Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

No censorship at seen.life

Wednesday, April 26, 2017

Scientists Identify Chemical Causes Of Battery "Capacity Fade"

Like you, me and everyone we know, batteries have a finite lifespan.

When a battery enters “old age,” scientists refer to its diminished performance as “capacity fade,” in which the amount of charge a battery can supply decreases with repeated use. Capacity fade is the reason why a cell phone battery that used to last a whole day will, after a couple of years, last perhaps only a few hours.

But what if scientists could reduce this capacity fade, allowing batteries to age more gracefully?

When manganese ions (gray) are stripped out of a battery’s cathode (blue), they can react with the battery’s electrolyte near the anode (gold), trapping lithium ions (green/yellow).

Credit: Robert Horn/Argonne National Laboratory

Researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory identified one of the major culprits in capacity fade of high-energy lithium-ion batteries in a paper published in The Journal of the Electrochemical Society.
For a lithium-ion battery – the kind that we use in laptops, smartphones, and plug-in hybrid electric vehicles – the capacity of the battery is tied directly to the amount of lithium ions that can be shuttled back and forth between the two terminals of the battery as it is charged and discharged.

This shuttling is enabled by certain transition metal ions, which change oxidation states as lithium ions move in and out of the cathode. However, as the battery is cycled, some of these ions – most notably manganese – get stripped out of the cathode material and end up at the battery’s anode.

Once near the anode, these metal ions interact with a region of the battery called the solid-electrolyte interphase, which forms because of reactions between the highly reactive anode and the liquid electrolyte that carries the lithium ions back and forth. For every electrolyte molecule that reacts and becomes decomposed in a process called reduction, a lithium ion becomes trapped in the interphase. As more and more lithium gets trapped, the capacity of the battery diminishes.

Some molecules in this interphase are incompletely reduced, meaning that they can accept more electrons and tie up even more lithium ions. These molecules are like tinder, awaiting a spark.

When the manganese ions become deposited into this interphase they act like a spark igniting the tinder: these ions are efficient at catalyzing reactions with the incompletely reduced molecules, trapping more lithium ions in the process.

“There’s a strict correlation between the amount of manganese that makes its way to the anode and the amount of lithium that gets trapped,” said study coauthor and Argonne scientist Daniel Abraham. “Now that we know the mechanisms behind the trapping of lithium ions and the capacity fade, we can find methods to solve the problem.”

The study, “Transition Metal Dissolution, Ion Migration, Electrocatalytic Reduction and Capacity Loss in Lithium-Ion Full Cells,” appeared in the online edition of The Journal of the Electrochemical Society on January 5. The other two authors were James Gilbert and Ilya Shkrob, both with Argonne.

The research was funded by DOE’s Vehicle Technologies Office.

Contacts and sources:
Argonne National Laboratory

Orange Essential Oil May Help Alleviate Post-Traumatic Stress Disorder

Researchers find evidence that essential oil reduces fear, diminishes immune system markers of stress in mice. 

About 8 percent of people will develop post-traumatic stress disorder at some point in their lives, according to the U.S. Department of Veterans Affairs, yet treatments for this debilitating condition remain limited. In a new study, mice exposed to orange essential oil after a stressful situation showed improvements in markers of stress and fear, suggesting essential oil may offer a nonpharmaceutical option to help alleviate PTSD.
Credit; Pixabay

Cassandra Moshfegh, research assistant in Paul Marvar’s laboratory at the George Washington University, presented the work at the American Physiological Society’s annual meeting during the Experimental Biology 2017 meeting, held April 22–26 in Chicago.

“Relative to pharmaceuticals, essential oils are much more economical and do not have adverse side effects,” said Moshfegh. “The orange essential plant oil showed a significant effect on the behavioral response in our study mice. This is promising, because it shows that passively inhaling this essential oil could potentially assuage PTSD symptoms in humans.”

Essential oils are aromatic compounds produced naturally by plants. Orange essential oil is typically extracted from the peel of the orange fruit. People use essential oils for therapeutic purposes by diffusing them into the air, applying them to the skin or ingesting them in foods or beverages.

The researchers tested the effects of orange essential oil using Pavlovian Fear conditioning, a behavioral mouse model used to study the formation, storage and expression of fear memories as a model for PTSD. Mice were exposed to the orange essential oil by passive inhalation 40 minutes before and after fear conditioning. Typically mice freeze in fear when they hear a certain audial tone later, a response that diminishes gradually over time.

Twelve mice received the tone by itself, 12 mice received water and fear conditioning, and 12 mice received an orange essential oil and fear conditioning. Mice exposed to orange essential oil by passive inhalation showed a significant reduction in freezing behavior and stopped freezing earlier than the water-exposed, fear-conditioned mice. They also showed significant differences in the types of immune cells present after fear conditioning. The immune system contributes to the inflammation associated with chronic stress and fear, so immune cells are a marker of the biochemical pathways involved in PTSD.

Preliminary results point to differences in the gene expression in the brain between the mice that were exposed to essential oil and those that were not, hinting at a potential mechanism to explain the behavioral results. Moshfegh said further studies would be needed to understand the specific effects of orange essential oil in the brain and nervous system and shed light on how these effects might help to reduce fear and stress in people with PTSD.

Cassandra Moshfegh will present this research at 12:45–3 p.m. Monday, April 24, in Hall F, McCormick Place Convention Center (poster F158 882.5) (abstract). Contact the media team for more information or to obtain a free press pass to attend the meeting.

Contacts and sources:
 Experimental Biology 2017
 American Physiological Society (APS)

In-Flight, On-Demand Hydrogen Production Could Mean “Greener” Aircraft

Technion researchers develop safe, efficient way to produce hydrogen from aluminum particles and water to meet in-flight aircraft energy needs

Aerospace engineers at the Technion-Israel Institute of Technology have developed and patented a process that can be used onboard aircraft while in flight to produce hydrogen from water (including waste water on the plane) and aluminum particles, safely and cheaply. The hydrogen can then be converted into electrical energy for inflight use. The breakthrough could pave the way for less-polluting, more-electric aircraft that replace hydraulic and pneumatic systems typically powered by the main engine.

Credit; ATS

The groundbreaking work was reported in a recent paper published in the International Journal of Hydrogen Energy.

“Hydrogen produced onboard the aircraft during flight can be channeled to a fuel cell for electrical energy generation,” said lead researcher Dr. Shani Elitzur of the Technion Faculty of Aerospace Engineering. “This technology offers a good solution to several challenges, such as hydrogen storage, without the problems associated with storing hydrogen in a liquid or gas state.”

While the use of hydrogen fuels has been a potential greener energy solution for some time, storing hydrogen has always been a problem. The engineers were able to work around the hydrogen storage problem by using non-polluting Proton Exchange Membrane (PEM) fuel cells and a process of aluminum activation patented by the paper’s co-authors, Prof. Alon Gany and Dr. Valery Rosenband.

Dr. Elitzur’s research was focused on the reaction between the activated aluminum powder and water (from different types) to produce hydrogen. The foundation for the technology is in the chemical reaction between aluminum powder and water to produce hydrogen. Either fresh water or waste water, already onboard the aircraft, can be used for activation, which means the aircraft does not need to carry any additional water.

The spontaneous and sustained reaction between powdered aluminum and water is enabled by a special thermo-chemical process of aluminum activation the researchers developed. The protective properties of the oxide or hydroxide film covering the aluminum particle surface are modified by a small fraction of lithium-based activator diffused into aluminum bulk, allowing water at room temperature to react spontaneously with the aluminum.

The process does generate heat, which the researchers say can be used for a number of tasks, including heating water and food in the galley, de-icing operations, or heating aircraft fuel prior to starting the engines.

According to the researchers, their technology would provide:
  • Quieter operations on board an aircraft
  • Drastic reductions in CO2 emissions
  • Compact storage; no need for hydrogen storage tanks onboard aircraft
  • More efficient electric power generation
  • A reduction in wiring (multiple fuel cells can be located near their point of use)
  • Thermal efficiency (fuel cell generated heat can be used for de-icing, heating jet fuel)
  • Reduced flammable vapors in fuel tanks (Inert gas generation)

“The possibility of using available, onboard wastewater boosts both the efficiency and safety of the system,” explained Dr. Rosenband. “Also, the PEM fuel cells exhibit high efficiency in electric energy generation.”

Aircraft manufacturers, including Boeing and Airbus, have already investigated using onboard fuel cells. Boeing has experimented with them in smaller aircraft, in anticipation of using them on its 787-8, the current state-of-the-art electric airplane. According to the Technion researchers, fuel cells can even play an energy saving role in airline and airport ground support operations when they are on used for systems such as de-icing and runway light towers.

“Efficient hydrogen production and storage represents the future for efficient and safe aircraft inflight energy needs.” summarized Prof. Gany.

The Technion-Israel Institute of Technology is a major source of the innovation and brainpower that drives the Israeli economy, and a key to Israel’s renown as the world’s “Start-Up Nation.” Its three Nobel Prize winners exemplify academic excellence. Technion people, ideas and inventions make immeasurable contributions to the world including life-saving medicine, sustainable energy, computer science, water conservation and nanotechnology. The Joan and Irwin Jacobs Technion-Cornell Institute is a vital component of Cornell Tech, and a model for graduate applied science education that is expected to transform New York City’s economy.

Contacts and sources:

One Step Closer to an “Exercise Pill”

Suppressing production of the protein myostatin enhances muscle mass and leads to significant improvements in markers of heart and kidney health, according to a study conducted in mice. 

The researchers zeroed in on myostatin because it is known as a powerful inhibitor of skeletal muscle growth, meaning that people with more myostatin have less muscle mass and people with less myostatin have more muscle mass. Studies suggest obese people produce more myostatin, which makes it harder to exercise and harder to build muscle mass. 

Aerobic exercise - public demonstration05.jpg
Credit: Wikimedia Commons

Joshua T. Butcher, PhD, a postdoctoral fellow at the Vascular Biology Center at Augusta University, presented the work at the American Physiological Society’s annual meeting during the Experimental Biology 2017 meeting, held April 22–26 in Chicago.

“Given that exercise is one of the most effective interventions for obesity, this creates a cycle by which a person becomes trapped in obesity,” Butcher said.

Obesity is linked with a range of factors that increase the risk of heart disease and diabetes, including high blood pressure, high cholesterol, insulin resistance and kidney damage. The researchers bred four groups of mice: lean and obese mice with uninhibited myostatin production and lean and obese mice that were unable to produce myostatin. 

As expected, mice that were unable to produce myostatin developed markedly higher muscle mass, though the obese mice remained obese even with more muscle. The obese mice that were unable to produce myostatin showed markers of cardiovascular and metabolic health that were on par with their lean counterparts and dramatically better than obese mice with uninhibited myostatin production.

“In our muscular obese mouse, despite full presentation of obesity, it appears that several of these key pathologies are prevented,” Butcher said. “While much more research is needed, at this point myostatin appears to be a very promising pathway for protection against obesity-derived cardiometabolic dysfunction.

“Ultimately, the goal of our research would be to create a pill that mimics the effect of exercise and protects against obesity. A pill that inhibits myostatin could also have applications for muscle wasting diseases, such as cancer, muscle dystrophy and AIDS,” he added.

Joshua Butcher presented this research on  Tuesday, April 25, in Hall F, McCormick Place Convention Center (poster E266 1011.7) (abstract).

Contacts and sources:
 American Physiological Society (APS)

Eye Expressions Offer a Glimpse Into the Evolution of Emotion

New research by Adam Anderson, professor of human development at Cornell University’s College of Human Ecology, reveals why the eyes offer a window into the soul.

According to the recent study, published in Psychological Science, we interpret a person’s emotions by analyzing the expression in their eyes – a process that began as a universal reaction to environmental stimuli and evolved to communicate our deepest emotions.

For example, people in the study consistently associated narrowed eyes – which enhance our visual discrimination by blocking light and sharpening focus – with emotions related to discrimination, such as disgust and suspicion. In contrast, people linked open eyes – which expand our field of vision – with emotions related to sensitivity, like fear and awe.

Subjects in new study linked wide-open eyes, left, with emotions related to sensitivity, like fear and awe. Narrowed eyes, right, were linked to emotions such as disgust and suspicion.
Credit: Cornell University

“When looking at the face, the eyes dominate emotional communication,” Anderson said. “The eyes are windows to the soul likely because they are first conduits for sight. Emotional expressive changes around the eye influence how we see, and in turn, this communicates to others how we think and feel.”

This work builds on Anderson’s research from 2013, which demonstrated that human facial expressions, such as raising one’s eyebrows, arose from universal, adaptive reactions to one’s environment and did not originally signal social communication.

Both studies support Charles Darwin’s 19th-century theories on the evolution of emotion, which hypothesized that our expressions originated for sensory function rather than social communication.

“What our work is beginning to unravel,” said Anderson, “are the details of what Darwin theorized: why certain expressions look the way they do, how that helps the person perceive the world, and how others use those expressions to read our innermost emotions and intentions.”

Anderson and his co-author, Daniel H. Lee, professor of psychology and neuroscience at the University of Colorado, Boulder, created models of six expressions – sadness, disgust, anger, joy, fear and surprise – using photos of faces in widely used databases. Study participants were shown a pair of eyes demonstrating one of the six expressions and one of 50 words describing a specific mental state, such as discriminating, curious, bored, etc. Participants then rated the extent to which the word described the eye expression. Each participant completed 600 trials.

Participants consistently matched the eye expressions with the corresponding basic emotion, accurately discerning all six basic emotions from the eyes alone.

Anderson then analyzed how these perceptions of mental states related to specific eye features. Those features included the openness of the eye, the distance from the eyebrow to the eye, the slope and curve of the eyebrow, and wrinkles around the nose, the temple and below the eye.

The study found that the openness of the eye was most closely related to our ability to read others’ mental states based on their eye expressions. Narrow-eyed expressions reflected mental states related to enhanced visual discrimination, such as suspicion and disapproval, while open-eyed expressions related to visual sensitivity, such as curiosity. Other features around the eye also communicated whether a mental state is positive or negative.

Further, he ran more studies comparing how well study participants could read emotions from the eye region to how well they could read emotions in other areas of the face, such as the nose or mouth. Those studies found the eyes offered more robust indications of emotions.

This study, said Anderson, was the next step in Darwin’s theory, asking how expressions for sensory function ended up being used for communication function of complex mental states.

“The eyes evolved over 500 million years ago for the purposes of sight but now are essential for interpersonal insight,” Anderson said.

Contacts and sources:
Cornell University

Movie Research Results: Multitasking Overloads the Brain

The brain works most efficiently when it can focus on a single task for a longer period of time.

Previous research shows that multitasking, which means performing several tasks at the same time, reduces productivity by as much as 40%. Now a group of researchers specialising in brain imaging has found that changing tasks too frequently interferes with brain activity. This may explain why the end result is worse than when a person focuses on one task at a time.

‘We used functional magnetic resonance imaging to measure different brain areas of our research subjects while they watched short segments of the Star Wars, Indiana Jones and James Bond movies,’ explains Aalto University Associate Professor Iiro Jääskeläinen.

The subjects’ brain areas functioned more smoothly when they watched the films in longer segments.
 Image: Juha Lahnakoski

Cutting the films into segments of approximately 50 seconds fragmented their continuity. In the study, the subjects’ brain areas functioned more smoothly when they watched the films in segments of 6.5 minutes. The posterior temporal and dorsomedial prefrontal cortices, the cerebellum and dorsal precuneus are the most important areas of the brain in terms of combining individual events into coherent event sequences. These areas of the brain make it possible to turn fragments into complete entities. According to the study, these brain regions work more efficiently when it can deal with one task at a time.

Inadequacy and overloading

Jääskeläinen recommends completing one task each day rather than working on a dozen of different tasks simultaneously.

The impact of a short segment of Indiana Jones movie on the diverse areas of the brain. 
Image: Juha Lahnakoski

‘It’s easy to fall into the trap of multitasking. In that case, it seems like there is little real progress and this leads to a feeling of inadequacy. Concentration decreases, which causes stress. Prolonged stress hinders thinking and memory,’ says Jääskeläinen.

The neuroscientist also sees social media as a challenge.

‘Social media is really nothing but multitasking, with several parallel plots and issues. You might end up reading the news or playing a game recommended by a friend. From the brain’s perspective, social media only increases the load.’

In addition to Jääskeläinen, Juha Lahnakoski from Max Planck Institute, Mikko Sams from Aalto University and Lauri Nummenmaa from the University of Turku participated in the research.

Contacts and sources:

Study Correlates Climate Change and Early Human Activities at The Algerian Site of El Kherba 1.7 Million Years Ago

Mohamed Sahnouni from CENIEH leads a study that reconstructs the ecology of this lower Pleistocene site and its relationship to the behavior of the hominids who inhabited this Northern Africa region

Mohamed Sahnouni, coordinator of the Prehistoric Technology Program at the Centro Nacional de Investigación sobre la Evolución Humana (CENIEH), leads a study, published online in the journal L’Anthropologie, using fossil fauna and carbon stable isotope to reconstruct paleoenvironments of the newly discovered site of El Kherba (Algeria) dated to 1.7 million years ago, in relation with hominid behavioral activities.

El Kherba
Credit: J. Mestre

The results of this paleoecological study indicate the occurrence of an increasingly open landscape, which is supported by the pedogenic carbonate data showing a climate change that is consistent with the documented Plio-Pleistocene continental trend of increasing aridification and grassland expansion.

The climate change likely impacted hominid foraging activities particularly in the Archaeological level A. The level A witnessed a drastic decrease in hominid activities characterized by a considerably lower density in stone tools and fossil bones as opposed to the lower level B characterized by a closed habitat and abundant archaeological materials.

“The open habitat in level A would have caused major constraints for early hominids, such as limitations for access to food supply and water as a result to their diffusion and shortage on the landscape, as well as riskier possibilities for meat acquisition due to competition with carnivores”, explains Mohamed Sahnouni.

In this The study, entitled Evidence of climate change in the Lower Pleistocene site of El Kherba (Algeria) and its possible impact on hominid activities, at 1.7 Ma, collaborators from the Centre National de Recherches Prehistoriques, Anthropologiques et Historiques (CNRPAH, Algeria), Museo Nacional de Ciemcias Naturales (Madrid), Indiana University Bloomington and Chevron Energy (USA), have also participated.

Contacts and sources:

Citation:  Sahnouni, M., Everet, M., Van der Made, J., & Harichane, Z. (2017). Mise en évidence d’un changement climatique dans le site pléistocène inférieur d’El Kherba (Algérie), et son possible impact sur les activités des hominidés, il y a 1,7 Ma. L'Anthropologie (0). http://dx.doi.org/10.1016/j.anthro.2017.03.015.

Century-Old Mystery of Blood Falls Solved

A research team led by the University of Alaska Fairbanks and Colorado College has solved a century-old mystery involving a famous red waterfall in Antarctica. New evidence links Blood Falls to a large source of salty water that may have been trapped under Taylor Glacier for more than 1 million years.

The team's study, published in the Journal of Glaciology, describes the brine's 300-foot path from beneath Taylor Glacier to the waterfall. This path has been a mystery since geoscientist Griffith Taylor discovered Blood Falls in 1911.

Lead author Jessica Badgeley, then an undergraduate student at Colorado College, worked with University of Alaska Fairbanks glaciologist Erin Pettit and her research team to understand this unique feature. They used a type of radar to detect the brine feeding Blood Falls.

Blood Falls is a famous iron-rich outflow of water that scientists suspected was connected to a water source that may have been trapped under an Antarctic glacier for more than a million years.

Photo by Erin Pettit

"The salts in the brine made this discovery possible by amplifying contrast with the fresh glacier ice," Badgeley said.

Blood Falls is famous for its sporadic releases of iron-rich salty water. The brine turns red when the iron contacts air.

The team tracked the brine with radio-echo sounding, a radar method that uses two antenna -- one to transmit electrical pulses and one to receive the signals.

Erin Pettit, left, and Cece Mortenson collect radar data on Taylor Glacier in front of Blood Falls.
Photo by Erin Pettit

"We moved the antennae around the glacier in grid-like patterns so that we could 'see' what was underneath us inside the ice, kind of like a bat uses echolocation to 'see' things around it," said co-author Christina Carr, a doctoral student at UAF.

Pettit said the researchers made another significant discovery - that liquid water can persist inside an extremely cold glacier. Scientists previously thought this was nearly impossible, but Pettit said the freezing process explains how water can flow in a cold glacier.

"While it sounds counterintuitive, water releases heat as it freezes, and that heat warms the surrounding colder ice," she said. The heat and the lower freezing temperature of salty water make liquid movement possible. "Taylor Glacier is now the coldest known glacier to have persistently flowing water."

Pettit said she enlisted Badgeley as an undergraduate student to help with the overall mission of understanding the hydrological plumbing of cold-based glaciers.

"Jessica's work is a perfect example of the high level of work undergraduate students can do when you give them a challenge and set the expectations high," she said.

The National Science Foundation sponsored the research.

Contacts and sources:
Meghan Murphy
University of Alaska Fairbanks

Citation: "An Englacial Hydrologic System of Brine Within a Cold Glacier: Blood Falls, McMurdo Dry Valleys, Antarctica," Journal of Glaciology, April 24, 2017, http://bit.ly/2peicdz

Whispering' Humpbacks Keep Calves Safe from Killer Whales, Study Finds

Newborn humpback whales 'whisper' to their mothers to avoid being overheard by killer whales, researchers have discovered. The recordings - the first obtained from tags directly attached to the whales - are published today in Functional Ecology.

Ecologists from Denmark and Australia used temporary tags on humpback mothers and their calves in Exmouth Gulf off western Australia to learn more about the first months of a humpback's life.

According to lead author Simone Videsen of Aarhus University: "We know next to nothing about the early life stages of whales in the wild, but they are crucial for the calves' survival during the long migration to their feeding grounds."

This image shows a mother-calf pair in Exmouth Gulf.
Credit: Fredrik Christiansen

"This migration is very demanding for young calves. They travel 5,000 miles across open water in rough seas and with strong winds. Knowing more about their suckling will help us understand what could disrupt this critical behaviour, so we can target conservation efforts more effectively."

Humpbacks spend their summer in the food-rich waters of the Antarctic or Arctic, and in the winter migrate to the tropics to breed and mate. While in tropical waters such as Exmouth Gulf, calves must gain as much weight as possible to embark on their first, epic migration.

Together with colleagues from Murdoch University, Videsen tagged eight calves and two mothers. To capture the faint sounds of the calves, they used special tags developed by the University of St Andrews.

The tags attach to whales via suction cups and record sounds made and heard by whales, along with their movements, for up to 48 hours before detaching to float at the surface.

The study found that mothers and calves spend significant amounts of time nursing and resting. The recordings also revealed that newborn humpbacks communicate with their mothers using intimate grunts and squeaks - a far cry from the loud, haunting song of the male humpback whale.

The data tags showed that these quiet calls usually occurred while whales were swimming, suggesting they help mother and calf keep together in the murky waters of Exmouth Gulf. "We also heard a lot of rubbing sounds, like two balloons being rubbed together, which we think was the calf nudging its mother when it wants to nurse," says Videsen.

This image shows a mother-calf pair in Exmouth Gulf.
Credit: Fredrik Christiansen

Such quiet communication helps reduce the risk of being overheard by killer whales nearby, she believes: "Killer whales hunt young humpback calves outside Exmouth Gulf, so by calling softly to its mother the calf is less likely to be heard by killer whales, and avoid attracting male humpbacks who want to mate with the nursing females."

The findings will help conserve this important humpback habitat and - crucially - ensure these nursery waters are kept as quiet as possible.

"From our research, we have learned that mother-calf pairs are likely to be sensitive to increases in ship noise. Because mother and calf communicate in whispers, shipping noise could easily mask these quiet calls."

                          This image shows a mother-calf pair in Exmouth Gulf.
Credit: Fredrik Christiansen

There are two major humpback whale populations, one in the northern hemisphere and the other in the south. Both breed in the tropics during the winter and then migrate to the Arctic or Antarctic during the summer to feed.

Humpback whales are slow to reproduce. Pregnancy lasts for around one year and calves - which are 5 metres at birth - stay with their mothers until they are one year old. During their first weeks of life, calves can grow by up to one metre per month.

Contacts and sources:
Dr. Simone Videsen
Aarhus University

Simone K. A. Videsen, Lars Bejder, Mark Johnson and Peter T. Madsen (2017). 'High suckling rates and acoustic crypsis of humpback whale neonates maximise potential for mother-calf energy transfer', doi: DOI 10.1111/1365-2435.12871, is published in Functional Ecology on 26 April 2017.

How Walking Benefits The Brain

You probably know that walking does your body good, but it's not just your heart and muscles that benefit. Researchers at New Mexico Highlands University (NMHU) found that the foot's impact during walking sends pressure waves through the arteries that significantly modify and can increase the supply of blood to the brain. The research was presented at the APS annual meeting at Experimental Biology 2017 in Chicago.

Until recently, the blood supply to the brain (cerebral blood flow or CBF) was thought to be involuntarily regulated by the body and relatively unaffected by changes in the blood pressure caused by exercise or exertion. The NMHU research team and others previously found that the foot's impact during running (4-5 G-forces) caused significant impact-related retrograde (backward-flowing) waves through the arteries that sync with the heart rate and stride rate to dynamically regulate blood circulation to the brain.

In the current study, the research team used non-invasive ultrasound to measure internal carotid artery blood velocity waves and arterial diameters to calculate hemispheric CBF to both sides of the brain of 12 healthy young adults during standing upright rest and steady walking (1 meter/second). The researchers found that though there is lighter foot impact associated with walking compared with running, walking still produces larger pressure waves in the body that significantly increase blood flow to the brain. While the effects of walking on CBF were less dramatic than those caused by running, they were greater than the effects seen during cycling, which involves no foot impact at all.

"New data now strongly suggest that brain blood flow is very dynamic and depends directly on cyclic aortic pressures that interact with retrograde pressure pulses from foot impacts," the researchers wrote. "There is a continuum of hemodynamic effects on human brain blood flow within pedaling, walking and running. Speculatively, these activities may optimize brain perfusion, function, and overall sense of wellbeing during exercise."

"What is surprising is that it took so long for us to finally measure these obvious hydraulic effects on cerebral blood flow," first author Ernest Greene explained. "There is an optimizing rhythm between brain blood flow and ambulating. Stride rates and their foot impacts are within the range of our normal heart rates (about 120/minute) when we are briskly moving along."

Ernest R. Greene, PhD, a researcher at New Mexico Highlands University, presented "Acute Effects of Walking on Human Internal Carotid Blood Flow" in a poster session on Monday, April 24, at the McCormick Place Convention Center.

Contacts and sources:
Stacy Brooks
Experimental Biology 2017

Researchers Discover Lull in Mars' Giant Impact History: Calm Before The Storm Supports Late Heavy Bombardment Theory

From the earliest days of our solar system's history, collisions between astronomical objects have shaped the planets and changed the course of their evolution. Studying the early bombardment history of Mars, scientists at Southwest Research Institute (SwRI) and the University of Arizona have discovered a 400-million-year lull in large impacts early in Martian history.

Mars bears the scars of five giant impacts, including the ancient giant Borealis basin (top of globe), Hellas (bottom right), and Argyre (bottom left). An SwRI-led team discovered that Mars experienced a 400-million-year lull in impacts between the formation of Borealis and the younger basins.
Mars bears the scars of five giant impacts, including the ancient giant Borealis basin (top of globe), Hellas (bottom right), and Argyre (bottom left)
Image Courtesy of University of Arizona/LPL/Southwest Research Institute

This discovery is published in the latest issue of Nature Geoscience in a paper titled, "A post-accretionary lull in large impacts on early Mars." SwRI's Dr. Bill Bottke, who serves as principal investigator of the Institute for the Science of Exploration Targets (ISET) within NASA's Solar System Exploration Research Virtual Institute (SSERVI), is the lead author of the paper. Dr. Jeff Andrews-Hanna, from the Lunar and Planetary Laboratory in the University of Arizona, is the paper's coauthor.

"The new results reveal that Mars' impact history closely parallels the bombardment histories we've inferred for the Moon, the asteroid belt, and the planet Mercury," Bottke said. "We refer to the period for the later impacts as the 'Late Heavy Bombardment.' The new results add credence to this somewhat controversial theory. However, the lull itself is an important period in the evolution of Mars and other planets. We like to refer to this lull as the 'doldrums.'"

The early impact bombardment of Mars has been linked to the bombardment history of the inner solar system as a whole. Borealis, the largest and most ancient basin on Mars, is nearly 6,000 miles wide and covers most of the planet's northern hemisphere. New analysis found that the rim of Borealis was excavated by only one later impact crater, known as Isidis. This sets strong statistical limits on the number of large basins that could have formed on Mars after Borealis. 

Moreover, the preservation states of the four youngest large basins -- Hellas, Isidis, Argyre, and the now-buried Utopia -- are strikingly similar to that of the larger, older Borealis basin. The similar preservation states of Borealis and these younger craters indicate that any basins formed in-between should be similarly preserved. No other impact basins pass this test.

"Previous studies estimated the ages of Hellas, Isidis, and Argyre to be 3.8 to 4.1 billion years old," Bottke said. "We argue the age of Borealis can be deduced from impact fragments from Mars that ultimately arrived on Earth. These Martian meteorites reveal Borealis to be nearly 4.5 billion years old -- almost as old as the planet itself."

The new results reveal a surprising bombardment history for the red planet. A giant impact carved out the northern lowlands 4.5 billion years ago, followed by a lull of approximately 400 million years. Then another period of bombardment produced giant impact basins between 4.1 and 3.8 billion years ago. The age of the impact basins requires two separate populations of objects striking Mars. The first wave of impacts was associated with formation of the inner planets, followed by a second wave striking the Martian surface much later.

Contacts and sources:
Deb Schmid 
Southwest Research Institute (SwRI)

Clues to the Exotic Origin for the Universe's Supervoid Cold Spot

A supervoid is unlikely to explain a 'Cold Spot' in the cosmic microwave background, according to the results of a new survey, leaving room for exotic explanations like a collision between universes. 

The cosmic microwave background (CMB), a relic of the Big Bang, covers the whole sky. At a temperature of 2.73 degrees above absolute zero (or -270.43 degrees Celsius), the CMB has some anomalies, including the Cold Spot. This feature, about 0.00015 degrees colder than its surroundings, was previously claimed to be caused by a huge void, billions of light years across, containing relatively few galaxies.

The accelerating expansion of the universe causes voids to leave subtle redshifts on light as it passes through via the integrated Sachs-Wolfe effect. In the case of the CMB this is observed as cold imprints. It was proposed that a very large foreground void could, in part, imprint the CMB Cold Spot which has been a source of tension in models of standard cosmology.

Figure 1. The map of the cosmic microwave background (CMB) sky produced by the Planck satellite. Red represents slightly warmer regions, and blue slightly cooler regions. The Cold Spot is shown in the inset, with coordinates on the x- and y-axes, and the temperature difference in millionths of a degree in the scale at the bottom. 
Credit: ESA and Durham University.  

The researchers, led by postgraduate student Ruari Mackenzie and Professor Tom Shanks in Durham University's Centre for Extragalactic Astronomy, publish their results in Monthly Notices of the Royal Astronomical Society.

Previously, most searches for a supervoid connected with the Cold Spot have estimated distances to galaxies using their colours. With the expansion of the universe more distant galaxies have their light shifted to longer wavelengths, an effect known as a cosmological redshift.

The more distant the galaxy is, the higher its observed redshift. By measuring the colours of galaxies, their redshifts, and thus their distances, can be estimated. These measurements though have a high degree of uncertainty.

In their new work, the Durham team presented the results of a comprehensive survey of the redshifts of 7,000 galaxies, harvested 300 at a time using a spectrograph deployed on the Anglo-Australian Telescope. From this higher fidelity dataset, Mackenzie and Shanks see no evidence of a supervoid capable of explaining the Cold Spot within the standard theory.

Figure 2. The 3-D galaxy distribution in the foreground of the CMB Cold Spot, where each point is a cluster of galaxies. The galaxy distribution in the Cold Spot (black points, at right) is compared to the same in an area with no background Cold Spot (red points, at left). The number and size of low galaxy density regions in both areas are similar, making it hard to explain the existence of the CMB Cold Spot by the presence of “voids”. 
Credit: Durham University. 

The researchers instead found that the Cold Spot region, before now thought to be underpopulated with galaxies, is split into smaller voids, surrounded by clusters of galaxies. This 'soap bubble' structure is much like the rest of the universe, illustrated in Figure 2 by the visual similarity between the galaxy distributions in the Cold Spot area and a control field elsewhere.

Mackenzie commented: "The voids we have detected cannot explain the Cold Spot under standard cosmology. There is the possibility that some non-standard model could be proposed to link the two in the future but our data place powerful constraints on any attempt to do that."

If there really is no supervoid that can explain the Cold Spot, simulations of the standard model of the universe give odds of 1 in 50 that the Cold Spot arose by chance.

Shanks added: "This means we can't entirely rule out that the Spot is caused by an unlikely fluctuation explained by the standard model. But if that isn't the answer, then there are more exotic explanations.

'Perhaps the most exciting of these is that the Cold Spot was caused by a collision between our universe and another bubble universe. If further, more detailed, analysis of CMB data proves this to be the case then the Cold Spot might be taken as the first evidence for the multiverse – and billions of other universes may exist like our own."

For the moment, all that can be said is that the lack of a supervoid to explain the Cold Spot has tilted the balance towards these more unusual explanations, ideas that will need to be further tested by more detailed observations of the CMB.

Contacts and sources:
Robert Massey, Royal Astronomical Society
Leighton Kitson. Durham University

Degradable Electronic Components Created from Corn Starch

As consumers upgrade their gadgets at an increasing pace, the amount of electronic waste we generate continues to mount. To help combat this environmental problem, researchers have modified a degradable bioplastic derived from corn starch or other natural sources for use in more eco-friendly electronic components. They report their development in ACS’ journal Industrial & Engineering Chemistry Research.

To address the world’s growing e-waste problem, researchers have created a degradable material for electronic components out of corn starch.
Credit: cheyennezj/Shutterstock.com

In 2014, consumers around the world discarded about 42 million metric tons of e-waste, according to a report by the United Nations University. This poses an environmental and human threat because electronic products are made up of many components, some of which are toxic or non-degradable. To help address the issue, Xinlong Wang and colleagues sought to develop a degradable material that could be used for electronic substrates or insulators.

The researchers started with polylactic acid, or PLA, which is a bioplastic that can be derived from corn starch or other natural sources and is already used in the packaging, electronics and automotive industries. PLA by itself, however, is brittle and flammable, and doesn’t have the right electrical properties to be a good electronic substrate or insulator. But the researchers found that blending metal-organic framework nanoparticles with PLA resulted in a transparent film with the mechanical, electrical and flame retardant properties that make the material a promising candidate for use in electronics.

Contacts and sources:
Katie Cottingham
American Chemical Society

What Do Electrolytes Actually Do?

Sports drink commercials love talking about them, but what are electrolytes, why do we need them, and what happens if we don’t have enough? 
Credit: Reactions

Electrolytes are salts that, once in our bodies, help our cells move water around. 

They also enable the nerve impulses that keep our hearts beating, our lungs breathing and our brains learning. But we can also lose them — for example, by sweating. Given all the ins and outs of electrolytes, should you reach for that bright orange sports drink after running around the block?

Credit: Reactions

Contacts and sources:
Katie Cottingham
American Chemical Society

Tuesday, April 25, 2017

New Artificial Photosynthesis Process Cleans Air and Produces Energy at the Same Time

A chemistry professor in Florida has just found a way to trigger the process of photosynthesis in a synthetic material, turning greenhouse gases into clean air and producing energy all at the same time.

The process has great potential for creating a technology that could significantly reduce greenhouse gases linked to climate change, while also creating a clean way to produce energy.

"This work is a breakthrough," said UCF Assistant Professor Fernando Uribe-Romo. "Tailoring materials that will absorb a specific color of light is very difficult from the scientific point of view, but from the societal point of view we are contributing to the development of a technology that can help reduce greenhouse gases."

Professor Fernando Uribe-Romo and his team of students created a way to trigger a chemical reaction in a synthetic material called metal-organic frameworks (MOF) that breaks down carbon dioxide into harmless organic materials. Think of it as an artificial photosynthesis process similar to the way plants convert carbon dioxide (CO2) and sunlight into food. But instead of producing food, Uribe-Romo's method produces solar fuel.

Credit: UCF: Bernard Wilchusky

The findings of his research are published in the Journal of Materials Chemistry A.

Uribe-Romo and his team of students created a way to trigger a chemical reaction in a synthetic material called metal-organic frameworks (MOF) that breaks down carbon dioxide into harmless organic materials. Think of it as an artificial photosynthesis process similar to the way plants convert carbon dioxide (CO2) and sunlight into food. But instead of producing food, Uribe-Romo's method produces solar fuel.

Credit: University of Central Florida

It's something scientists around the world have been pursuing for years, but the challenge is finding a way for visible light to trigger the chemical transformation. Ultraviolet rays have enough energy to allow the reaction in common materials such as titanium dioxide, but UVs make up only about 4 percent of the light Earth receives from the sun. The visible range -- the violet to red wavelengths -- represent the majority of the sun's rays, but there are few materials that pick up these light colors to create the chemical reaction that transforms CO2 into fuel.

Researchers have tried it with a variety of materials, but the ones that can absorb visible light tend to be rare and expensive materials such as platinum, rhenium and iridium that make the process cost-prohibitive.

Uribe-Romo used titanium, a common nontoxic metal, and added organic molecules that act as light-harvesting antennae to see if that configuration would work. The light harvesting antenna molecules, called N-alkyl-2-aminoterephthalates, can be designed to absorb specific colors of light when incorporated in the MOF. In this case he synchronized it for the color blue.

His team assembled a blue LED photoreactor to test out the hypothesis. Measured amounts of carbon dioxide were slowly fed into the photoreactor -- a glowing blue cylinder that looks like a tanning bed -- to see if the reaction would occur. The glowing blue light came from strips of LED lights inside the chamber of the cylinder and mimic the sun's blue wavelength.

UCF Assistant Professor Fernando Uribe-Romo has found a way to trigger the process of photosynthesis in a synthetic material, turning greenhouse gases into clean air and producing energy all at the same time.
Credit: UCF

It worked and the chemical reaction transformed the CO2 into two reduced forms of carbon, formate and formamides (two kinds of solar fuel) and in the process cleaning the air.

"The goal is to continue to fine-tune the approach so we can create greater amounts of reduced carbon so it is more efficient," Uribe-Romo said.

He wants to see if the other wavelengths of visible light may also trigger the reaction with adjustments to the synthetic material. If it works, the process could be a significant way to help reduce greenhouse gases.

"The idea would be to set up stations that capture large amounts of CO2, like next to a power plant. The gas would be sucked into the station, go through the process and recycle the greenhouse gases while producing energy that would be put back into the power plant."

Perhaps someday homeowners could purchase rooftop shingles made of the material, which would clean the air in their neighborhood while producing energy that could be used to power their homes.

"That would take new technology and infrastructure to happen," Uribe-Romo said. "But it may be possible."

Other members of the team who worked on the paper include UCF graduate student Matt Logan, who is pursuing a Ph.D in chemistry, and undergraduate student Jeremy Adamson, who is majoring in biomedical sciences. Kenneth Hanson and his research group at Florida State University helped interpret the results of the experiments.

Contacts and sources:
Zenaida Gonzalez Kotala
The University of Central Florida (UCF)

'Diet' Products Can Make You Fat, Study Shows

High-fat foods are often the primary target when fighting obesity, but sugar-laden "diet" foods could be contributing to unwanted weight gain as well, according to a new study from the University of Georgia (UGA).

Researchers found that rats fed a diet high in sugar but low in fat--meant to imitate many popular diet foods--increased body fat mass when compared to rats fed a balanced rodent diet. The high-sugar diet induced a host of other problems, including liver damage and brain inflammation.

"Most so-called diet products containing low or no fat have an increased amount of sugar and are camouflaged under fancy names, giving the impression that they are healthy, but the reality is that those foods may damage the liver and lead to obesity as well," said the study's principal investigator, Krzysztof Czaja, an associate professor of veterinary biosciences and diagnostic imaging in UGA's College of Veterinary Medicine.

Krzysztof Czaja is an associate professor of veterinary biosciences and diagnostic imaging in UGA's College of Veterinary Medicine.

Credit: Peter Frey/UGA

"What's really troubling in our findings is that the rats consuming high-sugar, low-fat diets didn't consume significantly more calories than the rats fed a balanced diet," Czaja said. "Our research shows that in rats fed a low-fat, high-sugar diet, the efficiency of generating body fat is more than twice as high--in other words, rats consuming low-fat high-sugar diets need less than half the number of calories to generate the same amount of body fat."

Over a four-week period, researchers monitored body weight, caloric intake, body composition and fecal samples in three groups of rats. One group of test subjects consumed a diet high in fat and sugar, another group was fed a low-fat, high-sugar diet and a third group was given a balanced or "normal" diet.

Both the low-fat, high-sugar and high-fat, high-sugar groups displayed an increase in liver fat and significant increases in body weight and body fat when compared to the balanced diet group. Liver fat accumulation was significant in the high-sugar, low-fat group, which Czaja said "is a very dangerous situation, because the liver accumulating more fat mimics the effect of non-alcoholic fatty liver disease."

Non-alcoholic fatty liver disease is caused by fat buildup in the liver, and serious forms of the disease can result in liver damage comparable to that caused by heavy alcohol use.

The unbalanced diets also induced chronic inflammation in the intestinal tract and brain. Former studies in rats conducted by Czaja have shown that brain inflammation alters gut-brain communication by damaging the vagus nerve, which controls sensory signals, including the brain's ability to determine when one is full.

"The brain changes resulting from these unbalanced diets seem to be long term, and it is still not known if they are reversible by balanced diets," Czaja said.

This study expands upon the researchers' previous work that determined high-fat diets alter the gut microbiome, the collection of bacteria, viruses and other microbes that live in the digestive tract. The recent study found that the unbalanced diets decreased the microbiome's bacterial diversity, and the low-fat, high-sugar diet increased gut bacteria that are associated with liver damage.

Contacts and sources:
Krzysztof Czaja
University of Georgia

The study was published online in the journal Physiology and Behavior and is available at http://www.sciencedirect.com/science/article/pii/S0031938416309489. Co-author Claire de La Serre, an assistant professor of foods and nutrition in UGA's College of Family and Consumer Sciences, conducted metabolic analysis for the study.

NASA's Fermi Catches Gamma-ray Flashes from Tropical Storms

About a thousand times a day, thunderstorms fire off fleeting bursts of some of the highest-energy light naturally found on Earth. These events, called terrestrial gamma-ray flashes (TGFs), last less than a millisecond and produce gamma rays with tens of millions of times the energy of visible light. Since its launch in 2008, NASA's Fermi Gamma-ray Space Telescope has recorded more than 4,000 TGFs, which scientists are studying to better understand how the phenomenon relates to lightning activity, storm strength and the life cycle of storms.

Now, for the first time, a team of NASA scientists has analyzed dozens of TGFs launched by the largest and strongest weather systems on the planet: tropical storms, hurricanes and typhoons. A paper describing the research was published March 16 in the Journal of Geophysical Research: Atmospheres.

During a period of rapid strengthening on Aug. 23, 2012, Typhoon Bolaven launched its only TGF from an outer rain band located nearly 490 miles (785 km) from the storm's center (roughly bottom center of the full image). The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra satellite captured this natural-color image of Bolaven the following day.
Credit: NASA Goddard Space Flight Center/Jeff Schmaltz, LANCE MODIS Rapid Response Team

"One result is a confirmation that storm intensity alone is not the key factor for producing TGFs," said Oliver Roberts, who led the study at the University College Dublin, Ireland, and is now at NASA's Marshall Space Flight Center in Huntsville, Alabama. "We found a few TGFs were made in the outer rain bands of major storms, hundreds of kilometers from the powerful eye walls at their centers, and one weak system that fired off several TGFs in a day."

About a thousand times a day, thunderstorms fire off fleeting bursts of some of the highest-energy light naturally found on Earth. These events, called terrestrial gamma-ray flashes (TGFs), last less than a millisecond and produce gamma rays with tens of millions of times the energy of visible light. Since its launch in 2008, NASA's Fermi Gamma-ray Space Telescope has recorded more than 4,000 TGFs, which scientists are studying to better understand how the phenomenon relates to lightning activity, storm strength and the life cycle of storms.

Credit: NASA Goddard N

Scientists suspect TGFs arise from the strong electric fields near the tops of thunderstorms. Under certain conditions, these fields become strong enough to drive an "avalanche" of electrons upward at nearly the speed of light. When these accelerated electrons race past air molecules, their paths become deflected slightly. This change causes the electrons to emit gamma rays.

Fermi's Gamma-ray Burst Monitor (GBM) detects TGFs occurring within about 500 miles (800 kilometers) of the location directly beneath the spacecraft. In 2012, GBM scientists employed new techniques that effectively upgraded the instrument, increasing its sensitivity and leading to a higher rate of TGF detections.

This enhanced discovery rate helped the GBM team show that most TGFs also generate a strong pulse of very low frequency radio waves, signals previously attributed only to lightning. Facilities like the Total Lightning Network operated by Earth Networks in Germantown, Maryland, and the World Wide Lightning Location Network, a research collaboration run by the University of Washington in Seattle, can pinpoint lightning- and TGF-produced radio pulses to within 6 miles (10 km) anywhere on the globe.

"Combining TGF data from GBM with precise positions from these lightning detection networks has opened up our ability to connect the outbursts to individual storms and their components," said co-author Michael Briggs, assistant director of the Center for Space Plasma and Aeronomic Research at University of Huntsville (UAH).

The team studied 37 TGFs associated with, among other storms, typhoons Nangka (2015) and Bolaven (2012), Hurricane Paula (2010), the 2013 tropical storms Sonia and Emang and Hurricane Manuel, and the disturbance that would later become Hurricane Julio in 2014.

"In our study, Julio holds the record for TGFs, firing off four within 100 minutes on Aug. 3, 2014, another the day after, and then no more for the life of the storm," Roberts said. "Most of this activity occurred as Julio underwent rapid intensification into a tropical depression, but long before it had even become a named storm."

What the scientists have learned so far is that TGFs from tropical systems do not have properties measurably different from other TGFs detected by Fermi. Weaker storms are capable of producing greater numbers of TGFs, which may arise anywhere in the storm. In more developed systems, like hurricanes and typhoons, TGFs are more common in the outermost rain bands, areas that also host the highest lightning rates in these storms.

Most of the tropical storm TGFs occurred as the systems intensified. Strengthening updrafts drive clouds higher into the atmosphere where they can generate powerful electric fields, setting the stage for intense lightning and for the electron avalanches thought to produce TGFs.

TGFs were discovered in 1992 by NASA's Compton Gamma-Ray Observatory, which operated until 2000.

The Fermi Gamma-ray Space Telescope is an astrophysics and particle physics partnership managed by NASA's Goddard Space Flight Center in Greenbelt, Maryland. Fermi was developed in collaboration with the U.S. Department of Energy, with important contributions from academic institutions and partners in France, Germany, Italy, Japan, Sweden and the United States.

The GBM Instrument Operations Center is located at the National Space Science Technology Center in Huntsville. The GBM team includes a collaboration of scientists from UAH, NASA's Marshall Space Flight Center, the Max Planck Institute for Extraterrestrial Physics in Germany, University College Dublin in Ireland and other institutions.

Contacts and sources: 
Francis Reddy
NASA’s Goddard Space Flight Center

Solar Cell Design with Over 50% Energy-Conversion Efficiency

Solar cells convert the sun’s energy into electricity by converting photons into electrons. A new solar cell design could raise the energy conversion efficiency to over 50% by absorbing the spectral components of longer wavelengths that are usually lost during transmission through the cell. These findings were published on April 6 in the online edition of Nature Communications.

Theoretical prediction of conversion efficiency. The efficiency changes in response to the use of two different bandgaps in a hetero-interface. The highest conversion efficiency is 63%.
Credit: Kobe University

This research was carried out by a team led by Professor KITA Takashi and Project Assistant Professor ASAHI Shigeo at the Kobe University Graduate School of Engineering.

In theory, 30% energy-conversion efficiency is the upper limit for traditional single-junction solar cells, as most of the solar energy that strikes the cell passes through without being absorbed, or becomes heat energy instead. Experiments have been taking place around the world to create various solar cell designs that can lift these limitations on conversion efficiency and reduce the loss of energy. The current world record is at 46% percent for a 4-junction solar cell. If the energy-conversion efficiency of solar cells surpasses 50%, it would have a big impact on the cost of producing electricity.

In order to reduce these large energy losses and raise efficiency, Professor Kita’s research team used two small photons from the energy transmitted through a single-junction solar cell containing a hetero-interface formed from semiconductors with different bandgaps. Using the photons, they developed a new solar cell structure for generating photocurrents.

A solar cell structure using a hetero interface and the up-conversion of the two-proton system (yellow and red arrows). The light represented by the red and yellow arrows normally passes through when only a semiconductor is used, but with this system the light is absorbed, greatly increasingly the flow of electricity.

Credit: Kobe University

 As well as demonstrating theoretical results of up to 63% conversion efficiency, it experimentally achieved up-conversion based on two photons, a mechanism unique to this solar cell. The reduction in energy loss demonstrated by this experiment is over 100 times more effective compared to previous methods that used intermediate bands.

The team will continue to design solar cells, and assess their performance based on conversion efficiency, working towards a highly efficient solar cell for low-cost energy production.

Contacts and sources:
Kobe University

Citation: “Two-step photon up-conversion solar cells”
Authors:Shigeo Asahi, Haruyuki Teranishi, Kazuki Kusaki, Toshiyuki Kaizu, and Takashi Kita Department of Electrical and Electronic Engineering, Graduate School of Engineering, Kobe University
Journal:Nature Communications 8, Article number: 14962 (2017)

Self Charging Batteries May Make Chargers Obsolete

New technology developed by Hydro-Québec and McGill University could one day make battery chargers obsolete.

Who hasn’t lid through the frustrating experience of being without a phone after forgetting to recharge it? This could one day be a thing of the past thanks to technology being developed by Hydro-Québec and McGill University.

Lithium-ion batteries have allowed the rapid proliferation of all kinds of mobile devices such as phones, tablets and computers. These tools however require frequent re-charging because of the limited energy density of their batteries.

Credit: McGill University

“With smart phones now, you can basically carry your whole office in that device, they are loaded with all sorts of applications so you need a lot of power to use it everyday and sometimes, you don’t have access to a plug to recharge,” explains Professor George P. Demopoulos, chair of Mining and Materials Engineering at McGill University.

This has led to the development of portable solar chargers but these hybrid devices are difficult to miniaturize due to their complex circuitry and packaging issues.

To solve this problem, scientists at McGill University and the Hydro-Québec’s research institute are working on a single device capable of harvesting and storing energy using light. In other words, a self-charging battery.

A first milestone

A novel concept presented in a Nature Communications paper by Professor Demopoulos and researchers at Hydro-Québec paves the way to these so-called light-charged batteries.

The study shows that a standard cathode from a lithium-ion battery can be “sensitized” to light by incorporating photo-harvesting dye molecules. “In other words,” says Dr. Andrea Paolella, the study’s lead author and researcher at Hydro-Québec, “our research team was able to simulate a charging process using light as a source of energy.”

Scientists will now have to build an anode, the storage component, which will close the device’s circuit, allowing energy produced by the cathode described in Nature Communications to be transferred and stored. If they succeed, they will have built the world’s first 100% self-charging lithium-ion battery.

Potential for mobile devices

The research team is already working on phase two of this project, thanks to a $564,000 grant from the Natural Sciences and Engineering Research Council of Canada.

“We have done half of the job,” says Professor Demopoulos, co-senior author of the paper with Hydro-Québec’s Dr. Karim Zaghib, a world leading expert on batteries. “We know that we can design the electrode that absorbs light. “This grant will give us the opportunity to bridge the gap and demonstrate that this new concept of a light-chargeable battery is possible.”

“I’m an optimist and I think we can get a fully working device,” says Paolella, who is also a former post-doctoral student from McGill. “Theoretically speaking, our goal is to develop a new hybrid solar-battery system, but depending on the power it can generate when we miniaturize it, we can imagine applications for portable devices such as phones”.

“Hydro-Québec has a strong global position with regard to the development of innovative, high-performance and safe battery materials,” says Karim Zaghib Director – Energy Storage and Conservation at IREQ, Hydro-Québec’s research institute.

While it may take a few years to complete the second phase of the project, Professor Demopoulos believes this “passive form of charging” could play an important role in portable devices of the future…

Funding for the research was provided by Hydro-Québec. The Natural Sciences and Engineering Research Council of Canada provides funding for the next phase of the project via Grant STPG# 493929.

Contacts and sources:
Justin Dupuis
McGill University

Chili Peppers and Marijuana Calm the Gut

You wouldn't think chili peppers and marijuana have much in common. But when eaten, both interact with the same receptor in our stomachs, according to a paper by UConn researchers published in the April 24 issue of the journal Proceedings of the National Academy of Sciences. The research could lead to new therapies for diabetes and colitis, and opens up intriguing questions about the relationship between the immune system, the gut and the brain.

Touch a chili pepper to your mouth and you feel heat. And biochemically, you aren't wrong. The capsaicin chemical in the pepper binds to a receptor that triggers a nerve that fires off to your brain: hot! Those same receptors are found throughout the gastrointestinal tract, for reasons that have been mysterious.

Credit: Public Domain Pictures

Curious, UConn researchers fed capsaicin to mice, and found the mice fed with the spice had less inflammation in their guts. The researchers actually cured mice with Type 1 diabetes by feeding them chili pepper. When they looked carefully at what was happening at a molecular level, the researchers saw that the capsaicin was binding to a receptor called TRPV1, which is found on specialized cells throughout the gastrointestinal tract. When capsaicin binds to it, TRPV1 causes cells to make anandamide. Anandamide is a compound chemically akin to the cannabinoids in marijuana. It was the anandamide that caused the immune system to calm down. And the researchers found they could get the same gut-calming results by feeding the mice anandamide directly.

The brain also has receptors for anandamide. It's these receptors that react with the cannabinoids in marijuana to get people high. Scientists have long wondered why people even have receptors for cannabinoids in their brains. They don't seem to interact with vital bodily functions that way opiate receptors do, for example.

"This allows you to imagine ways the immune system and the brain might talk to each other. They share a common language," says Pramod Srivastava, Professor of Immunology and Medicine at UConn Health School of Medicine. And one word of that common language is anandamide.

Srivastava and his colleagues don't know how or why anandamide might relay messages between the immune system and the brain. But they have found out the details of how it heals the gut. The molecule reacts with both TRPV1 (to produce more anandamide) and another receptor to call in a type of macrophage, immune cells that subdue inflammation. The macrophage population and activity level increases when anandamide levels increase. 

The effects pervade the entire upper gut, including the esophagus, stomach and pancreas. They are still working with mice to see whether it also affects disorders in the bowels, such as colitis. And there are many other questions yet to be explored: what is the exact molecular pathway? Other receptors also react with anandamide; what do they do? How does ingesting weed affect the gut and the brain?

It's difficult to get federal license to experiment on people with marijuana, but the legalization of pot in certain states means there's a different way to see if regular ingestion of cannabinoids affects gut inflammation in humans.

"I'm hoping to work with the public health authority in Colorado to see if there has been an effect on the severity of colitis among regular users of edible weed," since pot became legal there in 2012, Srivastava says. If the epidemiological data shows a significant change, that would make a testable case that anandamide or other cannabinoids could be used as therapeutic drugs to treat certain disorders of the stomach, pancreas, intestines and colon.

It seems a little ironic that both chili peppers and marijuana could make the gut chill out. But how useful if it's true.

Contacts and sources:
Kim Krieger 
University of Connecticut

Climate Change Clues Revealed by Ice Sheet Collapse

The rapid decline of ancient ice sheets could help scientists predict the impact of modern-day climate and sea-level change, according to research by the universities of Stirling in Scotland and Tromsø in Norway.

Ice sheets are massive land-based reservoirs of frozen water. For the first time, scientists have reconstructed in detail the evolution of the last ice sheet that covered Iceland around 20,000 years ago.

The recently published study shows the greatest changes took place at a time when temperatures in the Northern Hemisphere rose by around 3°C in just 500 years.

This model shows ice sheet gradually reducing in size between 22,000 and 11,000 years ago.

Credit: Henry Patton

The maximum rate of ice loss in Iceland then was on the same scale seen in West Antarctica and Greenland today, providing worrying evidence of how climate change can dramatically alter the world's ice sheets, leading to rapid sea level rise.

Dr Tom Bradwell, from Stirling's Faculty of Natural Sciences, said: "About 22,000 years ago, the climate awoke from the last Ice Age, and entered a prolonged but gradual period of warming. This triggered the melting of the huge ice sheets that once covered North America and Eurasia.

"We used seafloor data to map the full extent of the last Icelandic ice sheet and fed this geological information into our ice sheet model. The new modelling experiments, driven by climate data from Greenland ice cores, replicate ice sheet behaviour over the last 35,000 years, showing when it melted the fastest and how it behaved.

"We found that, at certain times, the Icelandic ice sheet retreated at an exceptionally fast rate - more than double the present-day rate of ice loss from the much larger West Antarctic ice sheet - causing global sea level to rise significantly."

These high-resolution model experiments, published in Earth-Science Reviews, provide an unprecedented view of how the Icelandic ice sheet rapidly reduced in size and volume between 21,000 and 18,000 years ago, mainly through icebergs breaking away from its marine margins. It then collapsed 14,000 years ago, this time abruptly in response to rapid climate warming.

The Icelandic ice sheet reached a maximum size of 562,000 sq. km - an area about the size of France. During its dramatic collapse the ice sheet melted rapidly over much of its surface area, decreasing in size by almost two-thirds, in only 750 years. This large volume of ice melting caused a 46 cm-rise in global sea levels -- or more than 1mm rise every two years for over seven centuries -- and is equivalent to the ice losses currently being experienced in Greenland.

When compared to the length of time it took the Icelandic ice sheet to grow to its full size -- approximately 10,000 years -- this rate of change is all the more remarkable.

These abrupt events, seen in former ice sheets and mirrored today, put present-day rates of ice sheet change in a new perspective. However, until recently, much of the data needed to reconstruct and model their shape, size and flow existed unseen below sea level.

Dr Henry Patton, from UiT The Arctic University of Norway, said: "Satellite data show that the present polar ice sheets can respond on alarmingly short timescales to climate and ocean changes. By using data from the geological record to constrain model reconstructions of rapid ice sheet change thousands of years ago, we can better predict how contemporary ice sheets will probably react in the future and the serious impact they have on sea level rise."

Prof Alun Hubbard, who works at UiT Norway and Aberystwyth University, said: "Just like the Icelandic ice sheet, some 20,000 year ago, the retreat of the Greenland ice sheet is now contributing up to approximately 1.2 mm per year to global sea-level rise. That doesn't sound much but given the time-scales involved, and that Greenland's ice loss has increased from nothing 20 years ago to over roughly 350 cubic per year now, makes it a significant cause for concern -- particularly for those low lying, coastal regions where much of the planet's population lives."

The research, supported by the Research Council of Norway, is part of an ongoing collaboration between the scientists in the Universities of Tromsø, Aberystwyth and Stirling to understand ice sheet evolution, past and present.

Contacts and sources:
Corrie Campbell
University of Stirling

Rocky Super-Earth Discovered in The Habitable Zone Close to Cool Star

One of the most successful techniques presently in use for detecting expolanets is the search for transits. Similar to the way the Moon cuts off the light of the Sun during an eclipse, a transit is produced when a planet orbiting a distant star cuts off a small fraction of its light when it passes between us and the star. There are many projects dedicated to detecting and monitoring small variations in the light from many stars in the hope of discovering an extrasolar planet.

One of these projects is MEarth, which uses a network of 40 cm telescopes to measure the light from hundreds of stars. In Setember 2014 MEarth detected a possible transit in the star named LHS 1140.

An artist's impression of the newly-discovered rocky exoplanet, LHS 1140b. This planet is located in the liquid water habitable zone surrounding its host star, a small, faint red star named LHS 1140. The planet weighs about 6.6 times the mass of Earth and is shown passing in front of LHS 1140. Depicted in blue is the atmosphere the planet may have retained.

Credit: M. Weiss/CfA.

Thanks to a thorough piece of research using data from MEarth-South, at the Interamerican Observatory of Cerro Tololo, Chile, and with the HARPS spectrograph on the 3.6m telescope at the La Silla Observatory of the European Southern Observatory, ESO (also in Chile) a planet was confirmed orbiting around this star with a period of 25 days. This spectrograph was designed especially for the detection and study of extrasolar planets. An almost identical twin instrument is installed at the Telescopio Nazionale Galileo (TNG) at the Roque de los Muchachos Obseratory, in Garafía (La Palma).

The planet in question, given the name LHS 1140b is in orbit around an M-type star. This type of stars, with sizes and luminosities less than those of the Sun,are the most abundant stars in the Galaxy. This planetary system is at only 40 light years from Earth, in other words in the solar neighbourhood. An international team, of which the IAC researcher Felipe Murgas is a member, was able to establish the size and mass of the planet as 1.4 times the radius of the Earth, and 6.6 times the mass of the Earth, respectively. Because of its size, and high mass, it is very probable that the planet has a rocky composition.

"This is the most exciting exoplanet which I have seen in the last ten years" comments Jason Dittmann of the Harvard-Smithsonian Centre for Astrophysics (CfA), the first author of the article in Nature. "It would be difficult to find a better objective for carrying out one of the most important searches in science: for evidence of life beyond the Earth".

Another important aspect of the discovery is that LMS 1140b is orbiting its star in the so-called "zone of habitability", the region around a star in which the temperature makes it possible for water to exist in all of its three phases: solid liquid, and gas. This is, as we know, one of the requirements for the existence of life as we know it on Earth.

"Because it is at a distance from its star which permits relatively cool temperatures, and a mass which is big enough to prevent the evaporation of an atmosphere due to the wind of its star, LHS 1140b has become one of the most promising candidates for the detection and study of its atmosphere, using the next generation of telescopes, such as the James Webb Space Telescope, (JWST)and the European Extrmely Large Telescope (E-ELT)" concludes Felipe Murgas.

Contacts and sources:
Elena Mora
Instituto De Astrofísica De Canarias (IAC)

Citation: “A temperate rocky super-Earth transiting a nearby cool star”. Nature, 20 abril 2017. doi:10.1038/nature22055

Plastic Eating Caterpillar Can Biodegrade Plastic Clogging Landfills

Scientists have found that a caterpillar commercially bred for fishing bait has the ability to biodegrade polyethylene: one of the toughest and most used plastics, frequently found clogging up landfill sites in the form of plastic shopping bags.

The wax worm, the larvae of the common insect Galleria mellonella, or greater wax moth, is a scourge of beehives across Europe. In the wild, the worms live as parasites in bee colonies. Wax moths lay their eggs inside hives where the worms hatch and grow on beeswax - hence the name.

A chance discovery occurred when one of the scientific team, Federica Bertocchini, an amateur beekeeper, was removing the parasitic pests from the honeycombs in her hives. The worms were temporarily kept in a typical plastic shopping bag that became riddled with holes.

These are wax worm specimens in a Petri dish.

Credit: César Hernández/CSIC

Bertocchini, from the Institute of Biomedicine and Biotechnology of Cantabria (CSIC), Spain, collaborated with colleagues Paolo Bombelli and Christopher Howe at the University of Cambridge's Department of Biochemistry to conduct a timed experiment.

Around a hundred wax worms were exposed to a plastic bag from a UK supermarket. Holes started to appear after just 40 minutes, and after 12 hours there was a reduction in plastic mass of 92mg from the bag.

Scientists say that the degradation rate is extremely fast compared to other recent discoveries, such as bacteria reported last year to biodegrade some plastics at a rate of just 0.13mg a day.

"If a single enzyme is responsible for this chemical process, its reproduction on a large scale using biotechnological methods should be achievable," said Cambridge's Paolo Bombelli, first author of the study published today in the journal Current Biology.

"This discovery could be an important tool for helping to get rid of the polyethylene plastic waste accumulated in landfill sites and oceans."

Polyethylene is largely used in packaging, and accounts for 40% of total demand for plastic products across Europe - where up to 38% of plastic is discarded in landfills. People around the world use around a trillion plastic bags every single year.

Generally speaking, plastic is highly resistant to breaking down, and even when it does the smaller pieces choke up ecosystems without degrading. The environmental toll is a heavy one.

Yet nature may provide an answer. The beeswax on which wax worms grow is composed of a highly diverse mixture of lipid compounds: building block molecules of living cells, including fats, oils and some hormones.

While the molecular detail of wax biodegradation requires further investigation, the researchers say it is likely that digesting beeswax and polyethylene involves breaking similar types of chemical bonds.

"Wax is a polymer, a sort of 'natural plastic,' and has a chemical structure not dissimilar to polyethylene," said CSIC's Bertocchini, the study's lead author.

Close-up of wax worm next to biodegraded holes in a polyethylene plastic shopping bag from a UK supermarket as used in the experiment.

Credit: The research team.

The researchers conducted spectroscopic analysis to show the chemical bonds in the plastic were breaking. The analysis showed the worms transformed the polyethylene into ethylene glycol, representing un-bonded 'monomer' molecules.

To confirm it wasn't just the chewing mechanism of the caterpillars degrading the plastic, the team mashed up some of the worms and smeared them on polyethylene bags, with similar results.

"The caterpillars are not just eating the plastic without modifying its chemical make-up. We showed that the polymer chains in polyethylene plastic are actually broken by the wax worms," said Bombelli.

"The caterpillar produces something that breaks the chemical bond, perhaps in its salivary glands or a symbiotic bacteria in its gut. The next steps for us will be to try and identify the molecular processes in this reaction and see if we can isolate the enzyme responsible."

As the molecular details of the process become known, the researchers say it could be used to devise a biotechnological solution on an industrial scale for managing polyethylene waste.

Added Bertocchini: "We are planning to implement this finding into a viable way to get rid of plastic waste, working towards a solution to save our oceans, rivers, and all the environment from the unavoidable consequences of plastic accumulation."

Contacts and sources:
University of Cambridge

Monday, April 24, 2017

How Fear Of Death Affects Human Attitudes Toward Animal Life

When reminded of death, humans become more likely to support killing animals, regardless of their existing attitudes about animal rights, according to new research from the University of Arizona.

The research provides new insight into the psychology behind humans' willingness to kill animals for a variety of reasons, and could also potentially help scientists better understand the psychological motivations behind the murder and genocide of humans, said lead researcher Uri Lifshin, a doctoral student in the UA Department of Psychology.

Lifshin and his colleagues conducted a series of experiments based on their existing work on terror management theory — the idea that humans' awareness of their own mortality is a strong motivator for behaviors that may help quell the fear of death.

"If we ever want to really understand how to reduce or fight human-to-human genocide, we have to understand our killing of animals," says UA researcher Uri Lifshin.
"If we ever want to really understand how to reduce or fight human-to-human genocide, we have to understand our killing of animals," says UA researcher Uri Lifshin.
Credit:  UA

During the experiments, half of participants were presented with a subliminal or subtle "death prime"; either they saw the word "dead" flash briefly on a computer screen or they saw an image of a T-shirt featuring a skull made up of several iterations of the word "death."

The other half of participants — the controls — instead saw the word "pain" or "fail" flash across the screen, or they saw an image of a plain T-shirt.

Study participants were then asked to rate how much they agree with a series of statements about killing animals, such as, "It is often necessary to control for animal overpopulation through different means, such as hunting or euthanasia," or, "An experiment should never cause the killing of animals." The researchers avoided asking questions about some of the more broadly accepted justifications for killing animals, like doing so for food.

In all experiments, those who received the death prime were more likely to support killing animals.

Prior to the start of experiments, participants were asked to report their feelings about animal rights. Surprisingly, it didn't matter if people self-identified as supporters of animal rights. While those individuals were overall less likely than others to support killing animals, the death prime still had the same effect on them.

"If you're an animal lover or if you care about animals rights, then overall, yes, you are going to support the killing of animals much less; however when you're reminded of death you're still going to be a little bit more reactive," Lifshin said. Worth noting, the study did not include overt animal rights activists, who might be affected differently. Additional research is needed for that population, Lifshin said.

Uri Lifshin holds his cat, Chupchik. Lifshin's own love of animals is, in part, what drove him to study humans' psychological reasons for supporting killing them.
 Credit:  UA

Gender also didn't change the effect of the death prime. Consistent with existing literature, male participants were generally more likely than females to support killing animals, but males and females were both affected in the same way by the death prime.

Self-Esteem Helps Us Manage Fear of Death

The UA researchers' paper, "The Evil Animal: A Terror Management Theory Perspective on the Human Tendency to Kill Animals," was published in the Personality and Social Psychology Bulletin. Their findings are based on psychology's terror management theory, which is derived from anthropologist Ernest Becker's 1974 Pulitzer Prize-winning book, "The Denial of Death." The theory posits that humans use self-esteem as a buffer against fear of death.

Lifshin's UA co-authors on the paper were: psychology professor Jeff Greenberg, one of the originators of terror management theory; psychology doctoral student Colin Zestcott; and assistant professor of psychology Daniel Sullivan.

Self-esteem can be achieved in different ways. In a previous study, Lifshin and his colleagues showed that when people who enjoy playing basketball are reminded of their mortality, they improve their performance on the basketball court, and thereby their self-esteem, to manage their fear of death.

In the animal study, researchers think death-primed participants supported killing animals more because it provided them with a sense of power or superiority over animals that indirectly helped them fend off fear of mortality, Lifshin said.

This all happens subconsciously.

"Sometimes, our self-esteem depends on the idea that we are special and not just sacks of meat. We want to feel powerful, immortal — not like an animal," said Lifshin, a proud pet owner whose own love of animals is, in part, what drove him to study why anyone would do them harm.

To further test the terror management connection, Lifshin and his colleagues designed one of their experiments to look at whether giving participants an alternative self-esteem boost would change the effect of the death prime.

It did.

Before each of the experiments conducted by Lifshin and his colleagues, participants were told a cover story to conceal the researchers' actual aim. In the self-esteem boost experiment, participants were told they were taking part in a word relationship study, and were asked to identify whether pairs of word on a computer screen were related. During the course of the experiment, the word "dead" appeared on the screen for 30 milliseconds to some participants.

When the experimenters praised those who had seen the death prime — telling them: "Oh wow, I'm not sure I've seen a score this high on this task, this is really good" — the effect of the death prime was eliminated when participants went on to answer the questions about killing animals. In other words, seeing the death prime did not make participants more supportive of killing animals if they subsequently received a self-esteem boost from a different source.

"We didn't find that people's general state of self-esteem made a difference; it was this self-esteem boost," Lifshin said. "Once your self-esteem is secured, you no longer need to satisfy the need for terror management by killing animals."

Those who saw the death prime and were given neutral feedback from the experimenters ("OK you did good, just as well as most people do on this task") still supported killing animals more. The neutral feedback did not change the effect of the death prime.

Findings Could Contribute to Understanding Psychology of Genocide

When researchers asked participants to rate statements about killing humans under various conditions, the death prime did not have the same effect; those who saw the death prime were not more likely to support killing humans.

Even so, the research could still have important implications for the study of the psychology behind murder and genocide of humans who fall into outgroups because of their race, religion or other characteristics, since those individuals tend to be dehumanized by those who would do them harm, Lifshin said.

"We dehumanize our enemies when there is genocide. There is research in social psychology showing that if you go to places where genocide is happening and you ask the people who are doing the killing to try to explain, they'll often say things like, 'Oh, they're cockroaches, they're rats, we just have to kill them all,'" Lifshin said. "So if we ever want to really understand how to reduce or fight human-to-human genocide, we have to understand our killing of animals."

Contacts and sources:
Alexis Blue
University of Arizona