Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

No censorship at seen.life

Thursday, February 23, 2017

Tiny Fibers Three-In-One Design Allows Genetic, Chemical, Optical, and Electrical Inputs and outputs for the Brain

For the first time ever, a single flexible fiber no bigger than a human hair has successfully delivered a combination of optical, electrical, and chemical signals back and forth into the brain, putting into practice an idea first proposed two years ago. With some tweaking to further improve its biocompatibility, the new approach could provide a dramatically improved way to learn about the functions and interconnections of different brain regions.

The fibers are designed to mimic the softness and flexibility of brain tissue. This could make it possible to leave implants in place and have them retain their functions over much longer periods than is currently possible with typical stiff, metallic fibers, thus enabling much more extensive data collection. For example, in tests with lab mice, the researchers were able to inject viral vectors that carried genes called opsins, which sensitize neurons to light, through one of two fluid channels in the fiber. They waited for the opsins to take effect, then sent a pulse of light through the optical waveguide in the center, and recorded the resulting neuronal activity, using six electrodes to pinpoint specific reactions. All of this was done through a single flexible fiber just 200 micrometers across — comparable to the width of a human hair.

Graduate student Seongjun Park holds an example of a new flexible fiber, which is no bigger than a human hair and has successfully delivered a combination of optical, electrical, and chemical signals back and forth into the brain.

Photo: Young Gyu Yoon

The new fibers were developed through a collaboration among material scientists, chemists, biologists, and other specialists. The results are reported in the journal Nature Neuroscience, in a paper by Seongjun Park, an MIT graduate student; Polina Anikeeva, the Class of 1942 Career Development Professor in the Department of Materials Science and Engineering; Yoel Fink, a professor in the departments of Materials Science and Engineering, and Electrical Engineering and Computer Science; Gloria Choi, the Samuel A. Goldblith Career Development Professor in the Department of Brain and Cognitive Sciences, and 10 others at MIT and elsewhere.

Previous research efforts in neuroscience have generally relied on separate devices: needles to inject viral vectors for optogenetics, optical fibers for light delivery, and arrays of electrodes for recording, adding a great deal of complication and the need for tricky alignments among the different devices. Getting that alignment right in practice was “somewhat probabilistic,” Anikeeva says. “We said, wouldn’t it be nice if we had a device that could just do it all.”

After years of effort, that’s what the team has now successfully demonstrated. “It can deliver the virus [containing the opsins] straight to the cell, and then stimulate the response and record the activity — and [the fiber] is sufficiently small and biocompatible so it can be kept in for a long time,” Anikeeva says.

Since each fiber is so small, “potentially, we could use many of them to observe different regions of activity,” she says. In their initial tests, the researchers placed probes in two different brain regions at once, varying which regions they used from one experiment to the next, and measuring how long it took for responses to travel between them.

The key ingredient that made this multifunctional fiber possible was the development of conductive “wires” that maintained the needed flexibility while also carrying electrical signals well. After much work, the team was able to engineer a composite of conductive polyethylene doped with graphite flakes. The polyethylene was initially formed into layers, sprinkled with graphite flakes, then compressed; then another pair of layers was added and compressed, and then another, and so on. A member of the team, Benjamin Grena, a recent graduate in materials science and engineering, referred to it as making “mille feuille,” (literally, “a thousand leaves,” the French name for a Napoleon pastry). That method increased the conductivity of the polymer by a factor of four or five, Park says. “That allowed us to reduce the size of the electrodes by the same amount.”

One immediate question that could be addressed through such fibers is that of exactly how long it takes for the neurons to become light-sensitized after injection of the genetic material. Such determinations could only be made by crude approximations before, but now could be pinpointed more clearly, the team says. The specific sensitizing agent used in their initial tests turned out to produce effects after about 11 days.

The team aims to reduce the width of the fibers further, to make their properties even closer to those of the neural tissue. “The next engineering challenge is to use material that is even softer, to really match” the adjacent tissue, Park says. Already, though, dozens of research teams around the world have been requesting samples of the new fibers to test in their own research.

The research team included members of MIT’s Research Laboratory of Electronics, Department of Electrical Engineering and Computer Science, McGovern Institute for Brain Research, Department of Chemical Engineering, and Department of Mechanical Engineering, as well as researchers at Tohuku University in Japan and Virginia Polytechnic Institute. It was supported by the National Institute of Neurological Disorders and Stroke, the National Science Foundation, the MIT Center for Materials Science and Engineering, the Center for Sensorimotor Neural Engineering, and the McGovern Institute for Brain Research.



Contacts and sources:
David L. Chandler 
Massachusetts Institute of Technology (MIT)

Ultracool Dwarf and the Seven Earth-like Planets


A total of seven Earth-like, potentially habitable worlds have been discovered orbiting a nearby star known as TRAPPIST-1. Just 40 light-years away, the star’s diminutive size and dim light output mean it is known as an ultracool dwarf.

ESOcast 96 explores this important discovery, from how the astronomers made the incredibly intricate measurements required to find and study the planets — including observations with ESO’s Very Large Telescope — to each world’s potential to support life as we know it. Excitingly, three of the planets in the system orbit in the habitable zone around TRAPPIST-1, and could harbour oceans of water on their surfaces.



Dwarf stars like TRAPPIST-1 are very common in our galaxy, making rich planetary systems like this some of the best targets in humanity’s search for life elsewhere in the Universe. This ESOcast takes you on a journey through one such system, which contains both the largest number of Earth-sized planets and the largest number of potentially habitable worlds ever discovered.

This infographic displays some artist's illustrations of how the seven planets orbiting TRAPPIST-1 might appear — including the possible presence of water oceans — alongside some images of the rocky planets in our Solar System. Information about the size and orbital periods of all the planets is also provided for comparison; the TRAPPIST-1 planets are all approximately Earth-sized.

Credit: NASA

Astronomers have found a system of seven Earth-sized planets just 40 light-years away. Using ground and space telescopes, including ESO’s Very Large Telescope, the planets were all detected as they passed in front of their parent star, the ultracool dwarf star known as TRAPPIST-1. According to the paper appearing today in the journal Nature, three of the planets lie in the habitable zone and could harbour oceans of water on their surfaces, increasing the possibility that the star system could play host to life. This system has both the largest number of Earth-sized planets yet found and the largest number of worlds that could support liquid water on their surfaces.

Astronomers using the TRAPPIST–South telescope at ESO’s La Silla Observatory, the Very Large Telescope (VLT) at Paranal and the NASA Spitzer Space Telescope, as well as other telescopes around the world [1], have now confirmed the existence of at least seven small planets orbiting the cool red dwarf star TRAPPIST-1 [2]. All the planets, labelled TRAPPIST-1b, c, d, e, f, g and h in order of increasing distance from their parent star, have sizes similar to Earth [3].

Dips in the star’s light output caused by each of the seven planets passing in front of it — events known as transits — allowed the astronomers to infer information about their sizes, compositions and orbits [4]. They found that at least the inner six planets are comparable in both size and temperature to the Earth.

Lead author Michaël Gillon of the STAR Institute at the University of Liège in Belgium is delighted by the findings: “This is an amazing planetary system — not only because we have found so many planets, but because they are all surprisingly similar in size to the Earth!”

With just 8% the mass of the Sun, TRAPPIST-1 is very small in stellar terms — only marginally bigger than the planet Jupiter — and though nearby in the constellation Aquarius (The Water Carrier), it appears very dim. Astronomers expected that such dwarf stars might host many Earth-sized planets in tight orbits, making them promising targets in the hunt for extraterrestrial life, but TRAPPIST-1 is the first such system to be found.

Co-author Amaury Triaud expands: “The energy output from dwarf stars like TRAPPIST-1 is much weaker than that of our Sun. Planets would need to be in far closer orbits than we see in the Solar System if there is to be surface water. Fortunately, it seems that this kind of compact configuration is just what we see around TRAPPIST-1!”

This diagram compares the orbits of the newly-discovered planets around the faint red star TRAPPIST-1 with the Galilean moons of Jupiter and the inner Solar System. All the planets found around TRAPPIST-1 orbit much closer to their star than Mercury is to the Sun, but as their star is far fainter, they are exposed to similar levels of irradiation as Venus, Earth and Mars in the Solar System.
Credit:ES O/O. Furtak

The team determined that all the planets in the system are similar in size to Earth and Venus in the Solar System, or slightly smaller. The density measurements suggest that at least the innermost six are probably rocky in composition.

The planetary orbits are not much larger than that of Jupiter’s Galilean moon system, and much smaller than the orbit of Mercury in the Solar System. However, TRAPPIST-1’s small size and low temperature mean that the energy input to its planets is similar to that received by the inner planets in our Solar System; TRAPPIST-1c, d and f receive similar amounts of energy to Venus, Earth and Mars, respectively.

This diagram compares the sizes of the newly-discovered planets around the faint red star TRAPPIST-1 with the Galilean moons of Jupiter and the inner Solar System. All the planets found around TRAPPIST-1 are of similar size to the Earth.
Credit: ESO/O. Furtak

All seven planets discovered in the system could potentially have liquid water on their surfaces, though their orbital distances make some of them more likely candidates than others. Climate models suggest the innermost planets, TRAPPIST-1b, c and d, are probably too hot to support liquid water, except maybe on a small fraction of their surfaces. The orbital distance of the system’s outermost planet, TRAPPIST-1h, is unconfirmed, though it is likely to be too distant and cold to harbour liquid water — assuming no alternative heating processes are occurring [5]. TRAPPIST-1e, f, and g, however, represent the holy grail for planet-hunting astronomers, as they orbit in the star’s habitable zone and could host oceans of surface water [6].

These new discoveries make the TRAPPIST-1 system a very important target for future study. The NASA/ESA Hubble Space Telescope is already being used to search for atmospheres around the planets and team member Emmanuël Jehin is excited about the future possibilities: “With the upcoming generation of telescopes, such as ESO’s European Extremely Large Telescope and the NASA/ESA/CSA James Webb Space Telescope, we will soon be able to search for water and perhaps even evidence of life on these worlds.”



Contacts and sources: 
Richard Hook
ESO

Neural Networks Promise Sharpest Ever Images of Deep Space


Telescopes, the workhorse instruments of astronomy, are limited by the size of the mirror or lens they use. Using 'neural nets', a form of artificial intelligence, a group of Swiss researchers now have a way to push past that limit, offering scientists the prospect of the sharpest ever images in optical astronomy. The new work appears in a paper in Monthly Notices of the Royal Astronomical Society.

The diameter of its lens or mirror, the so-called aperture, fundamentally limits any telescope. In simple terms, the bigger the mirror or lens, the more light it gathers, allowing astronomers to detect fainter objects, and to observe them more clearly. A statistical concept known as 'Nyquist sampling theorem' describes the resolution limit, and hence how much detail can be seen.

The Swiss study, led by Prof Kevin Schawinski of ETH Zurich, uses the latest in machine learning technology to challenge this limit. They teach a neural network, a computational approach that simulates the neurons in a brain, what galaxies look like, and then ask it to automatically recover a blurred image and turn it into a sharp one. Just like a human, the neural net needs examples - in this case a blurred and a sharp image of the same galaxy - to learn the technique.

The frames here show an example of an original galaxy image (left), the same image deliberately degraded (second from left), the image after recovery with the neural net (second from right), and the image processed with deconvolution, the best existing technique (right).


Credit; K. Schawinski / C. Zhang / ETH Zurich.

Their system uses two neural nets competing with each other, an emerging approach popular with the machine learning research community called a "generative adversarial network", or GAN. The whole teaching programme took just a few hours on a high performance computer.

The trained neural nets were able to recognise and reconstruct features that the telescope could not resolve - such as star-forming regions, bars and dust lanes in galaxies. The scientists checked it against the original high-resolution image to test its performance, finding it better able to recover features than anything used to date, including the 'deconvolution' approach used to improve the images made in the early years of the Hubble Space Telescope.

Schawinski sees this as a big step forward: "We can start by going back to sky surveys made with telescopes over many years, see more detail than ever before, and for example learn more about the structure of galaxies. There is no reason why we can't then apply this technique to the deepest images from Hubble, and the coming James Webb Space Telescope, to learn more about the earliest structures in the Universe."

Professor Ce Zhang, the collaborator from computer science, also sees great potential: "The massive amount of astronomical data is always fascinating to computer scientists. But, when techniques such as machine learning emerge, astrophysics also provides a great test bed for tackling a fundamental computational question - how do we integrate and take advantage of the knowledge that humans have accumulated over thousands of years, using a machine learning system? We hope our collaboration with Kevin can also shed light on this question."

The success of the project points to a more "data-driven" future for astrophysics in which information is learned automatically from data, instead of manually crafted physics models. ETH Zurich is hosting this work on the space.ml cross-disciplinary astrophysics/computer-science initiative, where the code is available to the general public.






Contacts and sources:
Robert Massey
The Royal Astronomical Society

Citation:  "Generative Adversarial Networks recover features in astrophysical images of galaxies beyond the deconvolution limit", Kevin Schawinski, Ce Zhang, Hantian Zhang, Lucas Fowler, and Gokula Krishnan Santhanam, Monthly Notices of the Royal Astronomical Society, in press. After the embargo expires, a copy of the paper will be available at no cost from http://doi.org/10.1093/mnrasl/slx008

A preprint is available at ttp://www.ras.org.uk/images/stories/press/Computation/Schawinski_et_al.pdf

Caught in the Act: First-Ever Global View of Transshipment in Commercial Fishing Industry



Analysis of satellite data broadcast from ships at sea enables automatic identification and monitoring of transshipments, a practice associated with illegal, unregulated, and unreported fishing

Transshipment, the transfer of goods from one boat to another, is a major pathway for illegally caught and unreported fish to enter the global seafood market. It has also been associated with drug smuggling and slave labor. Illegal in many cases, transshipment has been largely invisible and nearly impossible to manage, because it often occurs far from shore and out of sight. Until now.

Today, with the release of our report, The Global View of Transshipment: Preliminary Findings, we present the first-ever global footprint of transshipment in the fishing industry. The report explains how data scientists from SkyTruth and Global Fishing Watch (a partnership of Oceana, SkyTruth and Google) analyzed Automatic Identification System (AIS) signals from ships at sea to developed a tool to identify and track 90 percent of the world's large refrigerated cargo vessels, ships that collect catch from multiple fishing boats at sea and carry it to port.

In the Indian Ocean, off the remote Saya de Malha bank, the refrigerated cargo vessel (reefer) Leelawadee was seen with two unidentified likely fishing vessels tied alongside. Image Captured by DigitalGlobe on Nov. 30, 2016.

Imagery by DigitalGlobe © 2017

According to the analysis, from 2012 through 2016, refrigerated cargo vessels, known as "reefers," participated in more than 5,000 likely transshipments (instances in which they rendezvoused with an AIS-broadcasting fishing vessel and drifted long enough to receive a catch). In addition, the data revealed more than 86,000 potential transshipments in which reefers exhibited transshipment-like behavior, but there were no corresponding AIS signals from fishing vessels. Brian Sullivan, Google's lead for Global Fishing Watch, will present the findings at the Economist World Ocean Summit in Indonesia today. The report, along with the underlying data and our list of likely and suspected transshipments, will be freely available on our website, globalfishingwatch.org.

The global scale of transshipment and its ability to facilitate suspicious activity, such as illegal fishing and human rights abuses, is exposed in a complementary report being issued today by our partners at Oceana. The opportunity for mixing legal and illegal catch during the collection of fish from multiple fishing boats provides an easy route for illegal players to get their product to market. This obscures the seafood supply chain from hook to port and hobbles efforts at sustainability because it prevents an accurate measurement of the amount of marine life being taken from the sea.

Among the many findings, Global Fishing Watch data documents that transshipment in offshore coastal waters is more common in regions with a high proportion of Illegal, Unregulated and Unreported (IUU) fishing than in regions where management is strong such as in North America and Europe. The data also revealed clusters of transshipment along the Exclusive Economic Zones (EEZs) of some countries, and inside those zones of nations rated strongly for corruption and having limited monitoring capabilities. "These correlations do not provide any proof of specific illegal behavior," says Global Fishing Watch Research Program Director, David Kroodsma, and lead author on the report, "but they raise important questions and can lead to more informed international efforts by fisheries management organizations to prevent or better regulate transshipment."

According to Oceana's report, three of the top eight countries visited by reefers have not yet ratified an international treaty meant to eliminate illegal, unregulated and unreported fishing, and therefore may have weaker regulations that would make it easier for illegally caught fish to enter the global marketplace. The report calls for the banning of transshipment at sea and expanded mandates for unique identifiers and vessel tracking for fishing vessels.


The Hai Feng 648 is with an unidentified fishing vessel off the coast of Argentina. There is a large mostly Chinese squid fleet just beyond the EEZ boundary. The Hai Feng 648 was previously with the squid fleet at the edge of the Peruvian EEZ and in 2014 took illegally processed catch from the Lafayette into port in Peru. This image was acquired on Nov. 30, 2016.

Imagery by DigitalGlobe © 2017


The new analytical tools SkyTruth and Global Fishing Watch have developed using public domain AIS data can enable fisheries managers to identify and monitor transshipment anywhere in the world, permanently lifting the veil from the previously invisible practice of transshipment.

The results were obtained through an analysis of over 21 billion satellite signals from Automatic Identification System messages broadcast by ocean-going vessels between 2012 and 2016. Using an artificial intelligence system developed by Global Fishing Watch, Kroodsma's team identified refrigerated cargo vessels based on their movement patterns. Verifying their results with confirmed fishery registries and open source online resources, they identified 794 reefers. That represents 90 percent of the world's reefer vessels identified in 2010 according to the US Central Intelligence Agency World Factbook. 

Through further analysis, they mapped 5065 instances in which a reefer and a fishing vessel were moving at a certain speed within a certain proximity to one another for a certain length of time.) Our algorithm was verified by matching a subset of these "likely transshipments" to known transshipments recorded by fishing registries. The data also revealed 86,490 potential transshipments, instances in which reefers that appeared to be alone traveled in a pattern and at a speed consistent with transshipment. Their activity cannot be verified, but given that many fishing vessels turn off their AIS device when they do not want to be detected, and some fishing vessels do not have AIS, these events must be considered potential transshipments.



Contacts and sources:
Kimbra Cutlip 
Global Fishing Watch

Using Dogs To Find and Save Big Cats


Investigators are using specially-trained detection dogs to determine the numbers and distribution of cheetah in a region of Western Zambia. The research represents the first demonstration of this strategy for wide-ranging species that are often threatened.

While traditional survey methods failed to detect any cheetah, using dogs specially trained to locate scat and other signs allowed the team to detect cheetah presence throughout the survey area. The researchers estimated a density of 5.9 to 6.6 cheetah per 1000km2.

This is a detection dog searching for cheetah scat.

Credit:  Dave Hamman
 
"With the alarming global decline of cheetah, we need new methods to be able to monitor and evaluate the remaining populations, many of which are in very remote ecosystems where traditional survey methods are challenging at best," said Dr. Matthew Becker, lead author of the Journal of Zoology study. "With this study, detection dogs once again demonstrate they are a powerful conservation tool and an important ally for threatened African carnivores like cheetah."

"Rapid global large carnivore declines make evaluations of remaining populations critical. Yet landscape-scale evaluations of presence, abundance and distribution are difficult, as many species are wide-ranging, occur only at low densities and are elusive." say the authors of "Using dogs to find cats: detection dogs as a survey method for wide-ranging cheetah."


Contacts and sources:
Lauren Elkins
Wiley

Citation: Using dogs to find cats: detection dogs as a survey method for wide-ranging cheetah http://dx.doi.org/10.1111/jzo.12445

Ancient Rocks in Colorado Give Evidence of a 'Chaotic Solar System'


Plumbing a 90 million-year-old layer cake of sedimentary rock in Colorado, a team of scientists from the University of Wisconsin-Madison and Northwestern University has found evidence confirming a critical theory of how the planets in our solar system behave in their orbits around the sun.

The finding, published Feb. 23, 2017 in the journal Nature, is important because it provides the first hard proof for what scientists call the "chaotic solar system," a theory proposed in 1989 to account for small variations in the present conditions of the solar system. The variations, playing out over many millions of years, produce big changes in our planet's climate -- changes that can be reflected in the rocks that record Earth's history.

The discovery promises not only a better understanding of the mechanics of the solar system, but also a more precise measuring stick for geologic time. Moreover, it offers a better understanding of the link between orbital variations and climate change over geologic time scales.

The layer cake of sedimentary rock near Big Bend, Texas, shows the alternating layers of shale and limestone characteristic of the rock laid down at the bottom of a shallow ocean during the late Cretaceous period. The rock holds the 87 million-year-old signature of a 'resonance transition' in the orbits of Mars and Earth, definitive geologic evidence that the orbits of the planets in our solar system behave differently than prevailing theory, which held that the planets orbit like clockwork in a quasiperiodic manner.

Credit: Bradley Sageman, Northwestern University

Using evidence from alternating layers of limestone and shale laid down over millions of years in a shallow North American seaway at the time dinosaurs held sway on Earth, the team led by UW-Madison Professor of Geoscience Stephen Meyers and Northwestern University Professor of Earth and Planetary Sciences Brad Sageman discovered the 87 million-year-old signature of a "resonance transition" between Mars and Earth. A resonance transition is the consequence of the "butterfly effect" in chaos theory. It plays on the idea that small changes in the initial conditions of a nonlinear system can have large effects over time.

In the context of the solar system, the phenomenon occurs when two orbiting bodies periodically tug at one another, as occurs when a planet in its track around the sun passes in relative proximity to another planet in its own orbit. These small but regular ticks in a planet's orbit can exert big changes on the location and orientation of a planet on its axis relative to the sun and, accordingly, change the amount of solar radiation a planet receives over a given area. Where and how much solar radiation a planet gets is a key driver of climate.

"The impact of astronomical cycles on climate can be quite large," explains Meyers, noting as an example the pacing of the Earth's ice ages, which have been reliably matched to periodic changes in the shape of Earth's orbit, and the tilt of our planet on its axis. "Astronomical theory permits a very detailed evaluation of past climate events that may provide an analog for future climate."

To find the signature of a resonance transition, Meyers, Sageman and UW-Madison graduate student Chao Ma, whose dissertation work this comprises, looked to the geologic record in what is known as the Niobrara Formation in Colorado. The formation was laid down layer by layer over tens of millions of years as sediment was deposited on the bottom of a vast seaway known as the Cretaceous Western Interior Seaway. The shallow ocean stretched from what is now the Arctic Ocean to the Gulf of Mexico, separating the eastern and western portions of North America.

"The Niobrara Formation exhibits pronounced rhythmic rock layering due to changes in the relative abundance of clay and calcium carbonate," notes Meyers, an authority on astrochronology, which utilizes astronomical cycles to measure geologic time. "The source of the clay (laid down as shale) is from weathering of the land surface and the influx of clay to the seaway via rivers. The source of the calcium carbonate (limestone) is the shells of organisms, mostly microscopic, that lived in the water column."

 Meyers explains that while the link between climate change and sedimentation can be complex, the basic idea is simple: "Climate change influences the relative delivery of clay versus calcium carbonate, recording the astronomical signal in the process. For example, imagine a very warm and wet climate state that pumps clay into the seaway via rivers, producing a clay-rich rock or shale, alternating with a drier and cooler climate state which pumps less clay into the seaway and produces a calcium carbonate-rich rock or limestone."

The new study was supported by grants from the National Science Foundation. It builds on a meticulous stratigraphic record and important astrochronologic studies of the Niobrara Formation, the latter conducted in the dissertation work of Robert Locklair, a former student of Sageman's at Northwestern.

Dating of the Mars-Earth resonance transition found by Ma, Meyers and Sageman was confirmed by radioisotopic dating, a method for dating the absolute ages of rocks using known rates of radioactive decay of elements in the rocks. In recent years, major advances in the accuracy and precision of radioisotopic dating, devised by UW-Madison geoscience Professor Bradley Singer and others, have been introduced and contribute to the dating of the resonance transition.

The motions of the planets around the sun has been a subject of deep scientific interest since the advent of the heliocentric theory -- the idea that the Earth and planets revolve around the sun -- in the 16th century. From the 18th century, the dominant view of the solar system was that the planets orbited the sun like clockwork, having quasiperiodic and highly predictable orbits. In 1988, however, numerical calculations of the outer planets showed Pluto's orbit to be "chaotic" and the idea of a chaotic solar system was proposed in 1989 by astronomer Jacques Laskar, now at the Paris Observatory.

Following Laskar's proposal of a chaotic solar system, scientists have been looking in earnest for definitive evidence that would support the idea, says Meyers.

"Other studies have suggested the presence of chaos based on geologic data," says Meyers. "But this is the first unambiguous evidence, made possible by the availability of high-quality, radioisotopic dates and the strong astronomical signal preserved in the rocks."



Contacts and sources:
Stephen Meyers
University of Wisconsin-Madison 

Popular Heartburn Drugs Linked to Gradual Yet 'Silent' Kidney Damage

Taking popular heartburn drugs for prolonged periods has been linked to serious kidney problems, including kidney failure. The sudden onset of kidney problems often serves as a red flag for doctors to discontinue their patients' use of so-called proton pump inhibitors (PPIs), which are sold under the brand names Prevacid, Prilosec, Nexium and Protonix, among others.

But a new study evaluating the use of PPIs in 125,000 patients indicates that more than half of patients who develop chronic kidney damage while taking the drugs don't experience acute kidney problems beforehand, meaning patients may not be aware of a decline in kidney function, according to researchers at Washington University School of Medicine in St. Louis and the Veterans Affairs St. Louis Health Care System. Therefore, people who take PPIs, and their doctors, should be more vigilant in monitoring use of these medications.

Taking popular heartburn medication for prolonged periods may lead to serious kidney damage, even in people who show no signs of kidney problems, according to researchers at Washington University School of Medicine in St. Louis and the Veterans Affairs St. Louis Health Care System
Credit:  Michael Worful/Washington University School of Medicine in St. Louis

The study is published Feb. 22 in Kidney International.

The onset of acute kidney problems is not a reliable warning sign for clinicians to detect a decline in kidney function among patients taking proton pump inhibitors, said Ziyad Al-Aly, MD, the study's senior author and an assistant professor of medicine at Washington University School of Medicine. "Our results indicate kidney problems can develop silently and gradually over time, eroding kidney function and leading to long-term kidney damage or even renal failure. Patients should be cautioned to tell their doctors if they're taking PPIs and only use the drugs when necessary."

More than 15 million Americans suffering from heartburn, ulcers and acid reflux have prescriptions for PPIs, which bring relief by reducing gastric acid. Many millions more purchase the drugs over-the-counter and take them without being under a doctor's care.

The researchers -- including first author Yan Xie, a biostatistician at the St. Louis VA -- analyzed data from the Department of Veterans Affairs databases on 125,596 new users of PPIs and 18,436 new users of other heartburn drugs referred to as H2 blockers. The latter are much less likely to cause kidney problems but often aren't as effective.

Over five years of follow up, the researchers found that more than 80 percent of PPI users did not develop acute kidney problems, which often are reversible and are characterized by too little urine leaving the body, fatigue and swelling in the legs and ankles.

However, more than half of the cases of chronic kidney damage and end-stage renal disease associated with PPI use occurred in people without acute kidney problems.

In contrast, among new users of H2 blockers, 7.67 percent developed chronic kidney disease in the absence of acute kidney problems, and 1.27 percent developed end-stage renal disease.

End-stage renal disease occurs when the kidneys can no longer effectively remove waste from the body. In such cases, dialysis or a kidney transplant is needed to keep patients alive.

"Doctors must pay careful attention to kidney function in their patients who use PPIs, even when there are no signs of problems," cautioned Al-Aly, who also is the VA's associate chief of staff for research and education and co-director of the VA's Clinical Epidemiology Center. "In general, we always advise clinicians to evaluate whether PPI use is medically necessary in the first place because the drugs carry significant risks, including a deterioration of kidney function."



Contacts and sources:
Diane Duke Williams
Washington University School of Medicine in St. Louis

Achieving A Strong, Lasting ‘Blue Economy’ Possible Says Marine Ecologist

Incentive-based solutions offer significant hope for addressing the myriad environmental challenges facing the world’s oceans – that’s the central message a leading marine ecologist delivered today in during a presentation at the annual meeting of the American Association for the Advancement of Science.

Jane Lubchenco, a distinguished professor in the Oregon State University College of Science, shared lessons from around the world about ways “to use the ocean without using it up” as nations look to the ocean for new economic opportunities, food security or poverty alleviation.

Elizabeth Cerny-Chipman, a former postdoctoral scholar under Lubchenco who’s now a Knauss Fellow at the National Oceanic and Atmospheric Administration, co-authored the presentation, titled “Getting Incentives Right for Sustained Blue Growth: Science and Opportunities.”

Credit: OSU


In her presentation, Lubchenco pointed out that achieving the long-term potential of blue growth will require aligning short- and long-term economic incentives to achieve a diverse mix of benefits. Blue growth refers to long-term strategies for supporting sustainable growth in the marine and maritime sectors as a whole.

“If we harness human ingenuity and recognize that a healthy ocean is essential for long-term prosperity, we can tackle the enormous threats facing the ocean,” Lubchenco says, “and we can make a transition from vicious cycles to virtuous cycles.”

Lubchenco and her collaborators note that the world’s oceans are the main source of protein production for 3 billion people; are directly or indirectly responsible for the employment of more than 200 million people; and contribute $270 billion to the planet’s gross domestic product.

“The right incentives can drive behavior that aligns with both desired environmental outcomes and desirable social outcomes,” Lubchenco says.

The first step in building increased support for truly sustainable blue growth, she says, is highlighting its potential. That means working with decision-makers to promote win-win solutions with clear short-term environmental and economic benefits. Governments, industry and communities all have important roles to play, Lubchenco notes.

“Another key step is transforming the social norms that drive the behavior of the different actors, particularly in industry,” Lubchenco says. “Finally, it will be critical to take a cross-sector approach.

“Some nations, like the Seychelles, Belize and South Africa, are doing integrated, smart planning to deconflict use by different sectors while also growing their economies in ways that value the health of the ocean, which is essential to jobs and food security. They are figuring out how to be smarter about ocean uses, not just to use the ocean more intensively.”

Prior to her presentation, Lubchenco gave a related press briefing on how to create the right incentives for sustainable uses of the ocean.

In November 2016, Lubchenco, Cerny-Chipman, OSU graduate student Jessica Reimer and Simon Levin, the distinguished university professor in ecology and evolutionary biology at Princeton University, co-authored a paper on a related topic for the Proceedings of the National Academy of Sciences.





Contacts and sources:
By Steve Lundeberg,
Jane Lubchenco,
Oregon State University

Wednesday, February 22, 2017

Impacts of Mass Coral Die-Off on Indian Ocean Reefs Revealed

Warming seawaters, caused by climate change and extreme climatic events, threaten the stability of tropical coral reefs, with potentially devastating implications for many reef species and the human communities that reefs support.

New research by the University of Exeter shows that increased surface ocean temperatures during the strong 2016 El Niño led to a major coral die-off event in the Maldives, and that this has caused reef growth rates to collapse. They also found that the rates at which some reefs species, in particular parrotfish, are eroding the reefs had increased following this coral die-off event.

Similar magnitudes of coral death have been reported on many other reefs in the region, including on the northern Great Barrier Reef, suggesting similar impacts may be very widespread.

Picture was taken in September 2016 along the shallow (2-3 m depth) fore-reef slope habitat around Kandahalagala showing the extent of coral bleaching-driven coral mortality that has preferentially impacted Acropora sp.
Credit:   University of Exeter

Professor Chris Perry and Dr Kyle Morgan, of the University of Exeter's Geography department, studied the impact of the 2016 El Niño event at sites in the southern Maldives and found that the event had not only caused widespread coral bleaching, a phenomenon whereby corals expel their photosynthesising algae when stressed by high temperatures, but that this had also led to extensive coral death in all shallow water reef habitats examined.

"A very major concern now is how quickly these reefs might recover. Recovery from similar past disturbances in the Maldives have taken 10-15 years, but major bleaching events are predicted to become far more frequent than this. If this is the case it could lead to long-term loss of reef growth and so limit the coastal protection and habitat services these reefs presently provide," Professor Perry said.

"The most alarming aspect of this coral die-off event is that it has led to a rapid and very large decline in the growth rate of the reefs. This in turn has major implications not only for the capacity of these reefs to match any increases in sea-level, but is also likely to lead to a loss of the surface structure of the reefs that is so critical for supporting fish species diversity and abundance."

Coral reefs are formed by the accumulation of coral skeletons (made of calcium carbonate) that builds up over 100's to 1000's of years, forming the complex structures that support a huge diversity of marine life. The so-called 'carbonate budget' of a reef, which represents the balance between the rate at which this carbonate is produced by corals and the rate at which it is removed (by biological or physical erosion or chemical dissolution), influences the development of these structures and how fast a reef can grow.

Picture was taken in September 2016 along the shallow (2-3 m depth) fore-reef slope habitat around Kandahalagala showing the extent of coral bleaching-driven coral mortality that has preferentially impacted Acropora sp.

Credit:   University of Exeter

The effect these combined factors was a major decline in the carbonate budgets of these reefs, with an average reduction of 157%. Before the warming event, the reefs had been in a period of rapid growth, but after the period of higher sea temperatures a negative carbonate budget was recorded at all sites. Put simply, the structure of these reefs is now eroding at a faster rate that it is growing. Based on past studies the researchers suggest that given the severity of the bleaching impacts it may take 10 to 15 years for full recovery to occur.

The extent of the 2016 bleaching, which also affected reefs in other parts of the Indian Ocean and Pacific, was so severe that it was subsequently named the 'Third Global Coral Bleaching Event'.

Dr Kyle Morgan said: "Coral reefs provide a wealth of benefits. They are vital habitats, essential for a vast number of species and they are also important for tourism and food provision. The reduction in carbonate budget threatens these benefits and may well also lead to the structural collapse of reefs. The key issue to consider now is whether, and when, these reefs will recover, both ecologically and in terms of their growth. Based on past trajectories, we predict recovery will take at least a decade, however it all depends on the extent of future warming events and climate change."

University of Exeter scientists warned there could be further rises in sea temperatures owing to global warming with potentially devastating effects on coral reefs.

Professor Mat Collins, an expert in climate modelling at the University of Exeter, said:

"We expect El Niño variability to continue into the future which, when combined with rising temperatures due to global warming, means we will see unprecedented sea temperatures and increasing incidence of coral bleaching."

Bleaching drives collapse in reef carbonate budgets and reef growth potential on southern Maldives reefs is published in Scientific Reports.


Contacts and sources:
Marie Woolf
University of Exeter

Single-Payer Reform Is 'The Only Way to Fulfill the President's Pledge' on Health Care

In Annals of Internal Medicine, researchers say a single-payer health reform would save an estimated $504 billion annually in administrative costs, allowing for universal coverage, full benefits and lower costs

Proposals floated by Republican leaders won't achieve President Trump's campaign promises of more coverage, better benefits, and lower costs, but a single-payer reform would, according to a commentary published today [Tuesday, Feb. 21] in Annals of Internal Medicine, one of the nation's most prestigious and widely cited medical journals.

Republicans promised to repeal the Affordable Care Act on the first day of the Trump presidency. But the health reform effort has stalled because Republicans in Congress have been unable to come up with a better replacement and fear a backlash against plans that would deprive millions of coverage and raise deductibles.

Credit: Wikimedia Commons

In today's Annals commentary, longtime health policy experts Drs. Steffie Woolhandler and David Himmelstein warn that the proposals by Speaker Paul Ryan, R-Wis., and Secretary of HHS Tom Price would slash Medicaid spending for the poor, shift the ACA's subsidies from the near-poor to wealthier Americans, and replace Medicare with a voucher program, even as they would cut Medicare's funding and raise the program's eligibility age.

Woolhandler and Himmelstein review evidence that, in contrast, single-payer reform could provide comprehensive first-dollar coverage to all Americans within the current budgetary envelope because of vast savings on health care bureaucracy and profits. The authors estimate that a streamlined, publicly financed single-payer program would save $504 billion annually on health care paperwork and profits, including $220 billion on insurance overhead, $150 billion in hospital billing and administration and $75 billion doctors' billing and paperwork. They estimate that an additional $113 billion could be saved each year by hard bargaining with drug companies over prices. The data supporting their estimates is summarized in a table.

The savings would cover the cost of expanding insurance to the 26 million who remain uninsured despite the ACA, as well as "plugging the gaps in existing coverage -- abolishing copayments and deductibles, covering such services as dental and long-term care that many policies exclude."

The lead author of the commentary, Dr. Steffie Woolhandler, is an internist, distinguished professor of public health and health policy at CUNY's Hunter College, and lecturer in medicine at Harvard Medical School. She said: "We're wasting hundreds of billions of health care dollars on insurance paperwork and profits. Private insurers take more than 12 cents of every premium dollar for their overhead and profit, as compared to just over 2 cents in Medicare. Meanwhile, 26 million are still uninsured and millions more with coverage can't afford care. It's time we make our health care system cater to patients instead of bending over backward to help insurance companies."

Dr. David Himmelstein, the senior author, is a primary care doctor and, like Woolhander, a distinguished professor at CUNY's Hunter College and lecturer at Harvard Medical School. He noted: "We urgently need reform that moves forward from the ACA, but the Price and Ryan plans would replace Obamacare with something much worse. Polls show that most Americans -- including most people who want the ACA repealed, and even a strong minority of Republicans - want single-payer reform. And doctors are crying out for such reform. The Annals of Internal Medicine is one of the most respected and traditional medical journals. Their willingness to publish a call for single payer signals that it's a mainstream idea in our profession."





Contacts and sources:
Mark Almberg
Physicians For A National Health Program

The Annals of Internal Medicine is the flagship journal of the American College of Physicians (ACP), the nation's largest medical specialty organization with 148,000 internal medicine physicians, related subspecialists, and medical students. In 2007, the Annals published a lengthy policy article in which the ACP said a single-payer system was one pathway to achieving universal coverage. In early 2008, it published a study showing 59 percent of U.S. physicians support "government legislation to establish national health insurance," a leap of 10 percentage points from five years before.

The commentary is believed to be the first full-length, direct call for single payer, or national health insurance, that the journal has published in its 90-year history.

"Single-Payer Reform: The Only Way to Fulfill the President's Pledge of More Coverage, Better Benefits, and Lower Costs," by Steffie Woolhandler, M.D., M.P.H., and David U. Himmelstein, M.D. Annals of Internal Medicine. Published online first, Feb. 21, 2017. doi:10.7326/M17-0302.

Disclosures: Drs. Woolhandler and Himmelstein co-founded Physicians for a National Health Program, a nonprofit educational and research organization that supports a single-payer national health plan; they also served as advisers to Sen. Bernie Sanders' presidential campaign. Neither the Sanders campaign nor PNHP played any role in funding or otherwise supporting the commentary.

High-Sensitivity Cameras Shows the Atomic Structure of Metal-Organic Frameworks

Highly sensitive electron cameras allow researchers to see the atomic structure of metal-organic frameworks.

Researchers at KAUST have developed a method for fine-scale imaging of metal-organic frameworks (MOFs), three-dimensional structures made up of metal ions connected by organic ligands. MOFs are useful for gas storage and separation because they can be designed to have precise pore sizes of molecular dimensions and large void spaces (porosity) within their frameworks.

Symmetry-imposed and lattice-averaged HRTEM image of the metal-organic framework ZIF-8 (black and white) with a structural model overlaid to show the position of the zinc ions and organic ligands (in color).

Credit: (c) 2017 KAUST Ivan D. Gromicho

Typically, high-resolution transmission electron microscopy (HRTEM) is used to visualize structures with atomic resolution; however, this method is unsuitable for observing MOFs because the electron beams destroy their structures.

"To thoroughly understand the performance of metal-organic frameworks in various applications, we need to know their structures at the atomic level because their macroscopic behavior is determined by their microscopic structure," explained KAUST Associate Professor of Chemical Science Yu Han. By visualizing these structures, researchers can uncover important clues about how these materials self-assemble to create their trademark pores.

Several members of the University's Advanced Membranes and Porous Materials Center, including Han's research scientist and first author of the paper, Yihan Zhu, Associate Professor of Chemical and Biological Engineering Zhiping Lai and Professor of Chemical and Biological Engineering and Director of the Center Ingo Pinnau, joined forces with the University's Imaging and Characterization Core Lab and with colleagues from Gatan, Lawrence Berkeley National Laboratory and others in China. Their collaboration resulted in an adaptation of HRTEM using state-of-the-art direct-detection electron-counting cameras1.

The high sensitivity of these detectors enabled them to acquire images with an electron dose low enough that it does not damage the structure of MOFs, allowing the group to produce high-resolution images of their atomic structures.

The team applied their method to ZIF-8, a MOF comprising zinc ions connected by organic 2-methylimidazole linkers. They were able to image its structure with a resolution of 0.21 nanometers (one nanometer is one billionth of a meter), a resolution high enough to image the individual columns of zinc atoms and organic linkers.

This helped the researchers to reveal the surface and interfacial structures of ZIF-8 crystals. "The results unraveled that porosity generated at the interfaces of ZIF-8 crystals is different from the intrinsic porosity of ZIF-8, which influences how gas molecules transport in ZIF-8 crystals," explained Han.



Contacts and sources:
KAUST - King Abdullah University of Science and Technology

Citation: Zhu, Y., Ciston, J., Zheng, B., ... & Han, Y. Unravelling surface and interfacial structures of a metal-organic framework by transmission electron microscopy. Nature Materials advance online publication, 20 February 2017. http://dx.doi.org/10.1038/nmat4852

What’s Next for Plant Breeders? Drones Are

Crop breeders grow thousands of potential varieties at a time; until now, observations of key traits were made by hand. In a new study, unmanned aerial vehicles, or drones, were used successfully to remotely evaluate and predict soybean maturity timing in tests of potential varieties. The use of drones for this purpose could substantially reduce the man-hours needed to evaluate new crops.

When plant breeders develop new crop varieties, they grow up a lot of plants and they all need to be checked. Repeatedly.

“Farmers might have a 100-acre field planted with one soybean variety, whereas breeders may have 10,000 potential varieties planted on one 10-acre field. The farmer can fairly quickly determine whether the single variety in a field is ready to be harvested. However, breeders have to walk through research fields several times in the fall to determine the date when each potential variety matures,” explains University of Illinois soybean breeder Brian Diers.

Drones are increasingly being used in agriculture. A new study demonstrates their benefits for soybean breeders.
Credit; University of Illinois

“We have to check every three days,” masters student Nathan Schmitz adds. “It takes a good amount of time during a busy part of the year. Sometimes it’s really hot, sometimes really muddy.”

To make things easier, an interdisciplinary team including breeders, computer scientists, engineers, and geographic information specialists turned to unmanned aerial vehicles – commonly known as UAVs or drones.

“When drones became available, we asked ourselves how we could apply this new technology to breeding. For this first attempt, we tried to do a couple simple things,” Diers says.

One goal was to predict the timing of pod maturity using images from a camera attached to the drone, along with sophisticated data and image analysis techniques. “We used multi-spectral images,” Schmitz explains. “We set up an equation in the program to pick up changes in the light frequency reflected off the plant. That color change is how we differentiate a mature plant from an immature one.”

The researchers developed an algorithm to compare images from the drone with pod maturity data measured the old-fashioned way, by walking the fields. “Our maturity predictions with the drone were very close to what we recorded while walking through the fields,” Diers notes.

Predictions made by the model achieved 93 percent accuracy, but Diers says they might have done even better without some of the inherent limitations of flying drones. For example, they could only fly it and obtain good images on sunny days with little wind.

Drones are increasingly recognized for their potential to improve efficiency and precision in agriculture—especially after new FAA rules went into effect in August 2016—but this is one of the first studies to use drones to optimize breeding practices. Diers notes that the application could be particularly useful to large breeding companies, which test hundreds of thousands of potential varieties annually. If breeders can save time and effort using this technology, new varieties could potentially be developed and made available to farmers on a faster timeline—a welcome improvement.

The article, “Development of methods to improve soybean yield estimation and predict plant maturity with an unmanned aerial vehicle based platform,” is published in Remote Sensing of Environment. In addition to Diers and Schmitz, Neil Yu, Liujun Li, Lei Tian, and Jonathan Greenberg, all from the University of Illinois, are co-authors.



Contact and sources:
University of Illinois College of Agricultural, Consumer and Environmental Sciences

New Evidence That E-Cigarettes May Harm Your Heart

It’s been more than 50 years since the U.S. surgeon general warned the public about the lethal dangers of cigarette smoking.

Last year, Surgeon General Dr. Vivek Murthy issued the first-ever report on electronic cigarettes, warning that their use posed a significant and avoidable risk to young people in the United States. E-cigarettes, or e-cigs, arrived on the U.S. market about 10 years ago. Since then, their popularity has exploded, especially among teenagers.

E-cigs are not actually cigarettes. There is no combustion or tobacco. Instead, these electronic, handheld devices deliver nicotine with flavoring and other chemicals in a vapor instead of smoke. Although traditional cigarettes are widely known as the most common preventable cause of heart disease, not much is known about the cardiovascular risks of e-cigarettes.

Credit: www.ecigclick.co.uk, CC BY-SA 2.0/Wikimedia Commons

To shed light on that issue, UCLA researchers decided to see if two health indicators that promote heart disease in tobacco users were also prevalent in people who use e-cigarettes.

The 42-person study, whose findings were published online Feb. 1 in the journal JAMA Cardiology, found that 23 study participants who were habitual users of e-cigarettes were more likely to have signs of two heart risk factors than 19 other participants who did not use e-cigarettes. The risk factors were oxidative stress, which hampers the body’s ability to defend itself against free radicals — a type of particle that has been associated with heart disease — and higher levels of adrenaline in the heart, which can lead to an increased heart rate and high blood pressure.

“The results were a bit surprising because it is widely believed that e-cigarettes are less harmful than tobacco cigarettes,” the study’s co-author, Dr. Holly Middlekauff, a professor of medicine in the division of cardiology at UCLA, told HealthDay. “Instead, we found the same types of abnormalities in our e-cigarette users that are reported in tobacco cigarette users, and these abnormalities are associated with increased cardiac risk.”
Cardiac risk factors the same as those of smokers

The study's authors noted that the findings only show an association, not a cause-and-effect link between e-cigarettes and the heart risks.

“We do not know if a tobacco cigarette smoker is better off switching to e-cigarettes. Most studies show that carcinogens are present at much lower levels in e-cigarettes compared to tobacco cigarettes,” said Middlekauff. “So it is conceivable that the risk for heart disease is similar for e-cigarettes and tobacco cigarettes, but that the risk for cancer is much greater with tobacco cigarettes."

To further their research, they are now doing a comparison of the heart effects of tobacco cigarettes to e-cigarette use.

"The key finding from our study is that e-cigarettes have real, adverse physiologic effects that have been associated with heart disease,” added Middlekauff. “My advice is, if you don't already smoke tobacco cigarettes, don't start using e-cigarettes — they are not harmless.”

Other authors on the study include Roya Moheimani, May Bhetraratana, Fen Yin, Kacey Peters, Jeffrey Gornmbein and Jesus Araujo. All are from UCLA.



Contacts and sources:
Amy Albin
UCLA

As 3-D Printing Grows, So Will the Need to Reclaim Plastic Waste

UC Berkeley is a leader in 3-D printing. From creating a prosthetic hand for an 8-year-old girl to a “smart cap” that senses spoiled food to large-scale cement buildings, engineers and designers on campus are pushing the technology to the limits, using it in ways never seen before.

The technology got its start on campus about a decade ago and continues to grow. With this surge in popularity has come a surge in plastic waste. Now, with more than 100 printers on campus, at least 600 pounds of trash is generated each year.Scott Silva and Nicole Panditi
Credit: UC Berkeley

But Nicole Panditi, a mechanical engineering student, and Scott Silva, and environmental sciences student, have a solution.

Panditi and Silva, who work for the Student Environmental Resource Center, and Ph.D. student Mickey Clemon are leading the 3-D Printer Filament Reclamation Project to decrease the amount of plastic waste made by 3-D printers on campus. They’re creating a campuswide system that takes used 3-D printer plastic, grinds it up, melts it down and produces a new spool of plastic that can be used again in the campus’s 3-D printers.

UC Berkeley students have established an initiative to collect and recycle the plastic left behind from 3-D printers on campus.

Credit: UC Berkeley video by Stephen McNally and Roxanne Makasdjian

First-of-its-kind 3-D recycling infrastructure at UC Berkeley

Although there have been several smaller student efforts to recycle 3-D printer plastic on campus, and some campus labs buy recycled filament, this would be the first time UC Berkeley had an infrastructure to recycle all the 3-D plastic waste on campus, something Panditi says will be essential as 3-D printing becomes more and more popular.

“It’s my personal goal to reduce inefficiencies in 3-D printing so that the tech industry can reach its full beneficial potential without being haunted by mountains of ugly waste,” she says.

The project is spearheaded by Cal Zero Waste, which aims to decrease waste on campus and to find new, creative ways to reuse and recycle items, particularly plastics. Lin King is manager of Cal Zero Waste and adviser to the 3-D printer filament reclamation team. He says creating a closed loop of 3-D printer plastic waste on campus not only decreases the amount of plastic waste, but also lightens the carbon footprint.

“The idea is that the plastics would never have to leave campus,” says King. “We would provide Berkeley-produced recycled filament and any discarded items would be sent right back to us.” Discarded 3-D prints are ground up, melted down and used to make new spools of 3-D printer plastic.

Credit: UC Berkeley

Preparing for 3-D printing to go mainstream

As 3-D printing becomes more mainstream, one area where it will prove useful is mass customization, says mechanical engineering professor Tarek Zohdi, the lead faculty member of the project. “One very special niche is for biological implants for prosthetics — for children in particular. Children who have lost a limb need new prosthetics designed for them as they start to grow.”
Born with symbrachydactyly, Sophie doesn’t have fully developed finger bones in her left hand. But with the help of a CITRIS Invention Lab team, she is the new user of a 3-D-printed super hand.
Credit: CITRIS video by Adriel Olmos

Experts aren’t the only ones using 3-D printers at UC Berkeley. Students in labs across campus, from Jacobs Hall to the Digital Fabrication Lab, can print models for their projects, quickly creating new prototypes. Even novices can now print their own projects at Moffitt Library’s Makerspace. For people new to 3-D printing, says Panditi, up to half of their projects can fail.

“That’s what is great about 3-D printing, and that’s also why there is so much trash generated,” says Panditi. “In rapid prototyping, you’re making iteration after iteration until you get it perfect. What happens with all those iterations is you throw them away. And that’s where all the plastic trash comes in.”

The most popular type of plastic for 3-D printing on campus is bio-based PLA. (The plastic of choice used to be petroleum-based ABS plastic, which has since been found to release a carcinogen when it’s heated.) Although PLA is marketed as compostable, most facilities, including the facility that UC Berkeley uses — Republic Services West Contra Costa Landfill in Richmond — don’t use a long enough cycle to break down these plastics, which can take 90 to 120 days to completely decompose. After a normal 45-day cycle, the facility will sift through the compost material and pull out any plastics that remain, then throw them into a landfill.

The facility says it’s in the process of changing how long plastics stay in the compost cycle, so that eventually compostable plastics will break down, but hasn’t done so yet. That’s why it’s even more crucial to find alternative ways to recycle plastics on campus, says King.

The campus does not recycle PLA because there isn’t a U.S. market for it. That means it would be shipped to another country, such as China or Vietnam.

The 3-D printer filament reclamation team has proven that the process works, but is missing one key piece before it can officially pilot the program on campus. Right now, the team is using a kitchen blender to break down the plastics — hardly efficient for managing the growing amount of plastic waste from 3D printers — so they’ve launched a crowdfunding effort to raise $5,000 to buy a grinder, along with machinery they will need for the pilot program.




Contacts and sources:
Anne Brice
UC Berkeley

Compound From Marine Snail Is Potent Pain Reliever: An Alternative to Opioids?


A tiny snail may offer an alternative to opioids for pain relief. Scientists at the University of Utah have found a compound that blocks pain by targeting a pathway not associated with opioids. Research in rodents indicates that the benefits continue long after the compound have cleared the body. The findings were reported online in the February 20 issue of the Proceedings of the National Academy of Sciences.

The opioid crisis has reached epidemic proportions. Opioids is highly addictive and according to the Centers for Disease Control and Prevention, 91 Americans die every day from an opioid overdose. The medical community is in need of alternative therapies that do not rely on the opioid pathways to relieve pain.

“Nature has evolved molecules that are extremely sophisticated and can have unexpected applications,” begins Baldomera Olivera, Ph.D., professor in biology at the University of Utah. “We were interested in using venoms to understand different pathways in the nervous system.”

The compound (RgIA) in the study was obtained from the venom of Conus regius, the royal cone.

Credit: My Huynh

Conus regius, a small marine cone snail common to the Caribbean Sea, packs a venomous punch, capable of paralyzing and killing its prey.

In this study, the researchers found that a compound isolated from snail’s venom, Rg1A, acts on a pain pathway distinct from that targeted by opioid drugs. Using rodent models, the scientists showed that 910 nicotinic acetylcholine receptors (nAChR) functions as a pain pathway receptor and that RgIA4 is an effective compound to block this receptor. The pathway adds to a small number of nonopioid-based pathways that could be further developed to treat chronic pain.

Interestingly, the duration of the pain relief is long, greatly outlasting the presence of the compound in the animal’s system.

The compound works its way through the body in 4 hours, but the scientists found the beneficial effects lingered. “We found that the compound was still working 72 hours after the injection, still preventing pain,” said J. Michael McIntosh, M.D., professor of psychiatry at the University of Utah Health Sciences. The duration of the outcome may suggest that the snail compound has a restorative effect on some components of the nervous system.

“What is particularly exciting about these results is the aspect of prevention,” said McIntosh. “Once chronic pain has developed, it is difficult to treat. This compound offers a potential new pathway to prevent pain from developing in the first place and offer a new therapy to patients who have run out of options.”

The researchers will continue to the next step of pre-clinical testing to investigate the safety and effectiveness of a new drug therapy.

Testing a new nonopioid compound

Previous research had shown that RgIA was effective in rodents, but the scientists wanted to ensure they had a compound that would work in people. To do this, they used synthetic chemistry to engineer 20 analogs of the compound. In essence, the scientists started with a key (RgIA) that fits into a lock (the pain pathway receptor 910 nAChR). Using the key as a template, they developed new keys (analogs) with slightly different configurations.

The scientists found one key that best fit the lock: the analog RgIA4 tightly bound to the human receptor.

To test whether the compound relieved pain, the scientists administered it to rodents that were exposed to a chemotherapy drug that causes extreme cold sensitivity, as well as hypersensitivity to touch. “Interactions that are not normally painful, like sheets rubbing against the body or pants against the leg, becomes painful,” said McIntosh.

While the untreated rodents experienced pain after exposure to the chemotherapy drug, rodents given the compound did not experience pain. Nor did rodents that were genetically altered rodents to lack the pain pathway receptor. This work demonstrates that 910 nAChR acts as a pain pathway receptor, and that RgIA4 prevents the receptor from being activated.

Most pain medications available today work through a limited number of pathways and are not sufficient to alleviate chronic pain. “RgIA4 works by an entirely new pathway, which opens the door for new opportunities to treat pain,” said McIntosh. “We feel that drugs that work by this pathway may reduce burden of opioid use.”



Contacts and sources:
University of Utah Health Sciences

Can’t We All Just Get Along – Like India’s Cats and Dogs?


A new WCS  (Wildlife Conservation Society) study in India shows that three carnivores – tigers, leopards, and dholes (Asian wild dog) – seemingly in direct competition with one other, are living side by side with surprisingly little conflict. Usually, big cats and wild canids live in different locations to avoid each other.

Yet in four relatively small reserves in India’s wildlife-rich Western Ghats region, WCS researchers have found that they are co-existing, despite competing for much of the same prey, including sambar deer, chital, and pigs.

Using dozens of non-invasive camera traps for sampling entire populations, rather than track a handful of individuals, the research team recorded some 2,500 images of the three predators in action.

A new WCS study in India shows that three carnivores – tigers, leopards, and dholes (Asian wild dog) – seemingly in direct competition with one other, are living side by side with surprisingly little conflict.


Credit: Ullas Karanth/WCS

A new WCS study in India shows that three carnivores – tigers, leopards, and dholes (Asian wild dog) – seemingly in direct competition with one other, are living side by side with surprisingly little conflict.

The authors found that in reserves with an abundance of prey, dholes, which are active during the day, did not come in much contact with the more nocturnal tigers and leopards. But in Bhadra Reserve where prey was scarcer, their active times overlapped, yet dholes still managed to avoid the big cats. In Nagarahole, a park teeming with all three carnivores and their prey, leopards actively to avoid tigers.

Overall, the authors say that these carnivores have developed smart adaptations to coexist, even while they exploit the same prey base. However, these mechanisms vary depending on density of prey resources and possibly other habitat features.


A new study says despite direct competition, tigers, leopards, and dholes are living side by side in protected areas

Credit: Ullas Karanth/WCS

Said Ullas Karanth, WCS Director for Science in Asia and lead author of the study: “Tigers, leopards, and dholes are doing a delicate dance in these protected areas, and all are manging to survive. We were surprised to see how each species has remarkably different adaptations to prey on different prey sizes, use different habitat types and be active at different times. Because of small and isolated nature of these high prey densities in these reserves, such adaptions are helpful for conservationists trying to save all three.”

Both tigers and dholes are classified as Endangered by IUCN; leopards are considered Vulnerable.

Understanding these separate yet overlapping species’ needs is critical to managing predators and prey in small reserves, which is increasingly the scenario of the future. The authors say that by managing populations of flagship predators, like tigers, carefully overall biodiversity can also be conserved.

Using dozens of non-invasive camera traps for sampling entire populations, rather than track a handful of individuals, the research team recorded some 2,500 images of the three predators in action.


Credit: Ullas Karanth/WCS

The study titled “Spatio-temporal interactions facilitate large carnivore sympatry across a resource gradient” authored by Dr. Ullas Karanth, Mr. Arjun Srivathsa, Dr. Divya Vasudev, Ms. Mahi Puri, Dr. Ravishankar Parameshwaran and Dr. Samba Kumar, appeared in the journal Proceedings of the Royal Society of London B: Biological Sciences in February 2017.

This research was supported by the Department of Biotechnology and Department of Science and Technology, Government of India; The Forest Department and Department of Science and Technology, Government of Karnataka; and the Liz Claiborne and Art Ortenberg Foundation.



Contacts and sources:
WCS (Wildlife Conservation Society)

Breakthrough Process Can Produce Renewable Car Tires from Trees and Grasses


A team of researchers, led by the University of Minnesota, has invented a new technology to produce automobile tires from trees and grasses in a process that could shift the tire production industry toward using renewable resources found right in our backyards. The discovery could have major impact on the multi-billion dollar tire industry

Conventional car tires are viewed as environmentally unfriendly because they are predominately made from fossil fuels. The car tires produced from biomass that includes trees and grasses would be identical to existing car tires with the same chemical makeup, color, shape, and performance.

The technology has been patented by the University of Minnesota and is available for licensing through the University of Minnesota Office of Technology Commercialization.


Catalytic conversion of biomass-derived chemicals to renewable polymers occurs in laboratory stirred-tank reactors.

Credit: University of Minnesota


The new study is published by the American Chemical Society’s ACS Catalysis, a leading journal in the chemical and catalysis sciences. Authors of the study, include researchers from the University of Minnesota, University of Massachusetts Amherst, and the Center for Sustainable Polymers, a National Science Foundation-funded center at the University of Minnesota.

“Our team created a new chemical process to make isoprene, the key molecule in car tires, from natural products like trees, grasses, or corn,” said Paul Dauenhauer, a University of Minnesota associate professor of chemical engineering and materials science and lead researcher of the study. “This research could have a major impact on the multi-billion dollar automobile tires industry.”

“Collaboration was really the key to this research taking biomass all the way to isoprene,” said Carol Bessel, the deputy director for the chemistry division at the National Science Foundation (NSF), which funds the Center for Sustainable Polymers. “This collaboration and synergy among researchers with different approaches and skills is really what we are trying to promote within the NSF Centers for Chemical Innovation Program.”

Currently, isoprene is produced by thermally breaking apart molecules in petroleum that are similar to gasoline in a process called “cracking.” The isoprene is then separated out of hundreds of products and purified. In the final step, the isoprene is reacted with itself into long chains to make a solid polymer that is the major component in car tires.

Biomass-derived isoprene has been a major initiative of tire companies for the past decade, with most of the effort focused on fermentation technology (similar to ethanol production). However, renewable isoprene has proven a difficult molecule to generate from microbes, and efforts to make it by an entirely biological process have not been successful.

Funded by NSF, researchers from the Center for Sustainable Polymers have focused on a new process that begins with sugars derived from biomass including grasses, trees and corn. They found that a three-step process is optimized when it is “hybridized,” meaning it combines biological fermentation using microbes with conventional catalytic refining that is similar to petroleum refining technology.

The first step of the new process is microbial fermentation of sugars, such as glucose, derived from biomass to an intermediate, called itaconic acid. In the second step, itaconic acid is reacted with hydrogen to a chemical called methyl-THF (tetrahydrofuran). This step was optimized when the research team identified a unique metal-metal combination that served as a highly efficient catalyst.

The process technology breakthrough came in the third step to dehydrate methyl-THF to isoprene. Using a catalyst recently discovered at the University of Minnesota called P-SPP (Phosphorous Self-Pillared Pentasil), the team was able to demonstrate a catalytic efficiency as high as 90 percent with most of the catalytic product being isoprene. By combining all three steps into a process, isoprene can be renewably sourced from biomass.

“The performance of the new P-containing zeolite catalysts such as S-PPP was surprising,” says Dauenhauer. “This new class of solid acid catalysts exhibits dramatically improved catalytic efficiency and is the reason renewable isoprene is possible.”

“Economically bio-sourced isoprene has the potential to expand domestic production of car tires by using renewable, readily available resources instead of fossil fuels,” said Frank Bates, a world-renowned polymer expert and University of Minnesota Regents Professor of Chemical Engineering and Materials Science. “This discovery could also impact many other technologically advanced rubber-based products.”

In addition to Professor Dauenhauer, researchers who were part of the study from the University of Minnesota were professors Michael Tsapatsis and Kechun Zhang, postdoctoral researchers Omar Abdelrahman, Dae Sung Park, Charles Spanjers and Limin Ren, and current student Katherine Vinter. University of Massachusetts Amherst professor Wei Fan and student Hong Je Cho were also part of the research team.

To read the full research paper entitled “Renewable Isoprene by Sequential Hydrogenation of Itaconic Acid and Dehydra-Decyclization of 3-Methyl-Tetrahydrofuran,” visit the ACS Catalysis website.

The invention of renewable tire technology is part of a larger mission of the Center for Sustainable Polymers, an NSF-funded Center for Chemical Innovation led by the University of Minnesota. Initiated in 2009, the CSP has focused on transforming how plastics are made and unmade through innovative research. Researchers aim to design, prepare and implement polymers derived from renewable resources for a wide range of advanced applications.



Contacts and sources: 
University of Minnesota College of Science and Engineering

Microbes Roil Oceans: Microorganisms Play Significant Role in Oceanic Nutrient and Energy Cycling

A single drop of seawater can contain a million microorganisms, but these microorganisms have a much bigger influence than their size would imply. For example, microbes influence an ocean’s nutrient and energy cycles, and in turn, these cycles affect atmospheric levels of carbon dioxide, methane, and other gases. 

Traditional models of the Earth system do not include the influence of microbes in these cycles. Now, researchers have developed a model that explains key microbial metabolic processes and their influence on greenhouse gas production and consumption in a model marine ecosystem. The new model uses information about the ocean’s geochemistry in combination with genomic and metabolic information about the microbes.

The Impact

Within the ocean, oxygen-starved zones affect the productivity of fisheries, emissions of greenhouse gases, biodiversity in coastal zones, and more. The results of this work significantly improve the crude models of microbial activity in these important oceanic zones. And, these results not only provide holistic insights into how microbes drive nutrient and energy flow within the oceanic environment but also how they influence the atmospheric ecosystem.


Researchers developed a biogeochemical model that integrates multiomic sequence data to explain key metabolic processes in oxygen-starved waters.
Credit:U.S. Department of Energy’s Environmental Molecular Sciences Laboratory


Summary

Oxygen minimum zones are widespread areas in the ocean. In these zones, oxygen is depleted due to the metabolic activity of microbes. Rising temperatures drive expansion of oxygen minimum zones, making these areas especially relevant as model ecosystems for climate science. In turn, microbial metabolic networks in these areas are predicted to have a growing influence on nutrient and energy cycling in the ocean. 

These cycles will affect atmospheric levels of greenhouse gases such as carbon dioxide, methane, and nitrous oxide. Despite important interactions between microbial activity and global biogeochemical processes, climate models have largely neglected modern molecular sequencing data containing critical information about metabolic networks. 

Moreover, climate models often do not incorporate sufficient information about biogeochemical processes in the ocean. Researchers from the University of British Columbia, University of Minnesota, Canadian Institute for Advanced Research, and Max Planck Institute for Marine Microbiology worked together to develop a biogeochemical model. The model integrates observational geochemical data with metagenomic, metatranscriptomic, and metaproteomic sequence data on the distribution of DNA, messenger RNA (mRNA), and proteins from waters in the Saanich Inlet, British Columbia, Canada. 

This site is serving as a model ecosystem for studying key metabolic processes of the oceanic microbial community and their responses to oxygen minimum zone expansion. The team used resources from two U.S. Department of Energy Office of Science user facilities: the DOE Joint Genome Institute and EMSL, the Environmental Molecular Sciences Laboratory. The new model reproduced measured biogeochemical reaction rates as well as DNA, mRNA, and protein concentration profiles at the ecosystem scale. 

Moreover, simulations predicted the role of ubiquitous microorganisms in mediating carbon, nitrogen, and sulfur cycling. These results quantitatively improve previous conceptual models describing microbial metabolic networks in oxygen minimum zones. The integration of real geochemical and multiomic sequence data in a biogeochemical model provides holistic insight into microbial metabolic networks driving nutrient and energy flow at ecosystem scales.

Funding

This work was supported by the U.S. Department of Energy (DOE), Office of Science, Office of Biological and Environmental Research, including support of the Joint Genome Institute and Environmental Molecular Sciences Laboratory, both DOE Office of Science user facilities; G. Unger Vetlesen and Ambrose Monell foundations; Tula Foundation; Natural Sciences and Engineering Research Council of Canada; Genome British Columbia; Canada Foundation for Innovation; and Canadian Institute for Advanced Research.



Contacts and sources:
Department of Energy, Office of Science

Citation: S. Louca, A.K. Hawley, S. Katsev, M. Torres-Beltran, M.P. Bhatia, S. Kheirandish, C.C. Michiels, D. Capelle, G. Lavik, M. Doebeli, S.A. Crowe, and S.J. Hallam, “Integrating biogeochemistry with multiomic sequence information in a model oxygen minimum zone.” Proceedings of the National Academy of Sciences (USA) 113(40), E5925-E5933 (2016). [DOI: 10.1073/pnas.1602897113]