Dec 14, 2019

Water common -- yet scarce -- in exoplanets

The most extensive survey of atmospheric chemical compositions of exoplanets to date has revealed trends that challenge current theories of planet formation and has implications for the search for water in the solar system and beyond.

A team of researchers, led by the University of Cambridge, used atmospheric data from 19 exoplanets to obtain detailed measurements of their chemical and thermal properties. The exoplanets in the study span a large range in size -- from 'mini-Neptunes' of nearly 10 Earth masses to 'super-Jupiters' of over 600 Earth masses -- and temperature, from nearly 20 degrees Celsius to over 2000 degrees Celsius. Like the giant planets in our solar system, their atmospheres are rich in hydrogen, but they orbit different types of stars.

The researchers found that while water vapour is common in the atmospheres of many exoplanets, the amounts were surprisingly lower than expected, while the amounts of other elements found in some planets were consistent with expectations. The results, which are part of a five-year research programme on the chemical compositions of planetary atmospheres outside our solar system, are reported in the Astrophysical Journal Letters.

"We are seeing the first signs of chemical patterns in extra-terrestrial worlds, and we're seeing just how diverse they can be in terms of their chemical compositions," said project leader Dr Nikku Madhusudhan from the Institute of Astronomy at Cambridge, who first measured low water vapour abundances in giant exoplanets five years ago.

In our solar system, the amount of carbon relative to hydrogen in the atmospheres of giant planets is significantly higher than that of the sun. This 'super-solar' abundance is thought to have originated when the planets were being formed, and large amounts of ice, rocks and other particles were brought into the planet in a process called accretion.

The abundances of other elements have been predicted to be similarly high in the atmospheres of giant exoplanets -- especially oxygen, which is the most abundant element in the universe after hydrogen and helium. This means that water, a dominant carrier of oxygen, is also expected to be overabundant in such atmospheres.

The researchers used extensive spectroscopic data from space-based and ground-based telescopes, including the Hubble Space Telescope, the Spitzer Space Telescope, the Very Large Telescope in Chile and the Gran Telescopio Canarias in Spain. The range of available observations, along with detailed computational models, statistical methods, and atomic properties of sodium and potassium, allowed the researchers to obtain estimates of the chemical abundances in the exoplanet atmospheres across the sample.

The team reported the abundance of water vapour in 14 of the 19 planets, and the abundance of sodium and potassium in six planets each. Their results suggest a depletion of oxygen relative to other elements and provide chemical clues into how these exoplanets may have formed without substantial accretion of ice.

"It is incredible to see such low water abundances in the atmospheres of a broad range of planets orbiting a variety of stars," said Madhusudhan.

"Measuring the abundances of these chemicals in exoplanetary atmospheres is something extraordinary, considering that we have not been able to do the same for giant planets in our solar system yet, including Jupiter, our nearest gas giant neighbour," said Luis Welbanks, lead author of the study and PhD student at the Institute of Astronomy.

Various efforts to measure water in Jupiter's atmosphere, including NASA's current Juno mission, have proved challenging. "Since Jupiter is so cold, any water vapour in its atmosphere would be condensed, making it difficult to measure," said Welbanks. "If the water abundance in Jupiter were found to be plentiful as predicted, it would imply that it formed in a different way to the exoplanets we looked at in the current study."

"We look forward to increasing the size of our planet sample in future studies," said Madhusudhan. "Inevitably, we expect to find outliers to the current trends as well as measurements of other chemicals."

These results show that different chemical elements can no longer be assumed to be equally abundant in planetary atmospheres, challenging assumptions in several theoretical models.

Read more at Science Daily

Mitochondria are the 'canary in the coal mine' for cellular stress

Mitochondria, tiny structures present in most cells, are known for their energy-generating machinery. Now, Salk researchers have discovered a new function of mitochondria: they set off molecular alarms when cells are exposed to stress or chemicals that can damage DNA, such as chemotherapy. The results, published online in Nature Metabolism on December 9, 2019, could lead to new cancer treatments that prevent tumors from becoming resistant to chemotherapy.

"Mitochondria are acting as a first line of defense in sensing DNA stress. The mitochondria tell the rest of the cell, 'Hey, I'm under attack, you better protect yourself,'" says Gerald Shadel, a professor in Salk's Molecular and Cell Biology Laboratory and the Audrey Geisel Chair in Biomedical Science.

Most of the DNA that a cell needs to function is found inside the cell's nucleus, packaged in chromosomes and inherited from both parents. But mitochondria each contain their own small circles of DNA (called mitochondrial DNA or mtDNA), passed only from a mother to her offspring. And most cells contain hundreds -- or even thousands -- of mitochondria.

Shadel's lab group previously showed that cells respond to improperly packaged mtDNA similarly to how they would react to an invading virus -- by releasing it from mitochondria and launching an immune response that beefs up the cell's defenses.

In the new study, Shadel and his colleagues set out to look in more detail at what molecular pathways are activated by the release of damaged mtDNA into the cell's interior. They homed in on a subset of genes known as interferon-stimulated genes, or ISGs, that are typically activated by the presence of viruses. But in this case, the team realized, the genes were a particular subset of ISGs turned on by viruses. And this same subset of ISGs is often found to be activated in cancer cells that have developed resistance to chemotherapy with DNA-damaging agents like doxyrubicin.

To destroy cancer, doxyrubicin targets the nuclear DNA. But the new study found that the drug also causes the damage and release of mtDNA, which in turn activates ISGs. This subset of ISGs, the group discovered, helps protect nuclear DNA from damage -- and, thus, causes increased resistance to the chemotherapy drug. When Shadel and his colleagues induced mitochondrial stress in melanoma cancer cells, the cells became more resistant to doxyrubicin when grown in culture dishes and even in mice, as higher levels of the ISGs were protecting the cell's DNA.

"Perhaps the fact that mitochondrial DNA is present in so many copies in each cell, and has fewer of its own DNA repair pathways, makes it a very effective sensor of DNA stress," says Shadel.

Most of the time, he points out, it's probably a good thing that the mtDNA is more prone to damage -- it acts like a canary in a coal mine to protect healthy cells. But in cancer cells, it means that doxyrubicin -- by damaging mtDNA first and setting off molecular alarm bells -- can be less effective at damaging the nuclear DNA of cancer cells.

"It says to me that if you can prevent damage to mitochondrial DNA or its release during cancer treatment, you might prevent this form of chemotherapy resistance," Shadel says.

Read more at Science Daily

Dec 13, 2019

Transformative change can save humans and nature

The survival of Earth's life is not a battle of humans versus nature. In this week's Science, an independent group of international experts, including one from Michigan State University (MSU), deliver a sweeping assessment of nature, concluding victory needs both humans and nature to thrive.

"Pervasive human-driven decline of life on Earth points to the need for transformative change" explores how human impacts on life on Earth are unprecedented, requiring transformative action to address root economic, social and technological causes.

It's a notable assessment not just for its unflinching examination of "living nature" -- Earth's fabric of life which provides food, water, energy and healthy security. The Science article "Pervasive human-driven decline of life on Earth points to the need for transformative change" takes up where the recent Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services Global Assessment (IPBES) leaves off by following an intergovernmental process from start to end. The report covers not only the history of humanity's interactions with nature -- with particular focus on the last 50 years -- but also how these might change in the future.

"We cannot save the planet -- and ourselves -- until we understand how tightly woven people and the natural benefits that allow us to survive are," said Jianguo "Jack" Liu, MSU Rachel Carson Chair in Sustainability and a co-author. "We have learned new ways to understand these connections, even as they spread across the globe. This strategy has given us the power to understand the full scope of a problem, which allows us to find true solutions."

Nature's capacity to provide beneficial regulation of environmental processes, such as modulating air and water quality, sequestering carbon, building healthy soils, pollinating crops, and providing coastal protection from hazards such as storms and storm surges, has decreased globally, but not evenly. Scientists, the paper notes, have gotten better collecting information and modeling situations to more accurately reflect how the world truly works.

Among that methodology increasingly adopted by scientists across the world is telecoupling, introduced by Liu in 2008 and the framework since applied to more than 500 scientific papers. The telecoupling framework is an integrative way to study coupled human and natural systems that are linked over long distances. The framework keeps both the humans and the natural in focus and shows how changes can reverberate far beyond, and then even double back.

Their group's dedication to integrative approaches has produced a litany of human impact: 70% of land surfaces altered, 77% of major rivers no longer flow from source to sea, the tally of animal species going extinct is rising, biodiversity is being lost.

The group applies different scenarios to see how plausible changes have an effect. Starkly, they note nothing on Earth ultimately wins in the "business as usual" scenario.

They say that what our planet needs -- quickly -- is transformative change. A new way of doing business, what they term "a system-wide reorganization across technological, economic and social factors, making sustainability the norm rather than the altruistic exception."

Read more at Science Daily

Largest study of its kind reveals that many psychiatric disorders arise from common genes

Many distinct psychiatric diseases share a common genetic structure, according to new research by scientists at Massachusetts General Hospital (MGH) and the Psychiatric Genomics Consortium, an international team of investigators. Psychiatric disorders affect more than 25 percent of the population in a given year. In the largest-ever study of its kind, published in the journal Cell, researchers identified more than 100 genetic variants that affect the risk for more than one mental health condition.

A gene is made up of segments of DNA; an alteration in the DNA sequence produces a gene variant, which can increase or decrease the risk for disease. Many individual gene variants that affect the risk for specific psychiatric disorders have been identified. However, genes are often pleiotropic, meaning they produce multiple effects in the body.

Identifying gene variants that influence the risk for more than one psychiatric disorder is an important step toward improving the diagnosis and treatment of these conditions, says the study's senior author, Jordan W. Smoller, MD, ScD, director of MGH's Psychiatric and Neurodevelopmental Genetics Unit and a professor of Psychiatry at Harvard Medical School (HMS). "Understanding how specific genetic variations may contribute to a broad spectrum of illnesses can tell us something about the degree to which these disorders may have a shared biology," says Smoller.

To identify these multi-purpose gene variants, the researchers used a technique called genome-wide association to analyze genetic data from 494,162 healthy control subjects and 232,964 people diagnosed with at least one of eight common psychiatric disorders. The analysis identified 109 gene variants that affect the risk for more than one psychiatric disorder.

Certain disorders shared many variants, allowing the researchers to divide the conditions into three groups of genetically-related conditions: disorders characterized by compulsive behaviors (anorexia nervosa, obsessive-compulsive disorder and, to a lesser extent, Tourette syndrome); mood and psychotic disorders (bipolar disorder, major depression and schizophrenia); and early-onset neurodevelopmental disorders (autism spectrum disorder, ADHD and Tourette syndrome). The researchers also found evidence that genes associated with multiple disorders show increased expression beginning in the second trimester of pregnancy and appear to play an important role in brain development.

Knowing which gene variants increase the odds for developing multiple psychiatric disorders provides new clues about the biological pathways that contribute to mental illness, says computational geneticist Phil H. Lee, PhD, of the Center for Genomic Medicine at MGH and HMS, lead author of the study. "And learning how disorders are related at a biological level may inform how we classify and diagnose mental health conditions," says Lee.

Read more at Science Daily

New study enhances knowledge about widespread diseases

When proteins in the brain form deposits consisting of insoluble aggregates, diseases such as Alzheimer's or Parkinson's can occur. Now a research team has come a step closer to understanding this process.

In Parkinson's disease the alpha-synuclein (α-synuclein) protein forms aggregates leading to impaired brain function and the development of the disease.

Now researchers at the University of Gothenburg and universities in Basel and Zurich have published a new study in Nature that demonstrates how a certain class of proteins can regulate and prevent α-synuclein from forming protein deposits and insoluble aggregates within healthy cells.

"In each cell there are auxiliary proteins called molecular chaperones. They take care of newly made proteins to help them in the process of protein folding and prevent misfolding or unfolding," says Björn Burmann, assistant professor at the Department of Chemistry and Molecular Biology at the University of Gothenburg.

A breakthrough in research

Protein folding describes the process by which a protein takes on its specific three-dimensional shape, enabling the protein to fulfil its function.

Countless proteins in mammalian cells do not have a stable protein fold, despite the important functions they serve in the cells. One of these proteins is α-synuclein. In the new study, the research team could now reveal the basic process that affects how the α-synuclein protein is folded and aggregated as well as how molecular chaperones in living mammalian cells can prevent the misfolding of α-synuclein.

"A large pool of various chaperones prevents α-synuclein from forming protein aggregates in healthy cells. By studying the protein directly in mammalian cells, we have found that inhibition of chaperones leads to aggregation of α-synuclein at the amino acid level."

Disruption of the α-synuclein-chaperone interaction may be the long-sought first step that initiates the development of α-synuclein-related diseases, according to Björn Burmann and his research colleagues.

From Science Daily

First identified comet to visit our solar system from another star

Comet 2I/Borisov is only the second interstellar object known to have passed through the solar system. These two images, taken by NASA's Hubble Space Telescope, capture the comet appearing near a background galaxy (left) and soon after its closest approach to the Sun (right).
When astronomers see something in the universe that at first glance seems like one-of-a-kind, it's bound to stir up a lot of excitement and attention. Enter comet 2I/Borisov. This mysterious visitor from the depths of space is the first identified comet to arrive here from another star. We don't know from where or when the comet started heading toward our Sun, but it won't hang around for long. The Sun's gravity is slightly deflecting its trajectory, but can't capture it because of the shape of its orbit and high velocity of about 100,000 miles per hour.

Telescopes around the world have been watching the fleeting visitor. NASA's Hubble Space Telescope has provided the sharpest views as the comet skirts by our Sun. Since October the space telescope has been following the comet like a sports photographer following horses speeding around a racetrack. Hubble revealed that the heart of the comet, a loose agglomeration of ices and dust particles, is likely no more than about 3,200 feet across, about the length of nine football fields. Though comet Borisov is the first of its kind, no doubt there are many other comet vagabonds out there, plying the space between stars. Astronomers will eagerly be on the lookout for the next mysterious visitor from far beyond.

These two images, taken by Hubble, capture comet 2I/Borisov streaking though our solar system and on its way back to interstellar space. It is only the second interstellar object known to have passed through the solar system.

Nov. 16, 2019, photo

The comet appears in front of a distant background spiral galaxy (2MASX J10500165-0152029). The galaxy's bright central core is smeared in the image because Hubble was tracking the comet. Comet Borisov was approximately 203 million miles from Earth in this exposure. Its tail of ejected dust streaks off to the upper right. The comet has been artificially colored blue to discriminate fine detail in the halo of dust, or coma, surrounding the central nucleus. It also helps to visually separate the comet from the background galaxy.

Dec. 9, 2019, photo

Hubble revisited the comet shortly after its closest approach to the Sun where it received maximum heating after spending most of its life in frigid interstellar space. The comet also reached a breathtaking maximum speed of about 100,000 miles per hour. Comet Borisov is 185 million miles from Earth in this photo, near the inner edge of the asteroid belt but below it. The nucleus, an agglomeration of ices and dust, is still too small to be resolved. The bright central portion is a coma made up of dust leaving the surface. The comet will make its closest approach to Earth in late December at a distance of 180 million miles.

"Hubble gives us the best upper limit of the size of comet Borisov's nucleus, which is the really important part of the comet," said David Jewitt, a UCLA professor of planetary science and astronomy, whose team has captured the best and sharpest look at this first confirmed interstellar comet. "Surprisingly, our Hubble images show that its nucleus is more than 15 times smaller than earlier investigations suggested it might be. Our Hubble images show that the radius is smaller than half a kilometer. Knowing the size is potentially useful for beginning to estimate how common such objects may be in the solar system and our galaxy. Borisov is the first known interstellar comet, and we would like to learn how many others there are."

Crimean amateur astronomer Gennady Borisov discovered the comet on Aug. 30, 2019, and reported the position measurements to the International Astronomical Union's Minor Planet Center in Cambridge, Massachusetts. The Center for Near-Earth Object Studies at NASA's Jet Propulsion Laboratory in Pasadena, California, working with the Minor Planet Center, computed an orbit for the comet, which shows that it came from elsewhere in our Milky Way galaxy, point of origin unknown.

Read more at Science Daily

Dec 12, 2019

When penguins ruled after dinosaurs died

What waddled on land but swam supremely in subtropical seas more than 60 million years ago, after the dinosaurs were wiped out on sea and land?

Fossil records show giant human-sized penguins flew through Southern Hemisphere waters -- along side smaller forms, similar in size to some species that live in Antarctica today.

Now the newly described Kupoupou stilwelli has been found on the geographically remote Chatham Islands in the southern Pacific near New Zealand's South Island. It appears to be the oldest penguin known with proportions close to its modern relatives.

It lived between 62.5 million and 60 million years ago at a time when there was no ice cap at the South Pole and the seas around New Zealand were tropical or subtropical.

Flinders University PhD palaeontology candidate and University of Canterbury graduate Jacob Blokland made the discovery after studying fossil skeletons collected from Chatham Island between 2006 and 2011.

He helped build a picture of an ancient penguin that bridges a gap between extinct giant penguins and their modern relatives.

"Next to its colossal human-sized cousins, including the recently described monster penguin Crossvallia waiparensis, Kupoupou was comparatively small -- no bigger than modern King Penguins which stand just under 1.1 metres tall," says Mr Blokland, who worked with Professor Paul Scofield and Associate Professor Catherine Reid, as well as Flinders palaeontologist Associate Professor Trevor Worthy on the discovery.

"Kupoupou also had proportionally shorter legs than some other early fossil penguins. In this respect, it was more like the penguins of today, meaning it would have waddled on land.

"This penguin is the first that has modern proportions both in terms of its size and in its hind limb and foot bones (the tarsometatarsus) or foot shape."

As published in the US journal Palaeontologica Electronica, the animal's scientific name acknowledges the Indigenous Moriori people of the Chatham Island (Rēkohu), with Kupoupou meaning 'diving bird' in Te Re Moriori.

The discovery may even link the origins of penguins themselves to the eastern region of New Zealand -- from the Chatham Island archipelago to the eastern coast of the South Island, where other most ancient penguin fossils have been found, 800km away.

University of Canterbury adjunct Professor Scofield, Senior Curator of Natural History at the Canterbury Museum in Christchurch, says the paper provides further support for the theory that penguins rapidly evolved shortly after the period when dinosaurs still walked the land and giant marine reptiles swam in the sea.

"We think it's likely that the ancestors of penguins diverged from the lineage leading to their closest living relatives -- such as albatross and petrels -- during the Late Cretaceous period, and then many different species sprang up after the dinosaurs were wiped out," Professor Scofield says

"It's not impossible that penguins lost the ability to fly and gained the ability to swim after the extinction event of 66 million years ago, implying the birds underwent huge changes in a very short time. If we ever find a penguin fossil from the Cretaceous period, we'll know for sure."

Read more at Science Daily

Study of elephant, capybara, human hair finds that thicker hair isn't always stronger

Strands of human hair.
Despite being four times thicker than human hair, elephant hair is only half as strong -- that's just one finding from researchers studying the hair strength of many different mammals. Their work, appearing in a paper publishing December 11 in the journal Matter, shows that thin hair tends to be stronger than thick hair because of the way that it breaks.

"We were very surprised by the result," says first author Wen Yang, a nanoengineering researcher at the University of California, San Diego. "Because, intuitively, we would think thick hair is stronger. Natural materials have undergone thousands of years of evolution, so to us, these materials are very well developed. We hope to learn from nature and develop synthetic products with comparable properties."

Previous studies have found that human hair has strength comparable to that of steel when adjusted for density. This is because of hair's hierarchical structure: human hair is composed of an outer layer called the cuticle that wraps around an inner cortex made of many small fibers linked by chemical bonds. Within each fiber, there are even smaller fibers embedded. This structural design allows hair, which is made of proteins, to be resistant to deformation.

Yang and her team, including researchers from the Meyers and Ritchie groups at University of California, San Diego, and University of California, Berkeley, were curious if hair from other animals shares similar characteristics. They collected hair samples from eight different mammals, including humans, bears, boars, horses, capybaras, javelinas, giraffes, and elephants. These hairs vary in thickness: human hair is as thin as 80 ?m in diameter, while those of elephants and giraffes are over 350 ?m in diameter.

The researchers tied individual strands of hair to a machine that gradually pulled them apart until they broke. To their surprise, they found thin hair was able to endure greater tension before it broke compared to thick hair. This also applied to hairs from the same species. For example, thin hair from a child was stronger than thicker hair from an adult.

By studying the broken hairs using a scanning electron microscope, the team found that although most hairs share a similar structure, they broke in different ways. Hairs with a diameter greater than 200 ?m, such as those of boars, giraffes and elephants, tend to break in a normal fracture mode, a clean break similar to what would happen if a banana breaks in the middle. Hairs that are thinner than 200 ?m, such as those of humans, horses and bears, break in a shear mode. The break is uneven, like when a tree branch is snapped in a storm. The distinction in cracking path is because the structural elements in different hairs interact differently.

"Shearing is when small zig-zag cracks are formed within the material as a result of stress," Yang says. "These cracks then propagate, and for some biological materials, the sample isn't completely broken until the small cracks meet. If a material shears, it means it can withstand greater tension and thus is tougher than a material that experiences a normal fracture."

"The notion of thick being weaker than thin is not unusual, and we have found that happening when studying brittle materials like metal wires," says co-author Robert Ritchie at the University of California, Berkeley. "This is actually a statistical thing, which is a bigger piece will have a greater possibility of having a defect. It's a bit surprising to see this in hair as hair is not a brittle material, but we think it's because of the same reason."

The researchers believe that their findings could help scientists design better synthetic materials. But Yang says her team's bio-inspired material manufacture is still at its infancy. Current technologies are not yet able to create materials that are as fine as hair and have a sophisticated hierarchical structure.

"There are many challenges in synthetic materials we haven't had a solution for, from how to manufacture very tiny materials to how to replicate the bonds between each layer as seen in natural hair," Yang says. "But if we can create metals that have a hierarchical structure like that of hair, we could produce very strong materials, which could be used as rescue ropes and for constructions."

Read more at Science Daily

Scientists discover key neural circuit regulating alcohol consumption

Scientists have known that a region of the brain called the central nucleus of the amygdala (CeA) plays a role in behaviors related to alcohol use and consumption in general. It's been less known which precise populations of brain cells and their projections to other brain regions mediate these behaviors. Now, UNC School of Medicine scientists discovered that specific neurons in the CeA contribute to reward-like behaviors, alcohol consumption in particular.

Published in the Journal of Neuroscience, this research pinpoints a specific neural circuit that when altered caused animal models to drink less alcohol.

"The fact that these neurons promote reward-like behavior, that extremely low levels of alcohol consumption activate these cells, and that activation of these neurons drive alcohol drinking in animals without extensive prior drinking experience suggests that they may be important for early alcohol use and reward," said senior author Zoe McElligott, PhD, assistant professor of psychiatry and pharmacology. "It's our hope that by understanding the function of this circuit, we can better predict what happens in the brains of people who transition from casual alcohol use to subsequent abuse of alcohol, and the development of alcohol use disorders."

McElligott, who is also a member of the UNC Bowles Center for Alcohol Studies, set out to investigate if a population of neurons that express a specific neuropeptide (neurotensin or NTS) contributes to reward-like behaviors and alcohol drinking. She was especially interested in these neurons in the context of inexperienced alcohol use, such as when a person first begins to drink alcohol. Also, NTS neurons are a subpopulation of other neurons in this CeA brain region that have been implicated in anxiety and fear -- known as the somatostatin and corticotropin releasing factor neurons.

Using modern genetic and viral technologies in male mice, McElligott and colleagues found that selectively lesioning or ablating the NTS neurons in the CeA, while maintaining other types of CeA neurons, would cause the animals to drink less alcohol. This manipulation did not alter anxiety-like behavior. It also did not affect the consumption of other palatable liquids such as sucrose, saccharin, and bitter quinine solutions.

"We found that these NTS neurons in the CeA send a strong projection to the hindbrain, where they inhibit the parabrachial nucleus, near the brainstem," McElligott said.

Using optogenetics -- a technique where light activates these neurons -- the researchers stimulated the terminal projections of the CeA-NTS neurons in the parabrachial and found that this stimulation inhibited the neurons in the parabrachial. When the scientists stimulated this projection with a laser in one half of the animal's box, animals would spend more time where the stimulation would occur.

Animals also learned to perform a task to get the laser stimulation to turn on, and they would do this repeatedly, suggesting that they found this stimulation to be rewarding.

"Furthermore, when we stimulated this projection, animals would drink more alcohol as compared to when they had an opportunity to drink alcohol without laser stimulation," McElligott said. "In contrast to our study where we ablated the NTS neurons, laser stimulation of this parabrachial pathway also caused the animals to consume caloric and non-caloric sweetened beverages. When the animals were presented with regular food and a sweet food, however, laser stimulation did not enhance the consumption regardless of the mouse's hunger state. This suggests that different circuits may regulate the consumption of rewarding fluids and solids."

McElligott and her graduate student María Luisa Torruella Suarez, the first author of this study, hope to explore how alcohol experience may change these neurons over time.

"Would these cells respond differently after animals have been drinking high quantities of alcohol over time?" McElligott said. "We also want to discover which populations of neurons in the parabrachial are receiving inputs from these neurons. Fully understanding this circuit could be the key to developing therapeutics to help people with alcohol use disorders."

Read more at Science Daily

Earth was stressed before dinosaur extinction

New evidence gleaned from Antarctic seashells confirms that Earth was already unstable before the asteroid impact that wiped out the dinosaurs.

The study, led by researchers at Northwestern University, is the first to measure the calcium isotope composition of fossilized clam and snail shells, which date back to the Cretaceous-Paleogene mass extinction event. The researchers found that -- in the run-up to the extinction event -- the shells' chemistry shifted in response to a surge of carbon in the oceans.

This carbon influx was likely due to long-term eruptions from the Deccan Traps, a 200,000-square-mile volcanic province located in modern India. During the years leading up to the asteroid impact, the Deccan Traps spewed massive amounts of carbon dioxide (CO2) into the atmosphere. The concentration of CO2 acidified the oceans, directly affecting the organisms living there.

"Our data suggest that the environment was changing before the asteroid impact," said Benjamin Linzmeier, the study's first author. "Those changes appear to correlate with the eruption of the Deccan Traps."

"The Earth was clearly under stress before the major mass extinction event," said Andrew D. Jacobson, a senior author of the paper. "The asteroid impact coincides with pre-existing carbon cycle instability. But that doesn't mean we have answers to what actually caused the extinction."

The study will be published in the January 2020 issue of the journal Geology, which comes out later this month.

Jacobson is a professor of Earth and planetary sciences in Northwestern's Weinberg College of Arts and Sciences. Linzmeier was a postdoctoral researcher with the Ubben Program for Climate and Carbon Science at the Institute for Sustainability and Energy at Northwestern when the research was conducted. He is now a postdoctoral fellow at the University of Wisconsin-Madison in the Department of Geoscience.

'Each shell is a snapshot'

Previous studies have explored the potential effects of the Deccan Traps eruptions on the mass extinction event, but many have examined bulk sediments and used different chemical tracers. By focusing on a specific organism, the researchers gained a more precise, higher-resolution record of the ocean's chemistry.

"Shells grow quickly and change with water chemistry," Linzmeier said. "Because they live for such a short period of time, each shell is a short, preserved snapshot of the ocean's chemistry."

Seashells mostly are composed of calcium carbonate, the same mineral found in chalk, limestone and some antacid tablets. Carbon dioxide in water dissolves calcium carbonate. During the formation of the shells, CO2 likely affects shell composition even without dissolving them.

For this study, the researchers examined shells collected from the Lopez de Bertodano Formation, a well-preserved, fossil-rich area on the west side of Seymour Island in Antarctica. They analyzed the shells' calcium isotope compositions using a state-of-the-art technique developed in Jacobson's laboratory at Northwestern. The method involves dissolving shell samples to separate calcium from various other elements, followed by analysis with a mass spectrometer.

"We can measure calcium isotope variations with high precision," Jacobson said. "And those isotope variations are like fingerprints to help us understand what happened."

Using this method, the team found surprising information.

"We expected to see some changes in the shells' composition, but we were surprised by how quickly the changes occurred," Linzmeier said. "We also were surprised that we didn't see more change associated with the extinction horizon itself."

A future warning

The researchers said that understanding how the Earth responded to past extreme warming and CO2 input can help us prepare for how the planet will respond to current, human-caused climate change.

"To some degree, we think that ancient ocean acidification events are good analogs for what's happening now with anthropogenic CO2 emissions," Jacobson said. "Perhaps we can use this work as a tool to better predict what might happen in the future. We can't ignore the rock record. The Earth system is sensitive to large and rapid additions of CO2. Current emissions will have environmental consequences."

Read more at Science Daily

Dec 11, 2019

ALMA spots most distant dusty galaxy hidden in plain sight

Astronomers using the Atacama Large Millimeter/submillimeter Array (ALMA) have spotted the light of a massive galaxy seen only 970 million years after the Big Bang. This galaxy, called MAMBO-9, is the most distant dusty star-forming galaxy that has ever been observed without the help of a gravitational lens.

Dusty star-forming galaxies are the most intense stellar nurseries in the universe. They form stars at a rate up to a few thousand times the mass of the Sun per year (the star-forming rate of our Milky Way is just three solar masses per year) and they contain massive amounts of gas and dust. Such monster galaxies are not expected to have formed early in the history of the universe, but astronomers have already discovered several of them as seen when the cosmos was less than a billion years old. One of them is galaxy SPT0311-58, which ALMA observed in 2018.

Because of their extreme behavior, astronomers think that these dusty galaxies play an important role in the evolution of the universe. But finding them is easier said than done. "These galaxies tend to hide in plain sight," said Caitlin Casey of the University of Texas at Austin and lead author of a study published in the Astrophysical Journal. "We know they are out there, but they are not easy to find because their starlight is hidden in clouds of dust."

MAMBO-9's light was already detected ten years ago by co-author Manuel Aravena, using the Max-Planck Millimeter BOlometer (MAMBO) instrument on the IRAM 30-meter telescope in Spain and the Plateau de Bure Interferometer in France. But these observations were not sensitive enough to reveal the distance of the galaxy. "We were in doubt if it was real, because we couldn't find it with other telescopes. But if it was real, it had to be very far away," says Aravena, who was at that time a PhD student in Germany and is currently working for the Universidad Diego Portales in Chile.

Thanks to ALMA's sensitivity, Casey and her team have now been able to determine the distance of MAMBO-9. "We found the galaxy in a new ALMA survey specifically designed to identify dusty star-forming galaxies in the early universe," said Casey. "And what is special about this observation, is that this is the most distant dusty galaxy we have ever seen in an unobstructed way."

The light of distant galaxies is often obstructed by other galaxies closer to us. These galaxies in front work as a gravitational lens: they bend the light from the more distant galaxy. This lensing effect makes it easier for telescopes to spot distant objects (this is how ALMA could see galaxy SPT0311-58). But it also distorts the image of the object, making it harder to make out the details.

In this study, the astronomers saw MAMBO-9 directly, without a lens, and this allowed them to measure its mass. "The total mass of gas and dust in the galaxy is enormous: ten times more than all the stars in the Milky Way. This means that it has yet to build most of its stars," Casey explained. The galaxy consists of two parts, and it is in the process of merging.

Casey hopes to find more distant dusty galaxies in the ALMA survey, which will give insight into how common they are, how these massive galaxies formed so early in the universe, and why they are so dusty. "Dust is normally a by-product of dying stars," she said. "We expect one hundred times more stars than dust. But MAMBO-9 has not produced that many stars yet and we want to find out how dust can form so fast after the Big Bang."

"Observations with new and more capable technology can produce unexpected findings like MAMBO-9," said Joe Pesce, National Science Foundation Program Officer for NRAO and ALMA. "While it is challenging to explain such a massive galaxy so early in the history of the universe, discoveries like this allow astronomers to develop an improved understanding of, and ask ever more questions about, the universe."

Read more at Science Daily

The secret to a long life? For worms, a cellular recycling protein is key

Longevity clock concept illustration.
Scientists at Sanford Burnham Prebys Medical Discovery Institute have shown that worms live longer lives if they produce excess levels of a protein, p62, which recognizes toxic cell proteins that are tagged for destruction. The discovery, published in Nature Communications, could help uncover treatments for age-related conditions, such as Alzheimer's disease, which are often caused by accumulation of misfolded proteins.

"Research, including our own, has shown that lifespan can be extended by enhancing autophagy -- the process cells use to degrade and recycle old, broken and damaged cell components," says Malene Hansen, Ph.D., a professor in the Development, Aging and Regeneration Program at Sanford Burnham Prebys and senior author of the study. "Prior to this work, we understood that autophagy as a process was linked to aging, but the impact of p62, a selective autophagy protein, on longevity was unknown."

Scientists used to think that cellular recycling worked the same way for all waste products. In more recent years, researchers are learning that autophagy can be highly selective -- meaning the cell uses distinct "recycling trucks," such as the protein p62, to deliver different types of trash to cellular "recycling centers." For example, p62 is known to selectively deliver aggregated proteins and worn-out mitochondria (the powerplants of the cell) to recycling centers.

To better understand p62's role in cellular recycling and longevity, the scientists used short-lived, transparent roundworms called C. elegans for their studies. Previously, Hansen's team found that levels of p62 are increased after a short heat shock is administered to the worms. This proved to be beneficial to the animals and required for the longevity that is caused by mild heat stress.

These findings prompted the scientists to genetically engineer C. elegans to produce excess levels of the protein p62. Instead of their usual three-week lifespan, these worms lived for a month -- equivalent to a 20 to 30% lifespan extension. The researchers were intrigued to find that by increasing the levels of p62, the "recycling truck," the "recycling centers" became more abundant and were able to recycle more "trash," indicating that p62 is a driver of the recycling process.

"Now that we have confirmed that selective autophagy is important for longevity, we can move to our next step: identifying what harmful cellular 'trash' it is removing. With this knowledge, we hope to target specific cell components that are risk factors for longevity," says Caroline Kumsta, Ph.D., a research assistant professor in Hansen's lab and lead author of the study.

Many age-related diseases, including Alzheimer's and Huntington's disease, are caused by accumulation of toxic, misfolded proteins. Hansen and Kumsta previously showed that increased levels of p62 were able to improve lifespan in a C. elegans Huntington's disease model. Scientists are hopeful that studying selective autophagy via proteins like p62 could lead to therapies that clear the proteins that are detrimental to living a long, healthy life. Finding possible therapeutic avenues to age-related diseases is increasingly important as the U.S. population ages: In about a decade, about 20 percent of Americans -- about 71 million people -- will be 65 and older and at higher risk for chronic diseases.

While the scientists see much potential in their findings -- and are encouraged that the benefits of increasing p62 levels on longevity seem to be evolutionarily conserved as a recent study in fruit flies demonstrated -- they urge caution for direct translations to humans: High levels of p62 have been shown to be associated with cancer in humans.

Read more at Science Daily

NASA's treasure map for water ice on Mars

Mars illustration.
NASA has big plans for returning astronauts to the Moon in 2024, a stepping stone on the path to sending humans to Mars. But where should the first people on the Red Planet land?

A new paper published in Geophysical Research Letters will help by providing a map of water ice believed to be as little as an inch (2.5 centimeters) below the surface.

Water ice will be a key consideration for any potential landing site. With little room to spare aboard a spacecraft, any human missions to Mars will have to harvest what's already available for drinking water and making rocket fuel.

NASA calls this concept "in situ resource utilization," and it's an important factor in selecting human landing sites on Mars. Satellites orbiting Mars are essential in helping scientists determine the best places for building the first Martian research station. The authors of the new paper make use of data from two of those spacecraft, NASA's Mars Reconnaissance Orbiter (MRO) and Mars Odyssey orbiter, to locate water ice that could potentially be within reach of astronauts on the Red Planet.

"You wouldn't need a backhoe to dig up this ice. You could use a shovel," said the paper's lead author, Sylvain Piqueux of NASA's Jet Propulsion Laboratory in Pasadena, California. "We're continuing to collect data on buried ice on Mars, zeroing in on the best places for astronauts to land."

Buried Treasure on Mars

Liquid water can't last in the thin air of Mars; with so little air pressure, it evaporates from a solid to a gas when exposed to the atmosphere.

Martian water ice is locked away underground throughout the planet's mid-latitudes. These regions near the poles have been studied by NASA's Phoenix lander, which scraped up ice, and MRO, which has taken many images from space of meteor impacts that have excavated this ice. To find ice that astronauts could easily dig up, the study's authors relied on two heat-sensitive instruments: MRO's Mars Climate Sounder and the Thermal Emission Imaging System (THEMIS) camera on Mars Odyssey.

Why use heat-sensitive instruments when looking for ice? Buried water ice changes the temperature of the Martian surface. The study's authors cross-referenced temperatures suggestive of ice with other data, such as reservoirs of ice detected by radar or seen after meteor impacts. Data from Odyssey's Gamma Ray Spectrometer, which is tailor-made for mapping water ice deposits, were also useful.

As expected, all these data suggest a trove of water ice throughout the Martian poles and mid-latitudes. But the map reveals particularly shallow deposits that future mission planners may want to study further.

Picking a Landing Site

While there are lots of places on Mars scientists would like to visit, few would make practical landing sites for astronauts. Most scientists have homed in on the northern and southern mid-latitudes, which have more plentiful sunlight and warmer temperatures than the poles. But there's a heavy preference for landing in the northern hemisphere, which is generally lower in elevation and provides more atmosphere to slow a landing spacecraft.

A large portion of a region called Arcadia Planitia is the most tempting target in the northern hemisphere. The map shows lots of blue and purple in this region, representing water ice less than one foot (30 centimeters) below the surface; warm colors are over two feet (60 centimeters) deep. Sprawling black zones on the map represent areas where a landing spacecraft would sink into fine dust.

What's Next?

Piqueux is planning a comprehensive campaign to continue studying buried ice across different seasons, watching how the abundance of this resource changes over time.

The more we look for near-surface ice, the more we find," said MRO Deputy Project Scientist Leslie Tamppari of JPL. "Observing Mars with multiple spacecraft over the course of years continues to provide us with new ways of discovering this ice."

Read more at Science Daily

Greenland ice losses rising faster than expected

Melting icebergs by the coast of Greenland.
Greenland is losing ice seven times faster than in the 1990s and is tracking the Intergovernmental Panel on Climate Change's high-end climate warming scenario, which would see 40 million more people exposed to coastal flooding by 2100.

A team of 96 polar scientists from 50 international organisations have produced the most complete picture of Greenland ice loss to date. The Ice Sheet Mass Balance Inter-comparison Exercise (IMBIE) Team combined 26 separate surveys to compute changes in the mass of Greenland's ice sheet between 1992 and 2018. Altogether, data from 11 different satellite missions were used, including measurements of the ice sheet's changing volume, flow and gravity.

The findings, published today in Nature today, show that Greenland has lost 3.8 trillion tonnes of ice since 1992 -- enough to push global sea levels up by 10.6 millimetres. The rate of ice loss has risen from 33 billion tonnes per year in the 1990s to 254 billion tonnes per year in the last decade -- a seven-fold increase within three decades.

The assessment, led by Professor Andrew Shepherd at the University of Leeds and Dr Erik Ivins at NASA's Jet Propulsion Laboratory in California, was supported by the European Space Agency (ESA) and the US National Aeronautics and Space Administration (NASA).

In 2013, the Intergovernmental Panel on Climate Change (IPCC) predicted that global sea levels will rise by 60 centimetres by 2100, putting 360 million people at risk of annual coastal flooding. But this new study shows that Greenland's ice losses are rising faster than expected and are instead tracking the IPCC's high-end climate warming scenario, which predicts 7 centimetres more.

Professor Shepherd said: "As a rule of thumb, for every centimetre rise in global sea level another six million people are exposed to coastal flooding around the planet."

"On current trends, Greenland ice melting will cause 100 million people to be flooded each year by the end of the century, so 400 million in total due to all sea level rise."

"These are not unlikely events or small impacts; they are happening and will be devastating for coastal communities."

The team also used regional climate models to show that half of the ice losses were due to surface melting as air temperatures have risen. The other half has been due to increased glacier flow, triggered by rising ocean temperatures.

Ice losses peaked at 335 billion tonnes per year in 2011 -- ten times the rate of the 1990s -- during a period of intense surface melting. Although the rate of ice loss dropped to an average 238 billion tonnes per year since then, this remains seven times higher and does not include all of 2019, which could set a new high due to widespread summer melting.

Dr Ivins said: "Satellite observations of polar ice are essential for monitoring and predicting how climate change could affect ice losses and sea level rise."

"While computer simulation allows us to make projections from climate change scenarios, the satellite measurements provide prima facie, rather irrefutable, evidence."

"Our project is a great example of the importance of international collaboration to tackle problems that are global in scale."

Guðfinna Aðalgeirsdóttir, Professor of Glaciology at the University of Iceland and lead author of the Intergovernmental Panel on Climate Change's sixth assessment report, who was not involved in the study, said:

"The IMBIE Team's reconciled estimate of Greenland ice loss is timely for the IPCC. Their satellite observations show that both melting and ice discharge from Greenland have increased since observations started."

"The ice caps in Iceland had similar reduction in ice loss in the last two years of their record, but this last summer was very warm here and resulted in higher loss. I would expect a similar increase in Greenland mass loss for 2019."

Read more at Science Daily

Dec 10, 2019

Community characteristics shape climate change discussions after extreme weather

Political affiliations, the presence of local environmental organizations and prior local media coverage of climate change play a role in how a community reacts to an extreme weather event, an article published today in Nature Climate Change concludes.

"Extreme weather events such as a catastrophic wildfire, a 500-year flood or a record-breaking heatwave may result in some local discussion and action around climate change, but not as much as might be expected and not in every community," said Hilary Boudet, the paper's lead author and an associate professor of public policy in Oregon State University's School of Public Policy in the College of Liberal Arts.

"In terms of making links to climate change, local reactions to an extreme weather event depend both on aspects of the event itself, but also on political leanings and resources within the community before the event took place."

Boudet's work is part of a growing body of research exploring the links between personal experience with an extreme weather event and social mobilization around climate change. The study was part of a project examining community reactions to extreme weather in the U.S.

The researchers sought to better understand how extreme weather events might influence local attitudes and actions related to climate change.

Researchers conducted 164 interviews with local residents and community leaders in 15 communities across the United States that had experienced extreme weather events that caused at least four fatalities between 2012 and 2015. They also analyzed media coverage to better understand what kinds of public discussions, actions and policies around climate change occurred in those communities.

Of the 15 communities, nine showed evidence of public discussion about the event's connection to climate change in the wake of the disaster.

"Although many of the extreme events we studied spurred significant emergency response from volunteers and donations for rescue and recovery efforts, we found these events sparked little mobilization around climate change," Boudet said. "Yet there was also a distinct difference between cases where community climate change discussion occurred and where it did not, allowing us to trace pathways to that discussion."

When there was some scientific certainty that the weather event was related to climate change, discussion about the connection was more likely to occur, particularly in communities that leaned Democratic or where residents were highly educated, Boudet said.

However, even in communities where climate change was discussed in relation to the weather event, it was often a marginal issue. Other more immediate concerns, such as emergency response management and economic recovery, generated far more discussion and subsequent action.

Some of those interviewed suggested that broaching the topic of climate change amid disaster recovery efforts could be interpreted as using a tragedy to advance a political agenda, Boudet said.

"Recent shifts in U.S. opinions on climate change suggest that it may become a more acceptable topic of conversation following an extreme weather event," Boudet said. "Yet our results indicate that it may take time for such discussions to take place, particularly in Republican-leaning communities."

While the work challenges the notion that a single extreme weather event will yield rapid local social mobilization around climate change, the researchers found that communities may still make important changes post-event to ensure more effective responses to future events.

Read more at Science Daily

What blocks bird flu in human cells?

Normally, bird flu viruses do not spread easily from person to person. But if this does happen, it could trigger a pandemic. Researchers from the MDC and RKI have now explained in the journal Nature Communications what makes the leap from animals to humans less likely.

Whenever people suddenly become infected with a bird flu virus such as H5N1, H7N9, and H5N6, the World Health Organization (WHO) has to assess the risk: Are these the first signs of a pandemic? Or is it just a few dozen or hundred cases that have only arisen through close contact with infected poultry? Researchers led by Professor Matthias Selbach from the Max Delbrueck Center for Molecular Medicine (MDC) have now found another piece of the puzzle that may be important in this initial assessment. In a paper published in Nature Communications, the researchers explain that avian influenza A viruses (IAVs) are unable to transform infected human cells into effective virus factories, because they do not produce enough of the matrix protein M1 following infection. The virus requires this protein, however, to export its many copies of its genetic material from the cell nucleus -- a prerequisite for building new viruses.

Not all flu is the same -- the name refers to a large family of viruses. Each member of this family is named after two prickly growths on the virus's surface: hemagglutinin (H), which enables the virus to infect human and animal cells where it can multiply, and neuraminidase (N), which helps the virus's offspring to extract themselves from the infected cell. In waterfowl, there are 16 known hemagglutinin subtypes and nine known neuraminidase subtypes. That results in at least 144 possible combinations that are constantly changing and adapting to new hosts -- like chickens, for example, but also mammals including horses, pigs, and humans.

Such new virus variants are often more dangerous than seasonal flu, because the human immune system has never encountered them before. Some people find themselves defenseless, while the immune system of others reacts so violently that the person's own resistance damages the body. In the worst case scenario, a pandemic could cost millions of lives. The Spanish flu of 1918, for example, claimed more than 50 million victims. Researchers around the world are therefore trying to understand the rules that determine when there is the possibility of a pandemic, and when there is not.

Why are human cells bad virus factories for bird flu?


"Hemagglutinin in humans and birds has a slightly different chemical structure, for example, which makes it more difficult for an avian influenza virus to infiltrate a human cell than a bird's cell," explains Selbach. Boris Bogdanow, a PhD student in Selbach's research group and the lead author of the current study, focused his research specifically on what other natural species barriers exist in flu viruses.

Matthias Selbach's group analyses proteins using quantitative mass spectrometry. In collaboration with the Robert Koch Institute (RKI), Boris Bogdanow and his colleagues infected human pulmonary epithelial cells separately with a bird flu virus and a human flu virus. They then measured the quantity of all newly produced proteins in the mass spectrometer. Postdoctoral researcher Dr. Katrin Eichelbaum had also developed a method that enables the precise differentiation of new and old proteins. "In the first analysis, we did not find any major differences between the two strains," reports Boris Bogdanow. "At first glance, the avian flu virus and the human virus displayed little difference with regard to protein production, which was quite surprising."

But the devil is in the detail, so Bogdanow performed more in-depth analyses to take a closer look at the protein distribution. In doing so, he came across the matrix protein M1, much larger quantities of which were produced in the lung cells infected with the human virus. The M1 protein is responsible, among other things, for exporting the replicated viral RNA from the nucleus of the infected cells and then assembling it with other newly produced viral proteins to form flu virus offspring. Could it be, therefore, that the viral RNA of bird flu viruses in human cells remains trapped in the cell nucleus because too little M1 protein is present?

Another piece of the puzzle

Fluorescence microscopic investigations confirmed these suspicions. The genetic material of the bird flu virus was far less capable of breaking out of the cell nucleus than the RNA of the human flu virus. But why? With the help of the MDC's sequencing platform and Professor Irmtraud Meyer, they discovered a small segment in the viral RNA of the avian flu virus that affects alternative splicing. "We call this a cis-regulatory element," says Bogdanow. "Alternative splicing regulates which proteins are ultimately made from a single gene, because many genes code for more than one protein. When human cells are attacked by bird flu, this element ensures that more M2 rather than M1 protein is produced."

In order to assess the relevance of this result, Professor Thorsten Wolff and his research team from the Robert Koch Institute transferred the cis-regulatory element from the bird virus to the human virus. This did indeed result in the human flu virus replicating less effectively in human lung cells. Selbach's team even conducted a similar experiment with Spanish flu viruses, whose genetic material was isolated in the nineties from graves in the permafrost soil of Alaska. However, they only used a small part of the viral RNA and not the entire virus for the experiment. Nevertheless, they were also able to confirm their theory on the cis-regulatory element for this virus.

Read more at Science Daily

Ice in motion: Satellites capture decades of change

New time-lapse videos of Earth's glaciers and ice sheets as seen from space -- some spanning nearly 50 years -- are providing scientists with new insights into how the planet's frozen regions are changing. At the annual meeting of the American Geophysical Union in San Francisco, scientists released new time series of images of Alaska, Greenland, and Antarctica using data from satellites including the NASA-U.S. Geological Survey Landsat missions. One series of images tells illustrates the dramatic changes of Alaska's glaciers and could warn of future retreat of the Hubbard Glacier. Over Greenland, different satellite records show a speed-up of glacial retreat starting in 2000, as well as meltwater ponds spreading to higher elevations in the last decade, which could potentially speed up ice flow. And in Antarctic ice shelves, the view from space could reveal lakes hidden beneath the winter snow.

Using images from the Landsat mission dating back to 1972 and continuing through 2019, glaciologist Mark Fahnestock of the University of Alaska Fairbanks, has stitched together six-second time-lapses of every glacier in Alaska and the Yukon.

"We now have this long, detailed record that allows us to look at what's happened in Alaska," Fahnestock said. "When you play these movies, you get a sense of how dynamic these systems are and how unsteady the ice flow is."

The videos clearly illustrate what's happening to Alaska's glaciers in a warming climate, he said, and highlight how different glaciers respond in varied ways. Some show surges that pause for a few years, or lakes forming where ice used to be, or even the debris from landslides making its way to the sea. Other glaciers show patterns that give scientists hints of what drives glacier changes.

The Columbia Glacier, for example, was relatively stable when the first Landsat satellite launched 1972. But starting in the mid-1980s, the glacier's front began retreating rapidly, and by 2019 was 12.4 miles (20 kilometers) upstream. In comparison, the Hubbard Glacier has advanced 3 miles (5 km) in the last 48 years. But Fahnestock's time-lapse ends with a 2019 image that shows a large indentation in the glacier, where ice has broken off.

"That calving embayment is the first sign of weakness from Hubbard Glacier in almost 50 years -- it's been advancing through the historical record," he said. If such embayments persist in the coming years, it could be a sign that change could be coming to Hubbard, he said: "The satellite images also show that these types of calving embayments were present in the decade before Columbia retreated."

The Landsat satellites have provided the longest continuous record of Earth from space. The USGS has reprocessed old Landsat images, which allowed Fahnestock to handpick the clearest Landsat scenes for each summer, over each glacier. With software and computing power from Google Earth Engine, he created the series of time-lapse videos.

Scientists are using long-term satellite records to look at Greenland glaciers as well. Michalea King of Ohio State University analyzed data from Landsat missions dating back to 1985 to study more than 200 of Greenland's large outlet glaciers. She examined how far the glacier fronts have retreated, how fast the ice flows, and how much ice glaciers are losing over this time span.

She found that Greenland's glaciers retreated an average of about 3 miles (5 km) between 1985 and 2018 -- and that the most rapid retreat occurred between 2000 and 2005. And when she looked at the amount of glacial ice entering the ocean, she found that it was relatively steady for the first 15 years of the record, but then started increasing around 2000.

"These glaciers are calving more ice into the ocean than they were in the past," King said. "There is a very clear relationship between the retreat and increasing ice mass losses from these glaciers during the 1985-through-present record. "While King is analyzing ice lost from the front of glacier, James Lea of the University of Liverpool in the United Kingdom is using satellites data to examine ice melting on top of Greenland's glaciers and ice sheets, which creates meltwater lakes.

These meltwater lakes can be up to 3 miles (5 km) across and can drain through the ice in a matter of hours, Lea said, which can impact how fast the ice flows. With the computing power of Google Earth Engine, Lea analyzed images of the Greenland ice sheet from the Moderate Resolution Imaging Spectroradiometer (MODIS) on the Terra satellites for every day of every melt seasons over last 20 years -- more than 18,000 images in all.

"We looked at how many lakes there are per year across the ice sheet and found an increasing trend over the last 20 years: a 27 percent increase in lakes," Lea said. "We're also getting more and more lakes at higher elevations -- areas that we weren't expecting to see lakes in until 2050 or 2060."

When these high-elevation meltwater ponds punch through the ice sheet and drain, it could cause the ice sheet to speed up, he said, thinning the ice and accelerating its demise.

It doesn't always take decades worth of data to study polar features -- sometimes just a year or two will provide insights. The Antarctic ice sheet experiences surface melt, but there are also lakes several meters below the surface, insulated by layers of snow. To see where these subsurface lakes are, Devon Dunmire of the University of Colorado, Boulder, used microwave radar images from the European Space Agency's Sentinel-1 satellite. Snow and ice are basically invisible to microwave radiation, but liquid water strongly absorbs it.

Dunmire's new study, presented at the AGU meeting, found lakes dotting the George VI and Wilkins ice shelves near the Antarctic Peninsula -- even a few that remained liquid throughout the winter months. These hidden lakes might be more common than scientists had thought, she said, noting that she is continuing to look for similar features across the continent's ice shelves.

Read more at Science Daily

Killer whale grandmothers boost survival of calves

Pod of killer whales
Post-menopausal killer whale grandmothers improve the chances of survival for their grand-calves, new research has found.

The study found that grandmothers who were no longer able to reproduce had the biggest beneficial impact on the survival chances of their grand-offspring. This may be because grandmothers without calves of their own are free to focus time and resources on the latest generation, the researchers suggest.

The research team also found that grandmothers had a particularly important role in times of food scarcity, as the impact on a calf of losing a post-menopausal grandmother was highest in years when salmon was scarce.

Previous research has shown that post-reproductive female killer whales are the most knowledgeable and provide an important leadership role for the group when foraging in salmon grounds.

These benefits to the group may help to solve the long-standing mystery of why the menopause has evolved in some species of whales and in humans, the authors of the study say.

Senior author of the study, Dr Dan Franks from the Department of Biology, at the University of York, said: "The study suggests that breeding grandmothers are not able to provide the same level of support as grandmothers who no longer breed. This means that the evolution of menopause has increased a grandmother's capacity to help her grand-offspring.

"The death of a post-menopausal grandmother can have important repercussions for her family group, and this could prove to be an important consideration when assessing the future of these populations. As salmon populations continue to decline, grandmothers are likely to become even more important in these killer whale populations."

The study involved an international research team from the Universities of York and Exeter (UK), the Centre for Whale Research (USA) and Fisheries and Oceans Canada.

The scientists analysed 36 years of data gathered by the Center for Whale Research and Fisheries and Oceans Canada on two populations of resident killer whales. The populations (which include several pods, made up of multiple family groups) live off the North West Pacific Coast of Canada and the US and feed on Chinook salmon.

In resident killer whales, both sons and daughters stay with their mothers for life, but they mate with individuals from a different family group. Male killer whales typically have a shorter lifespan than females with many not surviving beyond 30 years. Females usually stop reproducing in their 30s-40s, but just like humans they can live for many decades following menopause.

Lead author, Dr Stuart Nattrass, from the University of York, added: "The findings help to explain factors that are driving the whales' survival and reproductive success, which is essential information given that the Southern Resident killer whales -- one of the whale populations under study -- is listed as endangered and at risk of extinction.

"We suspect when breeding grandmothers are supporting their own calves, their movement and activity patterns are constrained and they are not able to provide support and leadership in the same way as post-menopausal females. Also, grandmothers with their own calves will be busy caring for their own calves, and be able to invest less in their grand-offspring, compared to post-menopausal grandmothers.

We are currently conducting observational studies with drones to directly study helping behaviour between family members in these killer whales."

Co-author of the study, Prof Darren Croft from the University of Exeter, said "The menopause has only evolved in humans, killer whales and three other species of toothed whales and understanding why females of these species stop reproduction well before the end of life is a long standing evolutionary puzzle.

Read more at Science Daily

Dec 9, 2019

Explaining the tiger stripes of Saturn's moon Enceladus

Saturn's tiny, frozen moon Enceladus is a strange place. Just 300 miles across, the moon is thought to have an outer shell of ice covering a global ocean 20 miles deep, encasing a rocky core. Slashed across Enceladus' south pole are four straight, parallel fissures or "tiger stripes" from which water erupts. These fissures aren't quite like anything else in the Solar System.

"We want to know why the eruptions are located at the south pole as opposed to some other place on Enceladus, how these eruptions can be sustained over long periods of time and finally why these eruptions are emanating from regularly spaced cracks," said Max Rudolph, assistant professor of earth and planetary sciences at the University of California, Davis.

Rudolph and colleagues Douglas Hemingway of the Carnegie Institution for Science, Washington D.C. and Michael Manga of UC Berkeley now think they have a good explanation for Enceladus' erupting stripes. They used numerical modeling to understand the forces acting on Enceladus' icy shell.

Saturn's gravity exerts tidal forces on Enceladus, which cause heating and cooling of the tiny world. Those forces are strongest at the poles. As liquid water solidifies into ice under the outer ice shell, it expands in volume, putting pressure on the ice until it cracks.

Enceladus' surface temperature is about negative 200 degrees Celsius, so if a crack formed in the ice, you would expect it to freeze shut pretty quickly. Yet the south polar fissures remain open, and in fact reach all the way to the liquid ocean below.

That's because liquid water within the fissure is sloshed around by tidal forces produced by Saturn's gravity, releasing energy as heat, Rudolph said. That stops the crack from freezing shut.

The release of pressure from the fissures stops new cracks from forming elsewhere on the moon, such as at the north pole. But at the same time, water vented from the crack falls back as ice, building up the edges of the fissure and weighing it down a bit. That causes the ice sheet to flex, the researchers calculate, just enough to set off a parallel crack about 20 miles away.

"Our model explains the regular spacing of the cracks," Rudolph said.

From Science Daily

How planets may form after dust sticks together

Scientists may have figured out how dust particles can stick together to form planets, according to a Rutgers co-authored study that may also help to improve industrial processes.

In homes, adhesion on contact can cause fine particles to form dust bunnies. Similarly in outer space, adhesion causes dust particles to stick together. Large particles, however, can combine due to gravity -- an essential process in forming asteroids and planets. But between these two extremes, how aggregates grow has largely been a mystery until now.

The study, published in the journal Nature Physics, found that particles under microgravity -- similar to conditions believed to be in interplanetary space -- develop strong electrical charges spontaneously and stick together, forming large aggregates. Remarkably, although like charges repel, like-charged aggregates form nevertheless, apparently because the charges are so strong that they polarize one another and therefore act like magnets.

Related processes seem to be at work on Earth, where fluidized bed reactors produce everything from plastics to pharmaceuticals. During this process, blowing gas pushes fine particles upwards and when particles aggregate due to static electricity, they can stick to reactor vessel walls, leading to shutdowns and poor product quality.

"We may have overcome a fundamental obstacle in understanding how planets form," said co-author Troy Shinbrot, a professor in the Department of Biomedical Engineering in the School of Engineering at Rutgers University-New Brunswick. "Mechanisms for generating aggregates in industrial processes have also been identified and that -- we hope -- may be controlled in future work. Both outcomes hinge on a new understanding that electrical polarization is central to aggregation."

The study, led by researchers at the University of Duisburg-Essen in Germany, opens avenues to potentially controlling fine particle aggregation in industrial processing. It appears that introducing additives that conduct electricity may be more successful for industrial processes than traditional electrostatic control approaches, according to Shinbrot.

The researchers want to investigate effects of material properties on sticking and aggregation, and potentially develop new approaches to generating and storing electricity.

From Science Daily

You create your own false information, study finds

Along with partisan news outlets and political blogs, there's another surprising source of misinformation on controversial topics -- it's you.

A new study found that people given accurate statistics on a controversial issue tended to misremember those numbers to fit commonly held beliefs.

For example, when people are shown that the number of Mexican immigrants in the United States declined recently -- which is true but goes against many people's beliefs -- they tend to remember the opposite.

And when people pass along this misinformation they created, the numbers can get further and further from the truth.

"People can self-generate their own misinformation. It doesn't all come from external sources," said Jason Coronel, lead author of the study and assistant professor of communication at The Ohio State University.

"They may not be doing it purposely, but their own biases can lead them astray. And the problem becomes larger when they share their self-generated misinformation with others."

Coronel conducted the study with Shannon Poulsen and Matthew Sweitzer, both doctoral students in communication at Ohio State. The study was published online in the journal Human Communication Research and will appear in a future print edition.

The researchers conducted two studies.

In the first study, the researchers presented 110 participants with short written descriptions of four societal issues that involved numerical information.

On two of those societal issues, the researchers did pre-tests and found that the factually accurate numerical relationship fit with many people's understanding of the issue. For example, many people generally expect more Americans to support same-sex marriage than oppose it, which coincides with public opinion polls.

But the researchers also presented participants with two issues for which the numbers didn't fit with how most people viewed the topics.

For example, most people believe that the number of Mexican immigrants in the United States grew between 2007 and 2014. But in fact, the number declined from 12.8 million in 2007 to 11.7 million in 2014.

After reading all the descriptions of the issues, the participants got a surprise. They were asked to write down the numbers that that were in the descriptions of the four issues. They were not told in advance they would have to memorize the numbers.

The researchers found that people usually got the numerical relationship right on the issues for which the stats were consistent with how many people viewed the world. For example, participants typically wrote down a larger number for the percentage of people who supported same-sex marriage than for those who opposed it -- which is the true relationship.

But when it came to the issues where the numbers went against many people's beliefs -- such as whether the number of Mexican immigrants had gone up or down -- participants were much more likely to remember the numbers in a way that agreed with their probable biases rather than the truth.

"We had instances where participants got the numbers exactly correct -- 11.7 and 12.8 -- but they would flip them around," Coronel said.

"They weren't guessing -- they got the numbers right. But their biases were leading them to misremember the direction they were going."

By using eye-tracking technology on participants while they read the descriptions of the issues, the researchers had additional evidence that people really were paying attention when they viewed the statistics.

"We could tell when participants got to numbers that didn't fit their expectations. Their eyes went back and forth between the numbers, as if they were asking 'what's going on.' They generally didn't do that when the numbers confirmed their expectations," Coronel said.

"You would think that if they were paying more attention to the numbers that went against their expectations, they would have a better memory for them. But that's not what we found."

In the second study, the researchers investigated how these memory distortions could spread and grow more distorted in everyday life. They designed a study similar to the childhood game of "telephone."

For example, the first person in the "telephone chain" in this study saw the accurate statistics about the trend in Mexican immigrants living in the United States (that it went down from 12.8 million to 11.7 million). They had to write those numbers down from memory, which were then passed along to the second person in the chain, who had to remember them and write them down. The second person's estimates were then sent to a third participant.

Results showed that, on average, the first person flipped the numbers, saying that the number of Mexican immigrants increased by 900,000 from 2007 to 2014 instead of the truth, which was that it decreased by about 1.1 million.

By the end of the chain, the average participant had said the number of Mexican immigrants had increased in those 7 years by about 4.6 million.

"These memory errors tended to get bigger and bigger as they were transmitted between people," Sweitzer said.

Coronel said the study did have limitations. For example, it is possible that the participants would have been less likely to misremember if they were given explanations as to why the numbers didn't fit expectations. And the researchers didn't measure each person's biases going in -- they used the biases that had been identified by pre-tests they conducted.

Finally, the telephone game study did not capture important features of real-life conversations that may have limited the spread of misinformation.

But the results did suggest that we shouldn't worry only about the misinformation that we run into in the outside world, Poulsen said.

"We need to realize that internal sources of misinformation can possibly be as significant as or more significant than external sources," she said.

Read more at Science Daily

How playing the drums changes the brain

People who play drums regularly for years differ from unmusical people in their brain structure and function. The results of a study by researchers from Bochum suggest that they have fewer, but thicker fibres in the main connecting tract between the two halves of the brain. In addition, their motor brain areas are organised more efficiently. This is the conclusion drawn by a research team headed by Dr. Lara Schlaffke from the Bergmannsheil university clinic in Bochum and Associate Professor Dr. Sebastian Ocklenburg from the biopsychology research unit at Ruhr-Universität Bochum following a study with magnetic resonance imaging (MRI). The results have been published in the journal Brain and Behavior, online on 4 December 2019.

Drummers were never previously studied


"It has long been understood that playing a musical instrument can change the brain via neuroplastic processes," says Sarah Friedrich, who wrote her bachelor's thesis on this project. "But no one had previously looked specifically into drummers," she adds.

The researchers from Bochum were interested in this group because their motor coordination far surpasses that of untrained people. "Most people can only perform fine motor tasks with one hand and have problems playing different rhythms with both hands at the same time," explains Lara Schlaffke. "Drummers can do things that are impossible for untrained people."

Drumming first, then brain scans

The team intended to gain new insights into the organisation of complex motor processes in the brain by identifying the changes in the brain caused by this training. The researchers tested 20 professional drummers who have played their instrument for an average of 17 years and currently practice for more than ten hours per week. They examined them using various MRI imaging techniques that provide insights into the structure and function of the brain. They then compared the data with measurements of 24 unmusical control subjects. In the first step, both groups had to play drums to test their abilities and were then examined in the MRI scanner.

More efficient motor processing

Drummers presented clear differences in the front part of the corpus callosum, a brain structure that connects the two hemispheres and whose front part is responsible for motor planning. The data indicated that the drummers had fewer but thicker fibres in this important connecting tract between the brain hemispheres. This allows musicians to exchange information between the hemispheres more quickly than the controls. The structure of the corpus callosum also predicted the performance in the drum test: the higher the measure of the thickness of the fibres in the corpus callosum, the better the drumming performance.

Moreover, the brain of drummers was less active in motor tasks than that of control subjects. This phenomenon is referred to as sparse sampling: a more efficient brain organisation in the areas leads to less activation in professionals.

Read more at Science Daily

Aspirin's health benefits under scrutiny

Taking a baby aspirin every day to prevent a heart attack or stroke should no longer be recommended to patients who haven't already experienced one of these events.

That's according to a new study published in Family Practice.

Nearly one-quarter of Americans over the age of 40 have reported taking aspirin daily even if they don't have a history of heart disease or stroke.

That's a problem, says study author University of Georgia researcher Mark Ebell.

As a physician and epidemiologist at UGA's College of Public Health, Ebell's work evaluates the evidence underpinning clinical practice and health behaviors. The current recommendation for taking aspirin as the primary form of heart attack or stroke prevention is limited to adults aged 50 to 69 who have an increased cardiovascular risk.

"We shouldn't just assume that everyone will benefit from low-dose aspirin, and in fact the data show that the potential benefits are similar to the potential harms for most people who have not had a cardiovascular event and are taking it to try to prevent a first heart attack or stroke," said Ebell.

Aspirin was first found to reduce the risk of fatal and nonfatal heart attacks 30 years ago, and subsequent studies found evidence that aspirin may also reduce risk of stroke and colon cancer.

But aspirin use has always carried risks, said Ebell, namely bleeding in the stomach and brain.

More recent studies have begun to suggest that potential harms of taking aspirin may outweigh the benefits by today's medical standards.

"If you look back in the 1970s and '80s when a lot of these original studies were done, patients were not taking statin drugs to control cholesterol, their blood pressure was not as well controlled, and they weren't getting screenings for colorectal cancer," said Ebell.

Ebell and his colleague Frank Moriarty of the Royal College of Surgeons in Ireland compared aspirin studies using patient data from 1978 to 2002 to four large-scale aspirin trials occurring after 2005, when statin use and colorectal cancer screenings had become more widespread.

They found that for 1,000 patients treated for five years, there were four fewer cardiovascular events and seven more major hemorrhages. Ebell was particularly alarmed by the number of brain bleeds experienced by aspirin users.

"About 1 in 300 persons who took aspirin for five years experienced a brain bleed. That's pretty serious harm. This type of bleeding can be fatal. It can be disabling, certainly," he said. "One in 300 is not something that the typical doctor is going to be able to pick up on in their practice. That's why we need these big studies to understand small but important increases in risk."

Ebell cautions people who are concerned about their cardiovascular risk, but who haven't had a heart attack or stroke, to talk with their doctors about other ways to prevent a major event.

These days, he says, treatment for blood pressure, cholesterol and diabetes are more aggressive, and the rate of other risk factors like smoking has dropped.

Read more at Science Dialy

Dec 8, 2019

Scientists reliably predict people's age by measuring proteins in blood

The carnival worker who tries to guess your age relies on aspects of your appearance, such as your posture and whether any wrinkles emanate from the corners of your eyes and lips. If the carny's guess is more than a few years off, you win a stuffed koala.

But a team of Stanford University School of Medicine scientists doesn't need to know how you look to guess your age. Instead, it watches a kind of physiological clock: the levels of 373 proteins circulating in your blood. If the clock is off, you don't win a plush toy. But you may find out important things about your health.

"We've known for a long time that measuring certain proteins in the blood can give you information about a person's health status -- lipoproteins for cardiovascular health, for example," said Tony Wyss-Coray, PhD, professor of neurology and neurological sciences, the D. H. Chen Professor II and co-director of the Stanford Alzheimer's Disease Research Center. "But it hasn't been appreciated that so many different proteins' levels -- roughly a third of all the ones we looked at -- change markedly with advancing age."

Changes in the levels of numerous proteins that migrate from the body's tissues into circulating blood not only characterize, but quite possibly cause, the phenomenon of aging, Wyss-Coray said.

A paper describing the research will be published Dec. 5 in Nature Medicine. Wyss-Coray is the senior author. The lead author is neurology instructor Benoit Lehallier, PhD.

'Proteins are the workhorses'

The researchers analyzed plasma -- the cell-free, fluid fraction of blood -- from 4,263 people ages 18-95. "Proteins are the workhorses of the body's constituent cells, and when their relative levels undergo substantial changes, it means you've changed, too," Wyss-Coray said. "Looking at thousands of them in plasma gives you a snapshot of what's going on throughout the body."

The study's results suggest that physiological aging does not simply proceed at a perfectly even pace, but rather seems to chart a more herky-jerky trajectory, with three distinct inflection points in the human life cycle. Those three points, occurring on average at ages 34, 60 and 78, stand out as distinct times when the number of different blood-borne proteins that are exhibiting noticeable changes in abundance rises to a crest. This happens because instead of simply increasing or decreasing steadily or staying the same throughout life, the levels of many proteins remain constant for a while and then at one point or another undergo sudden upward or downward shifts. These shifts tend to bunch up at three separate points in a person's life: young adulthood, late middle age and old age.

The investigators built their clock by looking at composite levels of proteins within groups of people rather than in individuals. But the resulting formula proved able to predict individuals' ages within a range of three years most of the time. And when it didn't, there was an interesting upshot: People whose predicted age was substantially lower than their actual one turned out to be remarkably healthy for their age.

The researchers obtained their samples from two large studies. One of them, known as the LonGenity study, has assembled a registry of exceptionally long-lived Ashkenazi Jews. It was able to provide many blood samples from people as old as 95.

On measuring the levels of roughly 3,000 proteins in each individual's plasma, Wyss-Coray's team identified 1,379 proteins whose levels varied significantly with participants' age.

Divergence

A reduced set of 373 of those proteins was sufficient for predicting participants' ages with great accuracy, the study said. But there were cases of substantial divergence between participants' chronological and physiological age -- for example, among the subjects in the LonGenity study, with their genetic proclivity toward exceptionally good health in what for most of us is advanced old age.

"We had data on hand-grip strength and cognitive function for that group of people," Wyss-Coray. "Those with stronger hand grips and better measured cognition were estimated by our plasma-protein clock to be younger than they actually were."

The study also strengthened the case that men and women, who were about equally represented in the study, age differently. Of the proteins the analysis found to change with age, 895 -- nearly two-thirds -- were significantly more predictive for one sex than for the other.

"The differences were striking," Wyss-Coray said. He added that this finding strongly supports the rationale for the National Institutes of Health's policy, instituted in 2016, promoting increased inclusion of women in clinical trials and the demarcating of sex as a biological variable.

Any clinical applications of the technique are a good five to 10 years off, he said. With further validation, though, it could be used not only to identify individuals who appear to be aging rapidly -- and, therefore, at risk of age-linked conditions such as Alzheimer's disease or cardiovascular disease -- but also to find drugs or other therapeutic interventions, like leafy green vegetables, that slow the aging process, or conversely to flash an early warning of a drug's unanticipated tendency to accelerate aging.

"Ideally, you'd want to know how virtually anything you took or did affects your physiological age," Wyss-Coray said.

While the words "373 proteins" may conjure up the image of a transfusion-sized blood extraction, a drop is all it takes for a 373-protein readout.

Read more at Science Daily