Mar 7, 2020

World-first system forecasts warming of lakes globally

A groundbreaking study will enable scientists to better predict future warming of the world's lakes due to climate change, and the potential threat to cold-water species such as salmon and trout.

Pioneering research led by the UK Centre for Ecology & Hydrology (UKCEH) has devised the first system that classifies lakes globally, placing each of them in one of nine 'thermal regions' (see map).

Lakes are grouped depending on their seasonal patterns of surface water temperatures, with the coldest thermal region including lakes in Alaska, Canada, northern Russia and China, and the warmest covering lakes in equatorial South America, Africa, India and south-east Asia.

By incorporating climate change models, the scientists predict that by the year 2100, for the most extreme climate change scenario, average lake temperature will be around 4 degrees Celsius warmer and that 66 per cent of lakes globally will be classified in a warmer thermal region than they are now.

The study -- carried out by UKCEH, the Universities of Dundee, Glasgow, Reading and Stirling, plus the Dundalk Institute of Technology -- was funded by the Natural Environment Research Council (NERC) and has been published in the journal Nature Communications.

Professor Stephen Maberly of UKCEH, lead author of the study, explains: "Thanks to cutting-edge analysis using satellite images of more than 700 lakes, taken twice a month over 16 years, we produced the first global lake temperature classification scheme. By combining this with a lake model and climate change scenarios we were able to identify that northern lakes, such as those in the UK, will be particularly sensitive to climate change."

Even relatively small changes in temperature can have a significant negative impact on aquatic wildlife, affecting the speed at which organisms grow and feed, and when they reproduce. As species do not react in the same way, prey and predators have increasingly different breeding and feeding cycles, reducing the amount of potential food available.

Warming also increases the risk of harmful algal blooms, which can have a negative impact on aquatic plants and fish.

Professor Maberly says: "Cold-water fish species in particular can be stressed by warmer temperatures. The potential negative impact on salmonids such as salmon, trout and Arctic charr, for example, is concerning because they play a central ecological role within food webs and also have great economic importance."

The research is aimed at scientists interested in freshwater ecology, climate change, greenhouse gas emissions and biogeochemical cycles.

Professor Andrew Tyler of the University of Stirling, who led the overall project, GloboLakes, says: "This is an example of pioneering UK-led research that has delivered the capability to monitor our inland waters at the global scale from satellite based platforms.

Read more at Science Daily

Caffeine boosts problem-solving ability but not creativity, study indicates

Caffeine increases the ability to focus and problem solve, but a new study by a University of Arkansas researcher indicates it doesn't stimulate creativity.

"In Western cultures, caffeine is stereotypically associated with creative occupations and lifestyles, from writers and their coffee to programmers and their energy drinks, and there's more than a kernel of truth to these stereotypes," wrote Darya Zabelina, assistant professor of psychology and first author of the study recently published in the journal Consciousness and Cognition.

While the cognitive benefits of caffeine -- increased alertness, improved vigilance, enhanced focus and improved motor performance -- are well established, she said, the stimulant's affect on creativity is less known.

In the paper, Zabelina differentiates "convergent" from "divergent" thinking. The former is defined as seeking a specific solution to a problem, for example, the "correct" answer. The latter is characterized by idea generation where a large set of apt, novel or interesting responses would be suitable. Caffeine was shown to improve convergent thinking in the study, while consuming it had no significant impact on divergent thinking.

For the study, 80 volunteers were randomly given either a 200mg caffeine pill, equivalent to one strong cup of coffee, or a placebo. They were then tested on standard measures of convergent and divergent thinking, working memory and mood. In addition to the results on creativity, caffeine did not significantly affect working memory, but test subjects who took it did report feeling less sad.

"The 200mg enhanced problem solving significantly, but had no effect on creative thinking," said Zabelina. "It also didn't make it worse, so keep drinking your coffee; it won't interfere with these abilities."

From Science Daily

Mar 6, 2020

ALMA spots metamorphosing aged star

An international team of astronomers using the Atacama Large Millimeter/submillimeter Array (ALMA) has captured the very moment when an old star first starts to alter its environment. The star has ejected high-speed bipolar gas jets which are now colliding with the surrounding material; the age of the observed jet is estimated to be less than 60 years. These features help scientists understand how the complex shapes of planetary nebulae are formed.

Sun-like stars evolve to puffed-up Red Giants in the final stage of their lives. Then, the star expels gas to form a remnant called a planetary nebula. There is a wide variety in the shapes of planetary nebulae; some are spherical, but others are bipolar or show complicated structures. Astronomers are interested in the origins of this variety, but the thick dust and gas expelled by an old star obscure the system and make it difficult to investigate the inner-workings of the process.

To tackle this problem, a team of astronomers led by Daniel Tafoya at Chalmers University of Technology, Sweden, pointed ALMA at W43A, an old star system around 7000 light years from Earth in the constellation Aquila, the Eagle.

Thanks to ALMA's high resolution, the team obtained a very detailed view of the space around W43A. "The most notable structures are its small bipolar jets," says Tafoya, the lead author of the research paper published by the Astrophysical Journal Letters. The team found that the velocity of the jets is as high as 175 km per second, which is much higher than previous estimations. Based on this speed and the size of the jets, the team calculated the age of the jets to be less than a human life-span.

"Considering the youth of the jets compared to the overall lifetime of a star, it is safe to say we are witnessing the 'exact moment' that the jets have just started to push through the surrounding gas," explains Tafoya. "The jets carve through the surrounding material in as little as 60 years. A person could watch their progress throughout their lifetime."

In fact, the ALMA image clearly maps the distribution of dusty clouds entrained by the jets, which is telltale evidence that it is impacting on the surroundings.

The team suggests how this entrainment could be the key to formimg a bipolar-shaped planetary nebula. In their scenario, the aged star originally ejects gas spherically and the core of the star loses its envelope. If the star has a companion, gas from the companion pours onto the core of the dying star, and a portion of this new gas forms the jets. Therefore, whether or not the old star has a companion is an important factor to determine the structure of the resulting planetary nebula.

"W43A is one of the peculiar so called 'water fountain' objects," says Hiroshi Imai at Kagoshima University, Japan, a member of the team. "These are old stars which show characteristic radio emission from water molecules. Our ALMA observations lead us to think that the water heated to generate the radio emission is located the interface region between the jets and the surrounding material. Perhaps all these 'water fountain' sources consist of a central binary system which has just launched a new, double jet, just like W43A."

The team are already working on new ALMA observations of other, similar stars. They are hoping to gain new insight into how planetary nebulae form, and what lies in the future for stars like the Sun.

Read more at Science Daily

As farming developed, so did cooperation -- and violence

The growth of agriculture led to unprecedented cooperation in human societies, a team of researchers, has found, but it also led to a spike in violence, an insight that offers lessons for the present.

A new study out today in Environmental Archaeology by collaborators from UConn, the University of Utah, Troy University, and California State University, Sacramento examines the growth of agriculture in Eastern North America 7,500 to 5,000 years ago, and finds that while the domestication of plants fostered new cooperation among people, it also saw the rise of organized, intergroup violence.

"We were interested in understanding why people would make the shift from hunting and gathering to farming," says Elic Weitzel, a UConn Ph.D. student in anthropology. "Then I started to get interested in what happened in society after they made that shift and started farming on a larger scale."

The team used the "ideal free distribution" model to look at patterns of how individuals distribute themselves in an area, meaning places where people will begin occupying the best locations first. A number of factors make an area more suitable such as access to food, water, raw materials, and shelter. To measure suitability, the team looked at an indicator called "net primary productivity," which is a measure of available energy based on the plants in the area. In areas of higher net primary productivity, there were more people clustered together -- and more conflict.

"If you are living in a suitable area, you can lay claim and keep others from accessing what you have. That becomes a cooperative process, because one person is not as effective as a whole group is at defending a territory," says Weitzel.

A growing population can decrease the suitability of a location over time, but that does not always mean declining quality of life. To study this, the team also took into consideration the concept known as Allee's Principle, which states that individual fitness, or likelihood of survival and reproduction, increases as the density of the population increases due to cooperative behaviors. Weitzel explains that for something like a crop of plants, they represent something valuable, and the value of cooperative behavior becomes apparent.

"The transition from a hunting and gathering society to an agricultural society is dependent on collaboration," says co-author Stephen Carmody, of Troy University. "The development of agriculture appears to only have happened in nine places around the world so Eastern North America is a unique part of the world to study. Agriculture was one of the most consequential transitions that happened in the past. It changed our whole economic situation."

Developments such as combined efforts for harvesting and defense, and possibly even sharing seeds among groups, could happen with interpersonal cooperation, which leads to greater chances of survival for the group.

As the saying goes, many hands make for lighter work and, Weitzel says, the research is about cooperation and competition at the same time.

"When a resource like domesticated crops is dense and predictable, that is when we expect that it would be defendable," he says. "Other groups may want access to your crop in case their crop failed, for example. There is cooperation and there are aspects of competition. Harvesting and defending."

Weitzel explains that this time period -- 7,500 to 5,000 years ago -- is not only when researchers found people aggregating and living cooperatively in high-quality locations, it is also when they saw an uptick in intergroup violence, as shown by skeletons showing the effects of "trophy-taking."

"Of course there are signs of violence throughout history, but trophy-taking is a different type of violence," Weitzel says. "The victor removes a part of the loser as a signal they won. They took scalps, hands, feet, heads -- that first evidence appears to have happened at the same time as plant management."

This reflects the Allee Principle's limit: a point at which population density surpasses an optimum number, and suitability declines as a result.

"As the ideal free distribution and Allee effects predict, at a certain point, the benefits of cooperation start to wane and you see dispersal again. There are incentives to be around other people, but not too many other people," says Weitzel.

After the spike in trophy-taking violence, there was a period of time when the populations dispersed once again, although populations still aggregated. During the dispersal period, researchers found a corresponding decrease in trophy-taking violence.

"We see a lot of things that look modern to us, for example social inequality and climate change," Carmody says. "However, these are fundamental processes and large-scale issues. A lot of these issues tie back to the origin of agriculture."

By understanding early human interactions, Weitzel says this knowledge can help understand our present and even influence the way we think about the future.

Read more at Science Daily

One step closer to understanding the human brain

An international team of scientists led by researchers at Karolinska Institutet in Sweden has launched a comprehensive overview of all proteins expressed in the brain, published today in the journal Science. The open-access database offers medical researchers an unprecedented resource to deepen their understanding of neurobiology and develop new, more effective therapies and diagnostics targeting psychiatric and neurological diseases.

The brain is the most complex organ of our body, both in structure and function. The new Brain Atlas resource is based on the analysis of nearly 1,900 brain samples covering 27 brain regions, combining data from the human brain with corresponding information from the brains of the pig and mouse. It is the latest database released by the Human Protein Atlas (HPA) program which is based at the Science for Life Laboratory (SciLifeLab) in Sweden, a joint research centre aligned with KTH Royal Institute of Technology, Karolinska Institutet, Stockholm University and Uppsala University. The project is a collaboration with the BGI research centre in Shenzhen and Qingdao in China and Aarhus University in Denmark.

"As expected the blueprint for the brain is shared among mammals, but the new map also reveals interesting differences between human, pig and mouse brains," says Mathias Uhlén, Professor at the Department of Protein Science at KTH Royal Institute of Technology, Visiting professor at the Department of Neuroscience at Karolinska Institutet and Director of the Human Protein Atlas effort.

The cerebellum emerged in the study as the most distinct region of the brain. Many proteins with elevated expression levels in this region were found, including several associated to psychiatric disorders supporting a role of the cerebellum in the processing of emotions.

"Another interesting finding is that the different cell types of the brain share specialised proteins with peripheral organs," says Dr. Evelina Sjöstedt, researcher at the Department of Neuroscience at Karolinska Institutet and first author on the paper. "For example, astrocytes, the cells that 'filter' the extracellular environment in the brain share a lot of transporters and metabolic enzymes with cells in the liver that filter the blood."

When comparing the neurotransmitter systems, responsible for the communication between neurons, some clear differences between the species could be identified.

"Several molecular components of neurotransmitter systems, especially receptors that respond to released neurotransmitters and neuropeptides, show a different pattern in humans and mice," says Dr. Jan Mulder, group leader of the Human Protein Atlas brain profiling group and researcher at the Department of Neuroscience at Karolinska Institutet. "This means that caution should be taken when selecting animals as models for human mental and neurological disorders."

For selected genes/proteins, the Brain Atlas also contains microscopic images showing the protein distribution in human brain samples and detailed, zoomable maps of protein distribution in the mouse brain.

Read more at Science Daily

Scientists monitor brains replaying memories in real time

Brain abstract illustration
In a study of epilepsy patients, researchers at the National Institutes of Health monitored the electrical activity of thousands of individual brain cells, called neurons, as patients took memory tests. They found that the firing patterns of the cells that occurred when patients learned a word pair were replayed fractions of a second before they successfully remembered the pair. The study was part of an NIH Clinical Center trial for patients with drug-resistant epilepsy whose seizures cannot be controlled with drugs.

"Memory plays a crucial role in our lives. Just as musical notes are recorded as grooves on a record, it appears that our brains store memories in neural firing patterns that can be replayed over and over again," said Kareem Zaghloul, M.D., Ph.D., a neurosurgeon-researcher at the NIH's National Institute of Neurological Disorders and Stroke (NINDS) and senior author of the study published in Science.

Dr. Zaghloul's team has been recording electrical currents of drug-resistant epilepsy patients temporarily living with surgically implanted electrodes designed to monitor brain activity in the hopes of identifying the source of a patient's seizures. This period also provides an opportunity to study neural activity during memory. In this study, his team examined the activity used to store memories of our past experiences, which scientists call episodic memories.

In 1957, the case of an epilepsy patient H.M. provided a breakthrough in memory research. H.M could not remember new experiences after part of his brain was surgically removed to stop his seizures. Since then, research has pointed to the idea that episodic memories are stored, or encoded, as neural activity patterns that our brains replay when triggered by such things as the whiff of a familiar scent or the riff of a catchy tune. But exactly how this happens was unknown.

Over the past two decades, rodent studies have suggested that the brain may store memories in unique neuronal firing sequences. After joining Dr. Zaghloul's lab, Alex P. Vaz, B.S., an M.D., Ph.D. student at Duke University, Durham, North Carolina, and the leader of this study decided to test this idea in humans.

"We thought that if we looked carefully at the data we had been collecting from patients we might be able to find a link between memory and neuronal firing patterns in humans that is similar to that seen in rodents," said Vaz, a bioengineer who specializes in deciphering the meaning of electrical signals generated by the body.

To do this they analyzed the firing patterns of individual neurons located in the anterior temporal lobe, a brain language center. Currents were recorded as patients sat in front of a screen and were asked to learn word pairs such as "cake" and "fox." The researchers discovered that unique firing patterns of individual neurons were associated with learning each new word pattern. Later, when a patient was shown one of the words, such as "cake," a very similar firing pattern was replayed just milliseconds before the patient correctly recalled the paired word "fox."

"These results suggest that our brains may use distinct sequences of neural spiking activity to store memories and then replay them when we remember a past experience," said Dr. Zaghloul.

Last year, his team showed that electrical waves, called ripples, may emerge in the brain just split seconds before we remember something correctly. In this study, the team discovered a link between the ripples recorded in the anterior temporal lobe and the spiking patterns seen during learning and memory. They also showed that ripples recorded in another area called the medial temporal lobe slightly preceded the replay of firing patterns seen in the anterior temporal lobe during learning.

Read more at Science Daily

Mar 5, 2020

Apes' inner ears could hide clues to evolutionary history of hominoids

Studying the inner ear of apes and humans could uncover new information on our species' evolutionary relationships, suggests a new study published today in eLife.

Humans, gorillas, chimpanzees, orangutans and gibbons all belong to a group known as the hominoids. This 'superfamily' also includes the immediate ancestors and close relatives of these species, but in many instances, the evolutionary relationships between these extinct ape species remain controversial. The new findings suggest that looking at the structure (or morphology) of the inner ears across hominoids as a whole could go some way to resolving this.

"Reconstructing the evolutionary history of apes and humans and determining the morphology of the last common ancestor from which they evolved are challenging tasks," explains lead author Alessandro Urciuoli, a researcher at the Institut Català de Paleontologia Miquel Crusafont (ICP) in Barcelona, Spain. "While DNA can help evolutionary biologists work out how living species are related to one another, fossils are typically the principle source of information for extinct species, although they must be used with caution."

The bony cavity that houses the inner ear, which is involved in balance and hearing and is fairly common in the fossil record, has proven useful for tracing the evolution of certain groups of mammals. But until now, no studies have explored whether this structure could provide insights into the evolutionary relatedness among living and extinct hominoids.

To address this, Urciuoli and his team used a 3D imaging technique to capture the complex shapes of the inner ear cavities of 27 species of monkeys and apes, including humans and the extinct ape Oreopithecus and fossil hominin Australopithecus. Their results confirmed that the shape of these structures most closely reflected the evolutionary relationships between the species and not, for example, how the animals moved.

The team next identified features of these bony chambers that were shared among several hominoid groups, and estimated how the inner ears of these groups' ancestors might have looked. Their findings for Australopithecus were consistent with this species being the most closely related to modern humans than other apes, while those for Oreopithecus supported the view that this was a much older species of ape related in some respects with other apes still alive today.

"Our work provides a testable hypothesis about inner ear evolution in apes and humans that should be subjected to further scrutiny based on the analysis of additional fossils, particularly for great apes that existed during the Miocene," says senior author David Alba, Director of the ICP. The Miocene period, which extends from about 23 to five million years ago, is when the evolutionary path to hominoids became distinct.

Read more at Science Daily

New telescope observations shed new light on black hole ejections

A black hole, ejecting material at close to the speed of light, has been observed using e-MERLIN, the UK's radio telescope array based at Jodrell Bank Observatory.

A research team based at Oxford University used e-MERLIN, as well as the VLA and MeerKAT telescopes based in the US and South Africa respectively, to track the ejecting material over a period of months.

The observations have allowed a deeper understanding into how black holes feed energy into their environment. Co-lead of the project, and author on the paper appearing in Nature Astronomy, Rob Fender said "We've been studying these kind of jets for over 20 years and never have we tracked them so beautifully over such a large distance."

The group successfully tracked these ejections of this particular system, known as MAXI J1820+070, after it went into outburst in the summer of 2018. The extreme distances from the black hole and the final angular separation is among the largest seen from such systems.

The ejections are moving so fast that they appear to be moving faster than the speed of light -- they are not, rather this is a phenomenon known as apparent superluminal motion.

Dr Rob Beswick, Head of e-MERLIN science operations at Jodrell Bank stated: "This work shows the power of world-class instruments such as e-MERLIN, MeerKAT and the VLA working in tandem.

"e-MERLIN's unique combination of resolution, sensitivity and rapid response made it the perfect instrument for this sort of study."

"Using our radio observations we were able to better estimate how much energy is contained in these ejections using a novel method for this type of system." explains Joe Bright, DPhil student at Oxford University's Department of Physics.

"Galactic black holes, such as MAXI J1820+070, are thought to be miniature versions of the supermassive black holes that are found at the centre of galaxies. The feedback from these black holes is thought to be a vital component regulating the growth of galaxies."

From Science Daily

The catch to putting warning labels on fake news

After the 2016 U.S. presidential election, Facebook began putting warning tags on news stories fact-checkers judged to be false. But there's a catch: Tagging some stories as false makes readers more willing to believe other stories and share them with friends, even if those additional, untagged stories also turn out to be false.

That is the main finding of a new study co-authored by an MIT professor, based on multiple experiments with news consumers. The researchers call this unintended consequence -- in which the selective labeling of false news makes other news stories seem more legitimate -- the "implied-truth effect" in news consumption.

"Putting a warning on some content is going to make you think, to some extent, that all of the other content without the warning might have been checked and verified," says David Rand, the Erwin H. Schell Professor at the MIT Sloan School of Management and co-author of a newly published paper detailing the study.

"There's no way the fact-checkers can keep up with the stream of misinformation, so even if the warnings do really reduce belief in the tagged stories, you still have a problem, because of the implied truth effect," Rand adds.

Moreover, Rand observes, the implied truth effect "is actually perfectly rational" on the part of readers, since there is ambiguity about whether untagged stories were verified or just not yet checked. "That makes these warnings potentially problematic," he says. "Because people will reasonably make this inference."

Even so, the findings also suggest a solution: Placing "Verified" tags on stories found to be true eliminates the problem.

The paper, "The Implied Truth Effect," has just appeared in online form in the journal Management Science. In addition to Rand, the authors are Gordon Pennycook, an assistant professor of psychology at the University of Regina; Adam Bear, a postdoc in the Cushman Lab at Harvard University; and Evan T. Collins, an undergraduate researcher on the project from Yale University.

BREAKING: More labels are better

To conduct the study, the researchers conducted a pair of online experiments with a total of 6,739 U.S. residents, recruited via Amazon's Mechanical Turk platform. Participants were given a variety of true and false news headlines in a Facebook-style format. The false stories were chosen from the website Snopes.com and included headlines such as "BREAKING NEWS: Hillary Clinton Filed for Divorce in New York Courts" and "Republican Senator Unveils Plan To Send All Of America's Teachers Through A Marine Bootcamp."

The participants viewed an equal mix of true stories and false stories, and were asked whether they would consider sharing each story on social media. Some participants were assigned to a control group in which no stories were labeled; others saw a set of stories where some of the false ones displayed a "FALSE" label; and some participants saw a set of stories with warning labels on some false stories and "TRUE" verification labels for some true stories.

In the first place, stamping warnings on false stories does make people less likely to consider sharing them. For instance, with no labels being used at all, participants considered sharing 29.8 percent of false stories in the sample. That figure dropped to 16.1 percent of false stories that had a warning label attached.

However, the researchers also saw the implied truth effect take effect. Readers were willing to share 36.2 percent of the remaining false stories that did not have warning labels, up from 29.8 percent.

"We robustly observe this implied-truth effect, where if false content doesn't have a warning, people believe it more and say they would be more likely to share it," Rand notes.

But when the warning labels on some false stories were complemented with verification labels on some of the true stories, participants were less likely to consider sharing false stories, across the board. In those circumstances, they shared only 13.7 percent of the headlines labeled as false, and just 26.9 percent of the nonlabeled false stories.

"If, in addition to putting warnings on things fact-checkers find to be false, you also put verification panels on things fact-checkers find to be true, then that solves the problem, because there's no longer any ambiguity," Rand says. "If you see a story without a label, you know it simply hasn't been checked."

Policy implications

The findings come with one additional twist that Rand emphasizes, namely, that participants in the survey did not seem to reject warnings on the basis of ideology. They were still likely to change their perceptions of stories with warning or verifications labels, even if discredited news items were "concordant" with their stated political views.

"These results are not consistent with the idea that our reasoning powers are hijacked by our partisanship," Rand says.

Rand notes that, while continued research on the subject is important, the current study suggests a straightforward way that social media platforms can take action to further improve their systems of labeling online news content.

Read more at Science Daily

Almost alien: Antarctic subglacial lakes are cold, dark and full of secrets

Antarctica map
More than half of the planet's fresh water is in Antarctica. While most of it is frozen in the ice sheets, underneath the ice pools and streams of water flow into one another and into the Southern Ocean surrounding the continent. Understanding the movement of this water, and what is dissolved in it as solutes, reveals how carbon and nutrients from the land may support life in the coastal ocean.

Gathering data on the biogeochemistry of these systems is an undertaking of Antarctic proportions. Trista Vick-Majors, Assistant Professor of Biological Sciences at Michigan Technological University, is part of a team that gathered samples from the Whillans Subglacial Lake in West Antarctica and is lead author on a paper about the lake, recently published in Global Biogeochemical Cycles.

"Life is tough -- it can handle a lot," Vick-Majors said. "This paper is putting together what we know about the biology and how active it is under Antarctic ice with information about the composition of organic carbon in the lake."

Life beneath the ice puts up with a lot -- there is no sunlight and pressure from the ice above in combination with heat radiating up from the Earth's core is what melts the water to form the lake, so the temperature hovers just below freezing. Organic carbon, an important food source for microorganisms, is present in relatively high concentrations in Whillans Subglacial Lake, even if it lacks the verdant mess of a Midwest pond in late August. Instead, as cameras dropped down the borehole of Mercer Subglacial Lake (a neighbor of Whillans) reveal, the subglacial lake is dark, cold, full of soft and fluffy sediment, and lined with bubble-filled ice.

The lake bed looks more alien than earth, and studying extreme environments like this does provide insight into what extraterrestrial life could be like or how earthly life might survive in similar conditions. Not that humans, penguins or fish could handle it; life in the waters beneath Antarctica's ice is mostly microbial. They still show signs of life -- organic carbon and other chemical byproducts of living, eating, excreting and dying -- that Vick-Majors and her team can measure and budget.

Using mass balance calculations, the team's research shows that a pool of dissolved organic carbon in the Whillans Subglacial Lake can be produced in 4.8 to 11.9 years. As the lake fills and drains, which takes about the same amount of time, all those nutrients slip and slide their way to the ice-covered coast of the Southern Ocean. Based on the team's calculations, the subglacial lakes in the region provide 5,400% more organic carbon than what microbial life in the ice-covered ocean downstream needs to survive.

"There's no photosynthesis under the ice in the ocean downstream of this lake -- this limits the available food and energy sources in a way that you wouldn't find in a surface lake or the open ocean," Vick-Majors said. "The idea is that these subglacial lakes that are upstream could provide important sources of energy and nutrients for things living in the ice-covered regions of the Southern Ocean."

While the Whillans Subglacial Lake on its own indicates that upstream nutrients may be an important factor, it is only a single source of data in an ice-covered complex of underground lakes, streams and estuary-like mixing zones that undergo seasonal and sporadic fluxes.

To expand their view, Vick-Majors and the rest of the team have been gathering data at other sites (Mercer Subglacial Lake was sampled by the SALSA team in early 2019), and doing so is no small feat. They make it happen with a hot water drill, a specially designed hose, a 10-liter water sampling bottle, some sediment coring devices, and a week of summery polar weather that can plunge to 20 below. The crew wears Tyvek suits and all equipment is thoroughly cleaned. They also filter the drilling water, run it past several banks of ultra-violet lights to knock down microbial contamination, and then heat it up to use the hot water to open an approximately 1000-meter borehole down to the lake.

"Some of that melted ice water, which has now circulated through the drill, is removed from the hole so that when the lake is punctured, water from the lake moves up into the borehole," Vick-Majors said, explaining that the crew has to keep the hot water from the drill separate from the lake water to keep their samples and the lake clean. "It takes about 24 hours to drill the borehole and we keep it open for a few days; gathering a single sample or letting down the cameras can take two hours or more, depending on the equipment."

And the hole keeps trying to refreeze. Plus, Vick-Majors is not a lone scientist; she is embedded in an interdisciplinary team and everyone needs access to the borehole for different experiments. But for all the tight logistics and cold toes, she says it's worth it.

"There is water and there is life under the ice," Vick-Majors said. "These can teach us a lot about our planet because this is a great place to look at somewhat simplified ecosystems, without higher levels of organisms. So we can answer questions about life that can be really hard to answer in other places."

Read more at Science Daily

Mar 4, 2020

5,000-year-old milk proteins point to the importance of dairying in eastern Eurasia

Today dairy foods sustain and support millions around the world, including in Mongolia, where dairy foods make up to 50% of calories consumed during the summer. Although dairy-based pastoralism has been an essential part of life and culture in the eastern Eurasian Steppe for millennia, the eastward spread of dairying from its origin in southwest Asia and the development of these practices is little understood. The current study, led by Shevan Wilkin and Jessica Hendy of the Max Planck Institute for the Science of Human History, presents the earliest evidence for dairy consumption in East Asia, circa 3000 BCE, and offers insights into the arrival and evolution of dairy pastoralism in prehistoric Mongolia.

Earliest dairy consumption & a possible path of entry

The highly mobile nature of pastoralist societies and the severe winds of the Eastern Steppe make detecting occupied sites with direct evidence into the lives and culture of ancient Mongolians exceedingly rare. Instead, the researchers looked for clues in ritual human burial mounds, often marked by stone monuments and occasionally featuring satellite animal graves.

In collaboration with the National University of Mongolia, researchers analyzed dental calculus from individuals ranging from the Early Bronze Age to the Mongol Period. Three-quarters of all individuals contained evidence that they had consumed dairy foods, which demonstrates the widespread importance of this food source in both prehistoric and historic Mongolia. The study's results include the earliest direct evidence for dairy consumption in East Asia, identified in an individual from the Afanasievo site of Shatar Chuluu, which dates to roughly 3000 BCE. Previous DNA analysis on this individual revealed non-local genetic markers consistent with Western Steppe Herder populations, presenting Early Bronze Age Afanasievo migrations westward via the Russian Altai as a viable candidate for the introduction of dairy and domestic livestock into eastern Eurasia.

Multiple different animal species were used for their milk

By sequencing the milk proteins extracted from the dental calculus, the scientists were able to determine which animal species were being used for dairy production, and thereby help to trace the progression of domestication, dairying, and pastoralism in the region. "Modern Mongolians use cow, sheep, goat, yak, camel, horse and reindeer for milk today, yet when each of these species were first utilized for dairy in Mongolia remains unclear," says Shevan Wilkin, lead author of the study. "What is clear is that the crucial renewable calories and hydration made available through the incorporation of dairying would have become essential across the arid and agriculturally challenging ancient Eastern Steppe."

The earliest individuals to show evidence of dairy consumption lived around 5000 years ago and consumed milk from ruminant species, such as cattle, sheep, and goats. A few thousand years later, at Bronze Age sites dated to after 1200 BCE, the researchers find the first evidence of horse milk consumption, occurring at the same time as early evidence for horse bridling and riding, as well as the use of horses at ritual burial sites. In addition, the study shows that during the Mongol Empire circa 1200-1400 CE, people also consumed the milk of camels. "We are excited that through the analysis of proteins we are able to see the consumption of multiple different animal species, even sometimes in the same individual. This gives us a whole new insight into ancient dairying practices" says Jessica Hendy, senior author of the study.

Millenia after the first evidence of horse milk consumption, horses remain vital to the daily lives of many in modern Mongolia, where mounted pastoralists rely on them to manage large herds of livestock, transport people and supplies, and provide a primary source of meat and milk. "Our findings suggest that the incorporation of horses into dairy pastoralism in Eastern Eurasia was closely linked to a broader economic transformation in the use of horses for riding, movement, and diet," says William Taylor of the University of Colorado-Boulder, one of the study's coauthors.

Read more at Science Daily

The brains of shrimps and insects are more alike than we thought

New research shows that crustaceans such as shrimps, lobsters and crabs have more in common with their insect relatives than previously thought -- when it comes to the structure of their brains.

Both insects and crustaceans possess mushroom-shaped brain structures known in insects to be required for learning, memory and possibly negotiating complex, three-dimensional environments, according to the study, led by University of Arizona neuroscientist Nicholas Strausfeld.

The research, published in the open-access journal eLife, challenges a widely held belief in the scientific community that these brain structures -- called "mushroom bodies" -- are conspicuously absent from crustacean brains.

In 2017, Strausfeld's team reported a detailed analysis of mushroom bodies discovered in the brain of the mantis shrimp, Squilla mantis. In the current paper, the group provides evidence that neuro-anatomical features that define mushroom bodies -- at one time thought to be an evolutionary feature proprietary to insects -- are present across crustaceans, a group that includes more than 50,000 species.

Crustaceans and insects are known to descend from a common ancestor that lived about a half billion years ago and has long been extinct.

"The mushroom body is an incredibly ancient, fundamental brain structure," said Strausfeld, Regents Professor of neuroscience and director of the University of Arizona's Center for Insect Science. "When you look across the arthropods as a group, it's everywhere."

In addition to insects and crustaceans, other arthropods include arachnids, such as scorpions and spiders, and myriapods, such as millipedes and centipedes.

Characterized by their external skeletons and jointed appendages, arthropods make up the most species-rich group of animals known, populating almost every conceivable habitat. About 480 million years ago, the arthropod family tree split, with one lineage producing the arachnids and another the mandibulates. The second group split again to provide the lineage leading to modern crustaceans, including shrimps and lobsters, and six-legged creatures, including insects -- the most diverse group of arthropods living today.

Decades of research has untangled arthropods' evolutionary relationships using morphological, molecular and genetic data, as well as evidence from the structure of their brains.

Mushroom bodies in the brain have been shown to be the central processing units where sensory input converges. Vision, smell, taste and touch all are integrated here, as studies on honeybees have shown. Arranged in pairs, each mushroom body consists of a column-like portion, called the lobe, capped by a dome-like structure, called the calyx, where neurons that relay information sent from the animal's sensory organs converge. This information is passed to neurons that supply thousands of intersecting nerve fibers in the lobes that are essential for computing and storing memories.

Recent research by other scientists has also shown that those circuits interact with other brain centers in strengthening or reducing the importance of a recollection as the animal gathers experiences from its environment.

"The mushroom bodies contain networks where interesting associations are being made that give rise to memory," Strausfeld said. "It's how the animal makes sense of its environment."

A more evolutionarily "modern" group of crustaceans called Reptantia, which includes many lobsters and crabs, do indeed appear to have brain centers that don't look at all like the insect mushroom body. This, the authors suggest, helped create the misconception crustaceans lack the structures altogether.

Brain analysis of crustaceans has revealed that while the mushroom bodies found in crustaceans appear more diverse than those of insects, their defining neuroanatomical and molecular elements are all there.

Using crustacean brain samples, the researchers applied tagged antibodies that act like probes, homing in on and highlighting proteins that have been shown to be essential for learning and memory in fruit flies. Sensitive tissue-staining techniques further enabled visualization of mushroom bodies' intricate architecture.

"We know of several proteins that are necessary for the establishment of learning and memory in fruit flies," Strausfeld said, "and if you use antibodies that detect those proteins across insect species, the mushroom bodies light up every time."

Using this method revealed that the same proteins are not unique to insects; they show up in the brains of other arthropods, including centipedes, millipedes and some arachnids. Even vertebrates, including humans, have them in a brain structure called the hippocampus, a known center for memory and learning.

"Corresponding brain centers -- the mushroom body in arthropods, marine worms, flatworms and, possibly, the hippocampus of vertebrates -- appear to have a very ancient origin in the evolution of animal life," Strausfeld said.

So why do the most commonly studied crustaceans have mushroom bodies that can appear so drastically different from their insect counterparts? Strausfeld and his co-authors have a theory: Crustacean species that inhabit environments that demand knowledge about elaborate, three-dimensional areas are precisely the ones whose mushroom bodies most closely resemble those in insects, a group that has also mastered the three-dimensional world by evolving to fly.

"We don't think that's a coincidence," Strausfeld says. "We propose that that the complexity of inhabiting a three-dimensional world may demand special neural networks that allow a sophisticated level of cognition for negotiating that space in three dimensions."

Lobsters and crabs, on the other hand, spend their lives confined mostly to the seafloor, which may explain why they've historically been said to lack mushroom bodies.

"At the risk of offending colleagues who are partial to crabs and lobsters: I view many of these as flat-world inhabitants," Strausfeld says. "Future studies will be able to tell us which are smarter: the reef dwelling mantis shrimp, a top predator, or the reclusive lobster."

Read more at Science Daily

Scientists shed light on mystery of dark matter

Scientists have identified a sub-atomic particle that could have formed the "dark matter" in the Universe during the Big Bang.

Up to 80% of the Universe could be dark matter, but despite many decades of study, its physical origin has remained an enigma. While it cannot be seen directly, scientists know it exists because of its interaction via gravity with visible matter like stars and planets. Dark matter is composed of particles that do not absorb, reflect or emit light.

Now, nuclear physicists at the University of York are putting forward a new candidate for the mysterious matter -- a particle they recently discovered called the d-star hexaquark.

The particle is composed of six quarks -- the fundamental particles that usually combine in trios to make up protons and neutrons. Importantly, the six quarks in a d-star result in a boson particle, which means that when many d-stars are present they can combine together in very different ways to the protons and neutrons.

The research group at York suggest that in the conditions shortly after the Big Bang, many d-star hexaquarks could have grouped together as the universe cooled and expanded to form the fifth state of matter -- Bose-Einstein condensate.

Dr MIkhail Bashkanov and Professor Daniel Watts from the the department of physics at the University of York recently published the first assessment of the viability of this new dark matter candididate.

Professor Daniel Watts from the department of physics at the University of York said: "The origin of dark matter in the universe is one of the biggest questions in science and one that, until now, has drawn a blank. Our first calculations indicate that condensates of d-stars are a feasible new candidate for dark matter. This new result is particularly exciting since it doesn't require any concepts that are new to physics."

Co-author of the paper, Dr Mikhail Bashkanov from the Department of Physics at the University of York said: "The next step to establish this new dark matter candidate will be to obtain a better understanding of how the d-stars interact -- when do they attract and when do they repel each other.

Read more at Science Daily

Chest CT findings in coronavirus disease (COVID-19) pneumonia

A multi-center study (n=101) of the relationship between chest CT findings and the clinical conditions of coronavirus disease (COVID-19) pneumonia -- published ahead-of-print and open-access in the American Journal of Roentgenology (AJR) -- determined that most patients with COVID-19 pneumonia have ground-glass opacities (GGO) (86.1%) or mixed GGO and consolidation (64.4%) and vascular enlargement in the lesion (71.3%).

In addition, lead authors Wei Zhao, Zheng Zhong, and colleagues revealed that lesions present on CT images were more likely to have peripheral distribution (87.1%) and bilateral involvement (82.2%) and be lower lung predominant (54.5%) and multifocal (54.5%).

Zhao, Zhong, et al. collected their 101 cases of COVID-19 pneumonia across four institutions in China's Hunan province, comparing clinical characteristics and imaging features between two groups: nonemergency (mild or common disease) and emergency (severe or fatal disease).

Accordingly, most of the cohort (70.2%) were 21-50 years old, and most patients (78.2%) had fever as the onset symptom. Only five patients showed disease associated with a family outbreak.

While the emergency group patients were older than the patients in the nonemergency group, the rate of underlying disease was not significantly different in the two groups -- suggesting that viral load could be a better reflection of the severity and extent of COVID-19 pneumonia.

As Zhao and Zhong explained further: "Architectural distortion, traction bronchiectasis, and pleural effusions, which may reflect the viral load and virulence of COVID-19, were statistically different between the two groups and may help us to identify the emergency type disease."

The authors of this AJR article also noted that CT involvement score can help evaluate the severity and extent of COVID-19 pneumonia.

From Science Daily

Mar 3, 2020

Two stars merged to form massive white dwarf

A massive white dwarf star with a bizarre carbon-rich atmosphere could be two white dwarfs merged together according to an international team led by University of Warwick astronomers, and only narrowly avoided destruction.

They have discovered an unusual ultra-massive white dwarf around 150 light years from us with an atmospheric composition never seen before, the first time that a merged white dwarf has been identified using its atmospheric composition as a clue.

The discovery, published today (2 March) in the journal Nature Astronomy, could raise new questions about the evolution of massive white dwarf stars and on the number of supernovae in our galaxy.

This star, named WDJ0551+4135, was identified in a survey of data from the European Space Agency's Gaia telescope. The astronomers followed up with spectroscopy taken using the William Herschel Telescope, focusing on those white dwarfs identified as particularly massive -- a feat made possible by the Gaia mission. By breaking down the light emitted by the star, the astronomers were able to identify the chemical composition of its atmosphere and found that it had an unusually high level of carbon present.

Lead author Dr Mark Hollands, from the University of Warwick Department of Physics, said: "This star stood out as something we had never seen before. You might expect to see an outer layer of hydrogen, sometimes mixed with helium, or just a mix of helium and carbon. You don't expect to see this combination of hydrogen and carbon at the same time as there should be a thick layer of helium in between that prohibits that. When we looked at it, it didn't make any sense."

To solve the puzzle, the astronomers turned detective to uncover the star's true origins.

White dwarfs are the remains of stars like our own Sun that have burnt out all their fuel and shed their outer layers. Most are relatively lightweight, around 0.6 times the mass of our Sun, but this one weighs in at 1.14 solar masses, nearly twice the average mass. Despite being heavier than our Sun, it is compacted into two-thirds the diameter of Earth.

The age of the white dwarf is also a clue. Older stars orbit the Milky Way faster than younger ones, and this object is moving faster than 99% of the other nearby white dwarfs with the same cooling age, suggesting that this star is older than it looks.

Dr Hollands adds: "We have a composition that we can't explain through normal stellar evolution, a mass twice the average for a white dwarf, and a kinematic age older than that inferred from cooling. We're pretty sure of how one star forms one white dwarf and it shouldn't do this. The only way you can explain it is if it was formed through a merger of two white dwarfs."

The theory is that when one star in a binary system expands at the end of its life it will envelope its partner, drawing its orbit closer as the first star shrinks. The same will happen when the other star expands. Over billions of years, gravitational wave emission will shrink the orbit further, to the point that the stars merge together.

While white dwarf mergers have been predicted to occur, this one would be particularly unusual. Most of the mergers in our galaxy will be between stars with different masses, whereas this merger appears to be between two similarly sized stars. There is also a limit to how big the resulting white dwarf can be: at more than 1.4 solar masses it is thought that it would explode in a supernova though it may be possible for that these explosions can occur at slightly lower masses, so this star is useful in demonstrating how massive a white dwarf can get and still survive.

Because the merging process restarts the cooling of the star, it is difficult to determine how old it is. The white dwarf probably merged around 1.3 billion years ago but the two original white dwarfs may have existed for many billions of years prior.

It is one of only a handful of merged white dwarfs to be identified so far, and the only one via its composition.

Dr Hollands adds: "There aren't that many white dwarfs this massive, although there are more than you would expect to see which implies that some of them were probably formed by mergers.

"In the future we may be able to use a technique called asteroseismology to learn about the white dwarf's core composition from its stellar pulsations, which would be an independent method confirming this star formed from a merger.

Read more at Science Daily

Life on Titan cannot rely on cell membranes, according to computational simulations

Researchers from Chalmers University of Technology, Sweden, have made a new contribution to the ongoing search into the possibility of life on Titan, Saturn's largest moon. Using quantum mechanical calculations, they have shown that azotosomes, a proposed alternative to cell membranes, could not form under the conditions there. Their research is published in the journal Science Advances.

Titan, Saturn's largest moon, has a dynamic surface, with seasonal rainfall, and lakes and seas at its polar regions, as well as a dense, nitrogen-rich atmosphere. These similarities to Earth have led many to consider the possibility of life there. The liquids on Titan are not water, however, but seas of methane and ethane, and the surface temperature is around -180C. Lipid membranes, of the type common to life on Earth, could not function under such conditions. This has led researchers looking for signs of life on Titan to contemplate alternative forms of cell membrane that could tolerate these extremes. One such alternative form, suggested by a team from Cornell University, is called an 'azotosome'.

The idea of azotosomes has gained traction in the field of astrobiology, and it has been shown computationally that such structures would survive the conditions on Titan. The azotosome was proposed to be formed from the organic compound acrylonitrile -- which was later confirmed to exist on Titan.

"Titan is a fascinating place to test our understanding of the limits of prebiotic chemistry -- the chemistry that precedes life. What chemical, or possibly biological, structures might form, given enough time under such different conditions? The suggestion of azotosomes was a really interesting proposal for an alternative to cell membranes as we understand them," says Martin Rahm, Assistant Professor at the Department of Chemistry and Chemical Engineering at Chalmers University of Technology.

"But our new research paper shows that, unfortunately, although the structure could indeed tolerate the extremes of Titan, it would not form in the first place," he explains.

Using advanced quantum mechanical calculations, the researchers compared the energy of the proposed azotosome membrane embedded in methane with that of the molecular crystal form of acrylonitrile -- its molecular ice. They discovered that each building block added to the azotosome increased its energy significantly, making its formation progressively less likely thermodynamically. They conclude therefore that while azotosomes could survive on Titan, they would not self-assemble under such conditions. Instead, acrylonitrile would crystallise into its molecular ice.

Despite the 'negative' results of their work, Martin Rahm sees the study, which was done together with PhD student Hilda Sandström, as providing highly valuable information for ongoing research in astrobiology.

"With this work we hope to contribute to the ongoing discussion on the limits of chemistry and biology in environmental extremes. Though we have shown that acrylonitrile is not a viable building block for workable cell membranes on Titan, we now have a better understanding of the environmental limits for cell membranes. Titan is a highly interesting and unique environment with many unanswered questions and possibilities left to explore," he explains.

Their work is also an important step forward in demonstrating the potential of computational astrobiology, which offers the chance to evaluate, prior to experiments or sampling, whether or not a particular structure or process might be a biosignature, a marker for potential biology.

Interest in astrobiology on Titan is very high in the scientific community -- so much so that in 2026, NASA will launch the billion-dollar Dragonfly spacecraft, which will make the 8-year journey to Titan to investigate its surface. It will spend around 3 years flying to different locations around the moon, assessing the conditions for habitability, prebiotic chemistry, and looking for signs of past and present life.

The likelihood of life on Titan and other similar worlds

While stressing that life under the extreme conditions found on Titan and other such worlds is highly unlikely, the researchers do, however, consider another possibility. They hypothesise that perhaps cell membranes are not a necessity for life everywhere, as we see they are on our Earth.

One crucial function of cell membranes on Earth is to protect the contents of a cell from being diluted and destroyed in larger bodies of liquid water. However, on the surface of Titan, any hypothetical life-bearing biomolecule would exist in the solid state, due to the low temperature, and never risk such destruction by dissolution.

Because hypothetical biomolecules on Titan would be immobile, they would need to rely on the diffusion of small energetic molecules, such as hydrogen gas or acetylene, to reach them before they could grow or replicate. Such molecules would need to be transported through the surrounding atmosphere or through liquid hydrocarbons, and a membrane would, in this case, hinder the desired diffusion. A membrane would likely be a similar obstacle in the opposite direction, for the necessary removal of waste products from the biomolecule's metabolism.

Read more at Science Daily

To predict an epidemic, evolution can't be ignored

When scientists try to predict the spread of something across populations -- anything from a coronavirus to misinformation -- they use complex mathematical models to do so. Typically, they'll study the first few steps in which the subject spreads, and use that rate to project how far and wide the spread will go.

But what happens if a pathogen mutates, or information becomes modified, changing the speed at which it spreads? In a new study appearing in this week's issue of Proceedings of the National Academy of Sciences (PNAS), a team of Carnegie Mellon University researchers show for the first time how important these considerations are.

"These evolutionary changes have a huge impact," says CyLab faculty member Osman Yagan, an associate research professor in Electrical and Computer Engineering (ECE) and corresponding author of the study. "If you don't consider the potential changes over time, you will be wrong in predicting the number of people that will get sick or the number of people who are exposed to a piece of information."

Most people are familiar with epidemics of disease, but information itself -- nowadays traveling at lightning speeds over social media -- can experience its own kind of epidemic and "go viral." Whether a piece of information goes viral or not can depend on how the original message is tweaked.

"Some pieces of misinformation are intentional, but some may develop organically when many people sequentially make small changes like a game of 'telephone,'" says Yagan. "A seemingly boring piece of information can evolve into a viral Tweet, and we need to be able to predict how these things spread."

In their study, the researchers developed a mathematical theory that takes these evolutionary changes into consideration. They then tested their theory against thousands of computer-simulated epidemics in real-world networks, such as Twitter for the spread of information or a hospital for the spread of disease.

In the context of spreading of infectious disease, the team ran thousands of simulations using data from two real-world networks: a contact network among students, teachers, and staff at a US high school, and a contact network among staff and patients in a hospital in Lyon, France.

These simulations served as a test bed: the theory that matches what is observed in the simulations would prove to be the more accurate one.

"We showed that our theory works over real-world networks," says the study's first author, Rashad Eletreby, who was a Carnegie Mellon Ph.D. student when he wrote the paper. "Traditional models that don't consider evolutionary adaptations fail at predicting the probability of the emergence of an epidemic."

While the study isn't a silver bullet for predicting the spread of today's coronavirus or the spread of fake news in today's volatile political environment with 100% accuracy -- one would need real-time data tracking the evolution of the pathogen or information to do that -- the authors say it's a big step.

Read more at Science Daily

Geologists determine early Earth was a 'water world' by studying exposed ocean crust

Ocean panorama
The Earth of 3.2 billion years ago was a "water world" of submerged continents, geologists say after analyzing oxygen isotope data from ancient ocean crust that's now exposed on land in Australia.

And that could have major implications on the origin of life.

"An early Earth without emergent continents may have resembled a 'water world,' providing an important environmental constraint on the origin and evolution of life on Earth as well as its possible existence elsewhere," geologists Benjamin Johnson and Boswell Wing wrote in a paper just published online by the journal Nature Geoscience.

Johnson is an assistant professor of geological and atmospheric sciences at Iowa State University and a recent postdoctoral research associate at the University of Colorado Boulder. Wing is an associate professor of geological sciences at Colorado. Grants from the National Science Foundation supported their study and a Lewis and Clark Grant from the American Philosophical Society supported Johnson's fieldwork in Australia.

Johnson said his work on the project started when he talked with Wing at conferences and learned about the well-preserved, 3.2-billion-year-old ocean crust from the Archaean eon (4 billion to 2.5 billion years ago) in a remote part of the state of Western Australia. Previous studies meant there was already a big library of geochemical data from the site.

Johnson joined Wing's research group and went to see ocean crust for himself -- a 2018 trip involving a flight to Perth and a 17-hour drive north to the coastal region near Port Hedland.

After taking his own rock samples and digging into the library of existing data, Johnson created a cross-section grid of the oxygen isotope and temperature values found in the rock.

(Isotopes are atoms of a chemical element with the same number of protons within the nucleus, but differing numbers of neutrons. In this case, differences in oxygen isotopes preserved with the ancient rock provide clues about the interaction of rock and water billions of years ago.)

Once he had two-dimensional grids based on whole-rock data, Johnson created an inverse model to come up with estimates of the oxygen isotopes within the ancient oceans. The result: Ancient seawater was enriched with about 4 parts per thousand more of a heavy isotope of oxygen (oxygen with eight protons and 10 neutrons, written as 18O) than an ice-free ocean of today.

How to explain that decrease in heavy isotopes over time?

Johnson and Wing suggest two possible ways: Water cycling through the ancient ocean crust was different than today's seawater with a lot more high-temperature interactions that could have enriched the ocean with the heavy isotopes of oxygen. Or, water cycling from continental rock could have reduced the percentage of heavy isotopes in ocean water.

"Our preferred hypothesis -- and in some ways the simplest -- is that continental weathering from land began sometime after 3.2 billion years ago and began to draw down the amount of heavy isotopes in the ocean," Johnson said.

The idea that water cycling through ocean crust in a way distinct from how it happens today, causing the difference in isotope composition "is not supported by the rocks," Johnson said. "The 3.2-billion-year-old section of ocean crust we studied looks exactly like much, much younger ocean crust."

Johnson said the study demonstrates that geologists can build models and find new, quantitative ways to solve a problem -- even when that problem involves seawater from 3.2 billion years ago that they'll never see or sample.

Read more at Science Daily

Mar 2, 2020

Not a 'math person'? You may be better at learning to code than you think

Want to learn to code? Put down the math book. Practice those communication skills instead.

New research from the University of Washington finds that a natural aptitude for learning languages is a stronger predictor of learning to program than basic math knowledge, or numeracy. That's because writing code also involves learning a second language, an ability to learn that language's vocabulary and grammar, and how they work together to communicate ideas and intentions. Other cognitive functions tied to both areas, such as problem solving and the use of working memory, also play key roles.

"Many barriers to programming, from prerequisite courses to stereotypes of what a good programmer looks like, are centered around the idea that programming relies heavily on math abilities, and that idea is not born out in our data," said lead author Chantel Prat, an associate professor of psychology at the UW and at the Institute for Learning & Brain Sciences. "Learning to program is hard, but is increasingly important for obtaining skilled positions in the workforce. Information about what it takes to be good at programming is critically missing in a field that has been notoriously slow in closing the gender gap."

Published online March 2 in Scientific Reports, an open-access journal from the Nature Publishing Group, the research examined the neurocognitive abilities of more than three dozen adults as they learned Python, a common programming language. Following a battery of tests to assess their executive function, language and math skills, participants completed a series of online lessons and quizzes in Python. Those who learned Python faster, and with greater accuracy, tended to have a mix of strong problem-solving and language abilities.

In today's STEM-focused world, learning to code opens up a variety of possibilities for jobs and extended education. Coding is associated with math and engineering; college-level programming courses tend to require advanced math to enroll and they tend to be taught in computer science and engineering departments. Other research, namely from UW psychology professor Sapna Cheryan, has shown that such requirements and perceptions of coding reinforce stereotypes about programming as a masculine field, potentially discouraging women from pursuing it.

But coding also has a foundation in human language: Programming involves creating meaning by stringing symbols together in rule-based ways.

Though a few studies have touched on the cognitive links between language learning and computer programming, some of the data is decades old, using languages such as Pascal that are now out of date, and none of them used natural language aptitude measures to predict individual differences in learning to program.

So Prat, who specializes in the neural and cognitive predictors of learning human languages, set out to explore the individual differences in how people learn Python. Python was a natural choice, Prat explained, because it resembles English structures such as paragraph indentation and uses many real words rather than symbols for functions.

To evaluate the neural and cognitive characteristics of "programming aptitude," Prat studied a group of native English speakers between the ages of 18 and 35 who had never learned to code.

Before learning to code, participants took two completely different types of assessments. First, participants underwent a five-minute electroencephalography scan, which recorded the electrical activity of their brains as they relaxed with their eyes closed. In previous research, Prat showed that patterns of neural activity while the brain is at rest can predict up to 60% of the variability in the speed with which someone can learn a second language (in that case, French).

"Ultimately, these resting-state brain metrics might be used as culture-free measures of how someone learns," Prat said.

Then the participants took eight different tests: one that specifically covered numeracy; one that measured language aptitude; and others that assessed attention, problem-solving and memory.

To learn Python, the participants were assigned 10 45-minute online instruction sessions using the Codeacademy educational tool. Each session focused on a coding concept, such as lists or if/then conditions, and concluded with a quiz that a user needed to pass in order to progress to the next session. For help, users could turn to a "hint" button, an informational blog from past users and a "solution" button, in that order.

From a shared mirror screen, a researcher followed along with each participant and was able to calculate their "learning rate," or speed with which they mastered each lesson, as well as their quiz accuracy and the number of times they asked for help.

After completing the sessions, participants took a multiple-choice test on the purpose of functions (the vocabulary of Python) and the structure of coding (the grammar of Python). For their final task, they programmed a game -- Rock, Paper, Scissors -- considered an introductory project for a new Python coder. This helped assess their ability to write code using the information they had learned.

Ultimately, researchers found that scores from the language aptitude test were the strongest predictors of participants' learning rate in Python. Scores from tests in numeracy and fluid reasoning were also associated with Python learning rate, but each of these factors explained less variance than language aptitude did.

Presented another way, across learning outcomes, participants' language aptitude, fluid reasoning and working memory, and resting-state brain activity were all greater predictors of Python learning than was numeracy, which explained an average of 2% of the differences between people. Importantly, Prat also found that the same characteristics of resting-state brain data that previously explained how quickly someone would learn to speak French, also explained how quickly they would learn to code in Python.

Read more at Science Daily

Are grandma, grandpa sleepy during the day? They may be at risk for diabetes, cancer, more

Older people who experience daytime sleepiness may be at risk of developing new medical conditions, including diabetes, cancer and high blood pressure, according to a preliminary study released today that will be presented at the American Academy of Neurology's 72nd Annual Meeting in Toronto, Canada, April 25 to May 1, 2020.

The condition called hypersomnolence is defined as excessive daytime sleepiness even after having seven or more hours of sleep. It can be debilitating for some people, affecting the way that they perform at work and in other daily activities.

"Paying attention to sleepiness in older adults could help doctors predict and prevent future medical conditions," said study author Maurice M. Ohayon, M.D., Ph.D., DSc, of Stanford University in Stanford, Calif., and a member of the American Academy of Neurology. "Older adults and their family members may want to take a closer look at sleeping habits to understand the potential risk for developing a more serious medical condition."

The study involved 10,930 people; 34% of participants were 65 years or older.

Researchers interviewed participants over the phone two times, three years apart. In the first interview, 23% of people over 65 met the criteria for excessive sleepiness. In the second interview, 24% reported excessive sleepiness. Of those, 41% said the sleepiness was a chronic problem.

The study found that people who reported sleepiness in the first phone interview had a 2.3 times greater risk of developing diabetes or high blood pressure three years later than those who did not experience sleepiness. They were also twice as likely to develop cancer. Of the 840 people who reported sleepiness at the first interview, 52 people, or 6.2%, developed diabetes compared to 74 people, or 2.9% of those who were never sleepy during the day. Also, of the 840 people who reported sleepiness, 20 people, or 2.4%, developed cancer compared to 21 people, or 0.8% of those who were never sleepy during the day.

The results remained the same after researchers adjusted for other factors that could affect daytime sleepiness, such as gender and sleep apnea.

People who reported daytime sleepiness during both interviews had a 2.5 times greater risk of developing heart disease.

People who reported sleepiness only in the second interview were 50% more likely to also have diseases of the musculoskeletal system and connective tissue, such as arthritis, tendinitis and lupus, than those who did not have daytime sleepiness.

Read more at Science Daily

Egg stem cells do not exist, new study shows

Researchers at Karolinska Institutet in Sweden have analysed all cell types in the human ovary and found that the hotly debated so-called egg stem cells do not exist. The results, published in Nature Communications, open the way for research on improved methods of treating involuntary childlessness.

The researchers used single-cell analysis to study more than 24,000 cells collected from ovarian cortex samples of 21 patients. They also analysed cells collected from the ovarian medulla, allowing them to present a complete cell map of the human ovary.

One of the aims of the study was to establish the existence or non-existence of egg stem cells.

"The question is controversial since some research has reported that such cells do exist, while other studies indicate the opposite," says Fredrik Lanner, researcher in obstetrics and gynaecology at the Department of Clinical Science, Intervention and Technology at Karolinska Institutet, and one of the study's authors.

The question of whether egg stem cells exist affects issues related to fertility treatment, since stem cells have properties that differ from other cells.

"Involuntary childlessness and female fertility are huge fields of research," says co-author Pauliina Damdimopoulou, researcher in obstetrics and gynaecology at the same department. "This has been a controversial issue involving the testing of experimental fertility treatments."

The new study substantiates previously reported findings from animal studies -- that egg stem cells do not exist. Instead, these are so-called perivascular cells.

The new comprehensive map of ovarian cells can contribute to the development of improved methods of treating female infertility, says Damdimopoulou.

"The lack of knowledge about what a normal ovary looks like has held back developments," she says. "This study now lays the ground on which to produce new methods that focus on the egg cells that already exist in the ovary. This could involve letting egg cells mature in test tubes or perhaps developing artificial ovaries in a lab."

The results of the new study show that the main cell types in the ovary are egg cells, granulosa cells, immune cells, endothelial cells, perivascular cells and stromal cells.

Read more at Science Daily

Length of pregnancy alters the child's DNA

Researchers from Karolinska Institutet in Sweden have together with an international team mapped the relationship between length of pregnancy and chemical DNA changes in more than 6,000 newborn babies. For each week's longer pregnancy, DNA methylation changes in thousands of genes were detected in the umbilical cord blood. The study is published in Genome Medicine.

Premature birth, that is before 37 consecutive weeks' of pregnancy, is common. Between 5 and 10% of all children in the world are born prematurely. Most children will develop and grow normally, but premature birth is also linked to respiratory and lung disease, eye problems and neurodevelopmental disorders. This is especially true for children who are born very or extremely prematurely. During the fetal period, epigenetic processes, i.e., chemical modification of the DNA, are important for controlling development and growth. One such epigenetic factor is DNA methylation, which in turn affects the degree of gene activation and how much of a particular protein is formed.

"Our new findings indicate that these DNA changes may influence the development of fetal organs," says Simon Kebede Merid, first author of the study and PhD student at Karolinska Institutet, Department of Clinical Science and Education, Södersjukhuset.

The majority of observed DNA methylations at birth tended not to persist into childhood, but in 17% the levels were completely stable from birth to adolescence. The levels that you are born with in certain genes thus track with age.

"Now we need to investigate whether the DNA changes are linked to the health problems of those born prematurely," says Professor Erik Melén, at the Department of Clinical Science and Education, Södersjukhuset.

Epigenetics is a hot research topic that links genes, the environment and health. This work was done within the international Pregnancy and Childhood Epigenetics (PACE) consortium. The work represents contributions from 26 studies. Professor Melén's group also contributed to the first PACE paper which showed that mother's smoking during pregnancy changes DNA in newborns and lead two PACE studies showing effects of air pollution. Links to diseases such as asthma, allergy, obesity and even aging have also been shown.

Read more at Science Daily

Mar 1, 2020

How cardiorespiratory function is related to genetics

How high altitudes affect people's breathing and its coordination with the heart beat is due to genetic differences say researchers.

Clear physiological differences have already been demonstrated between people living in the Himalayas and Andes compared with people living at sea level, revealing an evolutionary adaptation in the control of blood flow and oxygen delivery to the brain and the rest of the body.

Now an international team led by Professor Aneta Stefanovska of Lancaster University has identified genes that are related to cardiorespiratory function during so-called acute periodic breathing. Their report is published in the Journal of Physiology.

Periodic breathing (PB) occurs in most humans at high altitudes and is characterised by periodic alternations between hyperventilation (too-fast breathing) and apnoea (no breathing). The altered respiratory pattern due to PB is accompanied by changes in heart rate and blood flow.

Breathing, ECG of the heart and microvascular blood flow were simultaneously monitored for 30 minutes in 22 healthy male subjects, with the same measurements repeated under normal and low oxygen levels, both at real and simulated altitudes of up to 3800m.

As part of the experiment, the participants were also taken in a cable car to a high altitude laboratory at the top of Aiguille du Midi mountain in Chamonix in France and tested immediately on arrival and after six hours at this altitude of 3842m.

The researchers found that orchestration between the participants' hearts and lungs, as measured by phase coherence, responded differently to periodic breathing depending on whether they had one of two specific genetic variants affecting the cardiorespiratory response to insufficient oxygen.

Chronic periodic breathing is generally seen as an unfavourable state, being associated with increased mortality in chronic heart failure, but in healthy people it may be an indication of better alertness to oxygen insufficiency at high altitudes.

Hypoxia, as well occurring during rapid ascents to high-altitudes, can also be a significant problem at sea-level, being a contributory factor in many health conditions including cancer, strokes, and heart attacks.

Professor Stefanovska said: "The similarities between hypoxia-induced PB at altitude, and the breathing characteristics observed in certain pathological states, provide an opportunity to further our understanding of the physiological processes involved in chronic hypoxic states that occur even when oxygen is abundant.

Read more at Science Dialy

Lessons learned from addressing myths about Zika and yellow fever outbreaks in Brazil

When disease epidemics and outbreaks occur, conspiracy theories often emerge that compete with the information provided by public health officials. A Dartmouth-led study in Science Advances finds that information used to counter myths about Zika in Brazil not only failed to reduce misperceptions but also reduced the accuracy of people's other beliefs about the disease.

The results provide important context as countries launch public information campaigns about the new coronavirus (COVID-19), including how to protect oneself and prevent the spread of the disease.

"It is essential to evaluate public health messaging and information campaigns," said co-author Brendan Nyhan, a professor of government at Dartmouth. "Our results indicate that efforts to correct misperceptions about emerging diseases like Zika may not be as effective as we might hope."

The study was based on a nationally representative survey conducted in Brazil in 2017 and online survey experiments conducted there in 2017 (not long after the 2015-2016 Zika epidemic) and in 2018 (just after an unusually severe yellow fever outbreak). Using survey data, the team first demonstrated the prevalence of misperceptions among Brazilians about whether Zika can be transmitted through sexual contact (true) or casual contact (false).

The researchers then conducted three preregistered experiments testing the effectiveness of information provided by public health officials to dispel myths about Zika and yellow fever. These studies, which were conducted online among Brazilian adults, showed that corrective information about Zika not only failed to reduce misperceptions but also frequently reduced the accuracy of other beliefs people held about the disease (a finding that was replicated in both the 2017 and 2018 data).

The researchers found that corrective information about yellow fever was more effective than the material about Zika. However, exposure to this information did not increase support for public policies aimed at preventing the spread of either disease nor did it change people's intentions to engage in preventive behaviors.

From Science Daily