Oct 16, 2021

Early modern human from Southeast Asia adapted to a rainforest environment

Traditional assumptions have often seen tropical rainforests as a barrier to early Homo sapiens. However, growing proof shows that humans adapted to and lived in tropical rainforest habitats of Southeast Asia. Some researchers also suggest that, in the past, other human species, like Homo erectus and Homo floresiensis, became extinct because they could not adapt to this environment as our species did. However, we know very little about the ecological adaptation of fossil humans, including what they were eating.

Zinc isotopes reveal what kind of food was primarily eaten

In this study, researchers analysed the zinc stable isotope ratios from animal and human teeth from two sites in the Huà Pan Province of Laos: Tam Pà Ling and the nearby site of Nam Lot. "The site of Tam Pà Ling is particularly important for palaeoanthropology and archaeology of Southeast Asia because it holds the oldest and most abundant fossil record of our species in this region," explains Fabrice Demeter, researcher at the University of Copenhagen. However, there is little archaeological evidence, like stone tools, hearth features, plant remains, cut marks on bones, in Tam Pà Ling: only teeth and bones. This makes isotopic approaches the only way to gain insight into past dietary reliance.

Nitrogen isotope analysis, in particular, can help scientists learn if past humans were eating animals or plants. However, the collagen in bones and teeth needed to do these analyses is not easily conservable. In tropical regions like the one at Tam Pà Ling this problem is even more acute. "New methods -- such as zinc isotope analysis of enamel -- can now overcome these limitations and allow us to investigate teeth from regions and periods we could not study before," says study leader Thomas Tütken, professor at the Johannes Gutenberg University's Institute of Geosciences. "With zinc stable isotope ratios, we can now study Tam Pà Ling and learn what kind of food our earliest ancestors in this region were eating."

First study that reveals the whole diet of fossil humans from Southeast Asia

The fossil human studied in this research dates from the Late Pleistocene, more precisely from 46,000 to 63,000 years ago. With it, various mammals from both sites, including water buffaloes, rhinos, wild boars, deer, bears, orangutans, macaques, and leopards, were also analysed. All these different animals show various eating behaviours, making for an ideal background to determine what exactly humans were eating at the time. The more diverse the animal remains found at a particular site are, the more information the researchers can use to understand the diet of prehistoric humans.

When we compare the zinc isotope values from the fossil Homo sapiens of Tam Pà Ling to that of the animals, it strongly suggests that its diet contained both plants and animals. This omnivorous diet also differs from most nitrogen isotope data of humans in other regions of the world for that time period, where a meat-rich diet is almost consistently discerned. "Another kind of analysis performed in this study -- stable carbon isotopes analysis -- indicates that the food consumed came strictly from forested environments," says Élise Dufour, researcher at the National Natural History Museum of Paris. "The results are the oldest direct evidence for subsistence strategies for Late Pleistocene humans in tropical rainforests."

Read more at Science Daily

Mammals on the menu: Snake dietary diversity exploded after mass extinction 66 million years ago

Modern snakes evolved from ancestors that lived side by side with the dinosaurs and that likely fed mainly on insects and lizards.

Then a miles-wide asteroid wiped out nearly all the dinosaurs and roughly three-quarters of the planet's plant and animal species 66 million years ago, setting the stage for the spectacular diversification of mammals and birds that followed in the early Cenozoic Era.

A new University of Michigan study shows that early snakes capitalized on that ecological opportunity and the smorgasbord that it presented, rapidly and repeatedly evolving novel dietary adaptations and prey preferences.

The study, which combines genetic evidence with ecological information extracted from preserved museum specimens, is scheduled for online publication Oct. 14 in the journal PLOS Biology.

"We found a major burst of snake dietary diversification after the dinosaur extinction -- species were evolving quickly and rapidly acquiring the ability to eat new types of prey," said study lead author Michael Grundler, who did the work for his doctoral dissertation at U-M and who is now a postdoctoral researcher at UCLA.

Mammals and birds, which were also diversifying in the wake of the extinction, began to appear in snake diets at that time. Specialized diets also emerged, such as snakes that feed only on slugs or snails, or snakes that eat only lizard eggs.

Similar outbursts of dietary diversification were also seen when snakes arrived in new places, as when they colonized the New World.

"What this suggests is that snakes are taking advantage of opportunities in ecosystems," said U-M evolutionary biologist and study co-author Daniel Rabosky, who was Grundler's doctoral adviser. "Sometimes those opportunities are created by extinctions and sometimes they are caused by an ancient snake dispersing to a new land mass."

Those repeated transformational shifts in dietary ecology were important drivers of what evolutionary biologists call adaptive radiation, the development of a variety of new forms adapted for different habitats and ways of life, according to Grundler and Rabosky.

Modern snakes are impressively diverse, with more than 3,700 species worldwide. And they display a stunning variety of diets, from tiny leaf-litter snakes that feed only on invertebrates such as ants and earthworms to giant constrictors like boas and pythons that eat mammals as big as antelope.

So, how did legless reptiles that can't chew come to be such important predators on land and sea? To find out, Grundler and Rabosky first assembled a dataset on the diets of 882 modern-day snake species.

The dataset includes more than 34,000 direct observations of snake diets, from published accounts of scientists' encounters with snakes in the field and from the analysis of the stomach contents of preserved museum specimens. Many of those specimens came from the U-M Museum of Zoology, home to the world's second-largest collection of reptiles and amphibians.

All species living today are descended from other species that lived in the past. But because snake fossils are rare, direct observation of the ancient ancestors of modern snakes -- and the evolutionary relationships among them -- is mostly hidden from view.

However, those relationships are preserved in the DNA of living snakes. Biologists can extract that genetic information and use it to construct family trees, which biologists call phylogenies.

Grundler and Rabosky merged their dietary dataset with previously published snake phylogenetic data in a new mathematical model that allowed them to infer what long-extinct snake species were like.

"You might think it would be impossible to know things about species that lived long ago and for which we have no fossil information," said Rabosky, an associate professor in the U-M Department of Ecology and Evolutionary Biology and an associate curator at the Museum of Zoology.

"But provided that we have information about evolutionary relationships and data about species that are now living, we can use these sophisticated models to estimate what their long-ago ancestors were like."

In addition to showing a major burst of snake dietary diversification following the demise of the dinosaurs in what's known as the K-Pg mass extinction, the new study revealed similar explosive dietary shifts when groups of snakes colonized new locations.

For example, some of the fastest rates of dietary change -- including an increase of roughly 200% for one subfamily -- occurred when the Colubroidea superfamily of snakes made it to the New World.

The colubroids account for most of the world's current snake diversity, with representatives found on every continent except Antarctica. They include all venomous snakes and most other familiar snakes; the group does not include boas, pythons and several obscure snakes such as blind snakes and pipe snakes.

Grundler and Rabosky also found a tremendous amount of variability in how fast snakes evolve new diets. Some groups, such as blind snakes, evolved more slowly and maintained similar diets -- mostly ants and termite larvae -- for tens of millions of years.

On the other extreme are the dipsadine snakes, a large subfamily of colubroid snakes that includes more than 700 species. Since arriving in the New World roughly 20 million years ago, they have experienced a sustained burst of dietary diversification, according to the new study.

The dipsadines include goo-eaters, false water cobras, forest flame snakes and hognose snakes. Many of them imitate deadly coral snakes to ward off predators and are known locally as false coral snakes.

"In a relatively short period of time, they've had species evolve to specialize on earthworms, on fishes, on frogs, on slugs, on snakelike eels -- even other snakes themselves," Grundlersaid.

"A lot of the stories of evolutionary success that make it into the textbooks -- such as Darwin's famous finches -- are nowhere near as impressive as some groups of snakes. The dipsadines of South and Central America have just exploded in all aspects of their diversity, and yet they are almost completely unknown outside the community of snake biologists."

Rabosky and Grundler stressed that their study could not have been done without the information gleaned from preserved museum specimens.

Read more at Science Daily

Plant-eating lizards on the cusp of tooth evolution

Researchers at the Universities of Helsinki and Lyon and the Geological Survey of Finland found that complex teeth, a hallmark of mammals, also evolved several times in reptiles, prompting the evolutionary success of plant-eating lizards. However, contrary to mammals their tooth evolution was not unidirectional.

The study, published in Nature Communications, reveals that several lizard groups evolved teeth with multiple tips ("cusps") that allowed new plant-based diets and higher speciation rates -- that is, how fast new species appear. Surprisingly, tooth evolution was more flexible in lizards and snakes than mammals, revealing a more nuanced view of tooth and dietary evolutionary adaptations in vertebrates.

Tooth shape is closely linked with diet

Scientists have richly documented the connection of tooth shape and diet in mammals, showing very diverse teeth fuelled their evolutionary success. But what about other toothed animals? The authors chose to study squamates, the group including lizards and snakes. "The teeth of squamates have received limited attention, even though they twice outnumber mammals in species numbers, and span many habitats and geographic ranges," remarks Nicolas Di-Poï, Associate Professor at the Institute of Biotechnology, University of Helsinki.

The researchers performed comparative analyses on tooth shape and diet data for more than 500 living and fossil species. They found the ancestor to all snakes and lizards had simple peg-like teeth and fed on insects. Later, complex teeth bearing multiple cusps -- similar to those of early mammals -- evolved multiple times independently in different lizard lineages. The appearance of multiple-cusped teeth allowed some lizard groups to evolve more plant-rich diets, sometimes leading to even more complex teeth.

Lizards' teeth evolution took two directions

The team also found that complex teeth and plant consumption provided an evolutionary advantage, as both traits favoured the appearance of new species. However, many lizard lineages also lost complex teeth to re-evolve the ancestral simple tooth morphology. "This came as a complete surprise," says PhD candidate Fabien Lafuma from the University of Helsinki, "as complex teeth appear as a critical innovation for both squamates and mammals."

Read more at Science Daily

Filling the gaps: Connecting genes to diseases through proteins

Hundreds of connections between different human diseases have been uncovered through their shared origin in our genome by an international research team led by scientists from the Medical Research Council (MRC) Epidemiology Unit at the University of Cambridge, challenging the categorisation of diseases by organ, symptoms, or clinical specialty.

A new study published in Science today generated data on thousands of proteins circulating in our blood and combined this with genetic data to produce a map showing how genetic differences that affect these proteins link together seemingly diverse as well as related diseases.

Proteins are essential functional units of the human body that are composed of amino acids and coded for by our genes. Malfunctions of proteins cause diseases across most medical specialties and organ systems, and proteins are also the most common target of drugs that exist today.

The findings published today help explain why seemingly unrelated symptoms can occur at the same time in patients and suggest that we should reconsider how diverse diseases can be caused by the same underlying protein or mechanism. Where a protein is a drug target, this information can point to new strategies for treating a variety of conditions, as well as minimising adverse effects.

In the study using blood samples from over 10,000 participants from the Fenland study, the team led by senior author Dr Claudia Langenberg at the MRC Epidemiology Unit and Berlin Institute of Health at Charité Universitätsmedizin, Germany, demonstrated that natural variation in 2,500 regions of the human genome is very robustly associated with differences in abundance or function of 5,000 proteins circulating in the blood.

This approach addresses an important bottleneck in the translation of basic science to clinically actionable insights. While large scale studies of the human genome have identified many thousands of variants in our DNA sequence that are associated with disease, underlying mechanisms remain often poorly understood due to uncertainties in mapping those variants to genes. By linking such disease-related DNA variations to the abundance or function of an encoded protein, the team produced strong evidence for which genes are involved, and identified novel mechanisms by which proteins mediate genetic risk into disease onset.

For example, multiple genome-wide association studies (GWAS) have linked a region of the human genome known as KAT8 with Alzheimer's disease but failed to identify which gene in this region was involved. By combining data on both proteins and genes the team was able to identify a gene in the KAT8 region named PRSS8, which codes for the protein prostasin, as a novel candidate gene in Alzheimer's disease. Similarly, they identified a novel risk gene for endometrial cancer (RSPO3).

The authors used these new insights to systematically test which of these protein-encoding genes affected a large range of diseases. They discovered more than 1,800 examples in which more than one disease was driven by variations in an individual gene and its protein products. What emerged was a network-like structure of human diseases, because many of the genes connected a range of seemingly diverse as well as related conditions in different tissues. This provides strong evidence that the respective protein is the origin, and points to new potential strategies for treatment.

Dr Langenberg explained:

'An extreme example we discovered of how one protein can be connected to several diseases is the protein Fibulin-3, which we connected to 37 conditions, including hypermobility, hernias, varicose veins, and a lower risk of carpal tunnel syndrome. A likely explanation is atypical formation of elastic fibres covering our organs and joints, leading to differences in elasticity of soft and connective tissues. This is also in line with features that others have observed in mice where this gene was deleted.'

Dr Maik Pietzner, a researcher at the MRC Epidemiology Unit and co-lead author of the study, added:

'Using our genome as the basis was key to the success of this study. Because we know that most of the proteins detected in blood have their origin in cells from other tissues, we integrated different biological layers, like gene expression, to enable us to trace proteins back to disease-relevant tissues. For example, we found that higher activity of the enzyme bile salt sulfotransferase was associated with an increased risk of gall stones through a liver specific mechanism. We linked around 900 proteins to their tissue of origin in this way.'

In collaboration with colleagues at the Helmholtz Centre in Munich, Germany, the authors have developed a bespoke web application to enable immediate dissemination of the results, and allow researchers worldwide to dive deeply into information on genes, proteins and diseases they are most interested in.

Read more at Science Daily

Oct 15, 2021

Did Venus ever have oceans?

The planet Venus can be seen as the Earth's evil twin. At first sight, it is of comparable mass and size as our home planet, similarly consists mostly of rocky material, holds some water and has an atmosphere. Yet, a closer look reveals striking differences between them: Venus' thick CO2 atmosphere, extreme surface temperature and pressure, and sulphuric acid clouds are indeed a stark contrast to the conditions needed for life on Earth. This may, however, have not always been the case. Previous studies have suggested that Venus may have been a much more hospitable place in the past, with its own liquid water oceans. A team of astrophysicists led by the University of Geneva (UNIGE) and the National Centre of Competence in Research (NCCR) PlanetS, Switzerland, investigated whether our planet's twin did indeed have milder periods. The results, published in the journal Nature, suggest that this is not the case.

Venus has recently become an important research topic for astrophysicists. ESA and NASA have decided this year to send no less than three space exploration missions over the next decade to the second closest planet to the Sun. One of the key questions these missions aim to answer is whether or not Venus ever hosted early oceans. Astrophysicists led by Martin Turbet, researcher at the Department of Astronomy of the Faculty of Science of the UNIGE and member of the NCCR PlanetS, have tried to answer this question with the tools available on Earth. "We simulated the climate of the Earth and Venus at the very beginning of their evolution, more than four billion years ago, when the surface of the planets was still molten," explains Martin Turbet. "The associated high temperatures meant that any water would have been present in the form of steam, as in a gigantic pressure cooker." Using sophisticated three-dimensional models of the atmosphere, similar to those scientists use to simulate the Earth's current climate and future evolution, the team studied how the atmospheres of the two planets would evolve over time and whether oceans could form in the process.

"Thanks to our simulations, we were able to show that the climatic conditions did not allow water vapour to condense in the atmosphere of Venus," says Martin Turbet. This means that the temperatures never got low enough for the water in its atmosphere to form raindrops that could fall on its surface. Instead, water remained as a gas in the atmosphere and oceans never formed. "One of the main reasons for this is the clouds that form preferentially on the night side of the planet. These clouds cause a very powerful greenhouse effect that prevented Venus from cooling as quickly as previously thought," continues the Geneva researcher.

Small differences with serious consequences


Surprisingly, the astrophysicists' simulations also reveal that the Earth could easily have suffered the same fate as Venus. If the Earth had been just a little closer to the Sun, or if the Sun had shone as brightly in its 'youth' as it does nowadays, our home planet would look very different today. It is likely the relatively weak radiation of the young Sun that allowed the Earth to cool down enough to condense the water that forms our oceans. For Emeline Bolmont, professor at UNIGE, member of PlaneS and co-author of the study, "this is a complete reversal in the way we look at what has long been called the 'Faint Young Sun paradox'. It has always been considered as a major obstacle to the appearance of life on Earth!" The argument was that if the Sun's radiation was much weaker than today, it would have turned the Earth into a ball of ice hostile to life. "But it turns out that for the young, very hot Earth, this weak Sun may have in fact been an unhoped-for opportunity," continues the researcher.

Read more at Science Daily

The planet does not fall far from the star

A compositional link between planets and their respective host star has long been assumed in astronomy. For the first time now, a team of scientists, with the participation of researchers of the National Centre of Competence in Research (NCCR) PlanetS from the University of Bern and the University of Zürich, deliver empirical evidence to support the assumption -- and partly contradict it at the same time.

Stars and planets are formed from the same cosmic gas and dust. In the course of the formation process, some of the material condenses and forms rocky planets, the rest is either accumulated by the star or becomes part of gaseous planets. The assumption of a connection between the composition of stars and their planets is therefore reasonable and is confirmed, for example, in the solar system by most rocky planets (Mercury being the exception). Nevertheless, assumptions, especially in astrophysics, do not always prove to be true. A study led by the Instituto de Astrofísica e Ciências do Espaço (IA) in Portugal, which also involves researchers from the NCCR PlanetS at the University of Bern and the University of Zürich, published today in the journal Science, provides the first empirical evidence for this assumption -- and at the same time partially contradicts it.

Condensed star vs rocky planet

To determine whether the compositions of stars and their planets are related, the team compared very precise measurements of both. For the stars, their emitted light was measured, which bears the characteristic spectroscopic fingerprint of their composition. The composition of the rocky planets was determined indirectly: Their density and composition were derived from their measured mass and radius. Only recently have enough planets been measured so precisely that meaningful investigations of this kind are possible.

"But since stars and rocky planets are quite different in nature, the comparison of their composition is not straightforward," as Christoph Mordasini, co-author of the study, lecturer of astrophysics at the university of Bern and member of the NCCR PlanetS begins to explain. "Instead, we compared the composition of the planets with a theoretical, cooled-down version of their star. While most of the star's material -- mainly hydrogen and helium -- remains as a gas when it cools, a tiny fraction condenses, consisting of rock-forming material such as iron and silicate," explains Christoph Mordasini.

At the University of Bern, the "Bern Model of Planet Formation and Evolution" has been continuously developed since 2003 (see infobox). Christoph Mordasini says: "Insights into the manifold processes involved in the formation and evolution of planets are integrated into the model." Using this Bern model the researchers were able to calculate the composition of this rock-forming material of the cooled-down star. "We then compared that with the rocky planets," Christoph Mordasini says.

Indications of the habitability of planets

"Our results show that our assumptions regarding star and planet compositions were not fundamentally wrong: the composition of rocky planets is indeed intimately tied to the composition of their host star. However, the relationship is not as simple as we expected," lead author of the study and researcher at the IA, Vardan Adibekyan, says. What the scientists expected, was that the star's abundance of these elements sets the upper possible limit. "Yet for some of the planets, the iron abundance in the planet is even higher than in the star" as Caroline Dorn, who co-authored the study and is a member of the NCCR PlanetS as well as Ambizione Fellow at the University of Zurich, explains. "This could be due to giant impacts on these planets that break off some of the outer, lighter materials, while the dense iron core remains," according to the researcher. The results could therefore give the scientists clues about the history of the planets.

"The results of this study are also very useful to constrain planetary compositions that are assumed based on the calculated density from mass and radius measurements," Christoph Mordasini explains. "Since more than one composition can fit a certain density, the results of our study tell us that we can narrow potential compositions down, based on the host star's composition," Mordasini says. And since the exact composition of a planet influences, for example, how much radioactive material it contains or how strong its magnetic field is, it can determine whether the planet is life-friendly or not.

Read more at Science Daily

'Broken heart' syndrome is on the rise in women

The study, published in the Journal of the American Heart Association (JAHA), suggests middle-aged and older women are being diagnosed with broken heart syndrome more frequently -- up to 10 times more often -- than younger women or men of any age.

The research also suggests that the rare condition has become more common, and the incidence has been rising steadily since well before the COVID-19 pandemic.

"Although the global COVID-19 pandemic has posed many challenges and stressors for women, our research suggests the increase in Takotsubo diagnoses was rising well before the public health outbreak," said Susan Cheng, MD, MPH, MMSc, director of the Institute for Research on Healthy Aging in the Department of Cardiology at the Smidt Heart Institute and senior author of the study. "This study further validates the vital role the heart-brain connection plays in overall health, especially for women."

What the Data Shows

Cheng and her research team used national hospital data collected from more than 135,000 women and men who were diagnosed with Takotsubo syndrome between 2006 and 2017. While confirming that women are diagnosed more frequently than men, the results also revealed that diagnoses have been increasing at least six to 10 times more rapidly for women ages 50 to 74 than for any other demographic.

Additional findings include:
 

  • Of the 135,463 documented cases of Takotsubo cardiomyopathy, the annual incidence increased steadily in both sexes, with women contributing most cases (83.3%), especially those over 50.
  • In particular, researchers observed a significantly greater increase in incidence among middle-aged women and older women, compared to younger women. For every additional diagnosis of Takotsubo in younger women -- or men of all age groups -- there were 10 additional cases diagnosed for middle-aged women and six additional diagnoses for older women.


Prior to this study, researchers only knew that women are more prone than men to developing Takotsubo syndrome. This latest study is the first to ask whether there are age-based sex differences and if case rates may be changing over time.

The Brain and Heart Connection

As Cheng, who also serves as professor of cardiology and the Erika J. Glazer Chair in Women's Cardiovascular Health and Population Science, explains, the way the brain and nervous system respond to different types of stressors is something that changes as women age.

"There is likely a tipping point, just beyond midlife, where an excess response to stress can impact the heart," said Cheng, director of Cardiovascular Population Sciences in the Barbra Streisand Women's Heart Center. "Women in this situation are at especially affected, and the risk seems to be increasing."

The researchers are next investigating the longer-term implications of a Takotsubo diagnosis, molecular markers of risk, and the factors that may be contributing to rising case rates.

The Smidt Heart Institute has played a leading role in identifying female-pattern heart disease and conditions, developing new diagnostic tools and advancing specialized care for women.

Although medical professionals understand the connection between stress and heart disease risk are critically important, there is still a lot to discern.

Read more at Science Daily

How the brain ignores distracting information to coordinate movements

As you read this article, touch receptors in your skin are sensing your environment. Your clothes and jewelry, the chair you're sitting on, the computer keyboard or mobile device you're using, even your fingers as they brush one another unintentionally -- each touch activates collections of nerve cells. But, unless a stimulus is particularly unexpected or required to help you orient your own movements, your brain ignores many of these inputs.

Now, Salk researchers have discovered how neurons in a small area of the mammalian brain help filter distracting or disruptive signals -- specifically from the hands -- to coordinate dexterous movements. Their results, published in the journal Science on October 14, 2021, may hold lessons in how the brain filters other sensory information as well.

"These findings have implications not only for gaining a better understanding of how our nervous system interacts with the world, but also for teaching us how to build better prosthetics and robots, and how to more effectively repair neural circuitry after disease or injury," says Eiman Azim, assistant professor in Salk's Molecular Neurobiology Laboratory and the William Scandling Developmental Chair.

Scientists have long known that input from the hands is needed to coordinate dexterous movements, from throwing a ball to playing a musical instrument. In one classic experiment, volunteers with anesthetized, numb fingertips found it extremely difficult to pick up and light a match.

"There's a common misconception that the brain sends a signal and you just perform the resulting movement," says Azim. "But in reality, the brain is constantly incorporating feedback information about the state of your limbs and fingers and adjusting its output in response."

If the brain responded to every signal from the body, it would quickly become overwhelmed -- as happens with some sensory processing disorders. Azim and his colleagues wanted to identify exactly how a healthy brain manages to pick and choose which tactile signals to take into account to coordinate dexterous movements like manipulating objects.

They used a combination of tools in mice to study cells within a small area in the brainstem called the cuneate nucleus, which is the first area signals from the hand enter the brain. While it was known that sensory information passes through the cuneate nucleus, the team discovered that a set of neurons in this region actually controls how much information from the hands eventually passes on to other parts of the brain. By manipulating those circuits to allow more or less tactile feedback through, Azim's team could influence how mice perform dexterous tasks -- such as pulling a rope or learning to distinguish textures -- to earn rewards.

"The cuneate nucleus is often referred to as a relay station, as if information was just passing through it," says Staff Researcher James Conner, first author of the new paper. "But it turns out that sensory information is actually being modulated in this structure."

Conner and Azim went on to show how different parts of the cortex in mice -- the region responsible for more complex, adaptive behavior -- can in turn control the neurons of the cuneate to dictate how strongly they're filtering sensory information from the hands.

Today, despite decades of work, most prosthetics and robots struggle to be nimble-fingered and carry out small, precise hand movements. Azim and Conner say their work could help inform the design of better processes to integrate sensory information from artificial fingers into these kinds of systems to improve their dexterity. It also could have implications for understanding sensory processing disorders or troubleshooting what goes wrong in the brain when the flow of sensory information is thrown out of balance.

"Sensory systems have evolved to have very high sensitivity in order to maximize protective responses to external threats. But our own actions can activate these sensory systems, thereby generating feedback signals that can be disruptive to our intended actions," says Conner.

"We're constantly bombarded with information from the world, and the brain needs ways to decide what comes through and what doesn't," says Azim. "It's not just tactile feedback, but visual and olfactory and auditory, temperature and pain -- the lessons we're learning about this circuitry likely apply in general ways to how the brain modulates these types of feedback as well."

Read more at Science Daily

Oct 14, 2021

Underwater gardens boost coral diversity to stave off ‘biodiversity meltdown’

Corals are the foundation species of tropical reefs worldwide, but stresses ranging from overfishing to pollution to warming oceans are killing corals and degrading the critical ecosystem services they provide. Because corals build structures that make living space for many other species, scientists have known that losses of corals result in losses of other reef species. But the importance of coral species diversity for corals themselves was less understood.

A new study from two researchers at the Georgia Institute of Technology provides both hope and a potentially grim future for damaged coral reefs. In the study, published October 13 in Science Advances, Cody Clements and Mark Hay found that increasing coral richness by 'outplanting' a diverse group of coral species together improves coral growth and survivorship. This finding may be especially important in the early stages of reef recovery following large-scale coral loss -- and in supporting healthy reefs that in turn support fisheries, tourism, and coastal protection from storm surges.

The scientists also call for additional research to better understand and harness the mechanisms producing these positive species interactions, with dual aims to improve reef conservation and promote more rapid and efficient recovery of degraded reefs.

But the ecological pendulum swings the other way, too. If more coral species are lost, the synergistic effects could threaten other species in what Clements and Hay term a "biodiversity meltdown."

"Yes, corals are the foundation species of these ecosystems -- providing habitat and food for numerous other reef species," said Clements, a Teasley Postdoctoral Fellow in the School of Biological Sciences. "Negative effects on corals often have cascading impacts on other species that call coral reefs home. If biodiversity is important for coral performance and resilience, then a 'biodiversity meltdown' could exacerbate the decline of reef ecosystems that we're observing worldwide."

Clements and Hay traveled to Mo'orea, French Polynesia, in the tropical Pacific Ocean, where they planted coral gardens differing in coral species diversity to evaluate the relative importance of mutualistic versus competitive interactions among corals as they grew and interacted through time.

"We've done the manipulations, and the corals should be competing with each other, but in fact they do better together than they do on their own," said Hay, Regents Professor and Teasley Chair in the School of Biological Sciences. Hay is also co-director of the Ocean Science and Engineering graduate program at Georgia Tech. "We are still investigating the mechanisms causing this surprising result, but our experiments consistently demonstrate that the positive interactions are overwhelming negative interactions in the reef settings where we conduct these experiments. That means when you take species out of the system, you're taking out some of those positive interactions, and if you take out critical ones, it may make a big difference."

Under the sea, in a coral-growing garden, in the shade

Coral reefs are under threat worldwide. Hay notes that according to the EPA, the Caribbean has lost 80 to 90 percent of its coral cover. The Indo-Pacific region has lost half of all its corals over the last 30 years. During the bleaching event of 2015-2016 alone, nearly half of the remaining corals along the Great Barrier Reef bleached and died.

"The frequency of these big bleaching and heating events that are killing off corals has increased fairly dramatically over the last 20 to 30 years," he said. "There are hot spots here and there where coral reefs are still good, but they're small and isolated in general."

In their coral gardens in French Polynesia, Hay and Clements manipulated the diversity of the coral species that they planted on platforms resembling underwater chess tables, to try and see if species richness and density affected coral productivity and survival.

Hay noted many previous, similar experiments involved bringing corals into a lab to "pit species against each other." But he points out, "We do all of our experiments in the real world. We're not as interested in whether it can happen, but whether it does happen."

An experimental setup suggested by Clements involving Coke bottles helped the scientists arrange their garden. The end tables "have Coca-Cola bottlecaps embedded in the top of them," Hay said. "We can then cut off the necks of Coke bottles, glue corals into the upside-down necks of these things, and then screw them in and out of these plots. This allows us to not only arrange what species we want where, but every couple of months we can unscrew and weigh them so we can get accurate growth rates."

The researchers found that corals benefitted from increased biodiversity, "but only up to a point," Clements noted. "Corals planted in gardens with an intermediate number of species -- three to six species in most cases -- performed better than gardens with low, or one, species, or high, as in nine, species. However, we still do not fully understand the processes that contributed to these observations."

Read more at Science Daily

To watch a comet form, a spacecraft could tag along for a journey toward the sun

Deep in the solar system, between Jupiter and Neptune, lurk thousands of small chunks of ice and rock. Occasionally, one of them will bump into Jupiter's orbit, get caught and flung into the inner solar system -- towards the sun, and us.

This is thought to be the source of many of the comets that eventually pass Earth. A new study lays out the dynamics of this little-understood system. Among the findings: it would be doable for a spacecraft to fly to Jupiter, wait in Jupiter's orbit until one of these objects gets caught in the planet's gravity well, and hitch a ride with the object to watch it become a comet in real time.

"This would be an amazing opportunity to see a pristine comet 'turn on' for the first time," said Darryl Seligman, a postdoctoral researcher with the University of Chicago and corresponding author of the paper, which is accepted to The Planetary Science Journal. "It would yield a treasure trove of information about how comets move and why, how the solar system formed, and even how Earth-like planets form."

Thanks in part to discoveries of several major asteroid belts, scientists over the last 50 years have revamped their theories of how our solar system came to be. Rather than big planets quietly evolving in place, they now envision a system that was much more dynamic and unstable -- chunks of ice and rock scattered and smashing into each other, re-forming and moving around within the solar system.

Many of these objects eventually coalesced into the eight major planets, but others remain loose and scattered in several regions of space. "These minor bodies show you the solar system is actually this very dynamic and almost living place that's constantly in a state of flux," said Seligman.

Scientists are very familiar with the asteroid belt near Mars, as well as the larger one out past Neptune called the Kuiper belt. But between Jupiter and Neptune, there lurks another, lesser-known population of objects called the centaurs (named after the mythical hybrid creatures due to their classification halfway between asteroids and comets).

Occasionally, these centaurs will get sucked into the inner solar system and become comets. "These objects are very old, containing ice from the early days of the solar system that has never been melted," said Seligman. "When an object gets closer to the sun, the ice sublimates and produces these beautiful long tails.

"Therefore comets are interesting not only because they're beautiful; they give you a way to probe the chemical composition of things from the distant solar system."

In this study, scientists examined the centaur population and the mechanisms by which these objects occasionally become comets bound for the sun. They estimate that about half of the centaurs-turned-comets are nudged into the inner solar system by interacting with both Jupiter and Saturn's orbits. The other half come too close to Jupiter, then get caught in its orbit and flung toward the center of the solar system.

The latter mechanism suggested a perfect way to get a better look at these soon-to-be comets: Space agencies, the scientists said, could send a spacecraft to Jupiter and have it sit in orbit until a centaur bumps into Jupiter's orbit. Then the spacecraft could hitch a ride alongside the centaur as it heads toward the sun, taking measurements all the way as it transforms into a comet.

This is a beautiful but destructive process: A comet's beautiful tail is produced as its ice burns off as the temperature rises. The ice in comets is made up of different kinds of molecules and gases, which each start to burn up at different points along the way to the sun. By taking measurements of that tail, a spacecraft could learn what the comet was made up of. "You could figure out where typical comet ices turn on, and also what the detailed internal structure of what a comet is, which you have very little hope of figuring out from ground-based telescopes," Seligman said.

Meanwhile, the surface of the comet erupts as it heats up, creating pockmarks and craters. "Charting all of this would help you understand the dynamics of the solar system, which is important for things like understanding how to form Earth-like planets in solar systems," he said.

While the idea sounds complicated, NASA and other space agencies already have the technology to pull it off, the scientists said. Spacecraft routinely go to the outer solar system; NASA's Juno mission, currently taking wild photos of Jupiter, only took about five years to get there. Other recent missions also show that it's possible to visit objects even as they're moving: OSIRIS-REx visited an asteroid 200 million miles away, and Japan's Hayabusa 2 spacecraft brought back a handful of rocks from another asteroid.

There's even a possible target: A year and a half ago, scientists discovered that one of the centaurs, called LD2, will likely be sucked into Jupiter's orbit in about the year 2063. And as telescopes become more powerful, scientists may soon discover many more of these objects, Seligman said: "It's very possible there would be 10 additional targets in the next 40 years, any of which would be attainable by a spacecraft parked at Jupiter."

Read more at Science Daily

Primates’ ancestors may have left trees to survive asteroid

When an asteroid struck 66 million years ago and wiped out dinosaurs not related to birds and three-quarters of life on Earth, early ancestors of primates and marsupials were among the only tree-dwelling (arboreal) mammals that survived, according to a new study.

Arboreal species were especially at risk of extinction due to global deforestation caused by wildfires from the asteroid's impact.

In the study, computer models, fossil records and information from living mammals revealed that most of the surviving mammals did not rely on trees, though the few arboreal mammals that lived on -- including human ancestors -- may have been versatile enough to adapt to the loss of trees.

The study points to the influence of this extinction event, known as the Cretaceous-Paleogene (K-Pg) boundary, on shaping the early evolution and diversification of mammals.

"One possible explanation for how primates survived across the K-Pg boundary, in spite of being arboreal, might be due to some behavioral flexibility, which may have been a critical factor that let them survive," said Jonathan Hughes, the paper's co-first author and a doctoral student in the lab of Jeremy Searle, professor of ecology and evolutionary biology in the College of Agriculture and Life Sciences. Co-first author Jacob Berv, Ph.D. '19, is currently a Life Sciences Fellow at the University of Michigan.

The study, "Ecological Selectivity and the Evolution of Mammalian Substrate Preference Across the K-Pg Boundary," published October 11 in the journal Ecology and Evolution.

The earliest mammals appeared roughly 300 million years ago and may have diversified in tandem with an expansion of flowering plants about 20 million years prior to the K-Pg event. When the asteroid struck, many of these mammal lineages died off, Hughes said.

"At the same time, the mammals that did survive diversified into all the new ecological niches that opened up when dinosaurs and other species became extinct," Hughes said.

In the study, the researchers used published phylogenies (branching, tree-like diagrams that show evolutionary relatedness among groups of organisms) for mammals. They then classified each living mammal on those phylogenies into three categories -- arboreal, semi-arboreal and non-arboreal -- based on their preferred habitats. They also designed computer models that reconstructed the evolutionary history of mammals.

Mammal fossils from around the K-Pg are very rare and are difficult to use to interpret an animal's habitat preference. The researchers compared information known from living mammals against available fossils to help provide additional context for their results.

Generally, the models showed that surviving species were predominantly non-arboreal through the K-Pg event, with two possible exceptions: ancestors of primates and marsupials. Primate ancestors and their closest relatives were found to be arboreal right before the K-Pg event in every model. Marsupial ancestors were found to be arboreal in half of the model reconstructions.

The researchers also examined how mammals as a group may have been changing over time.

"We were able to see that leading up to the K-Pg event, around that time frame, there was a big spike in transitions from arboreal and semi-arboreal to non-arboreal, so it's not just that we are seeing mostly non-arboreal [species], but things were rapidly transitioning away from arboreality," Hughes said.

Read more at Science Daily

Stress on mothers can influence biology of future generations

A mother's response to stress can even influence her grandchildren.

Biologists at the University of Iowa found that roundworm mothers subjected to heat stress passed, under certain conditions and through modifications to their genes, the legacy of that stress exposure not only to their offspring but even to their offspring's children.

The researchers, led by Veena Prahlad, associate professor in the Department of Biology and the Aging Mind and Brain Initiative, looked at how a mother roundworm reacts when she senses danger, such as a change in temperature, which can be harmful or even fatal to the animal. In a study published last year, the biologists discovered the mother roundworm releases serotonin when she senses danger. The serotonin travels from her central nervous system to warn her unfertilized eggs, where the warning is stored, so to speak, and then passed to offspring after conception.

Examples of such genetic cascades abound, even in humans. Studies have shown that pregnant women affected by famine in the Netherlands from 1944 to 1945, known as the Dutch Hunger Winter, gave birth to children who were influenced by that episode as adults -- with higher rates than average of obesity, diabetes, and schizophrenia.

In this study, the biologists wanted to find out how the memory of stress exposure was stored in the egg cell.

"Genes have 'memories' of past environmental conditions that, in turn, affect their expression even after these conditions have changed," Prahlad explains. "How this 'memory' is established and how it persists past fertilization, embryogenesis, and after the embryo develops into adults is not clear. "This is because during embryogenesis, most organisms typically reset any changes that have been made to genes because of the genes' past activity."

Prahlad and her teams turned to the roundworm, a creature regularly studied by scientists, for clues. They exposed mother roundworms to unexpected stresses and found the stress memory was ingrained in the mother's eggs through the actions of a protein called the heat shock transcription factor, or HSF1. The HSF1 protein is present in all plants and animals and is activated by changes in temperature, salinity, and other stressors.

The team found that HSF1 recruits another protein, an enzyme called a histone 3 lysine 9 (H3K9) methyltransferase. The latter normally acts during embryogenesis to silence genes and erase the memory of their prior activity.

However, Prahald's team observed something else entirely.

"We found that HSF1 collaborates with the mechanisms that normally act to 'reset' the memory of gene expression during embryogenesis to, instead, establish this stress memory," Prahlad says.

One of these newly silenced genes encodes the insulin receptor, which is central to metabolic changes with diabetes in humans, and which, when silenced, alters an animal's physiology, metabolism, and stress resilience. Because these silencing marks persisted in offspring, their stress-response strategy was switched from one that depended on the ability to be highly responsive to stress, to relying instead on mechanisms that decreased stress responsiveness but provided long-term protection from stressful environments.

"What we found all the more remarkable was that if the mother was exposed to stress for a short period of time, only progeny that developed from her germ cells that were subjected to this stress in utero had this memory," Prahlad says. "The progeny of these progeny (the mother's grandchildren) had lost this memory. However, if the mother was subjected to a longer period of stress, the grandchildren generation retained this memory. Somehow the 'dose' of maternal stress exposure is recorded in the population."

The researchers plan to investigate these changes further. HSF1 is not only required for stress resistance but also increased levels of both HSF1 and the silencing mark are associated with cancer and metastasis. Because HSF1 exists in many organisms, its newly discovered interaction with H3K9 methyltransferase to drive gene silencing is likely to have larger repercussions.

Read more at Science Daily

Sense of smell is our most rapid warning system

The ability to detect and react to the smell of a potential threat is a precondition of our and other mammals' survival. Using a novel technique, researchers at Karolinska Institutet in Sweden have been able to study what happens in the brain when the central nervous system judges a smell to represent danger. The study, which is published in PNAS, indicates that negative smells associated with unpleasantness or unease are processed earlier than positive smells and trigger a physical avoidance response.

"The human avoidance response to unpleasant smells associated with danger has long been seen as a conscious cognitive process, but our study shows for the first time that it's unconscious and extremely rapid," says the study's first author Behzad Iravani, researcher at the Department of Clinical Neuroscience, Karolinska Institutet.

The olfactory organ takes up about five per cent of the human brain and enables us to distinguish between many million different smells. A large proportion of these smells are associated with a threat to our health and survival, such as that of chemicals and rotten food. Odour signals reach the brain within 100 to 150 milliseconds after being inhaled through the nose.

The survival of all living organisms depends on their ability to avoid danger and seek rewards. In humans, the olfactory sense seems particularly important for detecting and reacting to potentially harmful stimuli.

It has long been a mystery just which neural mechanisms are involved in the conversion of an unpleasant smell into avoidance behaviour in humans. One reason for this is the lack of non-invasive methods of measuring signals from the olfactory bulb, the first part of the rhinencephalon (literally "nose brain") with direct (monosynaptic) connections to the important central parts of the nervous system that helps us detect and remember threatening and dangerous situations and substances.

Researchers at Karolinska Institutet have now developed a method that for the first time has made it possible to measure signals from the human olfactory bulb, which processes smells and in turn can transmits signals to parts of the brain that control movement and avoidance behaviour.

Their results are based on three experiments in which participants were asked to rate their experience of six different smells, some positive, some negative, while the electrophysiological activity of the olfactory bulb when responding to each of the smells was measured.

"It was clear that the bulb reacts specifically and rapidly to negative smells and sends a direct signal to the motor cortex within about 300 ms," says the study's last author Johan Lundström, associate professor at the Department of Clinical Neuroscience, Karolinska Institutet. "The signal causes the person to unconsciously lean back and away from the source of the smell."

He continues:

"The results suggest that our sense of smell is important to our ability to detect dangers in our vicinity, and much of this ability is more unconscious than our response to danger mediated by our senses of vision and hearing."

Read more at Science Daily

Oct 13, 2021

Immense set of mysterious fast radio bursts

An international team of astronomers recently observed more than 1,650 fast radio bursts (FRBs) detected from one source in deep space, which amounts to the largest set -- by far -- of the mysterious phenomena ever recorded.

More than a decade after the discovery of FRBs, astronomers are still baffled by the origins of the millisecond-long, cosmic explosions that each produce the energy equivalent to the sun's annual output.

In a study published in the Oct. 13 issue of the journal Nature, scientists -- including UNLV astrophysicist Bing Zhang -- report on the discovery of a total of 1,652 independent FRBs from one source over the course of 47 days in 2019. The source, dubbed FRB 121102, was observed using the Five-hundred-meter Aperture Spherical Telescope (FAST) in China, and represents more FRBs in one event than all previous reported occurrences combined.

"This was the first time that one FRB source was studied in such great detail," said Zhang, one of the study's corresponding authors. "The large burst set helped our team home in like never before on the characteristic energy and energy distribution of FRBs, which sheds new light on the engine that powers these mysterious phenomena."

Since FRBs were first discovered in 2007, astronomers worldwide have turned to powerful radio telescopes like FAST to trace the bursts and to look for clues on where they come from and how they're produced. The source that powers most FRBs is widely believed to be magnetars, incredibly dense, city-sized neutron stars that possess the strongest magnetic fields in the universe. And while scientists are gaining greater clarity on what produces FRBs, the exact location of where they occur is still a mystery.

A mystery that recent results may be starting to unravel.

According to Zhang, there are two active models for where FRBs come from. One could be that they come from magnetospheres, or within a magnetar's strong magnetic field. Another theory is that FRBs form from relativistic shocks outside the magnetosphere traveling the speed of light.

"These results pose great challenges to the latter model," says Zhang. "The bursts are too frequent and -- given that this episode alone amounts to 3.8% of the energy available from a magnetar -- it adds up to too much energy for the second model to work."

The bursts were measured by FAST within a total of 59.5 hours over 47 days from Aug. 29 to Oct. 29, 2019.

"During its most active phase, FRB 121102 included 122 bursts measured within a one-hour period, the highest repeat rate ever observed for any FRB," said Pei Wang, one of the article's lead authors from the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC).

Researchers expect that FAST will continue to systematically investigate a large number of repeating FRBs in the future.

"As the world's largest antenna, FAST's sensitivity proves to be conducive to revealing intricacies of cosmic transients, including FRBs," said Di Li, the study's lead researcher from NAOC.

Read more at Science Daily

Ancient feces shows people in present-day Austria drank beer and ate blue cheese up to 2,700 years ago

Human feces don't usually stick around for long -- and certainly not for thousands of years. But exceptions to this general rule are found in a few places in the world, including prehistoric salt mines of the Austrian UNESCO World Heritage area Hallstatt-Dachstein/Salzkammergut. Now, researchers who've studied ancient fecal samples (or paleofeces) from these mines have uncovered some surprising evidence: the presence of two fungal species used in the production of blue cheese and beer. The findings appear in the journal Current Biology on October 13.

"Genome-wide analysis indicates that both fungi were involved in food fermentation and provide the first molecular evidence for blue cheese and beer consumption during Iron Age Europe," says Frank Maixner (@FrankMaixner) of the Eurac Research Institute for Mummy Studies in Bolzano, Italy.

"These results shed substantial new light on the life of the prehistoric salt miners in Hallstatt and allow an understanding of ancient culinary practices in general on a whole new level," adds Kerstin Kowarik (@KowarikKerstin) of the Museum of Natural History Vienna. "It is becoming increasingly clear that not only were prehistoric culinary practices sophisticated, but also that complex processed foodstuffs as well as the technique of fermentation have held a prominent role in our early food history."

Earlier studies already had shown the potential for studies of prehistoric paleofeces from salt mines to offer important insights into early human diet and health. In the new study, Maixner, Kowarik, and their colleagues added in-depth microscopic, metagenomic, and proteomic analyses -- to explore the microbes, DNA, and proteins that were present in those poop samples.

These comprehensive studies allowed them to reconstruct the diet of the people who once lived there. They also could get information about the ancient microbes that inhabited their guts. Gut microbes are collectively known as the gut microbiome and are now recognized to have an important role in human health.

Their dietary survey identified bran and glumes of different cereals as one of the most prevalent plant fragments. They report that this highly fibrous, carbohydrate-rich diet was supplemented with proteins from broad beans and occasionally with fruits, nuts, or animal food products.

In keeping with their plant-heavy diet, the ancient miners up to the Baroque period also had gut microbiome structures more like those of modern non-Westernized individuals, whose diets are also mainly composed of unprocessed food, fresh fruits and vegetables. The findings suggest a more recent shift in the Western gut microbiome as eating habits and lifestyles changed.

When the researchers extended their microbial survey to include fungi, that's when they got their biggest surprise: an abundance in one of their Iron Age samples of Penicillium roqueforti and Saccharomyces cerevisiae DNA.

"The Hallstatt miners seem to have intentionally applied food fermentation technologies with microorganisms which are still nowadays used in the food industry," Maixner says.

Read more at Science Daily

Scientists discover a highly potent antibody against SARS-CoV-2

Scientists at Lausanne University Hospital (CHUV) and EPFL have discovered a highly potent monoclonal antibody that targets the SARS-CoV-2 spike protein and is effective at neutralizing all variants of concern identified to date, including the delta variant. Their findings are published in the journal Cell Reports.

The newly identified antibody was isolated using lymphocytes from COVID-19 patients enrolled in the ImmunoCoV study being carried out by CHUV's Service of Immunology and Allergy. This antibody is one of the most powerful identified so far against SARS-CoV-2. Structural characterization of the antibody indicates that it binds to an area that is not subjected to mutations of the spike protein. Through this tight interaction, the antibody blocks the spike protein from binding to cells expressing the ACE2 receptor, which is the receptor the virus uses to enter and infect lung cells. That means the antibody halts the viral replication process, enabling a patient's immune system to eliminate SARS-CoV-2 from the body. This protective mechanism was proven through in vivo tests on hamsters; specimens that were administered the antibody were protected against infection even after receiving a highly infectious dose.

In addition to its antiviral properties, the new antibody is designed to have a lasting effect in humans. A typical unaltered antibody provides protection for up to 3-4 weeks. But this new one can protect patients for 4-6 months. That makes it an interesting preventive-treatment option for unvaccinated at-risk individuals or for vaccinated individuals who are unable to produce an immune response. Immunocompromised patients, organ transplant recipients and those suffering from certain kinds of cancer could be protected against SARS-CoV-2 by receiving antibody injections two or three times a year.

CHUV and EPFL now plan to build on these promising results in association with a start-up company which will perform clinical development and production of the antibody-containing drug, through cooperation and intellectual property agreements. Clinical trials of the drug should begin in late 2022.

Treatment or prophylaxy

This research was conducted jointly by CHUV's Service of Immunology and Allergy, headed by Prof. Giuseppe Pantaleo and Dr. Craig Fenwick, and by EPFL's Laboratory of Virology and Genetics, headed by Prof. Didier Trono and Dr. Priscilla Turelli. The research team was able to respond to the pandemic and discover this neutralizing antibody so quickly thanks to the multi-year support of the Swiss Vaccine Research Institute. Prof. Pantaleo's department at CHUV also received support from the Corona Accelerated R&D in Europe (CARE) program, which is part of the Innovative Medicine Initiative (IMI) -- a public-private partnership that seeks to address bottlenecks in the drug discovery and development process in Europe.

Read more at Science Daily

Catching malaria evolution in the act

Understanding how malaria parasites evolve after a human is bitten by an infected mosquito is very difficult. There can be billions of individual parasites in a patient's bloodstream and traditional genetic sequencing techniques can't identify the raw material for evolution: new mutations.

"If you want to understand if the parasites are related to each other, if they are all from one mosquito or multiple mosquito bites, and what novel mutations are emerging in an infection, then you have to bring it down to the individual genome level," says Assistant Professor Ian Cheeseman, Ph.D., and Co-lead of the Host-Pathogen Interactions Program at Texas Biomedical Research Institute.

Thanks to a combination of advanced techniques, Cheeseman and his collaborators are now able to sequence the genomes of individual parasites found in the blood of infected patients. Notably, they can now do this even when the infection burden is very low, which can occur during asymptomatic infections. They describe their approach this month in the journal Cell Host & Microbe. Gaining this incredibly detailed view of malaria parasite genetics and evolution is expected to give researchers and drug companies ammunition to develop more effective treatments, vaccines or therapies.

Malaria infects more than 200 million people a year, killing more than 400,000 in 2019 -- most of them young children. Of the five malaria parasite species that infect humans, two are the most widespread: Plasmodium falciparum, which is the deadliest; and Plasmodium vivax, which is the leading cause of recurring malaria infections because it can lie dormant in the liver and reemerge later.

"We were really excited to understand how this dormant liver stage might impact genetic variation and evolution in a P. vivax infection," says co-first paper author Aliou Dia, Ph.D., a postdoctoral researcher in Cheeseman's lab who is now at the University of Maryland School of Medicine.

The challenge is that when P. vivax does emerge, it only infects very young red blood cells, so parasites are rare in the blood. Analyzing such low levels of infection is the microbiology equivalent of finding a needle in a haystack.

The scientists start with red blood cells, which become slightly magnetic when infected with malaria parasites. They used a high-powered magnet to separate the infected red blood cells from uninfected cells. The infected cells were then run through a machine called a flow cytometer, which uses a laser and fluorescent tags to detect if there is indeed parasite DNA present. Cells with parasite DNA are plopped one by one into test wells and ultimately run through a genetic sequencing machine to decode each individual parasite genome.

Single cell sequencing enables the scientists to precisely compare individual parasite genomes to one another to determine how related they are to each other. They can also really dig down and pinpoint single differences in the genetic code -- say an A is changed to a T -- to see what happened since the parasite infected that patient.

"We would expect these brand-new mutations to be scattered randomly throughout the genome," Cheeseman says. "Instead, we find they are often targeting a gene family that controls transcription in malaria."

But that's not the only notable thing about the results. What really excites Cheeseman is that when the team compared single cell sequencing data for P. vivax and P. falciparum, the same transcription gene family contained the majority of new mutations for both species.

"We have two different species of malaria from two different parts of the world, Thailand and Malawi," he says. "When we see the same thing happening independently in different species, this is an example of convergent evolution."

In other words, similar processes might be shaping similar mutation patterns in both species, even though their last common ancestor was millions of years ago.

The team does not know yet what impact the mutations have on the parasite and its ability to persist and cause damage in human hosts. The mutations may be critical for survival, or something like drug resistance, or may reveal those genes are unimportant.

Read more at Science Daily

Oct 12, 2021

Stellar 'fossils' in meteorites point to distant stars

Some pristine meteorites contain a record of the original building blocks of the solar system, including grains that formed in ancient stars that died before the sun formed. One of the biggest challenges in studying these presolar grains is to determine the type of star each grain came from.

Nan Liu, research assistant professor of physics in Arts & Sciences at Washington University in St. Louis, is first author of a new study in Astrophysical Journal Letters that analyzes a diverse set of presolar grains with the goal of realizing their true stellar origins.

Liu and her team used a state-of-the-art mass spectrometer called NanoSIMS to measure isotopes of a suite of elements including the N and Mg-Al isotopes in presolar silicon carbide (SiC) grains. By refining their analytical protocols and also utilizing a new-generation plasma ion source, the scientists were able to visualize their samples with better spatial resolution than could be accomplished by previous studies.

"Presolar grains have been embedded in meteorites for 4.6 billion years and are sometimes coated with solar materials on the surface," Liu said. "Thanks to the improved spatial resolution, our team was able to see Al contamination attached on the surface of a grain and to obtain true stellar signatures by including signals only from the core of the grain during the data reduction."

The scientists sputtered the grains using an ion beam for extended periods of time to expose clean, interior grain surfaces for their isotopic analyses. The researchers found that the N isotope ratios of the same grain greatly increased after the grain was exposed to extended ion sputtering.

Isotope ratios can be rarely measured for stars, but C and N isotopes are two exceptions. The new C and N isotope data for the presolar grains reported in this study directly link the grains to different types of carbon stars based on these stars' observed isotopic ratios.

"The new isotopic data obtained in this study are exciting for stellar physicists and nuclear astrophysicists like me," said Maurizio Busso, a co-author of the study who is based at the University of Perugia, in Italy. "Indeed, the 'strange' N isotopic ratios of presolar SiC grains have been in the last two decades a remarkable source of concern. The new data explain the difference between what was originally present in the presolar stardust grains and what was attached later, thus solving a long-standing puzzle in the community."

The study also includes a significant exploration of radioactive isotope aluminum-26 (26Al), an important heat source during the evolution of young planetary bodies in the early solar system and also other extra-solar systems. The scientists inferred the initial presence of large amounts of 26Al in all measured grains, as predicted by current models. The study determined how much 26Al was produced by the "parent stars" of the grains they measured. Liu and her collaborators concluded that stellar model predictions for 26Al are too high by at least a factor of two, compared to the grain data.

The data-model offsets likely point to uncertainties in relevant nuclear reaction rates, Liu noted, and will motivate nuclear physicists to pursue better measurements of these reaction rates in the future.

The team's results link some of the presolar grains in this collection to poorly known carbon stars with peculiar chemical compositions.

The grains' isotopic data point to H-burning processes occurring in such carbon stars at higher-than-expected temperatures. This information will help astrophysicists to construct stellar models to better understand the evolution of these stellar objects.

Read more at Science Daily

Islands are cauldrons of evolution

Islands are hot spots of evolutionary adaptation that can also advantage species returning to the mainland, according to a study published the week of Oct. 11 in the Proceedings of the National Academy of Sciences.

Islands are well known locations of adaptive radiation, where species diversify to fill empty niches. In contrast, species that evolved on islands are thought to be evolutionarily disadvantaged when attempting to recolonize the mainland.

Jonathan B. Losos, the William H. Danforth Distinguished University Professor, professor of biology in Arts & Sciences and director of the Living Earth Collaborative at Washington University in St. Louis, is senior author of the new study.

Losos and his colleagues used a time-calibrated phylogeny and measurements of relevant ecological and morphological traits of neotropical anoles (Anolis spp.) to explore the collision of island and mainland adaptive radiations.

Anolis lizards originated in South America, colonized and radiated on various islands in the Caribbean and then returned and diversified on the Central American mainland. All of the Anolis groups exhibited significant adaptive radiations, but the results suggested that they followed different evolutionary paths.

The island Anolis species, and to a lesser extent the ancestral species, experienced higher initial rates of evolution as ecological niches were filled. In contrast, the Anolis species that recolonized the Central American mainland from the islands diversified ecologically without developing significant morphological differences between species.

When the Isthmus of Panama reconnected the two mainland groups, the recolonizing Central American Anolis species outcompeted the ancestral South American Anolis species, contrary to expectations.

According to Losos, rather than being evolutionary dead ends, islands are cauldrons of evolutionary innovation and diversification.

"The traditional thinking is that plant and animal groups that evolve on islands can't invade the mainland because the mainland has more species, and thus a more competitive biotic milieu due to higher rates of competition, predation, parasitism, etc.," Losos said. "So the idea is that species on islands aren't 'tough' enough to cut it on the mainland.

Read more at Science Daily

Greenland’s groundwater changes with thinning ice sheet

For more than a decade, a team of University of Montana researchers and students have studied the dynamics of the Greenland Ice Sheet as it responds to a warming climate. Department of Geosciences researchers Toby Meierbachtol and Joel Harper said water has always been central to their research.

"The water from melting of the ice can run off the surface to the ocean and contribute to sea level rise, it can refreeze in place and actually warm the ice, and it can even reach the bottom of the ice sheet and act as a sort of lubricant to make the ice slide quickly over its bed," Meierbachtol said. "The importance of water in controlling the response of Greenland to warming is hard to overstate."

But while much of their focus has been on the importance of water in controlling processes occurring on the ice sheet, their most recent research findings have flipped the order of their thinking.

As outlined in their recent article in Nature Geoscience, Meierbachtol, Harper and an international team of researchers discovered that changes to the ice sheet have an immediate impact on the groundwater underlying the Greenland island, an area larger than the state of Alaska.

"We have been focused on water's impacts on ice sheet change," said Harper. "But our most recent findings show that changes in the ice sheet have a real impact on Arctic hydrology -- specifically the massive groundwater system extending under the ice sheet."

This latest revelation occurred thanks to a marriage of drilling techniques, with international collaborators boring an angled hole 650 meters through bedrock underneath a Greenland glacier to measure groundwater conditions deep under the ice sheet. Meanwhile, UM and University of Wyoming researchers drilled 32 holes from atop the glacier, through nearly a kilometer of ice, to measure water conditions at the interface between ice and bedrock, which forms an important boundary controlling groundwater flow below.

The system that UM has perfected over the years involves drilling with a combination of very hot water under high pressure typically for 12 or more hours at a time.

"We practice and rehearse to make the operation flow smoothly," Harper said, noting they always include one to two undergraduate students on an expedition. "Everyone on the team has an important and specific role to fill."

After drilling the team installs sensors in the ice column and at the ice sheet bed to measure ice dynamics and water conditions as water flows under the ice to margin. Time is always of the essence because the cold ice freezes the hole shut in as little as two hours.

The dual drilling approach facilitated the first-ever measurements of groundwater response to a changing ice sheet, and the eight-year data record yielded some unexpected results.

"By studying areas that were covered by ice 10,000 years ago during the last ice age, the field has known that the huge mass and vast amounts of water from melting ice can impact the underlying groundwater," Meierbachtol said, "but the paradigm has been that the groundwater response to ice sheet change is long: thousands of years. What we've shown here is that the groundwater response to Greenland's change is immediate."

This new understanding could have important downstream implications for how Greenland's thinning impacts the Arctic, Harper said. The thinning ice could reduce the rate of groundwater flow to the ocean, changing the water temperature and salinity balance that is important for ocean circulation patterns.

"In thinking about the complex feedbacks that occur from Greenland's ongoing change, we as a field have really neglected the groundwater component because we thought it was more or less dormant over the decade to century timescales that are important for us as a society," Harper said. "But now we recognize that the groundwater system actually changes quite rapidly, and there are some compelling reasons for why this could really matter for the broader Arctic."

Read more at Science Daily

Brain damage from long stays in space

Spending a long time in space appears to cause brain damage. This is shown by a study of five Russian cosmonauts who had stayed on the International Space Station (ISS). Researchers at the University of Gothenburg are among those now presenting the results.

The study is published in the scientific journal JAMA Neurology. Its co-authors at the University, scientists from the Institute of Neuroscience and Physiology at Sahlgrenska Academy, wrote it jointly with colleagues in Moscow and Munich.

The scientists followed five male Russian cosmonauts working on the permanently manned International Space Station (ISS), which is in orbit 400 km from Earth's surface.

The adverse effects on the body of long periods in space have been known for some time. The negative changes include atrophic muscles, decreasing bone mass, deteriorating vision and altered bacterial flora in the gut.

Evidence of brain damage


Blood samples were taken from the cosmonauts 20 days before their departure to the ISS. On average, they then stayed in space for 169 days (approximately five and a half months). The participants' mean age was 49.

After their return to Earth, follow-up blood samples were taken on three occasions: one day, one week, and about three weeks respectively after landing. Five biomarkers for brain damage were analyzed. They were neurofilament light (NFL), glial fibrillary acidic protein (GFAP), total tau (T-tau), and two amyloid beta proteins.

For three of the biomarkers -- NFL, GFAP and the amyloid beta protein Aβ40 -- the concentrations were significantly elevated after the space sojourn. The peak readings did not occur simultaneously after the men's return to Earth, but their biomarker trends nonetheless broadly tallied over time.

"This is the first time that concrete proof of brain-cell damage has been documented in blood tests following space flights. This must be explored further and prevented if space travel is to become more common in the future," says Henrik Zetterberg, professor of neuroscience and one of the study's two senior coauthors.

Several studies underway

"To get there, we must help one another to find out why the damage arises. Is it being weightless, changes in brain fluid, or stressors associated with launch and landing, or is it caused by something else? Here, loads of exciting experimental studies on humans can be done on Earth," he continues.

The notion that the changes concerned may have a bearing on brain function is substantiated by changes also seen in magnetic resonance imaging (MRI) of the brain after space travel. Further support is provided by clinical tests of the men's brain function that show deviations linked to their assignments in space. However, the present study was too small to investigate these associations in detail.

Zetterberg and his coauthors at the University, scientist Nicholas Ashton and Professor Kaj Blennow, are currently discussing follow-up studies with their other fellow researchers involved in the study, and also with national and international space research institutes.

Read more at Science Daily

When breezy, wear masks outdoors to prevent coronavirus exposure

As the highly infectious delta variant of the coronavirus continues to spread across the United States, guidelines from the Centers for Disease Control and Prevention recommend even the vaccinated wear masks indoors to prevent exposure and transmission.

However, it is less clear what people should do when outside.

In Physics of Fluids, by AIP Publishing, researchers from the Indian Institute of Technology Bombay found when a person coughs outdoors, wind flowing in the same direction can propagate the virus faster over longer distances than in calm conditions.

"The study is significant in that it points to the increased infection risk that coughing in the same direction as the wind could bring about," co-author Amit Agrawal said. "Based on the results, we recommend wearing masks outdoors, particularly in breezy conditions."

Other guidelines, such as coughing in an elbow or turning the face away while coughing, should be followed to reduce transmission when socializing outdoors.

Most studies model cough flow using puffs of air or a simple pulsating profile. But a real cough is more complicated, exhibiting turbulent flow with prominent vortical structures swirling like mini whirlpools.

To investigate these vortices, the researchers used a large eddy simulation, a numerical model in computational fluid dynamics that simulatesturbulence. They modeled cough jets in breezy conditions and in calm conditions representing a typical indoor environment.

These simulations show even a light breeze of about 5 mph extends effective social distancing by around 20%, from 3-6 feet to 3.6-7.2 feet, depending on cough strength. At 9-11 mph, spreading of the virus increases in distance and duration.

The researchers found the vortices enable bigger droplets to persist in the air longer than has been typically assumed, increasing the time it takes to adequately dilute the viral load in fresh air. As the cough jet evolves and spreads, it interacts with the wind flowing in the same direction, and the bigger infected droplets become trapped in the jet's vortices instead of falling relatively quickly to the ground under gravity.

Read more at Science Daily

Oct 11, 2021

Italian sailors knew of America 150 years before Christopher Columbus, new analysis of ancient documents suggests

New analysis of ancient writings suggests that sailors from the Italian hometown of Christopher Columbus knew of America 150 years before its renowned 'discovery'.

Transcribing and detailing a, circa, 1345 document by a Milanese friar, Galvaneus Flamma, Medieval Latin literature expert Professor Paolo Chiesa has made an "astonishing" discovery of an "exceptional" passage referring to an area we know today as North America.

According to Chiesa, the ancient essay -- first discovered in 2013 -- suggests that sailors from Genoa were already aware of this land, recognizable as 'Markland'/ 'Marckalada' -- mentioned by some Icelandic sources and identified by scholars as part of the Atlantic coast of North America (usually assumed to be Labrador or Newfoundland).

Published in the peer-reviewed journal Terrae Incognitae, the discovery comes ahead of Columbus Day 2021, alternatively celebrated as Indigenous Peoples' Day across many states in the US. The findings add more fuel to the fire for the continuing question of 'what, exactly, did Columbus expect to find when he set out across the ocean?' and come following a period in which his statues have been beheaded, covered with red paint, lassoed around the head and pulled down, set on fire and thrown into a lake.

"We are in the presence of the first reference to the American continent, albeit in an embryonic form, in the Mediterranean area," states Professor Chiesa, from the Department of Literary Studies, Philology and Linguistics at the University of Milan.

Galvaneus was a Dominican friar who lived in Milan and was connected to a family which held at the lordship of the city.

He wrote several literary works in Latin, mainly on historical subjects. His testimony is valuable for information on Milanese contemporary facts, about which he has first-hand knowledge.

Cronica universalis, which is analyzed here by Chiesa, is thought to be one of his later works -- perhaps the last one -- and was left unfinished and unperfected. It aims to detail the history of the whole world, from 'Creation' to when it was published.

In translating and analysing the document, Professor Chiesa demonstrates how Genoa would have been a "gateway" for news, and how Galvaneus appears to hear, informally, of seafarers' rumours about lands to the extreme north-west for eventual commercial benefit -- as well as information about Greenland, which he details accurately (for knowledge of the time).

"These rumours were too vague to find consistency in cartographic or scholarly representations," the professor states, as he explains why Marckalada wasn't classified as a new land at the time.

Regardless though, Chiesa states, Cronica universalis "brings unprecedented evidence to the speculation that news about the American continent, derived from Nordic sources, circulated in Italy one and half centuries before Columbus."

He adds: "What makes the passage (about Marckalada) exceptional is its geographical provenance: not the Nordic area, as in the case of the other mentions, but northern Italy.

"The Marckalada described by Galvaneus is 'rich in trees', not unlike the wooded Markland of the Grœnlendinga Saga, and animals live there.

"These details could be standard, as distinctive of any good land; but they are not trivial, because the common feature of northern regions is to be bleak and barren, as actually Greenland is in Galvaneus's account, or as Iceland is described by Adam of Bremen."

Overall, Professor Chiesa says, we should "trust" Cronica universalis as throughout the document Galvaneus declares where he has heard of oral stories, and backs his claims with elements drawn from accounts (legendary or real) belonging to previous traditions on different lands, blended together and reassigned to a specific place.

"I do not see any reason to disbelieve him," states Professor Chiesa, who adds, "it has long been noticed that the fourteenth-century portolan (nautical) charts drawn in Genoa and in Catalonia offer a more advanced geographical representation of the north, which could be achieved through direct contacts with those regions.

"These notions about the north-west are likely to have come to Genoa through the shipping routes to the British Isles and to the continental coasts of the North Sea.

"We have no evidence that Italian or Catalan seafarers ever reached Iceland or Greenland at that time, but they were certainly able to acquire from northern European merchant goods of that origin to be transported to the Mediterranean area.

"The marinarii mentioned by Galvaneus can fit into this dynamic: the Genoese might have brought back to their city scattered news about these lands, some real and some fanciful, that they heard in the northern harbors from Scottish, British, Danish, Norwegian sailors with whom they were trading."

Read more at Science Daily

What makes us human? The answer may be found in overlooked DNA

Our DNA is very similar to that of the chimpanzee, which in evolutionary terms is our closest living relative. Stem cell researchers at Lund University in Sweden have now found a previously overlooked part of our DNA, so-called non-coded DNA, that appears to contribute to a difference which, despite all our similarities, may explain why our brains work differently. The study is published in the journal Cell Stem Cell.

The chimpanzee is our closest living relative in evolutionary terms and research suggests our kinship derives from a common ancestor. About five to six million years ago, our evolutionary paths separated, leading to the chimpanzee of today, and Homo Sapiens, humankind in the 21st century.

In a new study, stem cell researchers at Lund examined what it is in our DNA that makes human and chimpanzee brains different -- and they have found answers.

"Instead of studying living humans and chimpanzees, we used stem cells grown in a lab. The stem cells were reprogrammed from skin cells by our partners in Germany, the USA and Japan. Then we examined the stem cells that we had developed into brain cells," explains Johan Jakobsson, professor of neuroscience at Lund University, who led the study.

Using the stem cells, the researchers specifically grew brain cells from humans and chimpanzees and compared the two cell types. The researchers then found that humans and chimpanzees use a part of their DNA in different ways, which appears to play a considerable role in the development of our brains.

"The part of our DNA identified as different was unexpected. It was a so-called structural variant of DNA that were previously called "junk DNA," a long repetitive DNA string which has long been deemed to have no function. Previously, researchers have looked for answers in the part of the DNA where the protein-producing genes are -- which only makes up about two per cent of our entire DNA -- and examined the proteins themselves to find examples of differences."

The new findings thus indicate that the differences appear to lie outside the protein-coding genes in what has been labelled as "junk DNA," which was thought to have no function and which constitutes the majority of our DNA.

"This suggests that the basis for the human brain's evolution are genetic mechanisms that are probably a lot more complex than previously thought, as it was supposed that the answer was in those two per cent of the genetic DNA. Our results indicate that what has been significant for the brain's development is instead perhaps hidden in the overlooked 98 per cent, which appears to be important. This is a surprising finding."

The stem cell technique used by the researchers in Lund is revolutionary and has enabled this type of research. The technique was recognised by the 2012 Nobel Prize in Physiology or Medicine. It was the Japanese researcher Shinya Yamanaka who discovered that specialised cells can be reprogrammed and developed into all types of body tissue. And in the Lund researchers' case, into brain cells. Without this technique, it would not have been possible to study the differences between humans and chimpanzees using ethically defensible methods.

Why did the researchers want to investigate the difference between humans and chimpanzees?

"I believe that the brain is the key to understanding what it is that makes humans human. How did it come about that humans can use their brain in such a way that they can build societies, educate their children and develop advanced technology? It is fascinating!"

Johan Jakobsson believes that in the future the new findings may also contribute to genetically-based answers to questions about psychiatric disorders, such as schizophrenia, a disorder that appears to be unique to humans.

Read more at Science Daily

Sleep loss does not impact ability to assess emotional information

It's no secret that going without sleep can affect people's mood, but a new study shows it does not interfere with their ability to evaluate emotional situations.

It is often assumed that feeling more negative will color people's experience of emotional images and events in the environment around them. However, Washington State University researchers found that while going 24 hours without sleep impacted study participants' mood, it did not change their performance on tests evaluating their ability to process emotional words and images.

"People do become less happy through sleep deprivation, but it's not affecting how they are processing emotional stimuli in their environment," said Anthony Stenson, a WSU psychology doctoral student and lead author of the study in Plos One.

The findings have implications for healthcare providers, law enforcement and people in other long-hour professions who need to be able to control their own emotions during stressful and emotionally trying situations. Sleep loss in not likely to make them numb to emotional situations, the researchers found, but it is likely to make them less able to control their own emotional responses.

For the study, about 60 adult participants spent four consecutive days in the Sleep and Performance Research Center at the WSU Elson S. Floyd College of Medicine. All participants were allowed to sleep normally the first night and then given a set of baseline tests to judge their mood as well as their emotional regulation and processing ability. Then, the researchers divided the participants into two groups: one group of 40 people spent the second night awake, while a control group of 20 were allowed a normal sleep period. The tests were then re-administered at different intervals.

The emotional regulation and processing tests both involved viewing a series of images with positive and negative emotional connotations. In the emotional regulation tests, participants were given a prompt to help them recontextualize negative images before seeing them and asked to control their feelings. The sleep-deprived group had greater difficulty reducing the emotion they felt when instructed to do so.

The processing tests involved responding to words and images with emotional content, for example rating the emotions conveyed by a smiling family, a growling dog or a crying child All participants performed similarly on these tests whether they were sleep deprived or not.

The distinction between processing the emotional content of the world around you and being able to regulate your own emotional responses is an important one, especially for some professions, said co-author Paul Whitney, a WSU professor of psychology.

"I don't think we want our first responders being numb to the emotional nature of the situations they encounter, and it looks like they are not," he said. "On the other hand, reacting normally to emotional situations, but not being able to control your own emotions, could be one reason sleep loss sometimes produces catastrophic errors in stressful situations."

A lot of previous research has looked at how sleep deprivation impacts so called "cold" cognitive tasks -- supposedly emotionally neutral tasks like recalling facts. These studies have also found that regulation, which is considered a "top-down" cognitive process, is a major problem with cold cognitive tasks. For instance, mental flexibility is compromised by sleep deprivation. This is the ability an emergency room doctor might need to quickly change tactics if a patient isn't responding to a treatment.

The current study shows that top-down regulation is a problem as well with "hot" or emotional cognitive processes. Future research is needed to understand whether the effects of sleep loss on the two top-down processes are linked.

Read more at Science Daily