Jun 13, 2020

Our visual world of color is largely incorrect, study finds

Color awareness has long been a puzzle for researchers in neuroscience and psychology, who debate over how much color observers really perceive. A study from Dartmouth in collaboration with Amherst College finds that people are aware of surprisingly limited color in their peripheral vision; much of our sense of a colorful visual world is likely constructed by our brain. The findings are published in the Proceedings of the National Academy of Sciences .

To test people's visual awareness of color during naturalistic viewing, the researchers used head-mounted virtual reality displays installed with eye-trackers to immerse participants in a 360-degree real-world environment. The virtual environments included tours of historic sites, a street dance performance, a symphony rehearsal and more, where observers could explore their surroundings simply by turning their heads. With the eye-tracking tool, researchers knew exactly where an observer was looking at all times in the scene and could make systematic changes to the visual environment so that only the areas where the person was looking were in color. The rest of the scene in the periphery was desaturated so that it had no color and was just in black and white. After a series of trials, observers were asked a series of questions to gauge if they noticed the lack of color in their periphery. A supplemental video from the study illustrates how the peripheral color was removed from various scenes.

In your visual field, your periphery extends approximately 210 degrees, which is similar to if your arms are stretched out on your left and right. The study's results showed that most people's color awareness is limited to a small area around the dead center of their visual field. When the researchers removed most color in the periphery, most people did not notice. In the most extreme case, almost a third of observers did not notice when less than five percent of the entire visual field was presented in color (radius of 10 degrees visual angle).

Participants were astonished to find out later that they hadn't noticed the desaturated periphery, after they were shown the changes that were made to a virtual scene that they had just explored.

A second study tasked the participants to identify when color was desaturated in the periphery. The results were similar in that most people failed to notice when the peripheral color had been removed. A large number of people participated in the two studies, which featured nearly 180 participants in total.

"We were amazed by how oblivious participants were when color was removed from up to 95 percent of their visual world," said senior author, Caroline Robertson, an assistant professor of psychological and brain sciences at Dartmouth. "Our results show that our intuitive sense of a rich, colorful visual world is largely incorrect. Our brain is likely filling-in much of our perceptual experience."

Read more at Science Daily

Scientists uncover immune cells that may lower airway allergy and asthma risk

The world is full of house dust mites. Do some cleaning, and you'll probably stir some up. While everyone has immune cells capable of reacting to common allergens like house dust mites, most of us have no allergic symptoms.

Still, many people do react with the typical allergic symptoms: sneezing, a runny nose, and itchy, swollen nasal passages. Others have a much more severe reaction: a life-threatening asthma attack.

To treat the root cause of allergies and asthma, researchers need to know exactly what sets these patients apart from healthy individuals.

In a new Science Immunology study, published on June 12, 2020, scientists at La Jolla Institute for Immunology (LJI) offer a clue to why non-allergic people don't have a strong reaction to house dust mites. They've uncovered a previously unknown subset of T cells that may control allergic immune reactions and asthma from ever developing in response to house dust mites -- and other possible allergens.

"We discovered new immune cell subsets and new therapeutic opportunities," says Grégory Seumois, Ph.D., instructor and director of LJI's Sequencing Core and co-leader of the new study. "This new population of cells could be one, out of many unknown mechanisms, that explains why healthy people don't develop inflammation when they breathe in allergens."

"The study highlights the power of unbiased single-cell genomics approaches to uncover novel biology," says LJI Professor Pandurangan Vijayanand, M.D. Ph.D., senior author of the new study.

The study builds on the Vijayanand lab's expertise in linking gene expression to disease development. The team also took advantage of the Immune Epitope Database, an LJI-led resource that houses information on how the immune system interacts with allergens like house dust mites.

Why house dust mites? These microscopic critters are hard to avoid, which means nearly everyone has been exposed. Even in people without a house dust mite (HDM) allergy, the immune system is likely to react in some way as it learns to recognize HDM molecules. This makes HDM a useful model for studying what causes allergies and asthma attacks.

The LJI team used a technique part of the "genomic revolution" arsenal of tools, called single-cell RNA-seq (or single cell transcriptomics) to see exactly which genes and molecules specific T cells produce in response to HDM allergens. They tested cells from four groups of people: people with asthma and HDM allergy, people with asthma but no HDM allergy, people with only HDM allergy, and healthy subjects.

Their analysis suggests that a subset of helper T cells, called interleukin (IL)-9 Th2 expressing HDM-reactive cells, is more prevalent in the blood of people with HDM-allergic asthma compared with those who are only allergic to HDM. Further analysis suggested that those IL9-TH2 cells are enriched in a group of molecules/genes that increased the cytotoxic potential of those cells. In other words, those specific T cells could kill other cells and drive inflammation.

In contrast, another subset of T cells stood out in the non-allergic subjects. These T cells express an "interferon response signature" and were enriched for a gene that encodes a protein called TRAIL. The work done by Seumois and his colleagues suggest that TRAIL could be important because it could dampen the activation of helper T cells.

This finding may mean that people with this specific cell population could have less T-cell driven inflammation in response to HDM allergens. At last, this could provide a clue to why some people develop allergies and asthma while others do not.

"Now if functional studies confirm this dampening effect, we're curious if there is a way to boost the activation of these T cells or induce their proliferation in asthmatic or allergic populations," says Seumois. "Can we act on those cells very early on, before asthma has developed?"

For example, genomics studies like this one may someday help identify children at risk of developing asthma and allergies. Early detection could open the door to preemptively acting on immune cells before development of allergy and asthma.

Read more at Science Daily

Jun 12, 2020

What control the height of mountains? Surprisingly, it is not erosion

Which forces and mechanisms determine the height of mountains? A group of researchers from Münster and Potsdam has now found a surprising answer: It is not erosion and weathering of rocks that determine the upper limit of mountain massifs, but rather an equilibrium of forces in the Earth's crust. This is a fundamentally new and important finding for the earth sciences. The researchers report on it in the scientific journal Nature.

The highest mountain ranges on Earth -- such as the Himalayas or the Andes -- arise along convergent plate boundaries. At such plate boundaries two tectonic plates move toward each other, and one of the plates is forced beneath the other into the Earth's mantle. During this process of subduction, strong earthquakes repeatedly occur on the plate interface, and over millions of years mountain ranges are built at the edges of the continents.

Whether the height of mountain ranges is mainly determined by tectonic processes in the Earth's interior or by erosional processes sculpturing the Earth's surface has long been debated in geosciences.

A new study led by Armin Dielforder of GFZ German Research Centre for Geoscience now shows that erosion by rivers and glaciers has no significant influence on the height of mountain ranges. Together with scientists from the GFZ and the University of Münster (Germany), he resolved the longstanding debate by analysing the strength of various plate boundaries and calculating the forces acting along the plate interfaces.

The researchers arrived at this surprising result by calculating the forces along different plate boundaries on the Earth. They used data that provide information about the strength of plate boundaries. These data are derived, for example, from heat flow measurements in the subsurface. The heat flow at convergent plate boundaries is in turn influenced by the frictional energy at the interfaces of the continental plates.

One can imagine the formation of mountains using a tablecloth. If you place both hands under the cloth on the table top and push it, the cloth folds and at the same time it slides a little over the back of your hands. The emerging folds would correspond, for instance, to the Andes, the sliding over the back of the hands to the friction in the underground. Depending on the characteristics of the rock, tensions also build up in the deep underground which are discharged in severe earthquakes, especially in subduction zones.

The researchers collected worldwide data from the literature on friction in the subsurface of mountain ranges of different heights (Himalayas, Andes, Sumatra, Japan) and calculated the resulting stress and thus the forces that lead to the uplift of the respective mountains. In this way they showed that in active mountains the force on the plate boundary and the forces resulting from the weight and height of the mountains are in balance.

Read more at Science Daily

Half the earth relatively intact from global human influence

Roughly half of Earth's ice-free land remains without significant human influence, according to a study from a team of international researchers led by the National Geographic Society and the University of California, Davis.

The study, published in the journal Global Change Biology, compared four recent global maps of the conversion of natural lands to anthropogenic land uses to reach its conclusions. The more impacted half of Earth's lands includes cities, croplands, and places intensively ranched or mined.

"The encouraging takeaway from this study is that if we act quickly and decisively, there is a slim window in which we can still conserve roughly half of Earth's land in a relatively intact state," said lead author Jason Riggio, a postdoctoral scholar at the UC Davis Museum of Wildlife and Fish Biology.

The study, published June 5 on World Environment Day, aims to inform the upcoming global Convention on Biological Diversity -- the Conference of Parties 15. The historic meeting was scheduled to occur in China this fall but was postponed due to the coronavirus pandemic. Among the meeting's goals is to establish specific, and higher, targets for land and water protection.

Approximately 15 percent of the Earth's land surface and 10 percent of the oceans are currently protected in some form. However, led by organizations including Nature Needs Half and the Half-Earth Project, there have been bold global calls for governments to commit to protecting 30 percent of the land and water by 2030 and 50 percent by 2050.

Intact natural lands across the globe can help purify air and water, recycle nutrients, enhance soil fertility and retention, pollinate plants, and break down waste products. The value of maintaining these vital ecosystem services to the human economy has been placed in the trillions of U.S. dollars annually.

CONSERVATION AND COVID-19

The coronavirus pandemic now shaking the globe illustrates the importance of maintaining natural lands to separate animal and human activity. The leading scientific evidence points to the likelihood that SARS-CoV2, the virus that causes the disease COVID-19, is a zoonotic virus that jumped from animals to humans. Ebola, bird flu and SARS are other diseases known to have spilled over into the human population from nonhuman animals.

"Human risk to diseases like COVID-19 could be reduced by halting the trade and sale of wildlife, and minimizing human intrusion into wild areas," said senior author Andrew Jacobson, professor of GIS and conservation at Catawba College in North Carolina.

Jacobson said that regional and national land-use planning that identify and appropriately zone locations best suited to urban growth and agriculture could help control the spread of human development. Establishing protections for other landscapes, particularly those currently experiencing low human impacts, would also be beneficial.

FROM THE TUNDRA TO THE DESERT

Among the largest low-impact areas are broad stretches of boreal forests and tundra across northern Asia and North America and vast deserts like the Sahara in Africa and the Australian Outback. These areas tend to be colder and/or drier and less fit for agriculture.

"Though human land uses are increasingly threatening Earth's remaining natural habitats, especially in warmer and more hospitable areas, nearly half of Earth still remains in areas without large-scale intensive use," said co-author Erle Ellis, professor of geography at the University of Maryland-Baltimore County.

Areas having low human influence do not necessarily exclude people, livestock or sustainable management of resources. A balanced conservation response that addresses land sovereignty and weighs agriculture, settlement or other resource needs with the protection of ecosystem services and biodiversity is essential, the authors note.

Read more at Science Daily

Ancient crocodiles walked on two legs like dinosaurs

An international research team has been stunned to discover that some species of ancient crocodiles walked on their two hind legs like dinosaurs and measured over three metres in length.

University of Queensland palaeontologist Dr Anthony Romilio said the researchers first thought the similar-shaped fossilised footprints were from another ancient animal known as the pterosaurs.

"At one site, the footprints were initially thought to be made by a giant bipedal pterosaur walking on the mudflat, we now understand that these were bipedal crocodile prints," Dr Romilio said.

"The footprints measure around 24 centimetres, suggesting the track-makers had legs about the same height as human adult legs.

"These were long animals that we estimate were over three metres in length.

"And while footprints were everywhere on the site, there were no handprints."

The research team, led by Professor Kyung Soo Kim from Chinju National University of Education, soon found clues as to why there were no handprints.

"Typical crocodiles walk in a squat stance and create trackways that are wide," Professor Kim said.

"Oddly, our trackways are very narrow looking -- more like a crocodile balancing on a tight-rope.

"When combined with the lack of any tail-drag marks, it became clear that these creatures were moving bipedally.

"They were moving in the same way as many dinosaurs, but the footprints were not made by dinosaurs.

"Dinosaurs and their bird descendants walk on their toes.

"Crocodiles walk on the flat of their feet leaving clear heel impressions, like humans do."

The footprints dated between 110-120 million years ago and were discovered after analysing animal track sites in what is now known as South Korea.

Researchers initially questioned the absence of hand impressions from the trackways, given that today's typical crocodiles are 'four-legged' or quadrupedal.

"Fossil crocodile tracks are quite rare in Asia, so finding an abundance of nearly one hundred footprints was extraordinary," Dr Romilio said.

"As an animal walks, the hind feet have the potential of stepping into the impression made by the hand and 'over-printing' it, but we find no evidence of this at these Korean sites.

Read more at Science Daily

Volunteerism: Doing good does you good

A new study in the American Journal of Preventive Medicine, published by Elsevier, takes a closer look at the benefits of volunteering to the health and well-being of volunteers, both validating and refuting findings from previous research. The results verify that adults over 50 who volunteer for at least 100 hours a year (about two hours per week) have a substantially reduced risk of mortality and developing physical limitations, higher levels of subsequent physical activity, and improved sense of well-being later on compared to individuals who do not volunteer.

"Humans are social creatures by nature. Perhaps this is why our minds and bodies are rewarded when we give to others. Our results show that volunteerism among older adults doesn't just strengthen communities, but enriches our own lives by strengthening our bonds to others, helping us feel a sense of purpose and well-being, and protecting us from feelings of loneliness, depression, and hopelessness. Regular altruistic activity reduces our risk of death even though our study didn't show any direct impact on a wide array of chronic conditions," explained lead investigator Eric S. Kim, PhD, Department of Social and Behavioral Sciences and Lee Kum Sheung Center for Health and Happiness, Harvard T.H. Chan School of Public Health, Boston; and Human Flourishing Program, Institute for Quantitative Social Science, Harvard University, Cambridge, MA, USA.

A growing body of research has linked volunteering to many health and well-being benefits, but there is still insufficient evidence to demonstrate the consistent and specific positive outcomes that are needed to develop public health interventions based on volunteerism. This large-scale study helps address this gap by evaluating 34 physical health and psychological/social well-being outcomes. This permitted direct comparisons of the potential size of effect that volunteering might have on various outcomes and also learn which outcomes volunteering does not appear to be influencing.

The study did not confirm links between volunteering and improvements to chronic conditions such as diabetes, hypertension, stroke, cancer, heart disease, lung disease, arthritis, obesity, cognitive impairment, or chronic pain.

The analysis was based on data, face-to-face interviews, and survey responses from nearly 13,000 participants randomly selected from the Health and Retirement Study (HRS), a nationally representative sample of older adults in the United States. The participants were tracked over four years in two cohorts from 2010-2016.

The growing older adult population possesses a vast array of skills and experiences that can be leveraged for the greater good of society via volunteering. While proposing further research to better understand this phenonmena, the study recommends the adoption of policies that encourage more volunteerism. Such interventions could simultaneously enhance society and foster a trajectory of healthy aging in the rapidly growing population of older adults. Further study is also needed to learn the underlying reasons for the divergence in some of the results from previous research.

Read more at Science Daily

Jun 11, 2020

New distance measurements bolster challenge to basic model of universe

A new set of precision distance measurements made with an international collection of radio telescopes have greatly increased the likelihood that theorists need to revise the "standard model" that describes the fundamental nature of the Universe.

The new distance measurements allowed astronomers to refine their calculation of the Hubble Constant, the expansion rate of the Universe, a value important for testing the theoretical model describing the composition and evolution of the Universe. The problem is that the new measurements exacerbate a discrepancy between previously measured values of the Hubble Constant and the value predicted by the model when applied to measurements of the cosmic microwave background made by the Planck satellite.

"We find that galaxies are nearer than predicted by the standard model of cosmology, corroborating a problem identified in other types of distance measurements. There has been debate over whether this problem lies in the model itself or in the measurements used to test it. Our work uses a distance measurement technique completely independent of all others, and we reinforce the disparity between measured and predicted values. It is likely that the basic cosmological model involved in the predictions is the problem," said James Braatz, of the National Radio Astronomy Observatory (NRAO).

Braatz leads the Megamaser Cosmology Project, an international effort to measure the Hubble Constant by finding galaxies with specific properties that lend themselves to yielding precise geometric distances. The project has used the National Science Foundation's Very Long Baseline Array (VLBA), Karl G. Jansky Very Large Array (VLA), and Robert C. Byrd Green Bank Telescope (GBT), along with the Effelsberg telescope in Germany. The team reported their latest results in the Astrophysical Journal Letters.

Edwin Hubble, after whom the orbiting Hubble Space Telescope is named, first calculated the expansion rate of the universe (the Hubble Constant) in 1929 by measuring the distances to galaxies and their recession speeds. The more distant a galaxy is, the greater its recession speed from Earth. Today, the Hubble Constant remains a fundamental property of observational cosmology and a focus of many modern studies.

Measuring recession speeds of galaxies is relatively straightforward. Determining cosmic distances, however, has been a difficult task for astronomers. For objects in our own Milky Way Galaxy, astronomers can get distances by measuring the apparent shift in the object's position when viewed from opposite sides of Earth's orbit around the Sun, an effect called parallax. The first such measurement of a star's parallax distance came in 1838.

Beyond our own Galaxy, parallaxes are too small to measure, so astronomers have relied on objects called "standard candles," so named because their intrinsic brightness is presumed to be known. The distance to an object of known brightness can be calculated based on how dim the object appears from Earth. These standard candles include a class of stars called Cepheid variables and a specific type of stellar explosion called a Type Ia supernova.

Another method of estimating the expansion rate involves observing distant quasars whose light is bent by the gravitational effect of a foreground galaxy into multiple images. When the quasar varies in brightness, the change appears in the different images at different times. Measuring this time difference, along with calculations of the geometry of the light-bending, yields an estimate of the expansion rate.

Determinations of the Hubble Constant based on the standard candles and the gravitationally-lensed quasars have produced figures of 73-74 kilometers per second (the speed) per megaparsec (distance in units favored by astronomers).

However, predictions of the Hubble Constant from the standard cosmological model when applied to measurements of the cosmic microwave background (CMB) -- the leftover radiation from the Big Bang -- produce a value of 67.4, a significant and troubling difference. This difference, which astronomers say is beyond the experimental errors in the observations, has serious implications for the standard model.

The model is called Lambda Cold Dark Matter, or Lambda CDM, where "Lambda" refers to Einstein's cosmological constant and is a representation of dark energy. The model divides the composition of the Universe mainly between ordinary matter, dark matter, and dark energy, and describes how the Universe has evolved since the Big Bang.

The Megamaser Cosmology Project focuses on galaxies with disks of water-bearing molecular gas orbiting supermassive black holes at the galaxies' centers. If the orbiting disk is seen nearly edge-on from Earth, bright spots of radio emission, called masers -- radio analogs to visible-light lasers -- can be used to determine both the physical size of the disk and its angular extent, and therefore, through geometry, its distance. The project's team uses the worldwide collection of radio telescopes to make the precision measurements required for this technique.

In their latest work, the team refined their distance measurements to four galaxies, at distances ranging from 168 million light-years to 431 million light-years. Combined with previous distance measurements of two other galaxies, their calculations produced a value for the Hubble Constant of 73.9 kilometers per second per megaparsec.

"Testing the standard model of cosmology is a really challenging problem that requires the best-ever measurements of the Hubble Constant. The discrepancy between the predicted and measured values of the Hubble Constant points to one of the most fundamental problems in all of physics, so we would like to have multiple, independent measurements that corroborate the problem and test the model. Our method is geometric, and completely independent of all others, and it reinforces the discrepancy," said Dom Pesce, a researcher at the Center for Astrophysics | Harvard and Smithsonian, and lead author on the latest paper.

"The maser method of measuring the expansion rate of the universe is elegant, and, unlike the others, based on geometry. By measuring extremely precise positions and dynamics of maser spots in the accretion disk surrounding a distant black hole, we can determine the distance to the host galaxies and then the expansion rate. Our result from this unique technique strengthens the case for a key problem in observational cosmology." said Mark Reid of the Center for Astrophysics | Harvard and Smithsonian, and a member of the Megamaser Cosmology Project team.

"Our measurement of the Hubble Constant is very close to other recent measurements, and statistically very different from the predictions based on the CMB and the standard cosmological model. All indications are that the standard model needs revision," said Braatz.

Astronomers have various ways to adjust the model to resolve the discrepancy. Some of these include changing presumptions about the nature of dark energy, moving away from Einstein's cosmological constant. Others look at fundamental changes in particle physics, such as changing the numbers or types of neutrinos or the possibilities of interactions among them. There are other possibilities, even more exotic, and at the moment scientists have no clear evidence for discriminating among them.

"This is a classic case of the interplay between observation and theory. The Lambda CDM model has worked quite well for years, but now observations clearly are pointing to a problem that needs to be solved, and it appears the problem lies with the model," Pesce said.

Read more at Science Daily

After a century of searching, scientists find new liquid phase

Researchers at the University of Colorado Boulder's Soft Materials Research Center (SMRC) have discovered an elusive phase of matter, first proposed more than 100 years ago and sought after ever since.

The team describes the discovery of what scientists call a "ferroelectric nematic" phase of liquid crystal in a study published today in the Proceedings of the National Academy of Sciences. The discovery opens a door to a new universe of materials, said co-author Matt Glaser, a professor in the Department of Physics.

Nematic liquid crystals have been a hot topic in materials research since the 1970s. These materials exhibit a curious mix of fluid- and solid-like behaviors, which allow them to control light. Engineers have used them extensively to make the liquid crystal displays (LCDs) in many laptops, TVs and cellphones.

Think of nematic liquid crystals like dropping a handful of pins on a table. The pins in this case are rod-shaped molecules that are "polar" -- with heads (the blunt ends) that carry a positive charge and tails (the pointy ends) that are negatively charged. In a traditional nematic liquid crystal, half of the pins point left and the other half point right, with the direction chosen at random.

A ferroelectric nematic liquid crystal phase, however, is much more disciplined. In such a liquid crystal, patches or "domains" form in the sample in which the molecules all point in the same direction, either right or left. In physics parlance, these materials have polar ordering.

Noel Clark, a professor of physics and director of the SMRC, said that his team's discovery of one such liquid crystal could open up a wealth of technological innovations -- from new types of display screens to reimagined computer memory.

"There are 40,000 research papers on nematics, and in almost any one of them you see interesting new possibilities if the nematic had been ferroelectric," Clark said.

Under the microscope

The discovery is years in the making.

Nobel Laureates Peter Debye and Max Born first suggested in the 1910s that, if you designed a liquid crystal correctly, its molecules could spontaneously fall into a polar ordered state. Not long after that, researchers began to discover solid crystals that did something similar: Their molecules pointed in uniform directions. They could also be reversed, flipping from right to left or vice versa under an applied electric field. These solid crystals were called "ferroelectrics" because of their similarities to magnets. (Ferrum is Latin for "iron").

In the decades since, however, scientists struggled to find a liquid crystal phase that behaved in the same way. That is, until Clark and his colleagues began examining RM734, an organic molecule created by a group of British scientists several years ago.

That same British group, plus a second team of Slovenian scientists, reported that RM734 exhibited a conventional nematic liquid crystal phase at higher temperatures. At lower temperatures, another unusual phase appeared.

When Clark's team tried to observe that strange phase under the microscope they noticed something new. Under a weak electric field, a palette of striking colors developed toward the edges of the cell containing the liquid crystal.

"It was like connecting a light bulb to voltage to test it but finding the socket and hookup wires glowing much more brightly instead," Clark said.

Stunning results

So, what was happening?

The researchers ran more tests and discovered that this phase of RM734 was 100 to 1,000 times more responsive to electric fields than the usual nematic liquid crystals. This suggested that the molecules that make up the liquid crystal demonstrated strong polar order.

"When the molecules are all pointing to the left, and they all see a field that says, 'go right,' the response is dramatic," Clark said.

The team also discovered that distinct domains seemed to form spontaneously in the liquid crystal when it cooled from higher temperature. There were, in other words, patches within their sample in which the molecules seemed to be aligned.

"That confirmed that this phase was, indeed, a ferroelectric nematic fluid," Clark said.

That alignment was also more uniform than the team was expecting.

"Entropy reigns in a fluid," said Joe MacLennan, a study coauthor and a professor of physics at CU Boulder. "Everything is wiggling around, so we expected a lot of disorder."

When the researchers examined how well aligned the molecules were inside a single domain, "we were stunned by the result," MacLennan said. The molecules were nearly all pointing in the same direction.

The team's next goal is to discover how RM734 achieves this rare feat. Glaser and SMRC researcher Dmitry Bedrov of the University of Utah, are currently using computer simulation to tackle this question.

"This work suggests that there are other ferroelectric fluids hiding in plain sight," Clark said. "It is exciting that right now techniques like artificial intelligence are emerging that will enable an efficient search for them."

Read more at Science Daily

Tropical disease in medieval Europe revises the history of a pathogen related to syphilis

Mass burials are common remnants of the many plague outbreaks that ravaged Medieval Europe. A number of these graveyards are well documented in historical sources, but the locations of most, and the victims they contain, have been lost to the pages of time. In Vilnius, Lithuania, one such cemetery was found in a typical way: accidental discovery during a routine city construction project.

A new study published in the journal Scientific Reports details the findings of genomic analyses on these medieval skeletons, with important implications for the history of syphilis in Europe.

Just another plague pit?

"Historical information on this Vilnius graveyard is unavailable, but the burial context, along with its location outside of the medieval city limits, pointed to plague, or some other major infectious disease outbreak," comments Rimantas Jankauskas, Professor of the Faculty of Medicine, Vilnius University. "To be certain, we needed confirmation through DNA analysis."

Kirsten Bos, a group leader for Molecular Palaeopathology at the Max Planck Institute for the Science of Human History (MPI-SHH) in Jena, Germany, is frequently contacted by archaeologists requesting such analyses.

"Plague was a common disease at the time, and the information we get from all the ancient DNA work can tell us a lot about how it was spreading," says Bos, a specialist in ancient pathogen DNA recovery who led the current study.

Working in Bos's team, doctoral candidate Karen Giffin took on DNA analysis of the putative plague victims and quickly identified the pathogen's DNA in the teeth of several individuals.

"I was happy to have identified them as victims of medieval plague," says Giffin, "but we wanted to see if the new techniques we were developing in molecular detection of pathogens could allow us to learn anything more about the health of this population."

More than just plague

"The typical method for pathogen detection in archaeological bone requires that you have some idea of what you're looking for," explains Alexander Herbig, group leader of Computational Pathogenomics at the MPI-SHH. "In this case we applied a relatively new hypothesis-free DNA screening approach to search for any other pathogens we might be able to identify at the molecular level."

This process unlocked a second secret of the 15th century graveyard. One of the four plague victims, a young woman, also showed a weak signal of something that seemed related to modern syphilis.

"It was impressive to find traces of such a disease in an historical skeleton because their molecular preservation in ancient bone is known to be problematic," comments Bos.

Diseases in the syphilis family, known as the treponemal diseases, are assumed to have had a long history with humans, though their inferred history in Europe is laden with controversy. The prevailing opinion holds that the first outbreak of syphilis in Europe coincided with Charles VIII's 1495 siege of Naples, where a debilitating disease erupted amongst his infantry and quickly spread around Europe. Since this outbreak happened just after the return of Columbus and his crew from their first trans-Atlantic voyage, most discussants believe syphilis was a newcomer to Europe that originated in the New World. But support is growing for a different theory. An increasing number of specialists in bone pathology believe they have properly identified examples of pre-1493 syphilis in Europe, which has ignited on-going debates about models of its evolution.

"We were able to reconstruct an impressively well-preserved genome that, to our surprise, fell within the diversity of modern yaws," comments Giffin. Yaws is a lesser-known treponemal disease primarily of the skin that affects both humans and other primates in warm, tropical environments. "Finding it in northern Europe in the mid-15th century was unexpected," she adds.

Yaws seems much younger than we thought
Since yaws infects both humans and non-human primates, some believe it to be a very old disease, having been with humans before the massive Pleistocene migrations that spread us around the globe.

"To our surprise, the yaws genome we reconstructed was just a few genetic steps away from the ancestor of all yaws varieties known in humans and non-human primates," says Bos. "Given the age of our medieval skeletons, it seems that all strains of yaws that we know today appeared on the scene only about 1000 years ago."

"This has important implications for the history of treponemal disease in Europe," Bos adds. "We can now confirm that yaws was circulating in medieval Europe, and given its similarity to syphilis and its recent emergence, it's possible that yaws contributed in some way to the famous late 15th to 16th century outbreak that we normally ascribe only to syphilis."

One possibility is that yaws emerged in either humans or other primates in West Africa within the last millennium and made its way to Europe in the mid-15th century. European presence in West Africa increased in the 15th century, as did the forced relocations of Africans to Europe through establishment of the transatlantic slave trade. These activities would have rapidly disseminated a new and highly contagious disease such as yaws.

Read more at Science Daily

Newly synthesized fungal compound can switch on a self-destruct button for cancer

All human body cells have a certain lifespan, during which they perform their essential duties. At the end of this lifespan, they reach senescence and-no longer able to perform those duties-die. This suicidal death is programmed into their genes through a process called apoptosis, causing them to self-destruct in order to make way for fresh, young, and healthy cells to replace them.

Mutations in a special gene called p53 can sometimes interfere with this process. Caused by aging, ultraviolet light, and various mutagenic compounds, these mutations can disable the apoptosis gene, resulting in "zombie" cells that refuse to die and continue to multiply, spreading the disabled gene and replacing healthy working cells with undying, rapidly growing tumors. This is the disease that we call cancer, and it takes many forms depending on which body cells develop the mutations.

Previously, scientists identified an anticancer compound called FE399 in a species of filamentous fungus called Ascochyta, which is often found afflicting common food crops such as cereals. The compound is a specific group of depsipeptides, a type of amino acid group, and was shown to induce apoptosis in cancerous human cells, particularly colorectal cancer, while they are still in vitro, proving its worth as an anti-cancer chemical.

Unfortunately, due to a variety of chemical complexities, the FE399 compound is not easy to purify, which hindered any plans for its widespread applications in cancer treatment. It was thus clear that extracting FE399?from the fungi naturally would not be a commercially feasible method, and despite the promise of a powerful anticancer drug, research into this particular compound was stalled.

The promise of a new anticancer treatment was tempting, however, and Prof Isamu Shiina, along with Dr Takayuki Tonoi, and his team from the Tokyo University of Science, accepted the challenge. "We wanted to create a lead compound that could treat colon cancer, and we aimed to do this through the total synthesis of FE399," says Prof Shiina. Total synthesis is the process of the complete chemical synthesis (production) of a complex molecule using commercially available precursors, allowing mass production. The results of their extensive studies will be published in the European Journal of Organic Chemistry.

The team figured that first, the structure of the depsipeptide would need to be identified. This was simple and could be easily performed using commercially available and inexpensive materials. Following this simple start, the subsequent procedures required many steps and resulted in some small failures when isomers were unsuccessfully isolated.

However, the team was rewarded for their efforts when, in a major breakthrough, their mass spectrometry and nuclear magnetic resonance studies confirmed that a trio of spots on a plate showed identical chemical signature to the known formula of FE399, meaning that they were able to successfully recreate FE399 synthetically.

Their technique was found to have an overall yield of 20%, which is quite promising for future large-scale production plans. "We hope that this newly produced compound can provide an unprecedented treatment option for patients with colorectal cancer, and thus improve the overall outcomes of the disease and ultimately improve their quality of life," states Prof Shiina.

Read more at Science Daily

Jun 10, 2020

Black hole's heart still beating

The first confirmed heartbeat of a supermassive black hole is still going strong more than ten years after first being observed.

X-ray satellite observations spotted the repeated beat after its signal had been blocked by our Sun for a number of years.

Astronomers say this is the most long lived heartbeat ever seen in a black hole and tells us more about the size and structure close to its event horizon -- the space around a black hole from which nothing, including light, can escape.

The research, by the National Astronomical Observatories, Chinese Academy of Sciences, China, and Durham University, UK, appears in the journal Monthly Notices of the Royal Astronomical Society.

The black hole's heartbeat was first detected in 2007 at the centre of a galaxy called RE J1034+396 which is approximately 600 million light years from Earth.

The signal from this galactic giant repeated every hour and this behaviour was seen in several snapshots taken before satellite observations were blocked by our Sun in 2011.

In 2018 the European Space Agency's XMM-Newton X-ray satellite was able to finally re-observe the black hole and to scientists' amazement the same repeated heartbeat could still be seen.

Matter falling on to a supermassive black hole as it feeds from the accretion disc of material surrounding it releases an enormous amount of power from a comparatively tiny region of space, but this is rarely seen as a specific repeatable pattern like a heartbeat.

The time between beats can tell us about the size and structure of the matter close to the black hole's event horizon.

Professor Chris Done, in Durham University's Centre for Extragalactic Astronomy collaborated on the findings with colleague Professor Martin Ward, Temple Chevallier Chair of Astronomy.

Professor Done said: "The main idea for how this heartbeat is formed is that the inner parts of the accretion disc are expanding and contracting.

"The only other system we know which seems to do the same thing is a 100,000 times smaller stellar-mass black hole in our Milky Way, fed by a binary companion star, with correspondingly smaller luminosities and timescales.

"This shows us that simple scalings with black hole mass work even for the rarest types of behaviour."

Lead author Dr Chichuan Jin of the National Astronomical Observatories, Chinese Academy of Sciences, said: "This heartbeat is amazing!

"It proves that such signals arising from a supermassive black hole can be very strong and persistent. It also provides the best opportunity for scientists to further investigate the nature and origin of this heartbeat signal."

The next step in the research is to perform a comprehensive analysis of this intriguing signal, and compare it with the behaviour of stellar-mass black holes in our Milky Way.

Read more at Science Daily

Physics principle explains order and disorder of swarms

Current experiments support the controversial hypothesis that a well-known concept in physics -- a "critical point" -- is behind the striking behaviour of collective animal systems. Physicists from the Cluster of Excellence "Centre for the Advanced Study of Collective Behaviour" at the University of Konstanz showed that light-controlled microswimming particles can be made to organize into different collective states such as swarms and swirls. By studying the particles fluctuating between these states, they provide evidence for critical behaviour -- and support for a physical principle underlying the complex behaviour of collectives. The research results were published in the scientific journal Nature Communications.

Animal groups exhibit the seemingly contradictory characteristics of being both robust and flexible. Imagine a school of fish: hundreds of individuals in perfect order and alignment can suddenly transition to a convulsing tornado dodging an attack. Animal groups benefit if they can strike this delicate balance between being stable in the face of "noise" like eddies or gusts of wind, yet responsive to important changes like the approach of a predator.

Critical transition

How they achieve this is not yet understood. But in recent years, a possible explanation has emerged: criticality. In physics, criticality describes systems in which a transition between states -- such as gas to liquid -- occurs at a critical point. Criticality has been argued to provide biological systems with the necessary balance between robustness and flexibility. "The combination of stability and high responsiveness is exactly what characterizes a critical point," says the study's lead author Clemens Bechinger, Principal Investigator in the Centre for the Advanced Study of Collective Behaviour and Professor in the Department of Physics at the University of Konstanz, "and so it made sense to test if this could explain some of the patterns we see in collective behaviour."

The hypothesis that collective states are hovering near critical points has been studied in the past largely through numerical simulations. In the new study published in Nature Communications, Bechinger and his colleagues have given rare experimental support to the mathematical prediction. "By demonstrating a close link between collectivity and critical behaviour, our findings not only add to our general understanding of collective states but also suggest that general physical concepts may apply to living systems," says Bechinger.

Experimental evidence

In experiments, the researchers used glass beads coated on one side by a carbon cap and placed in a viscous liquid. When illuminated by light, they swim much like bacteria, but with an important difference: every aspect of how the particles interact with others -- from how the individuals move to how many neighbours can be seen -- can be controlled. These microswimming particles allow the researchers to eschew the challenges of working with living systems in which rules of interaction cannot be easily controlled. "We design the rules in the computer, put them in an experiment, and watch the result of the interaction game," says Bechinger.

But to ensure that the physical system bore a resemblance to living systems, the researchers designed interactions that mirrored the behaviour of animals. For example, they controlled the direction that individuals moved in relation to their neighbours: particles were programmed either to swim straight towards others in the main group or to deviate away from them. Depending on this angle of movement, the particles organized into either swirls or disordered swarms. And incrementally adjusting this value elicited rapid transitions between a swirl and a disordered but still cohesive swarm. "What we observed is that the system can make sudden transitions from one state to the other, which demonstrates the flexibility needed to react to an external perturbation like a predator," says Bechinger, "and provides clear evidence for a critical behaviour."

"Similar behaviour to animal groups and neural systems"

This result is "key to understanding how animal collectives have evolved," says Professor Iain Couzin, co-speaker of the Centre for the Advanced Study of Collective Behaviour and Director of the Department of Collective Behavior at the Konstanz Max Planck Institute of Animal Behavior. Although not involved with the study, Couzin has worked for decades to decipher how grouping may enhance sensing capabilities in animal collectives.

Says Couzin: "The particles in this study behave in a very similar way to what we see in animal groups, and even neural systems. We know that individuals in collectives benefit from being more responsive, but the big challenge in biology has been testing if criticality is what allows the individual to spontaneously become much more sensitive to their environment. This study has confirmed this can occur just via spontaneous emergent physical properties. Through very simple interactions they have shown that you can tune a physical system to a collective state -- criticality -- of balance between order and disorder."

Read more at Science Daily

How the brain controls our speech

Speaking requires both sides of the brain. Each hemisphere takes over a part of the complex task of forming sounds, modulating the voice and monitoring what has been said. However, the distribution of tasks is different than has been thought up to now, as an interdisciplinary team of neuroscientists and phoneticians at Goethe University Frankfurt and the Leibniz-Centre General Linguistics Berlin has discovered: it is not just the right hemisphere that analyses how we speak -- the left hemisphere also plays a role.

Until now, it has been assumed that the spoken word arises in left side of the brain and is analysed by the right side. According to accepted doctrine, this means that when we learn to speak English and for example practice the sound equivalent to "th," the left side of the brain controls the motor function of the articulators like the tongue, while the right side analyses whether the produced sound actually sounds as we intended.

The division of labour actually follows different principles, as Dr Christian Kell from the Department of Neurology at Goethe University explains: "While the left side of the brain controls temporal aspects such as the transition between speech sounds, the right hemisphere is responsible for the control of the sound spectrum. When you say 'mother', for example, the left hemisphere primarily controls the dynamic transitions between "th" and the vowels, while the right hemisphere primarily controls the sounds themselves." His team, together with the phonetician Dr Susanne Fuchs, was able to demonstrate this division of labour in temporal and spectral control of speech for the first time in studies in which speakers were required to talk while their brain activities were recorded using functional magnetic resonance imaging.

A possible explanation for this division of labour between the two sides of the brain is that the left hemisphere generally analyses fast processes such as the transition between speech sounds better than the right hemisphere. The right hemisphere could be better at controlling the slower processes required for analysing the sound spectrum. A previous study on hand motor function that was published in the scientific publication "elife" demonstrates that this is in fact the case. Kell and his team wanted to learn why the right hand was preferentially used for the control of fast actions and the left hand preferred for slow actions. For example, when cutting bread, the right hand is used to slice with the knife while the left hand holds the bread.

In the experiment, scientists had right-handed test persons tap with both hands to the rhythm of a metronome. In one version they were supposed to tap with each beat, and in another only with every fourth beat. As it turned out, the right hand was more precise during the quick tapping sequence and the left hemisphere, which controls the right side of the body, exhibited increased activity. Conversely, tapping with the left hand corresponded better with the slower rhythm and resulted in the right hemisphere exhibiting increased activity.

Read more at Science Daily

What makes a giant jellyfish's sting deadly?

With summer on the way, and some beaches reopening after COVID-19 shutdowns, people will be taking to the ocean to cool off on a hot day. But those unlucky enough to encounter the giant jellyfish Nemopilema nomurai (also known as Nomura's jellyfish) might wish they had stayed on shore. Now, researchers reporting in ACS' Journal of Proteome Research have identified the key toxins that make the creature's venom deadly to some swimmers.

Found in coastal waters of China, Korea and Japan, Nomura's jellyfish can grow up to 6.6 feet in diameter and weigh up to 440 pounds. This behemoth stings hundreds of thousands of people per year, causing severe pain, redness, swelling, and in some cases, even shock or death. The jellyfish's venom is a complex brew of numerous toxins, some of which resemble poisons found in other organisms, such as snakes, spiders, bees and bacteria. Rongfeng Li, Pengcheng Li and colleagues wanted to determine which of the many toxins in the jellyfish's venom actually cause death. The answer could help scientists develop drugs to counteract jellyfish stings.

The researchers captured N. nomurai jellyfish off the coast of Dalian, China, and collected their tentacles, which contain the venom. They extracted venom proteins and separated them into different fractions using chromatography. By injecting each protein fraction into mice, the team identified one that killed the animals. Autopsies revealed damage to the mice's heart, lungs, liver and kidneys. The researchers used mass spectrometry to identify 13 toxin-like proteins in this lethal fraction. Some of the jellyfish proteins were similar to harmful enzymes and proteins found in poisonous snakes, spiders and bees. Instead of any one toxin being lethal, it's likely that multiple poisons work in concert to cause death, the researchers say.

From Science Daily

Volcanic activity and changes in Earth's mantle were key to rise of atmospheric oxygen

Volcanic eruption
Oxygen first accumulated in the Earth's atmosphere about 2.4 billion years ago, during the Great Oxidation Event. A long-standing puzzle has been that geologic clues suggest early bacteria were photosynthesizing and pumping out oxygen hundreds of millions of years before then. Where was it all going?

Something was holding back oxygen's rise. A new interpretation of rocks billions of years old finds volcanic gases are the likely culprits. The study led by the University of Washington was published in June in the open-access journal Nature Communications.

"This study revives a classic hypothesis for the evolution of atmospheric oxygen," said lead author Shintaro Kadoya, a UW postdoctoral researcher in Earth and space sciences. "The data demonstrates that an evolution of the mantle of the Earth could control an evolution of the atmosphere of the Earth, and possibly an evolution of life."

Multicellular life needs a concentrated supply of oxygen, so the accumulation of oxygen is key to the evolution of oxygen-breathing life on Earth.

"If changes in the mantle controlled atmospheric oxygen, as this study suggests, the mantle might ultimately set a tempo of the evolution of life," Kadoya said.

The new work builds on a 2019 paper that found the early Earth's mantle was far less oxidized, or contained more substances that can react with oxygen, than the modern mantle. That study of ancient volcanic rocks, up to 3.55 billion years old, were collected from sites that included South Africa and Canada.

Robert Nicklas at Scripps Institution of Oceanography, Igor Puchtel at the University of Maryland, and Ariel Anbar at Arizona State University are among the authors of the 2019 study. They are also co-authors of the new paper, looking at how changes in the mantle influenced the volcanic gases that escaped to the surface.

The Archean Eon, when only microbial life was widespread on Earth, was more volcanically active than today. Volcanic eruptions are fed by magma -- a mixture of molten and semi-molten rock -- as well as gases that escape even when the volcano is not erupting.

Some of those gases react with oxygen, or oxidize, to form other compounds. This happens because oxygen tends to be hungry for electrons, so any atom with one or two loosely held electrons reacts with it. For instance, hydrogen released by a volcano combines with any free oxygen, removing that oxygen from the atmosphere.

The chemical makeup of Earth's mantle, or softer layer of rock below the Earth's crust, ultimately controls the types of molten rock and gases coming from volcanoes. A less-oxidized early mantle would produce more of the gases like hydrogen that combine with free oxygen. The 2019 paper shows that the mantle became gradually more oxidized from 3.5 billion years ago to today.

The new study combines that data with evidence from ancient sedimentary rocks to show a tipping point sometime after 2.5 billion years ago, when oxygen produced by microbes overcame its loss to volcanic gases and began to accumulate in the atmosphere.

"Basically, the supply of oxidizable volcanic gases was capable of gobbling up photosynthetic oxygen for hundreds of millions of years after photosynthesis evolved," said co-author David Catling, a UW professor of Earth and space sciences. "But as the mantle itself became more oxidized, fewer oxidizable volcanic gases were released. Then oxygen flooded the air when there was no longer enough volcanic gas to mop it all up."

This has implications for understanding the emergence of complex life on Earth and the possibility of life on other planets.

Read more at Science Daily

Jun 9, 2020

Ancient asteroid impacts created the ingredients of life on Earth and Mars

A new study reveals that asteroid impact sites in the ocean may possess a crucial link in explaining the formation of the essential molecules for life. The study discovered the emergence of amino acids that serve as the building blocks for proteins -- demonstrating the role of meteorites in bringing life's molecules to earth, and potentially Mars.

There are two explanations for the origins of life's building molecules: extraterrestrial delivery, such as via meteorites; and endogenous formation. The presence of amino acids and other biomolecules in meteorites points to the former.

Researchers from Tohoku University, National Institute for Materials Science (NIMS), Center for High Pressure Science & Technology Advanced Research (HPSTAR), and Osaka University simulated the reactions involved when a meteorite crashes into the ocean. To do this, they investigated the reactions between carbon dioxide, nitrogen, water, and iron in a laboratory impact facility using a single stage propellant gun. Their simulation revealed the formation of amino acids such as glycine and alanine. These amino acids are direct constituents of proteins, which catalyze many biological reactions.

The team used carbon dioxide and nitrogen as the carbon and nitrogen sources because these gases are regarded as the two major components in the atmosphere on the Hadean Earth, which existed more than 4 billion years ago.

Corresponding author from Tohoku University, Yoshihiro Furukawa, explains, "Making organic molecules form reduced compounds like methane and ammonia are not difficult, but they are regarded as minor components in the atmosphere at that time." He adds, "The finding of amino acid formation from carbon dioxide and molecular nitrogen demonstrates the importance in making life's building blocks from these ubiquitous compounds."

The hypothesis that an ocean once existed on Mars also raises interesting avenues for exploration. Carbon dioxide and nitrogen are likely to have been the major constituent gases of the Martian atmosphere when the ocean existed. Therefore, impact-induced amino acid formation also provides a possible source of life's ingredients on ancient Mars.

Furukawa says, "further investigations will reveal more about the role meteorites played in bringing more complex biomolecules to Earth and Mars."

From Science Daily

Water vapor in the atmosphere may be prime renewable energy source

The search for renewable energy sources, which include wind, solar, hydroelectric dams, geothermal, and biomass, has preoccupied scientists and policymakers alike, due to their enormous potential in the fight against climate change. A new Tel Aviv University study finds that water vapor in the atmosphere may serve as a potential renewable energy source in the future.

The research, led by Prof. Colin Price in collaboration with Prof. Hadas Saaroni and doctoral student Judi Lax, all of TAU's Porter School of the Environment and Earth Sciences, is based on the discovery that electricity materializes in the interaction between water molecules and metal surfaces. It was published in Scientific Reports on May 6, 2020.

"We sought to capitalize on a naturally occurring phenomenon: electricity from water," explains Prof. Price. "Electricity in thunderstorms is generated only by water in its different phases -- water vapor, water droplets, and ice. Twenty minutes of cloud development is how we get from water droplets to huge electric discharges -- lightning -- some half a mile in length."

The researchers set out to try to produce a tiny low-voltage battery that utilizes only humidity in the air, building on the findings of earlier discoveries. In the nineteenth century, for example, English physicist Michael Faraday discovered that water droplets could charge metal surfaces due to friction between the two. A much more recent study showed that certain metals spontaneously build up an electrical charge when exposed to humidity.

The scientists conducted a laboratory experiment to determine the voltage between two different metals exposed to high relative humidity, while one is grounded. "We found that there was no voltage between them when the air was dry," Prof. Price explains. "But once the relative humidity rose above 60%, a voltage began to develop between the two isolated metal surfaces. When we lowered the humidity level to below 60%, the voltage disappeared. When we carried out the experiment outside in natural conditions, we saw the same results.

"Water is a very special molecule. During molecular collisions, it can transfer an electrical charge from one molecule to the other. Through friction, it can build up a kind of static electricity," says Prof. Price. "We tried to reproduce electricity in the lab and found that different isolated metal surfaces will build up different amounts of charge from water vapor in the atmosphere, but only if the air relative humidity is above 60%. This occurs nearly every day in the summer in Israel and every day in most tropical countries."

According to Prof. Price, this study challenges established ideas about humidity and its potential as an energy source. "People know that dry air results in static electricity and you sometimes get 'shocks' you when you touch a metal door handle. Water is normally thought of as a good conductor of electricity, not something that can build up charge on a surface. However, it seems that things are different once the relative humidity exceeds a certain threshold," he says.

The researchers, however, showed that humid air may be a source of charging surfaces to voltages of around one volt. "If a AA battery is 1.5V, there may be a practical application in the future: to develop batteries that can be charged from water vapor in the air," Prof. Price adds.

Read more at Science Daily

Physical activity in all of its forms may help maintain muscle mass in midlife

A large study of middle-aged women shows that age-related changes in skeletal muscle are part of everyday life for women in their fifties. During this time, women transition from perimenopause to postmenopause and the production of estrogen ceases. Loss of estrogen has an effect on muscles and leads to a decline in muscle mass. Physical activity in all of its forms may help maintain muscle mass in midlife.

"We already knew that estrogen has a role in the regulation of muscle properties," says doctoral student Hanna-Kaarina Juppi. "By following the hormonal status, measuring many aspects of muscles and by taking into consideration the simultaneous chronological aging of women going through menopausal transition, we were able to show that the decrease of muscle mass takes place already in early postmenopause."

In the current study, muscle size was measured in the perimenopausal state and right after entering postmenopause, when menstruation had permanently stopped. Women were on average 51-and-a-half years old at the beginning of the study and 53 years old at the final measurements, so the average duration of menopausal transition was one-and-a-half years. The time it takes a woman to go through menopause is unique: in this study it varied from less than six months to more than three years. During this time, the decrease in muscle mass was on average one percent.

Juppi continues: "The observed change does not seem like much, but what is meaningful is that the decline happens in a short period of time and can have an impact on metabolism, as muscles are important regulators of whole-body metabolism."

Physical activity was found to be positively associated with the maintenance of muscle mass during the menopausal transition. Women who were more active had higher muscle mass before and after menopause compared to the less active women. It seems that even though menopause alone decreases muscle mass, staying physically active throughout middle age can help women to slow the change.

The current study was conducted in the Gerontology Research Center and Faculty of Sport and Health Sciences, and is part of a larger study, Estrogenic Regulation of Muscle Apoptosis (ERMA), led by Academy Research Fellow Eija Laakkonen. More than a thousand women between the ages of 47 and 55 from the Jyväskylä region participated in the ERMA study. At the beginning of the study, 381 of them were perimenopausal, while 234 reached early postmenopause during the study. The research was funded by the Academy of Finland and the European Commission.

From Science Daily

Drug researcher develops 'fat burning' molecule

Obesity affects more than 40 percent of adults in the United States and 13 percent of the global population. With obesity comes a variety of other interconnected diseases including cardiovascular disease, diabetes, and fatty liver disease, which makes the disease one of the most difficult -- and most crucial -- to treat.

"Obesity is the biggest health problem in the United States. But, it is hard for people to lose weight and keep it off; being on a diet can be so difficult. So, a pharmacological approach, or a drug, could help out and would be beneficial for all of society," said Webster Santos, professor of chemistry and the Cliff and Agnes Lilly Faculty Fellow of Drug Discovery in the College of Science at Virginia Tech.

Santos and his colleagues have recently identified a small mitochondrial uncoupler, named BAM15, that decreases the body fat mass of mice without affecting food intake and muscle mass or increasing body temperature. Additionally, the molecule decreases insulin resistance and has beneficial effects on oxidative stress and inflammation.

The findings, published in Nature Communications on May 14, 2020, hold promise for future treatment and prevention of obesity, diabetes, and especially nonalcoholic steatohepatitis (NASH), a type of fatty liver disease that is characterized by inflammation and fat accumulation in the liver. In the next few years, the condition is expected to become the leading cause of liver transplants in the United States.

The mitochondria are commonly referred to as the powerhouses of the cell. The organelle generates ATP, a molecule that serves as the energy currency of the cell, which powers body movement and other biological processes that help our body to function properly.

In order to make ATP, nutrients need to be burned and a proton motive force (PMF) needs to be established within the mitochondria. The PMF is generated from a proton gradient, where there is a higher concentration of protons outside of the inner membrane and a lower concentration of protons in the matrix, or the space within the inner membrane. The cell creates ATP whenever protons pass through an enzyme called ATP synthase, which is embedded in the membrane. Hence, nutrient oxidation, or nutrient burning, is coupled to ATP synthesis.

"So anything that decreases the PMF has the potential to increase respiration. Mitochondrial uncouplers are small molecules that go to the mitochondria to help the cells respire more. Effectively, they change metabolism in the cell so that we burn more calories without doing any exercise," said Santos, an affiliated member of the Fralin Life Sciences Institute and the Virginia Tech Center for Drug Discovery.

Mitochondrial uncouplers transport protons into the matrix by bypassing ATP synthase, which throws off the PMF. To reestablish the gradient, protons must be exported out of the mitochondrial matrix. As a result, the cell begins to burn fuel at higher than necessary levels.

Knowing that these molecules can change a cell's metabolism, researchers wanted to be sure that the drug was reaching its desired targets and that it was, above all, safe. Through a series of mouse studies, the researchers found that BAM15 is neither toxic, even at high doses, nor does it affect the satiety center in the brain, which tells our body if we are hungry or full.

In the past, many anti-fat drugs would tell your body to stop eating. But as a result, patients would rebound and eat more. In the BAM15 mouse studies, animals ate the same amount as the control group -- and they still lost fat mass.

Another side effect of previous mitochondrial uncouplers was increased body temperature. Using a rectal probe, researchers measured the body temperature of mice who were fed BAM15. They found no change in body temperature.

But one issue arises concerning the half-life of BAM15. The half-life, or the length of time that a drug is still effective, is relatively short in the mouse model. For oral dosing in humans, the optimal half-life is much longer.

Even as BAM15 has some serious potential in mouse models, the drug won't necessarily be successful in humans -- at least not this same exact molecule.

"We are essentially looking for roughly the same type of molecule, but it needs to stay in the body for longer to have an effect. We are tweaking the chemical structure of the compound. So far, we have made several hundred molecules related to this," said Santos.

The penultimate goal of the Santos lab is to transition the anti-fat treatment from animal models to a treatment for NASH in humans. The lab has used their better compounds in animal models of NASH, which have been proven to be effective as anti-NASH compounds in mice.

Working alongside Santos is Kyle Hoehn, an assistant professor of pharmacology from the University of Virginia and an associate professor of biotechnology and biomolecular sciences at the University of New South Wales in Australia. Hoehn is a metabolic physiology expert who is in charge of conducting the animal studies. Santos and Hoehn have been collaborating for several years now and they even founded a biotech company together.

Co-founded by Santos and Hoehn in 2017, Continuum Biosciences aims to improve the ways in which our bodies burn fuel and fight back against our bodies ability to store excess nutrients as we age. These promising NASH treatment compounds are licensed by their company and are patented by Virginia Tech.

The company is looking to use mitochondrial uncouplers for more than just obesity and NASH. The molecules also have a unique anti-oxygen effect that can minimize the accumulation of reactive oxygen species, or oxidative stress, in our bodies, which ultimately results in neurodegeneration and aging.

Read more at Science Daily

Jun 8, 2020

Astronomers find elusive target hiding behind dust

Astronomers acting on a hunch have likely resolved a mystery about young, still-forming stars and regions rich in organic molecules closely surrounding some of them. They used the National Science Foundation's Karl G. Jansky Very Large Array (VLA) to reveal one such region that previously had eluded detection, and that revelation answered a longstanding question.

The regions around the young protostars contain complex organic molecules that can further combine into prebiotic molecules that are the first steps on the road to life. The regions, dubbed "hot corinos" by astronomers, are typically about the size of our Solar System and are much warmer than their surroundings, though still quite cold by terrestrial standards.

The first hot corino was discovered in 2003, and only about a dozen have been found so far. Most of these are in binary systems, with two protostars forming simultaneously.

Astronomers have been puzzled by the fact that, in some of these binary systems, they found evidence for a hot corino around one of the protostars but not the other.

"Since the two stars are forming from the same molecular cloud and at the same time, it seemed strange that one would be surrounded by a dense region of complex organic molecules, and the other wouldn't," said Cecilia Ceccarelli, of the Institute for Planetary Sciences and Astrophysics at the University of Grenoble (IPAG) in France.

The complex organic molecules were found by detecting specific radio frequencies, called spectral lines, emitted by the molecules. Those characteristic radio frequencies serve as "fingerprints" to identify the chemicals. The astronomers noted that all the chemicals found in hot corinos had been found by detecting these "fingerprints" at radio frequencies corresponding to wavelengths of only a few millimeters.

"We know that dust blocks those wavelengths, so we decided to look for evidence of these chemicals at longer wavelengths that can easily pass through dust," said Claire Chandler of the National Radio Astronomy Observatory, and principal investigator on the project. "It struck us that dust might be what was preventing us from detecting the molecules in one of the twin protostars."

The astronomers used the VLA to observe a pair of protostars called IRAS 4A, in a star-forming region about 1,000 light-years from Earth. They observed the pair at wavelengths of centimeters. At those wavelengths, they sought radio emissions from methanol, CH3OH (wood alcohol, not for drinking). This was a pair in which one protostar clearly had a hot corino and the other did not, as seen using the much shorter wavelengths.

The result confirmed their hunch.

"With the VLA, both protostars showed strong evidence of methanol surrounding them. This means that both protostars have hot corinos, and the reason we didn't see the one at shorter wavelengths was because of dust," said Marta de Simone, a graduate student at IPAG who led the data analysis for this object.

The astronomers caution that, while both hot corinos now are known to contain methanol, there still may be some chemical differences between them. That, they said, can be settled by looking for other molecules at wavelengths not obscured by dust.

Read more at Science Daily

Countries must work together on CO2 removal to avoid dangerous climate change

The Paris Agreement lays out national quotas on CO2 emissions but not removal, and that must be urgently addressed, say the authors of a new study.

The Paris Agreement aims to keep global temperature rise this century well below 2°C above pre-industrial levels and to pursue efforts to limit it to 1.5°C. Reaching these targets will require mitigation -- lowering the carbon dioxide (CO2) emitted through changes such as increased use of renewable energy sources, and removal of CO2 from the atmosphere through measures such as reforestation and carbon capture and storage.

However, while countries signed up to the Paris Agreement have individual quotas they need to meet in terms of mitigation and have individual plans for doing so, there are no agreed national quotas for CO2 removal.

Now, in a paper published today in Nature Climate Change, an international group of researchers have argued that to meet the Paris Agreement's targets, CO2 removal quotas cannot be allocated in such a way that any one country can fulfil its obligations alone.

Cross-border cooperation

The team, from Imperial College London, the University of Girona, ETH Zürich and the University of Cambridge, say countries need to start working together now to make sure enough CO2 is removed in a fair and equitable way. This should involve deciding how quotas might be allocated fairly and devising a system where countries that cannot fulfil their obligations alone can trade with countries with greater capacity to remove CO2.

Co-author Dr Niall Mac Dowell, from the Centre for Environmental Policy and the Centre for Process Systems Engineering at Imperial, said: "Carbon dioxide removal is necessary to meet climate targets, since we have so far not done enough to mitigate our emissions. Both will be necessary going forward, but the longer we wait to start removing CO2 on a large scale, the more we will have to do.

"It is imperative that nations have these conversations now, to determine how quotas could be allocated fairly and how countries could meet those quotas via cross-border cooperation. It will work best if we all work together."

Co-author Dr David Reiner, from Judge Business School at the University of Cambridge, added: "Countries such as the UK and France have begun to adopt binding 'net-zero targets' and whereas there has been extensive focus on greenhouse gas emissions and emissions reductions, meeting these targets will require greater attention to the negative emissions or carbon dioxide removal side of the equation."

Allocating quotas

A critical element in any negotiations will be to determine the fairest way to allocate quotas to different nations. Different methods have been used for determining previous quotas, such as the ability of a country to pay and its historic culpability (how much CO2 it has emitted), with a blend of methods often used implicitly or explicitly in any final agreement.

The team modelled several of these different methods and applied them to countries across Europe. While the quotas varied significantly, they found that only a handful of countries could meet any of the quotas using only their own resources.

Co-lead author Dr Ángel Galán-Martín, from ETH Zürich, said: "The exercise of allocating CO2 removal quotas may help to break the current impasse, by incentivising countries to align their future national pledges with the expectations emerging from the fairness principles."

Carbon dioxide removal can be achieved in several ways. Reforestation uses trees as natural absorbers of atmospheric CO2 but takes time to reach its full potential as the trees grow. Carbon capture and storage (CCS) takes CO2 out of the atmosphere and stores it in underground geological formations.

CCS is usually coupled with a fossil fuel power station to take the CO2 out of the emissions before they reach the atmosphere. However, it can also be coupled to bioenergy -- growing crops to burn for fuel. These systems have the double benefit of the crops removing CO2 from the atmosphere, and the CCS capturing any CO2 from the power station before it is released.

Beginning the process

However, different countries have varying abilities to deploy these CO2 removal strategies. For example, small but rich countries like Luxembourg might incur a heavy CO2 removal burden but not have the geological capacity to implement large-scale CCS or have the space to plant enough trees or bioenergy crops.

The authors therefore suggest, after quotas have been determined, that a system of trading quotas could be established. For example, the UK has abundant space for CCS thanks to favourable geological formations in the North Sea, so could sell some of its capacity to other countries.

This system would take a while to set up, so the authors urge nations to begin the process now. Co-lead author Dr Carlos Pozo from the University of Girona, said: "By 2050, the world needs to be carbon neutral -- taking out of the atmosphere as much CO2 as it puts in. To this end, a CO2 removal industry needs to be rapidly scaled up, and that begins now, with countries looking at their responsibilities and their capacity to meet any quotas.

Read more at Science Daily

Virus DNA spread across surfaces in hospital ward over 10 hours

Virus DNA left on a hospital bed rail was found in nearly half of all sites sampled across a ward within 10 hours and persisted for at least five days, according to a new study by UCL and Great Ormond Street Hospital (GOSH).

The study, published as a letter in the Journal of Hospital Infection, aimed to safely simulate how SARS-CoV-2, the virus that causes Covid-19, may spread across surfaces in a hospital.

Instead of using the SARS-CoV-2 virus, researchers artificially replicated a section of DNA from a plant-infecting virus, which cannot infect humans, and added it to a millilitre of water at a similar concentration to SARS-CoV-2 copies found in infected patients' respiratory samples.

Researchers placed the water containing this DNA on the hand rail of a hospital bed in an isolation room -- that is, a room for higher-risk or infected patients -- and then sampled 44 sites across a hospital ward over the following five days.

They found that after 10 hours, the surrogate genetic material had spread to 41% of sites sampled across the hospital ward, from bed rails to door handles to arm rests in a waiting room to children's toys and books in a play area. This increased to 59% of sites after three days, falling to 41% on the fifth day.

Dr Lena Ciric (UCL Civil, Environmental & Geomatic Engineering), a senior author of the study, said: "Our study shows the important role that surfaces play in the transmission of a virus and how critical it is to adhere to good hand hygiene and cleaning.

"Our surrogate was inoculated once to a single site, and was spread through the touching of surfaces by staff, patients and visitors. A person with SARS-CoV-2, though, will shed the virus on more than one site, through coughing, sneezing and touching surfaces."

The highest proportion of sites that tested positive for the surrogate came from the immediate bedspace area -- including a nearby room with several other beds -- and clinical areas such as treatment rooms. On day three, 86% of sampled sites in clinical areas tested positive, while on day four, 60% of sampled sites in the immediate bedspace area tested positive.

Co-author Dr Elaine Cloutman-Green (UCL Civil, Environmental & Geomatic Engineering), Lead Healthcare Scientist at GOSH, said: "People can become infected with Covid-19 through respiratory droplets produced during coughing or sneezing. Equally, if these droplets land on a surface, a person may become infected after coming into contact with the surface and then touching their eyes, nose or mouth.

"Like SARS-CoV-2, the surrogate we used for the study could be removed with a disinfectant wipe or by washing hands with soap and water. Cleaning and handwashing represent our first line of defence against the virus and this study is a significant reminder that healthcare workers and all visitors to a clinical setting can help stop its spread through strict hand hygiene, cleaning of surfaces, and proper use of personal protective equipment (PPE)."

SARS-CoV-2 will likely be spread within bodily fluid such as cough droplets, whereas the study used virus DNA in water. More sticky fluid such as mucus would likely spread more easily.

Read more at Science Daily

Blood pressure medications help even the frailest elderly people live longer

Taking blood pressure medication as prescribed helped even the frailest elderly people (65 and older) live longer, and the healthiest older people had the biggest survival boost, according to a large study in northern Italy published today in Hypertension, an American Heart Association journal.

"We knew that high blood pressure medication was protective in general among older people, however, we focused on whether it is also protective in frail patients with many other medical conditions who are usually excluded from randomized trials," said Giuseppe Mancia, M.D., lead study author and professor emeritus at the University of Milano-Bicocca in Milan, Italy.

Researchers reviewed data on almost 1.3 million people aged 65 and older (average age 76) in the Lombardy region of northern Italy who had 3 or more high blood pressure medication prescriptions in 2011-2012. Examining the public health care database, researchers calculated the percentage of time over the next seven years (or until death) that each person continued to receive the medications. Because almost all medications are free or low-cost and dispensed by the public health service, this corresponds roughly to people's adherence in using the medication in Italy.

To look separately at outcomes among older people with various medical conditions, researchers used a previously developed score that accounts for 34 different health factors and has a close relationship with mortality.

Researchers compared roughly 255,000 people who died during the 7-year follow-up with age-, gender-, and health-status-matched controls who survived and divided them into four groups of health status: good, medium, poor or very poor.

The probability of death over 7-years was 16% for people rated in good health at the beginning of the study. Mortality probability increased progressively to 64% for those rated in very poor health.

Compared with people with very low adherence to blood pressure medications (dispensed pills covered less than 25% of the time period), people with high adherence to blood pressure medications (more than 75% of the time period covered) were:

    44% less likely to die if they started in good health; and

    33% less likely to die if they started in very poor health.

A similar pattern was seen with cardiovascular deaths. The greatest survival benefit was among the people who started in good health, and the most modest survival benefit was in those who started in very poor health.

"Our findings definitely suggest that even in very frail people, antihypertensive treatment reduces the risk of death; however, the benefits may be smaller in this group," Mancia said.

No matter what a person's initial health status, survival benefits were greatest in those who received blood pressure medication to cover more than 75% of the follow-up period, compared with those with intermediate (25-75%) or low levels (less than 25%) of coverage, highlighting the importance of consistent use of blood pressure medications.

"Do your best to encourage and support patients to take their medications, because adherence is crucial to getting the benefits. Medications do nothing if people don't take them," Mancia said.

Read more at Science Daily

Jun 7, 2020

Chance of finding young Earth-like planets higher than previously thought

Research from the University of Sheffield has found that the chance of finding Earth-like planets in their early stages of formation is much higher than previously thought.

The team studied groups of young stars in the Milky Way to see if these groups were typical compared to theories and previous observations in other star-forming regions in space, and to study if the populations of stars in these groups affected the likelihood of finding forming Earth-like planets.

The research, published in The Astrophysical Journal, found that there are more stars like the Sun than expected in these groups, which would increase the chances of finding Earth-like planets in their early stages of formation.

In their early stages of formation these Earth-like planets, called magma ocean planets, are still being made from collisions with rocks and smaller planets, which causes them to heat up so much that their surfaces become molten rock.

The team, led by Dr Richard Parker, included undergraduate students from the University of Sheffield giving them the opportunity to apply the skills learnt on their course to leading published research in their field.

Dr Richard Parker, from the University of Sheffield's Department of Physics and Astronomy, said: "These magma ocean planets are easier to detect near stars like the Sun, which are twice as heavy as the average mass star. These planets emit so much heat that we will be able to observe the glow from them using the next generation of infra-red telescopes.

"The locations where we would find these planets are so-called 'young moving groups' which are groups of young stars that are less than 100 million years old -- which is young for a star. However, they typically only contain a few tens of stars each and previously it was difficult to determine whether we had found all of the stars in each group because they blend into the background of the Milky Way galaxy.

"Observations from the Gaia telescope have helped us to find many more stars in these groups, which enabled us to carry out this study."

The findings from the research will help further understanding of whether star formation is universal and will be an important resource for studying how rocky, habitable planets like Earth form. The team now hopes to use computer simulations to explain the origin of these young moving groups of stars.

The research team included undergraduate students Amy Bottrill, Molly Haigh, Madeleine Hole and Sarah Theakston from the University of Sheffield's Department of Physics and Astronomy.

Molly Haigh said: "Being involved in this project was one of the highlights of our university experience and it was a great opportunity to work on an area of astronomy outside the typical course structure.

"It was rewarding to see a physical application of the computer coding we learnt in our degree by sampling the initial mass distribution of stars and how this can relate to the future of exoplanet detection."

Read more at Science Daily

Scientists discover that nicotine promotes spread of lung cancer to the brain

Among people who have the most common type of lung cancer, up to 40% develop metastatic brain tumors, with an average survival time of less than six months.

But why non-small-cell lung cancer so often spreads to the brain has been poorly understood.

Now scientists at Wake Forest School of Medicine have found that nicotine, a non-carcinogenic chemical found in tobacco, actually promotes the spread, or metastasis, of lung cancer cells into the brain.

"Based on our findings, we don't think that nicotine replacement products are the safest way for people with lung cancer to stop smoking," said Kounosuke Watabe, Ph.D., professor of cancer biology at Wake Forest School of Medicine and lead author of the study.

In the study, published in the June 4 edition of the Journal of Experimental Medicine, Watabe's team first examined 281 lung cancer patients and found that cigarette smokers exhibited a significantly higher incidence of brain cancer.

Then, using a mouse model, the researchers discovered that nicotine enhanced brain metastasis by crossing the blood-brain barrier to change the microglia -- a type of immune cell in the brain -- from being protective to supporting tumor growth.

Watabe and colleagues then looked for drugs that might reverse the effects of nicotine and identified parthenolide, a naturally occurring substance in the medicinal herb feverfew, which blocked nicotine-induced brain metastasis in the mice.

Because feverfew has been used for years and is considered safe, Watabe believes parthenolide may provide a new approach to fight brain metastasis, particularly for patients who have smoked or still smoke.

"Currently, the only treatment for this devastating illness is radiation therapy," Watabe said. "Traditional chemotherapy drugs can't cross the blood-brain barrier, but parthenolide can, and thus holds promise as a treatment or possibly even a way to prevent brain metastasis."

Read more at Science Daily