Nov 7, 2020

The birth of a tRNA gene

 Translation is the process by which genetic information is converted into proteins, the workhorses of the cell. Small molecules called transfer RNAs ("tRNAs") play a crucial role in translation; they are the adapter molecules that match codons (the building blocks of genetic information) with amino acids (the building blocks of proteins). Organisms carry many types of tRNAs, each encoded by one or more genes (the "tRNA gene set").

Broadly speaking, the function of the tRNA gene set -- to translate 61 types of codons into 20 different kinds of amino acids -- is conserved across organisms. Nevertheless, tRNA gene set composition can vary considerably between organisms. How and why these differences arise has been a question of long-standing interest among scientists.

Evolution of a bacterial tRNA set in the lab

Jenna Gallie (Research Group Leader at the Max Planck Institute for Evolutionary Biology) and her team have investigated how the tRNA gene set of the bacterium Pseudomonas fluorescens can evolve, using a combination of mathematical modelling and lab-based experiments.

"We started by removing one type of tRNA from the bacterium's genome, resulting in a bacterial strain that grows slowly. We gave this slow-growing strain the opportunity to improve its growth during a real-time evolution experiment. We saw the strain improve repeatedly and rapidly. The improvement was due to the duplication of large chunks of bacterial genetic information, with each duplication containing a compensatory tRNA gene. Ultimately, the elimination of one tRNA type was compensated by an increase in the amount of a second, different tRNA type." Jenna Gallie said. The duplicated tRNA type can compensate because it is able to perform, at a lower rate, the codon-amino acid matching function of the eliminated tRNA type.

Read more at Science Daily

Seeing dark matter in a new light

 A small team of astronomers have found a new way to 'see' the elusive dark matter haloes that surround galaxies, with a new technique 10 times more precise than the previous-best method. The work is published in Monthly Notices of the Royal Astronomical Society.

Scientists currently estimate that up to 85% of the mass in the universe is effectively invisible. This 'dark matter' cannot be observed directly, because it does not interact with light in the same way as the ordinary matter that makes up stars, planets, and life on Earth.

So how do we measure what cannot be seen? The key is to measure the effect of gravity that the dark matter produces.

Pol Gurri, the PhD student at Swinburne University of Technology who led the new research, explains: "It's like looking at a flag to try to know how much wind there is. You cannot see the wind, but the flag's motion tells you how strongly the wind is blowing."

The new research focuses on an effect called weak gravitational lensing, which is a feature of Einstein's general theory of relativity. "The dark matter will very slightly distort the image of anything behind it," says Associate Professor Edward Taylor, who was also involved in the research. "The effect is a bit like reading a newspaper through the base of a wine glass."

Weak gravitational lensing is already one of the most successful ways to map the dark matter content of the Universe. Now, the Swinburne team has used the ANU 2.3m Telescope in Australia to map how gravitationally lensed galaxies are rotating. "Because we know how stars and gas are supposed to move inside galaxies, we know roughly what that galaxy should look like," says Gurri. "By measuring how distorted the real galaxy images are, then we can figure out how much dark matter it would take to explain what we see."

The new research shows how this velocity information enables a much more precise measurement of the lensing effect than is possible using shape alone. "With our new way of seeing the dark matter," Gurri says, "we hope to get a clearer picture of where the dark matter is, and what role it plays in how galaxies form."

Read more at Science Daily

Nov 6, 2020

Astronomers discover clues that unveil the mystery of fast radio bursts

 Fast radio bursts, or FRBs -- powerful, millisecond-duration radio waves coming from deep space outside the Milky Way Galaxy -- have been among the most mysterious astronomical phenomena ever observed. Since FRBs were first discovered in 2007, astronomers from around the world have used radio telescopes to trace the bursts and look for clues on where they come from and how they're produced.

UNLV astrophysicist Bing Zhang and international collaborators recently observed some of these mysterious sources, which led to a series of breakthrough discoveries reported in the journal Nature that may finally shed light into the physical mechanism of FRBs.

The first paper, for which Zhang is a corresponding author and leading theorist, was published in the Oct. 28 issue of Nature.

"There are two main questions regarding the origin of FRBs," said Zhang, whose team made the observation using the Five-hundred-meter Aperture Spherical Telescope (FAST) in Guizhou, China. "The first is what are the engines of FRBs and the second is what is the mechanism to produce FRBs. We found the answer to the second question in this paper."

Two competing theories have been proposed to interpret the mechanism of FRBs. One theory is that they're similar to gamma-ray bursts (GRBs), the most powerful explosions in the universe. The other theory likens them more to radio pulsars, which are spinning neutron stars that emit bright, coherent radio pulses. The GRB-like models predict a non-varying polarization angle within each burst whereas the pulsar-like models predict variations of the polarization angle.

The team used FAST to observe one repeating FRB source and discovered 11 bursts from it. Surprisingly, seven of the 11 bright bursts showed diverse polarization angle swings during each burst. The polarization angles not only varied in each burst, the variation patterns were also diverse among bursts.

"Our observations essentially rules out the GRB-like models and offers support to the pulsar-like models," said K.-J. Lee from the Kavli Institute for Astronomy and Astrophysics, Peking University, and corresponding author of the paper.

Four other papers on FRBs were published in Nature on Nov. 4. These include multiple research articles published by the FAST team led by Zhang and collaborators from the National Astronomical Observatories of China and Peking University. Researchers affiliated with the Canadian Hydrogen Intensity Mapping Experiment (CHIME) and the Survey for Transient Astronomical Radio Emission 2 (STARE2) group also partnered on the publications.

"Much like the first paper advanced our understanding of the mechanism behind FRBs, these papers solved the challenge of their mysterious origin," explained Zhang.

Magnetars are incredibly dense, city-sized neutron stars that possess the most powerful magnetic fields in the universe. Magnetars occasionally make short X-ray or soft gamma-ray bursts through dissipation of magnetic fields, so they have been long speculated as plausible sources to power FRBs during high-energy bursts.

The first conclusive evidence of this came on April 28, 2020, when an extremely bright radio burst was detected from a magnetar sitting right in our backyard -- at a distance of about 30,000 light years from Earth in the Milky Way Galaxy. As expected, the FRB was associated with a bright X-ray burst.

"We now know that the most magnetized objects in the universe, the so-called magnetars, can produce at least some or possibly all FRBs in the universe," said Zhang.

The event was detected by CHIME and STARE2, two telescope arrays with many small radio telescopes that are suitable for detecting bright events from a large area of the sky.

Zhang's team has been using FAST to observe the magnetar source for some time. Unfortunately, when the FRB occurred, FAST was not looking at the source. Nonetheless, FAST made some intriguing "non-detection" discoveries and reported them in one of the Nov. 4 Nature articles. During the FAST observational campaign, there were another 29 X-ray bursts emitted from the magnetar. However, none of these bursts were accompanied by a radio burst.

"Our non-detections and the detections by the CHIME and STARE2 teams delineate a complete picture of FRB-magnetar associations," Zhang said.

To put it all into perspective, Zhang also worked with Nature to publish a single-author review of the various discoveries and their implications for the field of astronomy.

Read more at Science Daily

Has the hidden matter of the universe been discovered?

 Astrophysicists consider that around 40% of the ordinary matter that makes up stars, planets and galaxies remains undetected, concealed in the form of a hot gas in the complexe cosmic web. Today, scientists at the Institut d'Astrophysique Spatiale (CNRS/Université Paris-Saclay) may have detected, for the first time, this hidden matter through an innovative statistical analysis of 20-year-old data. Their findings are published on November 6, 2020 in Astronomy & Astrophysics.

Galaxies are distributed throughout the Universe in the form of a complex network of nodes connected by filaments, which are in turn separated by voids. This is known as the cosmic web. The filaments are thought to contain almost all of the ordinary (so-called baryonic) matter of the Universe in the form of a diffuse, hot gas. However, the signal emitted by this diffuse gas is so weak that in reality 40 to 50% of the baryons goes undetected.

These are the missing baryons, hidden in the filamentary structure of the cosmic web, that Nabila Aghanim, a researcher at the Institut d'Astrophysique Spatiale (CNRS/Université Paris-Saclay) and Hideki Tanimura, a post-doctoral researcher, together with their colleagues, are attempting to detect. In a new study, funded by the ERC ByoPiC project, they present a statistical analysis that reveals, for the first time, the X-ray emission from the hot baryons in filaments. This detection is based on the stacked X-ray signal, in the ROSAT survey data, from approximately 15,000 large-scale cosmic filaments identified in the SDSS galaxy survey. The team made use of the spatial correlation between the position of the filaments and the associated X-ray emission to provide evidence of the presence of hot gas in the cosmic web, and for the first time measure its temperature.

These findings confirm earlier analyses by the same research team, based on indirect detection of hot gas in the cosmic web through its effect on the cosmic microwave background. This paves the way for more detailed studies, using better quality data, to test the evolution of gas in the filamentary structure of the cosmic web.

Read more at Science Daily

Final dance of unequal black hole partners

 Solving the equations of general relativity for colliding black holes is no simple matter.

Physicists began using supercomputers to obtain solutions to this famously hard problem back in the 1960s. In 2000, with no solutions in sight, Kip Thorne, 2018 Nobel Laureate and one of the designers of LIGO, famously bet that there would be an observation of gravitational waves before a numerical solution was reached.

He lost that bet when, in 2005, Carlos Lousto, then at The University of Texas at Brownsville, and his team generated a solution using the Lonestar supercomputer at the Texas Advanced Computing Center. (Concurrently, groups at NASA and Caltech derived independent solutions.)

In 2015, when the Laser Interferometer Gravitational-Wave Observatory (LIGO) first observed such waves, Lousto was in shock.

"It took us two weeks to realize this was really from nature and not from inputting our simulation as a test," said Lousto, now a professor of mathematics at Rochester Institute of Technology (RIT). "The comparison with our simulations was so obvious. You could see with your bare eyes that it was the merger of two black holes."

Lousto is back again with a new numerical relativity milestone, this time simulating merging black holes where the ratio of the mass of the larger black hole to the smaller one is 128 to 1 -- a scientific problem at the very limit of what is computational possible. His secret weapon: the Frontera supercomputer at TACC, the eighth most powerful supercomputer in the world and the fastest at any university.

His research with collaborator James Healy, supported by the National Science Foundation (NSF), was published in Physical Review Letters this week. It may require decades to confirm the results experimentally, but nonetheless it serves as a computational achievement that will help drive the field of astrophysics forward.

"Modeling pairs of black holes with very different masses is very computational demanding because of the need to maintain accuracy in a wide range of grid resolutions," said Pedro Marronetti, program director for gravitational physics at NSF. "The RIT group has performed the world's most advanced simulations in this area, and each of them takes us closer to understanding observations that gravitational-wave detectors will provide in the near future."

LIGO is only able to detect gravitational waves caused by small and intermediate mass black holes of roughly equal size. It will take observatories 100 times more sensitive to detect the type of mergers Lousto and Healy have modeled. Their findings show not only what the gravitational waves caused by a 128:1 merger would look like to an observer on Earth, but also characteristics of the ultimate merged black hole including its final mass, spin, and recoil velocity. These led to some surprises.

"These merged black holes can have speeds much larger than previously known," Lousto said. "They can travel at 5,000 kilometers per second. They kick out from a galaxy and wander around the universe. That's another interesting prediction."

The researchers also computed the gravitational waveforms -- the signal that would be perceived near Earth -- for such mergers, including their peak frequency, amplitude, and luminosity. Comparing those values with predictions from existing scientific models, their simulations were within 2 percent of the expected results.

Previously, the largest mass ratio that had ever been solved with high-precision was 16 to 1 -- eight times less extreme than Lousto's simulation. The challenge of simulating larger mass ratios is that it requires resolving the dynamics of the interacting systems at additional scales.

Like computer models in many fields, Lousto uses a method called adaptive mesh refinement to get precise models of the dynamics of the interacting black holes. It involves putting the black holes, the space between them, and the distant observer (us) on a grid or mesh, and refining the areas of the mesh with greater detail where it is needed.

Lousto's team approached the problem with a methodology that he compares to Zeno's first paradox. By halving and halving the mass ratio while adding internal grid refinement levels, they were able to go from 32:1 black hole mass ratios to 128:1 binary systems that undergo 13 orbits before merger. On Frontera, it required seven months of constant computation.

"Frontera was the perfect tool for the job," Lousto said. "Our problem requires high performance processors, communication, and memory, and Frontera has all three."

The simulation isn't the end of the road. Black holes can have a variety of spins and configurations, which impact the amplitude and frequency of the gravitational waves their merger produces. Lousto would like to solve the equations 11 more times to get a good first range of possible "templates" to compare with future detections.

The results will help the designers of future Earth- and space-based gravitational wave detectors plan their instruments. These include advanced, third generation ground based gravitational wave detectors and the Laser Interferometer Space Antenna (LISA), which is targeted for launch in the mid-2030s.

The research may also help answer fundamental mysteries about black holes, such as how some can grow so big -- millions of times the mass of the Sun.

Read more at Science Daily

About half of Sun-like stars could host rocky, potentially habitable planets

 

Exoplanets illustration
Since astronomers confirmed the presence of planets beyond our solar system, called exoplanets, humanity has wondered how many could harbor life. Now, we're one step closer to finding an answer. According to new research using data from NASA's retired planet-hunting mission, the Kepler space telescope, about half the stars similar in temperature to our Sun could have a rocky planet capable of supporting liquid water on its surface.

Our galaxy holds at least an estimated 300 million of these potentially habitable worlds, based on even the most conservative interpretation of the results in a new study to be published in The Astronomical Journal. Some of these exoplanets could even be our interstellar neighbors, with at least four potentially within 30 light-years of our Sun and the closest likely to be at most about 20 light-years from us. These are the minimum numbers of such planets based on the most conservative estimate that 7% of Sun-like stars host such worlds. However, at the average expected rate of 50%, there could be many more.

This research helps us understand the potential for these planets to have the elements to support life. This is an essential part of astrobiology, the study of life's origins and future in our universe.

The study is authored by NASA scientists who worked on the Kepler mission alongside collaborators from around the world. NASA retired the space telescope in 2018 after it ran out of fuel. Nine years of the telescope's observations revealed that there are billions of planets in our galaxy -- more planets than stars.

"Kepler already told us there were billions of planets, but now we know a good chunk of those planets might be rocky and habitable," said the lead author Steve Bryson, a researcher at NASA's Ames Research Center in California's Silicon Valley. "Though this result is far from a final value, and water on a planet's surface is only one of many factors to support life, it's extremely exciting that we calculated these worlds are this common with such high confidence and precision."

For the purposes of calculating this occurrence rate, the team looked at exoplanets between a radius of 0.5 and 1.5 times that of Earth's, narrowing in on planets that are most likely rocky. They also focused on stars similar to our Sun in age and temperature, plus or minus up to 1,500 degrees Fahrenheit.

That's a wide range of different stars, each with its own particular properties impacting whether the rocky planets in its orbit are capable of supporting liquid water. These complexities are partly why it is so difficult to calculate how many potentially habitable planets are out there, especially when even our most powerful telescopes can just barely detect these small planets. That's why the research team took a new approach.

Rethinking How to Identify Habitability

This new finding is a significant step forward in Kepler's original mission to understand how many potentially habitable worlds exist in our galaxy. Previous estimates of the frequency, also known as the occurrence rate, of such planets ignored the relationship between the star's temperature and the kinds of light given off by the star and absorbed by the planet.

The new analysis accounts for these relationships, and provides a more complete understanding of whether or not a given planet might be capable of supporting liquid water, and potentially life. That approach is made possible by combining Kepler's final dataset of planetary signals with data about each star's energy output from an extensive trove of data from the European Space Agency's Gaia mission.

"We always knew defining habitability simply in terms of a planet's physical distance from a star, so that it's not too hot or cold, left us making a lot of assumptions," said Ravi Kopparapu, an author on the paper and a scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Gaia's data on stars allowed us to look at these planets and their stars in an entirely new way."

Gaia provided information about the amount of energy that falls on a planet from its host star based on a star's flux, or the total amount of energy that is emitted in a certain area over a certain time. This allowed the researchers to approach their analysis in a way that acknowledged the diversity of the stars and solar systems in our galaxy.

"Not every star is alike," said Kopparapu. "And neither is every planet."

Though the exact effect is still being researched, a planet's atmosphere figures into how much light is needed to allow liquid water on a planet's surface as well. Using a conservative estimate of the atmosphere's effect, the researchers estimated an occurrence rate of about 50% -- that is, about half of Sun-like stars have rocky planets capable of hosting liquid water on their surfaces. An alternative optimistic definition of the habitable zone estimates about 75%.

Kepler's Legacy Charts Future Research

This result builds upon a long legacy of work of analyzing Kepler data to obtain an occurrence rate and sets the stage for future exoplanet observations informed by how common we now expect these rocky, potentially habitable worlds to be. Future research will continue to refine the rate, informing the likelihood of finding these kinds of planets and feeding into plans for the next stages of exoplanet research, including future telescopes.

"Knowing how common different kinds of planets are is extremely valuable for the design of upcoming exoplanet-finding missions," said co-author Michelle Kunimoto, who worked on this paper after finishing her doctorate on exoplanet occurrence rates at the University of British Columbia, and recently joined the Transiting Exoplanet Survey Satellite, or TESS, team at the Massachusetts Institute of Technology in Cambridge, Massachusetts. "Surveys aimed at small, potentially habitable planets around Sun-like stars will depend on results like these to maximize their chance of success."

After revealing more than 2,800 confirmed planets outside our solar system, the data collected by the Kepler space telescope continues to yield important new discoveries about our place in the universe. Though Kepler's field of view covered only 0.25% of the sky, the area that would be covered by your hand if you held it up at arm's length towards the sky, its data has allowed scientists to extrapolate what the mission's data means for the rest of the galaxy. That work continues with TESS, NASA's current planet hunting telescope.

Read more at Science Daily

Nov 5, 2020

Clay subsoil at Earth's driest place may signal life on Mars

 Earth's most arid desert may hold a key to finding life on Mars.

Diverse microbes discovered in the clay-rich, shallow soil layers in Chile's dry Atacama Desert suggest that similar deposits below the Martian surface may contain microorganisms, which could be easily found by future rover missions or landing craft.

Led by Cornell University and Spain's Centro de Astrobiología, scientists now offer a planetary primer to identifying microbial markers on shallow rover digs in Martian clay, in their work published Nov. 5 in Nature Scientific Reports.

In that dry environment at Atacama, the scientists found layers of wet clay about a foot below the surface.

"The clays are inhabited by microorganisms," said corresponding author Alberto G. Fairén, a visiting scientist in the Department of Astronomy at Cornell University. "Our discovery suggests that something similar may have occurred billions of years ago -- or it still may be occurring -- on Mars."

If microbes existed on Mars in the past, their biomarkers likely would be preserved there, Fairén said. "If microbes still exist today," he said, "the latest possible Martian life still may be resting there."

The red planet will see rovers cruising across the surface there in the next few years. NASA's rover Perseverance will land on Mars in February 2021; Europe's Rosalind Franklin rover will arrive in 2023. Both of those missions will seek microbial biomarkers in the clay below the planet's surface.

"This paper helps guide the search," Fairén said, "to inform where we should look and which instruments to use on a search for life."

In the Yungay region of the Atacama desert, the scientists found the clay layer, a previously unreported habitat for microbial life, is inhabited by at least 30 salt-loving microbial species of metabolically active bacteria and archaea (single-cell organisms).

The researchers' Atacama discovery reinforces the notion that early Mars may have had a similar subsurface with protected habitable niches, particularly during the first billion years of its history.

"That's why clays are important," he said. "They preserve organic compounds and biomarkers extremely well and they are abundant on Mars."

Read more at Science Daily

Minor fluctuations in sound make it hard to identify in which concert hall music is played

 The volume and timbre of music have a significant impact on how people perceive the acoustics in a concert hall, according to two recent studies carried out by the research group of Aalto University Professor Tapio Lokki. Both have been published in the Journal of the Acoustical Society of America.

The first study demonstrated that, based on the music alone, it is difficult to distinguish which concert hall a piece of music is being played in. The test subjects listened to recordings of a single violin and of part of Beethoven's Seventh Symphony, all played in four different concert halls: the rectangular Concertgebouw in Amsterdam and Herkulessaal in Munich, the vineyard-shaped Berlin Philharmonic, and the fan-shaped Cologne Philharmonic. They first heard a sample from the reference location, after which they tried to identify the reference location from four music samples.

It was easier to identify the hall when the same music sample played in each concert halls. If the reference sample was from a slightly different part of the same piece of music as used in the control locations, it was harder to identify the hall.

'Even small differences in the music listened to made it very difficult to identify concert halls of similar size and architecture. Halls that were very different to each other were clearly more easily identified regardless of the music,' explains postdoctoral researcher Antti Kuusinen.

Another study showed that the acoustics of a concert hall are experienced differently depending on the volume at which the orchestra is playing. The concert halls used in the study were the Helsinki's Musiikkitalo, Munich Herkulessaal, Berlin Philharmonic and Berlin Konzerthaus.

The subjects listened to the orchestra playing at different volume levels, from the quietest piano pianoissimo to the strongest forte fortissimo, after which they placed the concert halls in order according to how loud and enveloping they experienced the music to be. The order of the concert halls changed in some cases related to the volume of the music.

'Traditionally, the acoustics of concert halls are studied by using objective measurements to calculate acoustic parameters, such as reverberation time, which is independent of the characteristics or dynamics of the music. Our research clearly shows that this is insufficient for understanding the acoustics in its entirety, because both the timbre of the sound and the listeners' perceptions shift as the volume changes,' Lokki explains.

Lokki's research group has also previously studied how the acoustics of concert halls affect the emotional reactions evoked by the music. The studies indicated that halls with acoustics that support large dynamic fluctuations evoke the strongest emotional experiences in listeners.

Read more at Science Daily

Technique to regenerate optic nerve offers hope for future glaucoma treatment

 Scientists have used gene therapy to regenerate damaged nerve fibres in the eye, in a discovery that could aid the development of new treatments for glaucoma, one of the leading causes of blindness worldwide.

Axons -- nerve fibres -- in the adult central nervous system (CNS) do not normally regenerate after injury and disease, meaning that damage is often irreversible. However, over the past decade there have been a number of discoveries that suggest it may be possible to stimulate regeneration.

In a study published today in Nature Communications, scientists tested whether the gene responsible for the production of a protein known as Protrudin could stimulate the regeneration of nerve cells and protect them from cell death after an injury.

The team, led by Dr Richard Eva, Professor Keith Martin and Professor James Fawcett from the John van Geest Centre for Brain Repair at the University of Cambridge, used a cell culture system to grow brain cells in a dish. They then injured their axons using a laser and analysed the response to this injury using live-cell microscopy. The researchers found that increasing the amount or activity of Protrudin in these nerve cells vastly increased their ability to regenerate.

Nerve cells in the retina, known as retinal ganglion cells, extend their axons from the eye to the brain through the optic nerve in order to relay and process visual information. To investigate whether Protrudin might stimulate repair in the injured CNS in an intact organism, the researchers used a gene therapy technique to increase the amount and activity of Protrudin in the eye and optic nerve. When they measured the amount of regeneration a few weeks after a crush injury to the optic nerve, the team found that Protrudin had enabled the axons to regenerate over large distances. They also found that the retinal ganglion cells were protected from cell death.

The researchers showed that this technique may help protect against glaucoma, a common eye condition. In glaucoma, the optic nerve that connects the eye to the brain is progressively damaged, often in association with elevated pressure inside the eye. If not diagnosed early enough, glaucoma can lead to loss of vision. In the UK, round one in 50 people over the age of 40, and one in ten people over the age of 75 is affected by glaucoma.

To demonstrate this protective effect of Protrudin against glaucoma, the researchers used a whole retina from a mouse eye and grew it in a cell-culture dish. Usually around a half of retinal neurons die within three days of retinal removal, but the researchers found that increasing or activating Protrudin led to almost complete protection of retinal neurons.

Dr Veselina Petrova from the Department of Clinical Neurosciences at the University of Cambridge, the study's first author, said: "Glaucoma is one of leading causes of blindness worldwide. The causes of glaucoma are not completely understood, but there is currently a large focus on identifying new treatments by preventing nerve cells in the retina from dying, as well as trying to repair vision loss through the regeneration of diseased axons through the optic nerve.

"Our strategy relies on using gene therapy -- an approach already in clinical use -- to deliver Protrudin into the eye. It's possible our treatment could be further developed as a way of protecting retinal neurons from death, as well as stimulating their axons to regrow. It's important to point out that these findings would need further research to see if they could be developed into effective treatments for humans."

Protrudin normally resides within the endoplasmic reticulum, tiny structures within our cells. In this study, the team showed that the endoplasmic reticulum found in axons appears to provide materials and other cellular structures important for growth and survival in order to support the process of regeneration after injury. Protrudin stimulates transport of these materials to the site of injury.

Dr Petrova added: "Nerve cells in the central nervous system lose the ability to regenerate their axons as they mature, so have very limited capacity for regrowth. This means that injuries to the brain, spinal cord and optic nerve have life-altering consequences.

"The optic nerve injury model is often used to investigate new treatments for stimulating CNS axon regeneration, and treatments identified this way often show promise in the injured spinal cord. It's possible that increased or activated Protrudin might be used to boost regeneration in the injured spinal cord."

Read more at Science Daily

Human intelligence just got less mysterious

 Neuroscience experts from the University of Leicester have released research that breaks with the past fifty years of neuroscientific opinion, arguing that the way we store memories is key to making human intelligence superior to that of animals.

It has previously been thought and copiously published that it is 'pattern separation' in the hippocampus, an area of the brain critical for memory, that enables memories to be stored by separate groups of neurons, so that memories don't get mixed up.

Now, after fifteen years of research, Leicester University's Director of Systems Neuroscience believes that in fact the opposite to pattern separation is present in the human hippocampus. He argues that, contrary to what has been described in animals, the same group of neurons store all memories. The consequences of this are far reaching, as such neuronal representation, devoid of specific contextual details, explains the abstract thinking that characterizes human intelligence.

Leicester University's Director of Systems Neuroscience Professor Rodrigo Quian Quiroga explains,

"In contrast to what everybody expects, when recording the activity of individual neurons we have found that there is an alternative model to pattern separation storing our memories.

"Pattern separation is a basic principle of neuronal coding that precludes memory interference in the hippocampus. Its existence is supported by numerous theoretical, computational and experimental findings in different animal species but these findings have never been directly replicated in humans. Previous human studies have been mostly obtained using Functional Magnetic Resource Imagining (fMRI), which doesn't allow recording the activity of individual neurons. Shockingly, when we directly recorded the activity of individual neurons, we found something completely different to what has been described in other animals. This could well be a cornerstone of human's intelligence."

The study, 'No pattern sepaeration in the human hippocampus', argues that the lack of pattern separation in memory coding is a key difference compared to other species, which has profound implications that could explain cognitive abilities uniquely developed in humans, such as our power of generalization and of creative thought.

Read more at Science Daily

Early big-game hunters of the Americas were female, researchers suggest

 

Vicunas in Peru highlands.
For centuries, historians and scientists mostly agreed that when early human groups sought food, men hunted and women gathered. However, a 9,000-year-old female hunter burial in the Andes Mountains of South America reveals a different story, according to new research conducted at the University of California, Davis.

"An archaeological discovery and analysis of early burial practices overturns the long-held 'man-the-hunter' hypothesis," said Randy Haas, assistant professor of anthropology and the lead author of the study, "Female Hunters of the Early Americas." It was published today (Nov. 4) in Science Advances.

"We believe that these findings are particularly timely in light of contemporary conversations surrounding gendered labor practices and inequality," he added. "Labor practices among recent hunter-gatherer societies are highly gendered, which might lead some to believe that sexist inequalities in things like pay or rank are somehow 'natural.' But it's now clear that sexual division of labor was fundamentally different -- likely more equitable -- in our species' deep hunter-gatherer past."

In 2018, during archaeological excavations at a high-altitude site called Wilamaya Patjxa in what is now Peru, researchers found an early burial that contained a hunting toolkit with projectile points and animal-processing tools. The objects accompanying people in death tend to be those that accompanied them in life, researchers said. It was determined that the hunter was likely female based on findings by the team's osteologist, James Watson of The University of Arizona. Watson's sex estimate was later confirmed by dental protein analysis conducted by UC Davis postdoctoral researcher Tammy Buonasera and Glendon Parker, an adjunct associate professor.

Revealing a broader pattern

The surprising discovery of an early female hunter burial led the team to ask whether she was part of a broader pattern of female hunters or merely a one-off. Looking at published records of late Pleistocene and early Holocene burials throughout North and South America, the researchers identified 429 individuals from 107 sites. Of those, 27 individuals were associated with big-game hunting tools -- 11 were female and 15 were male. The sample was sufficient to "warrant the conclusion that female participation in early big-game hunting was likely nontrivial," researchers said. Moreover, the analysis identified the Wilamaya Patjxa female hunter as the earliest hunter burial in the Americas.

Statistical analysis shows that somewhere between 30 to 50 percent of hunters in these populations were female, the study said. This level of participation stands in stark contrast to recent hunter-gatherers, and even farming and capitalist societies, where hunting is a decidedly male activity with low levels of female participation, certainly under 30 percent, Haas explained.

The study was conducted in collaboration with multiple UC Davis labs. Parker, a forensic expert in the Department of Environmental Toxicology, helped determine sex through a proteomic technique he recently developed. In Professor Jelmer Eerkens' lab, Jenny Chen, an undergraduate researcher at the time of the study, discovered the distinct isotopic signature of meat consumption in the bones, further supporting the conclusion that the Wilamaya Patjxa female was a hunter.

While the research answers an old question about sexual division of labor in human societies, it also raises some new ones. The team now wishes to understand how sexual division of labor and its consequences in different times and places changed among hunter-gatherer populations in the Americas.

Read more at Science Daily

Nov 4, 2020

Supersonic winds, rocky rains forecasted on lava planet

 Among the most extreme planets discovered beyond the edges of our solar system are lava planets: fiery hot worlds that circle so close to their host star that some regions are likely oceans of molten lava. According to scientists from McGill University, York University, and the Indian Institute of Science Education, the atmosphere and weather cycle of at least one such exoplanet is even stranger, featuring the evaporation and precipitation of rocks, supersonic winds that rage over 5000 km/hr, and a magma ocean 100 km deep.

In a study published in Monthly Notices of the Royal Astronomical Society, the scientists use computer simulations to predict the conditions on K2-141b, an Earth-size exoplanet with a surface, ocean, and atmosphere all made up of the same ingredients: rocks. The extreme weather forecasted by their analysis could permanently change the surface and atmosphere of K2-141b over time.

"The study is the first to make predictions about weather conditions on K2-141b that can be detected from hundreds of light years away with next-generation telescopes such as the James Webb Space Telescope," says lead author Giang Nguyen, a PhD student at York University who worked under the supervision of McGill University Professor Nicolas Cowan on the study.

Two-thirds of the exoplanet faces endless daylight

In analyzing the illumination pattern of the exoplanet, the team discovered that about two-thirds of K2-141b faces perpetual daylight -- rather than the illuminated hemisphere we are used to on Earth. K2-141b belongs to a subset of rocky planets that orbit very close to their star. This proximity keeps the exoplanet gravitationally locked in place, meaning the same side always faces the star.

The night side experiences frigid temperatures of below -200 C. The day side of the exoplanet, at an estimated 3000 C, is hot enough to not only melt rocks but vaporize them as well, ultimately creating a thin atmosphere in some areas. "Our finding likely means that the atmosphere extends a little beyond the shore of the magma ocean, making it easier to spot with space telescopes," says Nicolas Cowan, a professor in the Department of Earth & Planetary Sciences at McGill University.

Like Earth's water cycle, only with rocks

Remarkably, the rock vapour atmosphere created by the extreme heat undergoes precipitation. Just like the water cycle on Earth, where water evaporates, rises into the atmosphere, condenses, and falls back as rain, so too does the sodium, silicon monoxide, and silicon dioxide on K2-141b. On Earth, rain flows back into the oceans, where it will once more evaporate and the water cycle is repeated. On K2-141b, the mineral vapour formed by evaporated rock is swept to the frigid night side by supersonic winds and rocks "rain" back down into a magma ocean. The resulting currents flow back to the hot day side of the exoplanet, where rock evaporates once more.

Still, the cycle on K2-141b is not as stable as the one on Earth, say the scientists. The return flow of the magma ocean to the day side is slow, and as a result they predict that the mineral composition will change over time -- eventually changing the very surface and atmosphere of K2-141b.

"All rocky planets, including Earth, started off as molten worlds but then rapidly cooled and solidified. Lava planets give us a rare glimpse at this stage of planetary evolution," says Professor Cowan of the Department of Earth and Planetary Sciences.

Read more at Science Daily

Detection of a short, intense radio burst in Milky Way

 New data from a Canadian-led team of astronomers, including researchers from the McGill Space Institute and McGill University Department of Physics, strongly suggest that magnetars -- a type of neutron star believed to have an extremely powerful magnetic field -- could be the source of some fast radio bursts (FRBs). Though much research has been done to explain the mysterious phenomenon, their source has thus far remained elusive and the subject of some debate.

First detection of an intense radio burst from a Galactic magnetar

On 28 April 2020, a team of approximately 50 students, postdocs and professors from the Canadian Hydrogen Intensity Mapping Experiment (CHIME) Fast Radio Burst Collaboration detected an unusually intense radio burst emanating from a nearby magnetar located in the Milky Way. In a study published today in Nature, they show that the intensity of the radio burst was three thousand times greater than that of any magnetar measured thus far, lending weight to the theory that magnetars are at the origin of at least some FRBs.

"We calculated that such an intense burst coming from another galaxy would be indistinguishable from some fast radio bursts, so this really gives weight to the theory suggesting that magnetars could be behind at least some FRBs," said Pragya Chawla, one of the co-authors on the study and a senior PhD student in the Physics Department at McGill.

Competing theories about the origins of FRBs

FRBs were first discovered over a decade ago. Originally thought to be singular events, astronomers have since discovered that some of these high-intensity blasts of radio emissions -- more intense than the energy generated by the Sun over millions to billions of years -- in fact repeat.

One theory hypothesized FRBs to be extragalactic magnetars -- young extremely magnetic neutron stars that occasionally flare to release enormous amounts of energy.

"So far, all of the FRBs that telescopes like CHIME have picked up were in other galaxies, which makes them quite hard to study in great detail," said Ziggy Pleunis, a senior PhD student in McGill's Physics department and one of the co-authors of the new study. "Moreover, the magnetar theory was not supported by observations of magnetars in our own galaxy as they were found to be far less intense than the energy released by extragalactic FRBs until now."

Magnetar origin for all FRBs remains to be confirmed

"However, given the large gaps in energetics and activity between the brightest and most active FRB sources and what is observed for magnetars, perhaps younger, more energetic and active magnetars are needed to explain all FRB observations," added Dr. Paul Scholz from the Dunlap Institute of Astronomy and Astrophysics at the University of Toronto.

Read more at Science Daily

Luminescent wood could light up homes of the future

 The right indoor lighting can help set the mood, from a soft romantic glow to bright, stimulating colors. But some materials used for lighting, such as plastics, are not eco-friendly. Now, researchers reporting in ACS Nano have developed a bio-based, luminescent, water-resistant wood film that could someday be used as cover panels for lamps, displays and laser devices.

Consumer demand for eco-friendly, renewable materials has driven researchers to investigate wood-based thin films for optical applications. However, many materials developed so far have drawbacks, such as poor mechanical properties, uneven lighting, a lack of water resistance or the need for a petroleum-based polymer matrix. Qiliang Fu, Ingo Burgert and colleagues wanted to develop a luminescent wood film that could overcome these limitations.

The researchers treated balsa wood with a solution to remove lignin and about half of the hemicelluloses, leaving behind a porous scaffold. The team then infused the delignified wood with a solution containing quantum dots -- semiconductor nanoparticles that glow in a particular color when struck by ultraviolet (UV) light. After compressing and drying, the researchers applied a hydrophobic coating. The result was a dense, water-resistant wood film with excellent mechanical properties. Under UV light, the quantum dots in the wood emitted and scattered an orange light that spread evenly throughout the film's surface. The team demonstrated the ability of a luminescent panel to light up the interior of a toy house. Different types of quantum dots could be incorporated into the wood film to create various colors of lighting products, the researchers say.

From Science Daily

Understanding the spread of infectious diseases

 Scientists worldwide have been working flat out on research into infectious diseases in the wake of the global outbreak of the COVID-19 disease, caused by the new coronavirus SARS-CoV-2. This concerns not only virologists, but also physicists, who are developing mathematical models to describe the spread of epidemics. Such models are important for testing the effects of various measures designed to contain the disease -- such as face masks, closing public buildings and businesses, and the familiar one of social distancing. These models often serve as a basis for political decisions and underline the justification for any measures taken.

Physicists Michael te Vrugt, Jens Bickmann and Prof. Raphael Wittkowski from the Institute of Theoretical Physics and the Center for Soft Nanoscience at the University of Münster have developed a new model showing the spread of infectious diseases. The working group led by Raphael Wittkowski is studying Statistical Physics, i.e. the description of systems consisting of a large number of particles. In their work, the physicists also use dynamical density functional theory (DDFT), a method developed in the 1990s which enables interacting particles to be described.

At the beginning of the corona pandemic, they realised that the same method is useful for describing the spread of diseases. "In principle, people who observe social distancing can be modelled as particles which repel one another because they have, for example, the same electrical charge," explains lead author Michael te Vrugt. "So perhaps theories describing particles which repel one another might be applicable to people keeping their distance from one another," he adds. Based on this idea, they developed the so-called SIR-DDFT model, which combines the SIR model (a well-known theory describing the spread of infectious diseases) with DDFT. The resulting theory describes people who can infect one another but who keep their distance. "The theory also makes it possible to describe hotspots with infected people, which improves our understanding of the dynamics of so-called super-spreader events earlier this year such as the carnival celebrations in Heinsberg or the après-ski in Ischgl," adds co-author Jens Bickmann. The results of the study have been published in the journal "Nature Communications".

The extent of the social distancing being practised is then defined by the strength of the repulsive interactions. "As a result," explains Raphael Wittkowski, the leader of the study, "this theory can also be used to test the effects of social distancing by simulating an epidemic and varying the values for the parameters defining the strength of the interactions." The simulations show that the infection rates do indeed show a marked decrease that is a result of social distancing. The model thus reproduces the familiar "flattening the curve" effect, in which the curve depicting the development of the number of infected people over time becomes much flatter as a result of social distancing. In comparison with existing theories, the new model has the advantage that the effects of social interactions can be explicitly modelled.

From Science Daily

'Monster tumors' could offer new glimpse at human development

 Finding just the right model to study human development -- from the early embryonic stage onward -- has been a challenge for scientists over the last decade. Now, bioengineers at the University of California San Diego have homed in on an unusual candidate: teratomas.

Teratomas -- which mean "monstrous tumors" in Greek -- are tumors made up of different tissues such as bone, brain, hair and muscle. They form when a mass of stem cells differentiates uncontrollably, forming all types of tissues found in the body. Teratomas are generally considered an undesired byproduct of stem cell research, but UC San Diego researchers found an opportunity to study them as a model for human development.

Researchers report their work in a paper published Nov. 4 in Cell.

"We've been fascinated with the teratoma for quite a while," said Prashant Mali, a professor of bioengineering at the UC San Diego Jacobs School of Engineering. "Not only is the teratoma an intriguing tumor to look at in terms of the diversity of cell types, but it also has regions of organized tissue-like structures. This prompted us to explore its utility in both cell science and cell engineering contexts."

"There's no other model like it. In just one tumor, you can study all of these different lineages, all of these different organs, at the same time," said Daniella McDonald, an M.D/Ph.D. candidate in Mali's lab and co-first author of the study. "Plus, it's a vascularized model, it has a three-dimensional structure and it's human-specific tissue, making it the ideal model for recreating the context in which human development happens."

The team used teratomas grown from human stem cells injected under the skin of immunodeficient mice. They analyzed the teratomas with a technique called single-cell RNA sequencing, which profiles the gene expression of individual teratoma cells. The researchers were able to map 20 cell types, or "human lineages" (brain, gut, muscle, skin, etc.) that were consistently present in all the teratomas they analyzed.

The researchers then used the gene editing technology CRISPR-Cas9 to screen and knock out 24 genes known to regulate development. They found multiple genes that play roles in the development of multiple lineages.

"What's remarkable about this study is that we could use the teratoma to discover things in a much faster way. We can study all of these genes on all of these human lineages in a single experiment," said co-first author Yan Wu, who worked on this project as a Ph.D. student in the labs of Mali and UC San Diego bioengineering professor Kun Zhang. "With other models, like organoids, that separately model one lineage at a time, we would have had to run many different experiments to come up with the same results as we did here."

"Teratomas are a very unique type of human tissue. When examined through the lens of single-cell sequencing, we can see that they contain most major representative cell types in the human body. With that understanding, we suddenly have an extremely powerful platform to understand, manipulate and engineer human cells and tissues in a far more sophisticated way than what was previously possible," Zhang said.

The researchers also showed that they can "molecularly sculpt" the teratoma to be enriched in one lineage -- in this case, neural tissue. They accomplished this feat using a microRNA gene circuit, which acts like a molecular chisel by carving away unwanted tissues -- these are selectively killed off using a suicide gene -- and leaving behind the lineage of interest. The researchers say this has applications in tissue engineering.

Read more at Science Daily

Nov 3, 2020

The cement for coral reefs

 Coral reefs are hotspots of biodiversity. As they can withstand heavy storms, they offer many species a safe home, and at the same time, they protect densely populated coastal regions as they level out storm-driven waves. However, how can these reefs that are made up of often very fragile coral be so stable? A team of researchers from Friedrich-Alexander Universität Erlangen-Nürnberg (FAU) and the University of Bayreuth have now discovered that a very specific type of 'cement' is responsible for this -- by forming a hard calcareous skeleton, coralline red algae stabilise the reefs, and have been doing so for at least 150 million years.

The wide variety of life they support is immediately apparent on images of tropical coral reefs. Their three-dimensional scaffolding provides a habitat for a large number of species. However, the skeletons of the coral are often so fragile that they would not be able to withstand heavy storms by themselves. Even if scientists have long suspected that coralline red algae provide support to reefs with their calcareous skeletons, this is the first time that this link has been proven.

Coralline red algae have been supporting coral reefs for at least 150 million years

The researchers from FAU and the University of Bayreuth were able to prove this supporting function by analysing more than 700 fossilised reefs from 150 million years of the Earth's history. 'The coralline red algae form a calcareous skeleton and cement the coral reefs together,' explains Dr. Sebastian Teichert from the Chair of Palaeoenvironmental Research at FAU. 'However, several crises over the course of millions of years have limited their capacity to do so.'

Successful adaptive strategies against plant grazers

These crises include the evolution of plant grazing marine animals such as sea urchins and parrot fishes who have repeatedly decimated populations of coralline red algae over the course of time. The algae, however, developed defence mechanisms such as special growth forms in order to defend themselves against their attackers. 'The algae have adapted so well that they now even benefit from these plant grazers,' says Teichert. 'They rid the coralline red algae of damaging growth such as green algae, allowing it to grow unhindered.' This means coralline red algae are more successful at supporting coral reefs today than ever before in the Earth's history.

Read more at Science Daily

New mineral discovered in moon meteorite

 A team of European researchers discovered a new high-pressure mineral in the lunar meteorite Oued Awlitis 001, named donwilhelmsite [CaAl4Si2O11]. The team around Jörg Fritz from the Zentrum für Rieskrater und Impaktforschung Nördlingen, Germany and colleagues at the German Research Centre for Geoscience GFZ in Potsdam, Museum für Naturkunde Berlin, Natural History Museum Vienna, Institute of Physics of the Czech Academy of Science, Natural History Museum Oslo, University of Manchester, and Deutsches Zentrum für Luft und Raumfahrt Berlin published their findings in the scientific journal American Mineralogist.

Besides the about 382 kilograms of rocks and soils collected by the Apollo and Luna missions, lunar meteorites allow valuable insights into the formation of the Moon. They are ejected by impacts onto the lunar surface and subsequently delivered to Earth.

Some of these meteorites experienced particularly high temperatures and pressures. The extreme physical conditions often led to shock melting of microscopic areas within these meteorites. These shocked areas are of great relevance as they mirror pressure and temperature regimes similar to those prevailing in the Earth's mantle. Therefore, the microscopic shock melt areas are natural crucibles hosting minerals that are otherwise naturally inaccessible at the Earth's surface. Minerals like wadsleyite, ringwoodite, and bridgmanite, constitute large parts of the Earth's mantle. Theses crystals were synthesized in high-pressure laboratory experiments. As natural minerals they were first described and named based on their occurrences in meteorites.

The new mineral donwilhelmsite is the first high-pressure mineral found in meteorites with application for subducted terrestrial sediments. It is mainly composed of calcium, aluminum, silicon, and oxygen atoms. Donwilhelmsite was discovered within shock melt zones of the lunar meteorite Oued Awlitis 001 found in 2014 in the Western Sahara. This meteorite is compositionally similar to rocks comprising the Earth's continents. Eroded sediments from these continents are transported by wind and rivers to the oceans, and subducted into the Earth's mantle as part of the dense oceanic crust. While being dragged deeper into the Earth mantle the pressure and temperature increases, and the minerals transform into denser mineral phases. The newly discovered mineral donwilhelmsite forms in 460 to 700 kilometre depth. In the terrestrial rock cycle, donwilhelmsite is therefore an important agent for transporting crustal sediments through the transition zone separating the upper and lower Earth's mantle.

This pan-European collaboration was essential to obtain the lunar meteorite, recognize the new mineral, understand its scientific relevance, and to determine the crystal structure of the tiny, the thousands part of a millimeter thick, mineral crystal with high accuracy. "At the GFZ, we used transmission electron microscopy to investigate microstructural aspects of the samples," says Richard Wirth from the section "Interface Geochemistry." "Our investigations and the crystal structure analyses of the colleagues from the Czech Republic once again underline the importance of transmission electron microscopy in the geosciences."

Read more at Science Daily

Vitamin D levels during pregnancy linked with child IQ

 Vitamin D is a critical nutrient and has many important functions in the body. A mother's vitamin D supply is passed to her baby in utero and helps regulate processes including brain development. A study published today in The Journal of Nutrition showed that mothers' vitamin D levels during pregnancy were associated with their children's IQ, suggesting that higher vitamin D levels in pregnancy may lead to greater childhood IQ scores. The study also identified significantly lower levels of vitamin D levels among Black pregnant women.

Melissa Melough, the lead author of the study and research scientist in the Department of Child Health, Behavior, and Development at Seattle Children's Research Institute, says vitamin D deficiency is common among the general population as well as pregnant women, but notes that Black women are at greater risk. Melough says she hopes the study will help health care providers address disparities among women of color and those who are at higher risk for vitamin D deficiency.

"Melanin pigment protects the skin against sun damage, but by blocking UV rays, melanin also reduces vitamin D production in the skin. Because of this, we weren't surprised to see high rates of vitamin D deficiency among Black pregnant women in our study. Even though many pregnant women take a prenatal vitamin, this may not correct an existing vitamin D deficiency," Melough said. "I hope our work brings greater awareness to this problem, shows the long-lasting implications of prenatal vitamin D for the child and their neurocognitive development, and highlights that there are certain groups providers should be paying closer attention to. Wide-spread testing of vitamin D levels is not generally recommended, but I think health care providers should be looking out for those who are at higher risk, including Black women."

Addressing disparities

According to Melough, as many as 80% of Black pregnant women in the U.S. may be deficient in vitamin D. Of the women who participated in the study, approximately 46% of the mothers were deficient in vitamin D during their pregnancy, and vitamin D levels were lower among Black women compared to White women.

Melough and her co-authors used data from a cohort in Tennessee called the Conditions Affecting Neurocognitive Development and Learning in Early Childhood (CANDLE) study. CANDLE researchers recruited pregnant women to join the study starting in 2006 and collected information over time about their children's health and development.

After controlling for several other factors related to IQ, higher vitamin D levels in pregnancy were associated with higher IQ in children ages 4 to 6 years old. Although observational studies like this one cannot prove causation, Melough believes her findings have important implications and warrant further research.

Vitamin D deficiency

"Vitamin D deficiency is quite prevalent," Melough said. "The good news is there is a relatively easy solution. It can be difficult to get adequate vitamin D through diet, and not everyone can make up for this gap through sun exposure, so a good solution is to take a supplement."

The recommended daily intake of vitamin D is 600 international units (IU). On average, Americans consume less than 200 IU in their diet, and so if people aren't making up that gap through sun exposure or supplementation, Melough says people will probably become deficient. Foods that contain higher levels of vitamin D include fatty fish, eggs and fortified sources like cow's milk and breakfast cereals. However, Melough notes that vitamin D is one of the most difficult nutrients to get in adequate amounts from our diets.

Additional research is needed to determine the optimal levels of vitamin D in pregnancy, but Melough hopes this study will help to develop nutritional recommendations for pregnant women. Especially among Black women and those at high risk for vitamin D deficiency, nutritional supplementation and screening may be an impactful strategy for reducing health disparities.

Read more at Science Daily

New insight into how brain neurons influence choices

 When you are faced with a choice -- say, whether to have ice cream or chocolate cake for dessert -- sets of brain cells just above your eyes fire as you weigh your options. Animal studies have shown that each option activates a distinct set of neurons in the brain. The more enticing the offer, the faster the corresponding neurons fire.

Now, a study in monkeys by researchers at Washington University School of Medicine in St. Louis has shown that the activity of these neurons encodes the value of the options and determines the final decision. In the experiments, researchers let animals choose between different juice flavors. By changing the neurons' activity, the researchers changed how appealing the monkeys found each option, leading the animals to make different choices. The study is published Nov. 2 in the journal Nature.

A detailed understanding of how options are valued and choices are made in the brain will help us understand how decision-making goes wrong in people with conditions such as addiction, eating disorders, depression and schizophrenia.

"In a number of mental and neuropsychiatric disorders, patients consistently make poor choices, but we don't understand exactly why," said senior author Camillo Padoa-Schioppa, PhD, a professor of neuroscience, of economics and of biomedical engineering. "Now we have located one critical piece of this puzzle. As we shed light on the neural mechanisms underlying choices, we'll gain a deeper understanding of these disorders."

In the 18th century, economists Daniel Bernoulli, Adam Smith and Jeremy Bentham suggested that people choose among options by computing the subjective value of each offer, taking into consideration factors such as quantity, quality, cost and the probability of actually receiving the promised offer. Once computed, values would be compared to make a decision. It took nearly three centuries to find the first concrete evidence of such calculations and comparisons in the brain. In 2006, Padoa-Schioppa and John Assad, PhD, a professor of neurobiology at Harvard Medical School, published a groundbreaking paper in Nature describing the discovery of neurons that encode the subjective value offered and chosen goods. The neurons were found in the orbitofrontal cortex, an area of the brain just above the eyes involved in goal-directed behavior.

At the time, though, they were unable to demonstrate that the values encoded in the brain led directly to choosing one option over another.

"We found neurons encoding subjective values, but value signals can guide all sorts of behaviors, not just choice," Padoa-Schioppa said. "They can guide learning, emotion, perceptual attention, and aspects of motor control. We needed to show that value signals in a particular brain region guide choices."

To examine the connection between values encoded by neurons and choice behavior, researchers performed two experiments. The study was conducted by first authors Sébastien Ballesta, PhD, then a postdoctoral researcher, and Weikang Shi, a graduate student, with the help of Katherine Conen, PhD, then a graduate student, who designed one of the experiments. Ballesta is now an associate professor at the University of Strasbourg in Strasbourg, France; Conen is now at Brown University.

In one experiment, the researchers repeatedly presented monkeys with two drinks and recorded the animals' selections. The drinks were offered in varying amounts and included lemonade, grape juice, cherry juice, peach juice, fruit punch, apple juice, cranberry juice, peppermint tea, kiwi punch, watermelon juice and salted water. The monkeys often preferred one flavor over another, but they also liked to get more rather than less, so their decisions were not always easy. Each monkey indicated its choice by glancing toward it, and the chosen drink was delivered.

Then, the researchers placed tiny electrodes in each monkey's orbitofrontal cortex. The electrodes painlessly stimulate the neurons that represent the value of each option. When the researchers delivered a low current through the electrodes while a monkey was offered two drinks, neurons dedicated to both options began to fire faster. From the perspective of the monkey, this meant that both options became more appealing but, because of the way values are encoded in the brain, the appeal of one option increased more than that of the other. The upshot is that low-level stimulation made the animal more likely to choose one particular option, in a predictable way.

In another experiment, the monkeys saw first one option, then the other, before they made a choice. Delivering a higher current while the monkey was considering one option disrupted the computation of value taking place at that time, making the monkey more likely to choose whichever option was not disrupted. This result indicates that values computed in the orbitofrontal cortex are a necessary part of making a choice.

Read more at Science Daily

Nov 2, 2020

Intelligent cameras enhance human perception

 Intelligent cameras are the next milestone in image and video processing A team of researchers at the Chair of Multimedia Communications and Signal Processing at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) has developed an intelligent camera that achieves not only high spatial and temporal but also spectral resolution. The camera has a wide range of applications that can improve environmental protection and resource conservation measures as well as autonomous driving or modern agriculture. The findings of the research have been publishedas an open access publication.

'Research up to now has mainly focused on increasing spatial and temporal resolution, which means the number of megapixels or images per second,' explains lecturer Dr. Jürgen Seiler. 'Spectral resolution -- the wavelength and thus the perception of colours -- has largely been adjusted to match human sight during the development of cameras, which merely corresponds to measuring the colours red, green, and blue. However, much more information is hidden in the light spectrum that can be used for a wide range of tasks. For example, we know that some animals use additional light spectra for hunting and searching for food.'

Three resolutions in one camera

Seiler, who is an electrical engineer, has therefore developed a high-resolution multi-spectral camera that enhances human perception with his team at the Chair of Multimedia Communications and Signal Processing (LMS) led by Prof. Dr. Kaup at FAU. It combines all three resolutions -- spatial, temporal and spectral -- in a cost-efficient solution. 'Up to now, there were only extremely expensive and complex methods for measuring the ultraviolet or infrared ranges of light or individual spectral bands for special industrial applications,' says Seiler. 'We looked for a cost-efficient model and we were able to develop a very cost-effective multi-spectral camera.'

The researchers connected several inexpensive standard cameras with various spectral filters to form a multi-spectral camera array. 'We then calculated an image in order to combine the various spectral information from each sensor,' explains Nils Genser, research associate at LMS. 'This new concept enables us to precisely determine the materials of each object captured using just one single image.'

At the same time, the new camera is greatly superior to existing systems in terms of its spatial, temporal and spectral resolution. As the surroundings are recorded by several 'eyes' as is the case with human sight, the system also provides a precise indication of depth. This means that the system not only precisely determines the colour and certain material properties of objects it captures, but also the distance between them and the camera.

Ideal for autonomous driving and environmental technology

Autonomous driving is a potential application for these new intelligent cameras. 'A whole range of solutions to various problems has now opened up thanks to our new technology,' says Seiler. 'In the infrared range, for example, we can differentiate between real people and signposts using the thermal signature. For night driving, we can detect animals crossing the road with sufficient warning.'

Read more at Science Daily

First light on a next-gen astronomical survey toward a new understanding of the cosmos

 The Sloan Digital Sky Survey's fifth generation collected its very first observations of the cosmos at 1:47 a.m. on October 24, 2020. This groundbreaking all-sky survey will bolster our understanding of the formation and evolution of galaxies -- including our own Milky Way -- and the supermassive black holes that lurk at their centers.

The newly-launched SDSS-V will continue the path-breaking tradition set by the survey's previous generations, with a focus on the ever-changing night sky and the physical processes that drive these changes, from flickers and flares of supermassive black holes to the back-and-forth shifts of stars being orbited by distant worlds. SDSS-V will provide the spectroscopic backbone needed to achieve the full science potential of satellites like NASA's TESS, ESA's Gaia, and the latest all-sky X-ray mission, eROSITA.

"In a year when humanity has been challenged across the globe, I am so proud of the worldwide SDSS team for demonstrating -- every day -- the very best of human creativity, ingenuity, improvisation, and resilience. It has been a challenging period for the team, but I'm happy to say that the pandemic may have slowed us, but it has not stopped us" said SDSS-V Director Juna Kollmeier.

As an international consortium, SDSS has always relied heavily on phone and digital communication. But adapting to exclusively virtual communication tactics was a challenge, as was tracking global supply chains and laboratory availability at various university partners while they shifted in and out of lockdown during the final ramp-up to the survey's start. Particularly inspiring were the project's expert observing staff, who worked in even-greater-than-usual isolation to shut down, and then reopen, operations at the survey's mountain-top observatories.

Funded primarily by member institutions, along with grants from the Alfred P. Sloan Foundation, the U.S. National Science Foundation, and the Heising-Simons Foundation, SDSS-V will focus on three primary areas of investigation, each exploring different aspects of the cosmos using different spectroscopic tools. Together these three project pillars -- called "Mappers" -- will observe more than six million objects in the sky, and monitor changes in more than a million of those objects over time.

The survey's Local Volume Mapper will enhance our understanding of galaxy formation and evolution by probing the interactions between the stars that make up galaxies and the interstellar gas and dust that is dispersed between them. The Milky Way Mapper will reveal the physics of stars in our Milky Way, the diverse architectures of its star and planetary systems, and the chemical enrichment of our galaxy since the early universe. The Black Hole Mapper will measure masses and growth over cosmic time of the supermassive black holes that reside in the hearts of galaxies as well as the smaller black holes left behind when stars die.

"We are thrilled to start taking the first data for two of our three Mappers," added SDSS-V Spokesperson Gail Zasowski of the University of Utah. "These early observations are already important for a wide range of science goals. Even these first targets cover goals from mapping the inner regions of supermassive black holes and searching for exotic multiple-black hole systems, to studying nearby stars and their dead cores, to tracing the chemistry of potential planet-hosting stars across the Milky Way."

"SDSS-V will continue to transform astronomy by building on a 20-year legacy of path-breaking science, shedding light on the most fundamental questions about the origins and nature of the universe. It demonstrates all the hallmark characteristics that have made SDSS so successful in the past: open sharing of data, inclusion of diverse scientists, and collaboration across numerous institutions," said Evan Michelson, program director at the Sloan Foundation. "We are so pleased to support Juna Kollmeier and the entire SDSS team, and we are excited for this next phase of discovery."

SDSS-V will operate out of both Apache Point Observatory in New Mexico, home of the survey's original 2.5-meter telescope, and Carnegie's Las Campanas Observatory in Chile, where it uses the 2.5-meter du Pont telescope.

"SDSS V is one of the most important astronomical projects of the decade. It will set new standards not only in astrophysics but also in robotics and big data," said the observatory's Director Leopoldo Infante. "Consequently, to ensure its success, the Las Campanas Observatory is prepared to carry out the project with all the human and technical resources available on the mountain."

SDSS-V's first observations were gathered in New Mexico with existing SDSS instruments, as a necessary change of plans due to the pandemic. As laboratories and workshops around the world navigate safe reopening, SDSS-V's own suite of new innovative hardware is on the horizon -- in particular, systems of automated robots to aim the fiber optic cables used to collect the light from the night sky. These will be installed at both observatories over the next year. New spectrographs and telescopes are also being constructed to enable the Local Volume Mapper observations.

Read more at Science Daily

How the immune system remembers viruses

 When a virus enters the body, it is picked up by certain cells of the immune system. They transport the virus to the lymph nodes where they present its fragments, known as antigens, to CD8+ T cells responsible control of viral infections. Each of these cells carries a unique T cell receptor on the surface that can recognize certain antigens. However, only very few T cell receptors match a given viral the antigen.

To bring the infection under control and maximize the defenses against the virus, these few antigen-specific T cells start dividing rapidly and develop into effector T cells. These kill virus-infected host cells and then die off themselves once the infection is cleared. Some of these short-lived effector cells -- according to the generally accepted theory -- turn into memory T cells, which persist in the organism long term. In case the same pathogen enters the body again, memory T cells are already present and ready to fight the invader more swiftly and effectively than during the first encounter.

Memory cells and their origin

"Prevailing scientific opinion says that activated T cells first become effector cells and only then gradually develop into memory cells," says Dr. Veit Buchholz, a specialist in microbiology and working group leader at the Institute for Medical Microbiology, Immunology and Hygiene at TUM. "In our view, however, that isn't the case. It would mean that the more effector cells are formed after contact with the pathogen, the more numerous the memory cells would become." However, Buchholz and his colleagues observed a different course of events and have now published their results in the journal Nature Immunology.

"We investigated the antiviral immune responses resulting from individual activated T cells in mice and traced the lineage of the ensuing memory cells using single-cell fate mapping," reports first author Dr. Simon Grassmann. "Based on these experiments, we were able to show that certain 'T cell families' descended from individual cells form up to 1000 times more 'memory' than others. However, these long-term dominating T cell families only contributed little to the magnitude of the initial immune response, which was dominated by effector cells derived from other shorter-lived T cell families."

At the level of individual cells, it therefore became evident that development of effector and memory cells segregates at a much earlier stage than previously believed: "Already in the first week after the confrontation with the pathogen, we saw major differences in the transcriptomes of the detected T cell families," says Lorenz Mihatsch, also a first author of the study. "Normally at this time of the immune response CD8+ T cells are enriched in molecules that help to kill virus infected cells. However, we found no indication of these cytolytic molecules in the long-term dominating T cell families. Instead, they were already geared exclusively towards memory development at this early stage."

Read more at Science Daily

Follow your gut: How farms protect from childhood asthma

 We are born into an environment full of small organisms called microbiota. Within the first minutes and hours of our lives, they start challenging but also educating our immune system. The largest immune organ is our gut, where maturation of the immune system and maturation of the colonizing bacteria, the gut microbiome, go hand in hand. After profound perturbations in the first year of life, the maturation process, the composition of the gut microbiome gradually stabilizes and accompanies us for our lives. Previous research of the Munich scientists showed an asthma-protective effect by a diverse environmental microbiome, which was particularly pronounced in farm children. The question now was whether this effect could be attributed to the maturation process of the early gut microbiome.

Farm life boosts gut microbiome maturation in children The researchers analyzed fecal samples from more than 700 infants partly growing up on traditional farms between the age of 2 and 12 months who took part in PASTURE -- a European birth cohort, which runs for almost 20 years now with funding from the European Commission.

"We found that a comparatively large part of the protective farm effect on childhood asthma was mediated by the maturation of the gut microbiome in the first year of life" states Dr. Martin Depner, biostatistician at Helmholtz Zentrum München, and further concludes: "This suggests that farm children are in contact with environmental factors possibly environmental microbiota that interact with the gut microbiome and lead to this protective effect."

The researchers anticipated effects of nutrition on the gut microbiome maturation but were surprised to find strong effects of farm-related exposures such as stays in animal sheds. This emphasizes the importance of the environment for the protective effect. In addition, vaginal delivery and breastfeeding fostered a protective microbiome in the first two months of life.

Furthermore, the researchers discovered an inverse association of asthma with measured level of fecal butyrate. Butyrate is a short chain fatty acid which is known to have an asthma protective effect in mice. The researchers concluded that gut bacteria such as Roseburia and Coprococcus with the potential of producing short chain fatty acids may contribute to asthma protection in humans as well. Children with a matured gut microbiome showed a higher amount of these bacteria (Roseburia and Coprococcus) compared to other children.

"Our study provides further evidence that the gut may have an influence on the health of the lung. A mature gut microbiome with a high level of short chain fatty acids had a protective effect on the respiratory health of the children in this study. This suggests the idea of a relevant gut-lung axis in humans," says Dr. Markus Ege, professor for clinical-respiratory epidemiology at the Dr. von Hauner Children's Hospital. "This also means, however, that an immature gut microbiome may contribute to the development of diseases. This emphasizes the need for prevention strategies in the first year of life, when the gut microbiome is highly plastic and amenable to modification."

Probiotic prevention strategies The researchers demonstrated that the asthma protective effect is not dependent on one single bacteria only, but on the maturation of the entire gut microbiome. This finding questions the approach of using single bacteria as probiotics for the prevention of asthma. Probiotics should rather be tested with respect to their sustained effect on the compositional structure of the gut microbiome and its maturation early in life.

Further studies on cow milk Nutritional aspects analyzed in this study may serve as prevention strategies such as consumption of cow's milk. Unprocessed raw milk, however, cannot be recommended because of the risk of life-threatening infections such as EHEC. Scientists at the Dr. von Hauner Children's Hospital are currently running a clinical trial on the effects of minimally processed but microbiologically safe milk for the prevention of asthma and allergies (MARTHA trial).

Read more at Science Daily

Nov 1, 2020

Most isolated massive stars are kicked out of their clusters

 A pair of University of Michigan studies reveals how some massive stars -- stars eight or more times the mass of our sun -- become isolated in the universe: most often, their star clusters kick them out.

Massive stars typically reside in clusters. Isolated massive stars are called field massive stars. The papers published by U-M students examined most of these stars in the Small Magellanic Cloud, a dwarf galaxy near the Milky Way.

The studies, appearing in the same issue of The Astrophysical Journal, reveal how these field massive stars originate, or become so isolated. Understanding how field massive stars become isolated -- whether they form in isolation or whether they become isolated by being ejected from a star cluster -- will help astronomers probe the conditions in which massive stars are formed. Understanding this and cluster formation is critical for understanding how galaxies evolve.

"About a quarter of all massive stars appear to be isolated, and that's our big question," said recent undergraduate Johnny Dorigo Jones. "How they're found to be isolated, and how they got there."

Dorigo Jones shows in his paper that the vast majority of field massive stars are "runaways," or stars ejected from clusters. Graduate student Irene Vargas-Salazar looked for field massive stars that may have formed in relative isolation by looking for evidence of tiny clusters around them. That means these relatively isolated stars could have formed in conjunction with these smaller stars. But she found very few of these faint clusters.

"Because massive stars require a lot of material to form, there are usually a lot of smaller stars around them," Vargas-Salazar said. "My project asks specifically how many of these field massive stars could have formed in the field."

Dorigo Jones examined how field massive stars are ejected from clusters. He looks at the two different mechanisms that produce runaways: dynamical ejection and binary supernova ejection. In the first, the massive stars are ejected from their clusters -- by up to half a million miles per hour -- because of unstable orbital configurations of stellar groups. In the second, a massive star is ejected when a binary pair has one star that explodes and shoots its companion out into space.

"By having the velocities and the masses of our stars, we're able to compare the distributions of those parameters to the model predictions to determine the certain contributions from each of the ejection mechanisms," Dorigo Jones said.

He found that dynamical ejections -- ejections caused by unstable orbital configurations -- were about 2 to 3 times more numerous than supernova ejections. But Dorigo Jones also found the first observational data that shows a large fraction of the field massive stars came from a combination of both dynamical and supernova ejections.

"These have been studied in the past but we have now set the first observational constraints on the numbers of these two-step runaways," he said. "The way we reach that conclusion is we're essentially seeing that the stars that trace the supernova ejections in our sample are a bit too numerous and too fast compared to the model predictions. You can imagine this being remedied by these stars being reaccelerated upon a supernova kick, having first been dynamically ejected."

The researchers found that potentially up to half of the stars first thought to be from supernova ejections were first dynamically ejected.

Vargas-Salazar's findings also support the idea that most field massive stars are runaways, but she looked at opposite conditions: she looked for field massive stars that formed in relative isolation in tiny clusters of smaller stars, where the massive target star is, called the "tip of the iceberg, or TIB clusters. She did this using two algorithms, "friends-of-friends" and "nearest neighbors," to search for those clusters around 310 field massive stars in the Small Magellanic Cloud.

The "friends-of-friends" algorithm measures the number density of stars by counting how many stars there are at a specific distance from the target star and then doing the same for those stars in turn. The more tightly packed the stars are, the more likely it is to be a cluster. The "nearest neighbors" algorithm measures the number density of stars between the target star and its nearest 20 companions. The more compact and denser the group, the more likely they are to be clusters, Vargas-Salazar said.

Using statistical tests, Vargas-Salazar compared these observations with three random-field datasets and compared the known runaway massive stars to nonrunaways. She found that only a few of the field massive stars appeared to have TIB clusters around them, suggesting that very few actually formed in the field. The balance of the field stars must have originated as runaways.

"In the end, we showed that 5% or less of the stars had TIB clusters. Instead, our findings imply that the majority of stars in field samples could be runaways," Vargas-Salazar said. "Our findings are actually supporting the result that Johnny found, wrapped in a neat little bow."

Vargas-Salazar's findings provide part of the answer to the question of how massive stars form, says Sally Oey, senior author on both of the papers and professor of astronomy at U-M.

Read more at Science Daily

Asteroid's scars tell stories of its past

 By studying impact marks on the surface of asteroid Bennu -- the target of NASA's OSIRIS-REx mission -- a team of researchers led by the University of Arizona has uncovered the asteroid's past and revealed that despite forming hundreds of millions of years ago, Bennu wandered into Earth's neighborhood only very recently.

The study, published in the journal Nature, provides a new benchmark for understanding the evolution of asteroids, offers insights into a poorly understood population of space debris hazardous to spacecraft, and enhances scientists' understanding of the solar system.

The researchers used images and laser-based measurements taken during a two-year surveying phase in which the van-sized OSIRIS-REx spacecraft orbited Bennu and broke the record as the smallest spacecraft to orbit a small body.

Presented at the opening day of the American Astronomical Society's Division of Planetary Science meeting on Oct. 26, the paper details the first observations and measurements of impact craters on individual boulders on an airless planetary surface since the Apollo missions to the moon 50 years ago, according to the authors.

The publication comes just a few days after a major milestone for NASA's University of Arizona-led OSIRIS-REx mission. On Oct. 20, the spacecraft successfully descended to asteroid Bennu to grab a sample from its boulder-scattered surface -- a first for NASA. The sample has now been successfully stowed and will be returned to Earth for study in 2023, where it could give scientists insight into the earliest stages of the formation of our solar system.

Impact Craters on Rocks Tell a Story

Although Earth is being pelted with more than 100 tons of space debris each day, it is virtually impossible to find a rockface pitted by impacts from small objects at high velocities. Courtesy of our atmosphere, we get to enjoy any object smaller than a few meters as a shooting star rather than having to fear being struck by what essentially amounts to a bullet from outer space.

Planetary bodies lacking such a protective layer, however, bear the full brunt of a perpetual cosmic barrage, and they have the scars to show for it. High-resolution images taken by the OSIRIS-REx spacecraft during its two-year survey campaign allowed researchers to study even tiny craters, with diameters ranging from a centimeter to a meter, on Bennu's boulders.

On average, the team found boulders of 1 meter (3 feet) or larger to be scarred by anywhere from one to 60 pits -- impacted by space debris ranging in size from a few millimeters to tens of centimeters.

"I was surprised to see these features on the surface of Bennu," said the paper's lead author, Ronald Ballouz, a postdoctoral researcher in the UArizona Lunar and Planetary Laboratory and a scientist with the OSIRIS-REx regolith development working group. "The rocks tell their history through the craters they accumulated over time. We haven't observed anything like this since astronauts walked on the moon."

For Ballouz, who grew up during the 1990s in post-civil war Beirut, Lebanon, the image of a rock surface pitted with small impact craters evoked childhood memories of building walls riddled with bullet holes in his war-torn home country.

"Where I grew up, the buildings have bullet holes all over, and I never thought about it," he said. "It was just a fact of life. So, when I looked at the images from the asteroid, I was very curious, and I immediately thought these must be impact features."

The observations made by Ballouz and his team bridge a gap between previous studies of space debris larger than a few centimeters, based on impacts on the moon, and studies of objects smaller than a few millimeters, based on observations of meteors entering Earth's atmosphere and impacts on spacecraft.

"The objects that formed the craters on Bennu's boulders fall within this gap that we don't really know much about," Ballouz said, adding that rocks in that size range are an important field of study, mainly because they represent hazards for spacecraft in orbit around Earth. "An impact from one of these millimeter to centimeter-size objects at speeds of 45,000 miles per hour can be dangerous."

Ballouz and his team developed a technique to quantify the strength of solid objects using remote observations of craters on the surfaces of boulders -- a mathematical formula that allows researchers to calculate the maximum impact energy that a boulder of a given size and strength could endure before being smashed. In other words, the crater distribution found on Bennu today keeps a historical record of the frequency, size and velocity of impact events the asteroid has experienced throughout its history.

"The idea is actually pretty simple," Ballouz said, using a building exposed to artillery fire as an analogy to boulders on an asteroid. "We ask, 'What is the largest crater you can make on that wall before the wall disintegrates?' Based on observations of multiple walls of the same size, but with different sized craters, you can get some idea of the strength of that wall."

The same holds true for a boulder on an asteroid or other airless body, said Ballouz, who added that the approach could be used on any other asteroid or airless body that astronauts or spacecraft may visit in the future.

"If a boulder gets hit by something larger than an object that would leave a certain size cater, it would just disappear," he explained. In other words, the size distribution of boulders that have persisted on Bennu serve as silent witnesses to its geologic past.

A Newcomer to Earth's Neighborhood

Applying the technique to boulders ranging in size from pebbles to parking garages, the researchers were able to make inferences about the sizes and type of impactors to which the boulders were exposed, and for how long.

The authors conclude that the largest craters on Bennu's boulders were created while Bennu resided in the asteroid belt, where impact speeds are lower than in the near-Earth environment, but are more frequent and often near the limit of what the boulders could withstand. Smaller craters, on the other hand, were acquired more recently, during Bennu's time in near-Earth space, where impact speeds are higher but potentially disruptive impactors are much less common.

Based on these calculations, the authors determine that Bennu is a relative newcomer to Earth's neighborhood. Although it is thought to have formed in the main asteroid belt more than 100 million years ago, it is estimated that it was kicked out of the asteroid belt and migrated to its current territory only 1.75 million years ago. Extending the results to other near-Earth objects, or NEOs, the researchers also suggest that these objects likely come from parent bodies that fall in the category of asteroids, which are mostly rocky with little or no ice, rather than comets, which have more ice than rock.

Read more at Science Daily