Humans are not the only ones who have beliefs; animals do too, although it is more difficult to prove them than with humans. Dr. Tobias Starzak and Professor Albert Newen from the Institute of Philosophy II at Ruhr-Universität Bochum have proposed four criteria to understand and empirically investigate animal beliefs in the journal Mind and Language. The article was published online on 16 June 2020.
Flexible use of information about the world
The first criterion for the existence of beliefs worked out by the philosophers is that an animal must have information about the world. However, this must not simply lead to an automatic reaction, like a frog instinctively snapping at a passing insect.
Instead, the animal must be able to use the information to behave in a flexible manner. "This is the case when one and the same piece of information can be combined with different motivations to produce different behaviours," explains Albert Newen. "For example, if the animal can use the information that there is food available at that moment for the purpose of eating or hiding the food."
Information can be relinked
The third criterion says that the information is internally structured in a belief; accordingly, individual aspects of that information can be processed separately. This has emerged, for example, in experiments with rats that can learn that a certain kind of food can be found at a certain time in a certain place. Their knowledge has a what-when-where structure.
Fourthly, animals with beliefs must be able to recombine the information components in novel ways. This reassembled belief should then lead to flexible behaviour. Rats can do this too, as the US researcher Jonathan Crystal demonstrated in experiments in an eight-armed labyrinth. The animals learned that if they received normal food in arm three of the maze in the morning, chocolate could be found in arm seven at noon.
Crows and scrub jays meet all criteria
The authors from Bochum also cite crows and scrub jays as examples of animals with beliefs. British researcher Nicola Clayton carried out conclusive experiments with scrub jays. When the birds are hungry, they initially tend to eat the food. When they are not hungry, they systematically hide the leftovers. In the process, they encode which food -- worm or peanut -- they have hidden where and when. If they are hungry in the following hours, they first look for the worms they prefer. After the period of time has elapsed that takes worms to become inedible, they head for the peanut hiding places instead.
"What best explains this change in behaviour is the birds' belief about the worms being spoiled and their beliefs about the location of other food items," says Tobias Starzak. The animals also react flexibly in other situations, for example if they notice that they are being watched by rivals while hiding; if this is the case, they hide the food again later.
Read more at Science Daily
Jun 18, 2020
Viruses can steal our genetic code to create new human-virus genes
Like a scene out of "Invasion of the Body Snatchers," a virus infects a host and converts it into a factory for making more copies of itself. Now researchers have shown that a large group of viruses, including the influenza viruses and other serious pathogens, steal genetic signals from their hosts to expand their own genomes.
This finding is presented in a study published online today and in print June 25 in Cell. The cross-disciplinary collaborative study was led by researchers at the Global Health and Emerging Pathogens Institute at Icahn School of Medicine at Mount Sinai in New York, and at the MRC-University of Glasgow Centre for Virus Research in the UK.
The cross-disciplinary team of virologists looked at a large group of viruses known as segmented negative-strand RNA viruses (sNSVs), which include widespread and serious pathogens of humans, domesticated animals and plants, including the influenza viruses and Lassa virus (the cause of Lassa fever). They showed that, by stealing genetic signals from their hosts, viruses can produce a wealth of previously undetected proteins. The researchers labeled them as UFO (Upstream Frankenstein Open reading frame) proteins, as they are encoded by stitching together the host and viral sequences. There was no knowledge of the existence of these kinds of proteins prior to this study.
These UFO proteins can alter the course of viral infection and could be exploited for vaccine purposes.
"The capacity of a pathogen to overcome host barriers and establish infection is based on the expression of pathogen-derived proteins," said Ivan Marazzi, PhD, Associate Professor of Microbiology at Icahn School of Medicine and corresponding author on the study. "To understand how a pathogen antagonizes the host and establishes infection, we need to have a clear understanding of what proteins a pathogen encodes, how they function, and the manner in which they contribute to virulence."
Viruses cannot build their own proteins, so they need to feed suitable instructions to the machinery that builds proteins in their host's cells. Viruses are known to do this through a process called "cap-snatching," in which they cut the end from one of the cell's own protein-encoding messages (a messenger RNA, or mRNA) and then extend that sequence with a copy of one of their own genes. This gives a hybrid message to be read.
"For decades we thought that by the time the body encounters the signal to start translating that message into protein (a 'start codon') it is reading a message provided to it solely by the virus. Our work shows that the host sequence is not silent," said Dr. Marazzi.
The researchers show that, because they make hybrids of host mRNAs with their own genes, viruses (sNSVs) can produce messages with extra, host-derived start codons, a process they called "start snatching." This makes it possible to translate previously unsuspected proteins from the hybrid host-virus sequences. They further show that these novel genes are expressed by influenza viruses and potentially a vast number of other viruses. The product of these hybrid genes can be visible to the immune system, and they can modulate virulence. Further studies are needed to understand this new class of proteins and what the implications are of their pervasive expression by many of the RNA viruses that cause epidemics and pandemics.
Ed Hutchinson, PhD, corresponding author and a research fellow at MRC-University of Glasgow Centre for Virus Research, said, "Viruses take over their host at the molecular level, and this work identifies a new way in which some viruses can wring every last bit of potential out of the molecular machinery they are exploiting. While the work done here focusses on influenza viruses, it implies that a huge number of viral species can make previously unsuspected genes."
Researchers say the next part of their work is to understand the distinct roles the unsuspected genes play. "Now we know they exist, we can study them and use the knowledge to help disease eradication," said Dr. Marazzi. "A large global effort is required to stop viral epidemics and pandemics, and these new insights may lead to identifying novel ways to stop infection."
Read more at Science Daily
This finding is presented in a study published online today and in print June 25 in Cell. The cross-disciplinary collaborative study was led by researchers at the Global Health and Emerging Pathogens Institute at Icahn School of Medicine at Mount Sinai in New York, and at the MRC-University of Glasgow Centre for Virus Research in the UK.
The cross-disciplinary team of virologists looked at a large group of viruses known as segmented negative-strand RNA viruses (sNSVs), which include widespread and serious pathogens of humans, domesticated animals and plants, including the influenza viruses and Lassa virus (the cause of Lassa fever). They showed that, by stealing genetic signals from their hosts, viruses can produce a wealth of previously undetected proteins. The researchers labeled them as UFO (Upstream Frankenstein Open reading frame) proteins, as they are encoded by stitching together the host and viral sequences. There was no knowledge of the existence of these kinds of proteins prior to this study.
These UFO proteins can alter the course of viral infection and could be exploited for vaccine purposes.
"The capacity of a pathogen to overcome host barriers and establish infection is based on the expression of pathogen-derived proteins," said Ivan Marazzi, PhD, Associate Professor of Microbiology at Icahn School of Medicine and corresponding author on the study. "To understand how a pathogen antagonizes the host and establishes infection, we need to have a clear understanding of what proteins a pathogen encodes, how they function, and the manner in which they contribute to virulence."
Viruses cannot build their own proteins, so they need to feed suitable instructions to the machinery that builds proteins in their host's cells. Viruses are known to do this through a process called "cap-snatching," in which they cut the end from one of the cell's own protein-encoding messages (a messenger RNA, or mRNA) and then extend that sequence with a copy of one of their own genes. This gives a hybrid message to be read.
"For decades we thought that by the time the body encounters the signal to start translating that message into protein (a 'start codon') it is reading a message provided to it solely by the virus. Our work shows that the host sequence is not silent," said Dr. Marazzi.
The researchers show that, because they make hybrids of host mRNAs with their own genes, viruses (sNSVs) can produce messages with extra, host-derived start codons, a process they called "start snatching." This makes it possible to translate previously unsuspected proteins from the hybrid host-virus sequences. They further show that these novel genes are expressed by influenza viruses and potentially a vast number of other viruses. The product of these hybrid genes can be visible to the immune system, and they can modulate virulence. Further studies are needed to understand this new class of proteins and what the implications are of their pervasive expression by many of the RNA viruses that cause epidemics and pandemics.
Ed Hutchinson, PhD, corresponding author and a research fellow at MRC-University of Glasgow Centre for Virus Research, said, "Viruses take over their host at the molecular level, and this work identifies a new way in which some viruses can wring every last bit of potential out of the molecular machinery they are exploiting. While the work done here focusses on influenza viruses, it implies that a huge number of viral species can make previously unsuspected genes."
Researchers say the next part of their work is to understand the distinct roles the unsuspected genes play. "Now we know they exist, we can study them and use the knowledge to help disease eradication," said Dr. Marazzi. "A large global effort is required to stop viral epidemics and pandemics, and these new insights may lead to identifying novel ways to stop infection."
Read more at Science Daily
Origins of the beloved guinea pig
Guinea pigs |
Just published in the international science journal, Scientific Reports, the researchers use ancient DNA from archaeological guinea pig remains which reveals the transition from the animals being used as a wild food source 10,000 years ago to their domestication and later role as beloved pets and medical animal models.
It builds on previous research over many years by Professor of Biological Anthropology, Lisa Matisoo-Smith, tracing the DNA from plants and animals that Pacific settlers carried in their canoes and using that as a proxy for identifying human population origins and tracking their movement around the Pacific.
As part of her Otago Master's thesis research in Professor Matisoo-Smith's lab, Edana Lord, now at Stockholm University, Sweden and Dr Catherine Collins from Otago's Department of Anatomy and other international researchers, set about finding out where the guinea pigs that were introduced to the islands of the Caribbean came from.
Professor Matisoo-Smith explains it is generally accepted that modern guinea pigs were domesticated in the Andes region of what is now Peru. As an important food item that was also included in religious ceremonies, they were transported and traded around South America.
Sometime around AD500, guinea pigs were taken out to the islands of the Caribbean, through at least one of several established trade networks. The researchers expected that the guinea pigs found in the Caribbean would came from Colombia, one of the closer locations in South America to the Caribbean.
Using ancient DNA of guinea pigs remains excavated from several sites in the Caribbean, Peru, Colombia, Bolivia, Europe and North America, they found the guinea pigs on the islands did not originate in Colombia, but most likely originated in Peru.
What was a bigger surprise to the team was that the guinea pig remains found in the Colombian Highlands appeared to be from a totally different species. This suggests that guinea pig domestication likely took place independently in both Peru and Colombia.
The genetic information, along with archaeological contexts, also shows how the guinea pigs had different roles through time.
"They were and still are important food item in many parts of South America and cultures that derived from South America -- people took them live to introduce to new islands where they were not native or they traded them for other goods," Professor Matisoo-Smith explains.
"The guinea pig was brought to Europe in the late 1500s or early 1600s by the Spanish and to North America in the early 1800s as part of the exotic pet trade. In the 18th century guinea pigs began to be used by medical researchers as laboratory animals because they have many biological similarities to humans, thus the origin of the phrase 'being a guinea pig' in research.
"All guinea pigs today -- pets, those that are sold for meat in South America and Puerto Rico, and those used in medical research -- are derived from the Peruvian domesticated guinea pigs."
Why the guinea pig was viewed as a pet in some cultures and a food source in others can likely be attributed to long-established cultural notions of what is acceptable as food.
Professor Matisoo-Smith says the research demonstrates that the history of guinea pigs is more complex than previously known and has implications for other studies regarding mammal domestication, translocation and distribution.
"Identifying the origins of the guinea pig remains from the Caribbean helps us to understand how the human trade networks in the region moved in the past 1000 years or so.
Read more at Science Daily
Tomato's hidden mutations revealed in study of 100 varieties
Tomato varieties |
Today, scientists are teasing out how these physical changes show up at the level of genes -- work that could guide modern efforts to tweak the tomato, says Howard Hughes Medical Institute Investigator Zachary Lippman.
He and colleagues have now identified long-concealed hidden mutations within the genomes of 100 types of tomato, including an orange-berried wild plant from the Galapagos Islands and varieties typically processed into ketchup and sauce.
Their analysis, described June 17, 2020, in the journal Cell, is the most comprehensive assessment of such mutations -- which alter long sections of DNA -- for any plant. The research could lead to the creation of new tomato varieties and the improvement of existing ones, Lippman says. A handful of the mutations his team identified alter key characteristics, like flavor and weight, the researchers showed.
Previous studies have long shown that these mutations exist in plant genomes, says Lippman, a plant geneticist at Cold Spring Harbor Laboratory. "But until now, we didn't have an efficient way to find them and study their impact," he says.
A window into the genome
Mutations, or changes, in the four types of DNA letters carried within an organism's cells can alter its physical characteristics. Scientists studying plants have generally focused on a small, tractable kind of mutation, in which one DNA letter is swapped for another.
The mutations Lippman's team studied are much bigger -- they modify DNA's structure by copying, deleting, inserting, or moving long sections of DNA elsewhere in the genome. These mutations, also called structural variations, occur throughout the living world. Studies in humans, for example, have linked these variations to disorders such as schizophrenia and autism.
Scientists can identify mutations by reading out the letters of DNA using a technique known as genetic sequencing. Limitations in this technology, however, have made it difficult to decode long sections of DNA, Lippman says. So researchers haven't been able to capture a complete picture of structural mutations in the genome.
Even so, plant geneticists have suspected that these mutations contribute significantly to plants' traits, says Michael Purugganan, who studies rice and date palms at New York University and was not involved in the new study. "That's why this paper is so exciting," he says. Lippman's team not only found these mutations in tomato and its wild relatives, but also determined how they function within the plants, he says.
A guide for future tomatoes
The new study, a collaboration with Michael Schatz at Johns Hopkins University and others, identified more than 200,000 structural mutations in tomatoes using a technique called long-read sequencing. Lippman likens it to looking through a panoramic window at large sections of the genome. By comparison, more conventional sequencing offered only a peephole, he says.
The majority of the mutations they found do not change genes that encode traits. But what's clear, Lippman says, is that many of these mutations alter mechanisms controlling genes' activity. One such gene, for instance, controls tomato fruit size. By modifying DNA structure -- in this case, the number of copies of the gene -- Lippman's team was able to alter fruit production. Plants lacking the gene never made fruit, while plants with three copies of the gene made fruit about 30 percent larger than those with just a single copy.
Lippman's team also demonstrated how DNA structure can influence traits in an example he calls "remarkably complex." They showed that four structural mutations together were needed for breeding a major harvesting trait into modern tomatoes.
These sorts of insights could help explain trait diversity in other crops and enable breeders to improve varieties, Lippman says. For instance, perhaps adding an extra copy of the size gene to tiny ground cherries, a close relative of the tomato, could increase their appeal by making them larger, he says.
Read more at Science Daily
Jun 17, 2020
Antarctic sea ice loss explained in new study
Scientists have discovered that the summer sea ice in the Weddell Sea sector of Antarctica has decreased by one million square kilometres -- an area twice the size of Spain -- in the last five years, with implications for the marine ecosystem. The findings are published this month (June 2020) in the journal Geophysical Research Letters.
Sea ice surrounding Antarctica provides an important habitat for many species including penguins and seals, which rely on it to access food and to breed.
An international team of researchers studied satellite records of sea ice extent and weather analyses starting in the late 1970s to understand why summer sea ice in the Weddell Sea area of Antarctica has reduced by a third over the last five years. They found that ice loss occurred due to a series of severe storms in the Antarctic summer of 2016/17, along with the re-appearance of an area of open water in the middle of the 'pack ice' (known as a polynya), which had not occurred since the mid-1970s.
Lead author Professor John Turner, a climate scientist at British Antarctic Survey, says:
"Antarctic sea ice continues to surprise us. In contrast to the Arctic, sea ice around the Antarctic had been increasing in extent since the 1970s, but then rapidly decreased to record low levels, with the greatest decline in the Weddell Sea. In summer, this area now has a third less sea ice, which will have implications for ocean circulation and the marine wildlife of the region that depend on it for their survival."
The ocean around Antarctica freezes and doubles the size of the continent in the austral winter, with the sea ice extent reaching over 18 million square kilometres by late September. Through the spring and summer, the sea ice almost completely melts in most parts of the Antarctic, with only the Weddell Sea retaining a significant amount of sea ice.
There are few storms around the Antarctic in the austral summer, but in December 2016, a number of intense and unseasonal storms developed in the Weddell Sea and drew warm air towards the Antarctic, melting a large amount of sea ice. The ice-free ocean absorbed energy from the Sun and then created a warm ocean temperature anomaly that still persists today.
The winter of 2016 also saw the development of a polynya in the Weddell Sea, a large area of open water within the sea ice, which also contributed to the overall decline in sea ice extent. This polynya was created by the strong winds associated with the storms and unprecedented warm ocean conditions.
This recent rapid sea ice loss is affecting both the Weddell Sea ecosystem and the wider Antarctic wildlife/plants and animals. Many species, ranging from tiny ice algae and shrimp-like crustaceans called krill to seabirds, seals and whales, are highly adapted to the presence of sea ice. If the drastic changes observed continue, they will have repercussions throughout the food chain, from affecting nutrients to the reduction of essential habitat for breeding and feeding for vast numbers of animals, such as ice seals and some species of penguins.
Author and ecologist Professor Eugene Murphy from British Antarctic Survey says:
"The dramatic decline in sea ice observed in the Weddell Sea is likely to have significant impacts on the way the entire marine ecosystem functions. Understanding these wider consequences is of paramount importance, especially if the decline in ice extent continues."
Read more at Science Daily
Sea ice surrounding Antarctica provides an important habitat for many species including penguins and seals, which rely on it to access food and to breed.
An international team of researchers studied satellite records of sea ice extent and weather analyses starting in the late 1970s to understand why summer sea ice in the Weddell Sea area of Antarctica has reduced by a third over the last five years. They found that ice loss occurred due to a series of severe storms in the Antarctic summer of 2016/17, along with the re-appearance of an area of open water in the middle of the 'pack ice' (known as a polynya), which had not occurred since the mid-1970s.
Lead author Professor John Turner, a climate scientist at British Antarctic Survey, says:
"Antarctic sea ice continues to surprise us. In contrast to the Arctic, sea ice around the Antarctic had been increasing in extent since the 1970s, but then rapidly decreased to record low levels, with the greatest decline in the Weddell Sea. In summer, this area now has a third less sea ice, which will have implications for ocean circulation and the marine wildlife of the region that depend on it for their survival."
The ocean around Antarctica freezes and doubles the size of the continent in the austral winter, with the sea ice extent reaching over 18 million square kilometres by late September. Through the spring and summer, the sea ice almost completely melts in most parts of the Antarctic, with only the Weddell Sea retaining a significant amount of sea ice.
There are few storms around the Antarctic in the austral summer, but in December 2016, a number of intense and unseasonal storms developed in the Weddell Sea and drew warm air towards the Antarctic, melting a large amount of sea ice. The ice-free ocean absorbed energy from the Sun and then created a warm ocean temperature anomaly that still persists today.
The winter of 2016 also saw the development of a polynya in the Weddell Sea, a large area of open water within the sea ice, which also contributed to the overall decline in sea ice extent. This polynya was created by the strong winds associated with the storms and unprecedented warm ocean conditions.
This recent rapid sea ice loss is affecting both the Weddell Sea ecosystem and the wider Antarctic wildlife/plants and animals. Many species, ranging from tiny ice algae and shrimp-like crustaceans called krill to seabirds, seals and whales, are highly adapted to the presence of sea ice. If the drastic changes observed continue, they will have repercussions throughout the food chain, from affecting nutrients to the reduction of essential habitat for breeding and feeding for vast numbers of animals, such as ice seals and some species of penguins.
Author and ecologist Professor Eugene Murphy from British Antarctic Survey says:
"The dramatic decline in sea ice observed in the Weddell Sea is likely to have significant impacts on the way the entire marine ecosystem functions. Understanding these wider consequences is of paramount importance, especially if the decline in ice extent continues."
Read more at Science Daily
Insight into the black hole at the center of our galaxy
Like most galaxies, the Milky Way hosts a supermassive black hole at its center. Called Sagittarius A*, the object has captured astronomers' curiosity for decades. And now there is an effort to image it directly.
Catching a good photo of the celestial beast will require a better understanding of what's going on around it, which has proved challenging due to the vastly different scales involved. "That's the biggest thing we had to overcome," said Sean Ressler, a postdoctoral researcher at UC Santa Barbara's Kavli Institute for Theoretical Physics (KITP), who just published a paper in the Astrophysical Journal Letters, investigating the magnetic properties of the accretion disk surrounding Sagittarius A*.
In the study, Ressler, fellow KITP postdoc Chris White and their colleagues, Eliot Quataert of UC Berkeley and James Stone at the Institute for Advanced Study, sought to determine whether the black hole's magnetic field, which is generated by in-falling matter, can build up to the point where it briefly chokes off this flow, a condition scientists call magnetically arrested. Answering this would require simulating the system all the way out to the closest orbiting stars.
The system in question spans seven orders of magnitude. The black hole's event horizon, or envelope of no return, reaches around 4 to 8 million miles from its center. Meanwhile, the stars orbit around 20 trillion miles away, or about as far as the sun's nearest neighboring star.
"So you have to track the matter falling in from this very large scale all the way down to this very small scale," said Ressler. "And doing that in a single simulation is incredibly challenging, to the point that it's impossible." The smallest events proceed on timescales of seconds while the largest phenomena play out over thousands of years.
This paper connects small scale simulations, which are mostly theory-based, with large-scale simulations that can be constrained by actual observations. To achieve this, Ressler divided the task between models at three overlapping scales.
The first simulation relied on data from Sagittarius A*'s surrounding stars. Fortunately, the black hole's activity is dominated by just 30 or so Wolf-Rayet stars, which blow off tremendous amounts of material. "The mass loss from just one of the stars is larger than the total amount of stuff falling into the black hole during the same time," Ressler said. The stars spend only around 100,000 years in this dynamic phase before transitioning into a more stable stage of life.
Using observational data, Ressler simulated the orbits of these stars over the course of about a thousand years. He then used the results as the starting point for a simulation of medium-range distances, which evolve over shorter time scales. He repeated this for a simulation down to the very edge of the event horizon, where activity takes place in matters of seconds. Rather than stitching together hard overlaps, this approach allowed Ressler to fade the results of the three simulations into one another.
"These are really the first models of the accretion at the smallest scales in [Sagittarius] A* that take into account the reality of the supply of matter coming from orbiting stars," said coauthor White.
And the technique worked splendidly. "It went beyond my expectations," Ressler remarked.
The results indicated that Sagittarius A* can become magnetically arrested. This came as a surprise to the team, since the Milky Way has a relatively quiet galactic center. Usually, magnetically arrested black holes have high-energy jets shooting particles away at relativistic speeds. But so far scientists have seen little evidence for jets around Sagittarius A*.
"The other ingredient that helps create jets is a rapidly spinning black hole," said White, "so this may be telling us something about the spin of Sagittarius A*."
Unfortunately, black hole spin is difficult to determine. Ressler modeled Sagittarius A* as a stationary object. "We don't know anything about the spin," he said. "There's a possibility that it's actually just not spinning."
Ressler and White next plan to model a spinning back hole, which is much more challenging. It immediately introduces a host of new variables, including spin rate, direction and tilt relative to the accretion disc. They will use data from the European Southern Observatory's GRAVITY interferometer to guide these decisions.
The team used the simulations to create images that can be compared to actual observations of the black hole. Scientists at the Event Horizon Telescope collaboration -- which made headlines in April 2019 with the first direct image of a black hole -- have already reached out requesting the simulation data in order to supplement their effort to photograph Sagittarius A*.
The Event Horizon Telescope effectively takes a time average of its observations, which results in a blurry image. This was less of an issue when the observatory had their sights on Messier 87*, because it is around 1,000 times larger than Sagittarius A*, so it changes around 1,000 times more slowly.
"It's like taking a picture of a sloth versus taking a picture of a hummingbird," Ressler explained. Their current and future results should help the consortium interpret their data on our own galactic center.
Read more at Science Daily
Catching a good photo of the celestial beast will require a better understanding of what's going on around it, which has proved challenging due to the vastly different scales involved. "That's the biggest thing we had to overcome," said Sean Ressler, a postdoctoral researcher at UC Santa Barbara's Kavli Institute for Theoretical Physics (KITP), who just published a paper in the Astrophysical Journal Letters, investigating the magnetic properties of the accretion disk surrounding Sagittarius A*.
In the study, Ressler, fellow KITP postdoc Chris White and their colleagues, Eliot Quataert of UC Berkeley and James Stone at the Institute for Advanced Study, sought to determine whether the black hole's magnetic field, which is generated by in-falling matter, can build up to the point where it briefly chokes off this flow, a condition scientists call magnetically arrested. Answering this would require simulating the system all the way out to the closest orbiting stars.
The system in question spans seven orders of magnitude. The black hole's event horizon, or envelope of no return, reaches around 4 to 8 million miles from its center. Meanwhile, the stars orbit around 20 trillion miles away, or about as far as the sun's nearest neighboring star.
"So you have to track the matter falling in from this very large scale all the way down to this very small scale," said Ressler. "And doing that in a single simulation is incredibly challenging, to the point that it's impossible." The smallest events proceed on timescales of seconds while the largest phenomena play out over thousands of years.
This paper connects small scale simulations, which are mostly theory-based, with large-scale simulations that can be constrained by actual observations. To achieve this, Ressler divided the task between models at three overlapping scales.
The first simulation relied on data from Sagittarius A*'s surrounding stars. Fortunately, the black hole's activity is dominated by just 30 or so Wolf-Rayet stars, which blow off tremendous amounts of material. "The mass loss from just one of the stars is larger than the total amount of stuff falling into the black hole during the same time," Ressler said. The stars spend only around 100,000 years in this dynamic phase before transitioning into a more stable stage of life.
Using observational data, Ressler simulated the orbits of these stars over the course of about a thousand years. He then used the results as the starting point for a simulation of medium-range distances, which evolve over shorter time scales. He repeated this for a simulation down to the very edge of the event horizon, where activity takes place in matters of seconds. Rather than stitching together hard overlaps, this approach allowed Ressler to fade the results of the three simulations into one another.
"These are really the first models of the accretion at the smallest scales in [Sagittarius] A* that take into account the reality of the supply of matter coming from orbiting stars," said coauthor White.
And the technique worked splendidly. "It went beyond my expectations," Ressler remarked.
The results indicated that Sagittarius A* can become magnetically arrested. This came as a surprise to the team, since the Milky Way has a relatively quiet galactic center. Usually, magnetically arrested black holes have high-energy jets shooting particles away at relativistic speeds. But so far scientists have seen little evidence for jets around Sagittarius A*.
"The other ingredient that helps create jets is a rapidly spinning black hole," said White, "so this may be telling us something about the spin of Sagittarius A*."
Unfortunately, black hole spin is difficult to determine. Ressler modeled Sagittarius A* as a stationary object. "We don't know anything about the spin," he said. "There's a possibility that it's actually just not spinning."
Ressler and White next plan to model a spinning back hole, which is much more challenging. It immediately introduces a host of new variables, including spin rate, direction and tilt relative to the accretion disc. They will use data from the European Southern Observatory's GRAVITY interferometer to guide these decisions.
The team used the simulations to create images that can be compared to actual observations of the black hole. Scientists at the Event Horizon Telescope collaboration -- which made headlines in April 2019 with the first direct image of a black hole -- have already reached out requesting the simulation data in order to supplement their effort to photograph Sagittarius A*.
The Event Horizon Telescope effectively takes a time average of its observations, which results in a blurry image. This was less of an issue when the observatory had their sights on Messier 87*, because it is around 1,000 times larger than Sagittarius A*, so it changes around 1,000 times more slowly.
"It's like taking a picture of a sloth versus taking a picture of a hummingbird," Ressler explained. Their current and future results should help the consortium interpret their data on our own galactic center.
Read more at Science Daily
Study in Philadelphia links growth in tree canopy to decrease in human mortality
The first city-wide health impact assessment of the estimated effects of a tree canopy initiative on premature mortality in Philadelphia suggests that increased tree canopy could prevent between 271 and 400 premature deaths per year. The study by Michelle Kondo, a Philadelphia-based research social scientist with the U.S. Department of Agriculture Forest Service, and her partners suggest that increased tree canopy or green space could decrease morbidity and mortality for urban populations -- particularly in areas with lower socioeconomic status where existing tree canopies tend to be the lowest.
The study, "Health impact assessment of Philadelphia's 2025 tree canopy cover goals," examined the potential impact of Greenworks Philadelphia, a plan to increase tree canopy to 30 percent across the city by 2025, on human mortality. The analysis is one of the first to estimate the number of preventable deaths based on physical activity, air pollution, noise, heat, and exposure to greenspaces using a tool developed by public health researchers in Spain and Switzerland called the Greenspace-Health Impact Assessment
Kondo and her partners estimated the annual number of preventable deaths associated with projected changes in tree canopy cover in Philadelphia between 2014 and 2025 under three scenarios of increased urban green space. They found that increasing urban tree canopy to the Greenworks Philadelphia goal of 30 percent in all neighborhoods could prevent 400 deaths annually, but lesser increases in tree canopy still resulted in reduced mortality. A 5 percentage point increase in tree canopy only in areas without trees could result in an annual reduction of 302 deaths citywide, researchers found, and a 10 percentage point increase in tree canopy cover across the city was associated with an estimated reduction of 376 deaths
"This study supports the idea that increasing tree canopy and urban greening efforts are worthwhile, even at modest levels, as health-promoting and cost-saving measures," Kondo said.
Current tree canopy in Philadelphia ranges from 2 percent to 88 percent, with an average 20 percent urban tree canopy coverage based on 2014 data.
"In recent weeks, as residents of many cities experienced quarantine conditions, we experienced a heightened need for public green space," Kondo said. "While the COVID-19 pandemic has meant that we need to pay attention to our proximity to other people and take precautions to limit our contact, time outside in parks and forests has been critical to maintaining our mental and physical health."
Research partners included scientists from Universitat Pompeu Fabra, National Socio-Environmental Synthesis Center, Colorado State University, and Drexel University.
From Science Daily
The study, "Health impact assessment of Philadelphia's 2025 tree canopy cover goals," examined the potential impact of Greenworks Philadelphia, a plan to increase tree canopy to 30 percent across the city by 2025, on human mortality. The analysis is one of the first to estimate the number of preventable deaths based on physical activity, air pollution, noise, heat, and exposure to greenspaces using a tool developed by public health researchers in Spain and Switzerland called the Greenspace-Health Impact Assessment
Kondo and her partners estimated the annual number of preventable deaths associated with projected changes in tree canopy cover in Philadelphia between 2014 and 2025 under three scenarios of increased urban green space. They found that increasing urban tree canopy to the Greenworks Philadelphia goal of 30 percent in all neighborhoods could prevent 400 deaths annually, but lesser increases in tree canopy still resulted in reduced mortality. A 5 percentage point increase in tree canopy only in areas without trees could result in an annual reduction of 302 deaths citywide, researchers found, and a 10 percentage point increase in tree canopy cover across the city was associated with an estimated reduction of 376 deaths
"This study supports the idea that increasing tree canopy and urban greening efforts are worthwhile, even at modest levels, as health-promoting and cost-saving measures," Kondo said.
Current tree canopy in Philadelphia ranges from 2 percent to 88 percent, with an average 20 percent urban tree canopy coverage based on 2014 data.
"In recent weeks, as residents of many cities experienced quarantine conditions, we experienced a heightened need for public green space," Kondo said. "While the COVID-19 pandemic has meant that we need to pay attention to our proximity to other people and take precautions to limit our contact, time outside in parks and forests has been critical to maintaining our mental and physical health."
Research partners included scientists from Universitat Pompeu Fabra, National Socio-Environmental Synthesis Center, Colorado State University, and Drexel University.
From Science Daily
Hunting in savanna-like landscapes may have poured jet fuel on brain evolution
African savanna |
Northwestern University researchers recently discovered that complex landscapes -- dotted with trees, bushes, boulders and knolls -- might have helped land-dwelling animals evolve higher intelligence than their aquatic ancestors.
Compared to the vast emptiness of open water, land is rife with obstacles and occlusions. By providing prey with spaces to hide and predators with cover for sneak attacks, the habitats possible on land may have helped give rise to planning strategies -- rather than those based on habit -- for many of those animals.
But the researchers found that planning did not give our ancestors the upper hand in all landscapes. The researchers' simulations show there is a Goldilocks level of barriers -- not too few and not too many -- to a predator's perception, in which the advantage of planning really shines. In simple landscapes like open ground or packed landscapes like dense jungle, there was no advantage.
"All animals -- on land or in water -- had the same amount of time to evolve, so why do land animals have most of the smarts?" asked Northwestern's Malcolm MacIver, who led the study. "Our work shows that it's not just about what's in the head but also about what's in the environment."
And, no, dolphins and whales do not fall into the category of less intelligent sea creatures. Both are land mammals that recently (evolutionarily speaking) returned to water.
The paper will be published June 16 in the journal Nature Communications.
It is the latest in a series of studies conducted by MacIver that advance a theory of how land animals evolved the ability to plan. In a follow-up study now underway with Dan Dombeck, a professor of neurobiology at Northwestern, MacIver will put the predictions generated by this computational study to the test through experiments with small animals in a robotic reconfigurable environment.
MacIver is a professor of biomedical and mechanical engineering in Northwestern's McCormick School of Engineering and a professor of neurobiology in the Weinberg College of Arts and Sciences. Ugurcan Mugan, a Ph.D. candidate in MacIver's laboratory, is the paper's first author.
Simulating survival
In previous work, MacIver showed that when animals started invading land 385 million years ago, they gained the ability to see around a hundred times farther than they could in water. MacIver hypothesized that being a predator or a prey in the context of being able to see so much farther might require more brain power than hunting through empty, open water. However, the supercomputer simulations for the new study (35 years of calculations on a single PC) revealed that although seeing farther is necessary to advantage planning, it's not sufficient. Instead, only a combination of long-range vision and landscapes with a mix of open areas and more densely vegetated zones resulted in a clear win for planning.
"We speculated that moving onto land poured jet fuel on the evolution of the brain as it may have advantaged the hardest cognitive operation there is: Envisioning the future," MacIver said. "It could explain why we can go out for seafood, but seafood can't go out for us."
To test this hypothesis, MacIver and his team developed computational simulations to test the survival rates of prey being actively hunted by a predator under two different decision-making strategies: Habit-based (automatic, such as entering a password that you have memorized) and plan-based (imagining several scenarios and selecting the best one). The team created a simple, open world without visual barriers to simulate an aquatic world. Then, they added objects of varying densities to simulate land.
Survival of the smartest
"When defining complex cognition, we made a distinction between habit-based action and planning," MacIver said. "The important thing about habit is that it is inflexible and outcome independent. That's why you keep entering your old password for a while after changing it. In planning, you have to imagine different futures and choose the best potential outcome."
In the simple aquatic and terrestrial environments examined in the study, survival rate was low both for prey that used habit-based actions and those that had the capability to plan. The same was true of highly packed environments, such as coral reefs and dense rainforests.
"In those simple open or highly packed environments, there is no benefit to planning," MacIver said. "In the open aquatic environments, you just need to run in the opposite direction and hope for the best. While in the highly packed environments, there are only a few paths to take, and you are not able to strategize because you can't see far. In these environments, we found that planning does not improve your chances of survival."
The Goldilocks landscape
When patches of vegetation and topography are interspersed with wide open areas similar to a savanna, however, simulations showed that planning results in a huge survival payoff compared to habit-based movements. Because planning increases the chance of survival, evolution would have selected for the brain circuitry that allowed animals to imagine future scenarios, evaluate them and then enact one.
"With patchy landscapes, there is an interplay of transparent and opaque regions of space and long-range vision, which means that your movement can hide or reveal your presence to an adversary," MacIver said. "Terra firma becomes a chess board. With every movement, you have a chance to unfurl a strategy.
"Interestingly," he noted, "when we split off from life in the trees with chimpanzees nearly seven million years ago and quickly quadrupled in brain size, paleoecology studies point to our having invaded patchy landscapes, similar to those our study highlights, as giving the biggest payoff for strategic thinking."
Read more at Science Daily
Jun 16, 2020
As many as six billion Earth-like planets in our galaxy, according to new estimates
Exoplanet illustration |
To be considered Earth-like, a planet must be rocky, roughly Earth-sized and orbiting Sun-like (G-type) stars. It also has to orbit in the habitable zones of its star -- the range of distances from a star in which a rocky planet could host liquid water, and potentially life, on its surface.
"My calculations place an upper limit of 0.18 Earth-like planets per G-type star," says UBC researcher Michelle Kunimoto, co-author of the new study in The Astronomical Journal. "Estimating how common different kinds of planets are around different stars can provide important constraints on planet formation and evolution theories, and help optimize future missions dedicated to finding exoplanets."
According to UBC astronomer Jaymie Matthews: "Our Milky Way has as many as 400 billion stars, with seven per cent of them being G-type. That means less than six billion stars may have Earth-like planets in our Galaxy."
Previous estimates of the frequency of Earth-like planets range from roughly 0.02 potentially habitable planets per Sun-like star, to more than one per Sun-like star.
Typically, planets like Earth are more likely to be missed by a planet search than other types, as they are so small and orbit so far from their stars. That means that a planet catalogue represents only a small subset of the planets that are actually in orbit around the stars searched. Kunimoto used a technique known as 'forward modelling' to overcome these challenges.
"I started by simulating the full population of exoplanets around the stars Kepler searched," she explained. "I marked each planet as 'detected' or 'missed' depending on how likely it was my planet search algorithm would have found them. Then, I compared the detected planets to my actual catalogue of planets. If the simulation produced a close match, then the initial population was likely a good representation of the actual population of planets orbiting those stars."
Kunimoto's research also shed more light on one of the most outstanding questions in exoplanet science today: the 'radius gap' of planets. The radius gap demonstrates that it is uncommon for planets with orbital periods less than 100 days to have a size between 1.5 and two times that of Earth. She found that the radius gap exists over a much narrower range of orbital periods than previously thought. Her observational results can provide constraints on planet evolution models that explain the radius gap's characteristics.
Read more at Science Daily
Discovery of oldest bow and arrow technology in Eurasia
The origins of human innovation have traditionally been sought in the grasslands and coasts of Africa or the temperate environments of Europe. More extreme environments, such as the tropical rainforests of Asia, have been largely overlooked, despite their deep history of human occupation. A new study provides the earliest evidence for bow-and-arrow use, and perhaps the making of clothes, outside of Africa ~48-45,000 years ago -- in the tropics of Sri Lanka.
The island of Sri Lanka in the Indian Ocean, just south of the Indian subcontinent, is home to the earliest fossils of our species, Homo sapiens, in South Asia. It also preserves clear evidence for human occupation and the use of tropical rainforest environments outside of Africa from ~48,000 to 3,000 years ago -- refuting the idea that these supposedly resource-poor environments acted as barriers for migrating Pleistocene humans. The question as to exactly how humans obtained rainforest resources -- including fast-moving food sources like monkeys and squirrels -- remains unresolved.
In this new study, published in Science Advances, an international team of researchers from the Max Planck Institute for the Science of Human History (MPI-SHH) in Germany, Griffith University in Australia and the Department of Archaeology, Government of Sri Lanka, present evidence for the earliest use of bow-and-arrow technologies by humans anywhere outside of Africa. At ~48,000 years old, these tools are earlier than the first similar technology found in Europe. Clear evidence for use on the preserved bone arrowheads shows that they were likely used for hunting difficult-to-catch rainforest prey. Not only that, but the scientists show that other bone tools may have been used for making nets or clothing in tropical settings, dramatically altering traditional assumptions about how certain human innovations were linked with specific environmental requirements.
Hunting in the open and sheltering from the cold?
European cultural products in the form of cave art, amazingly detailed bone carvings, bone tool technologies, and tailored clothing have been frequently held up as the pinnacle of Late Pleistocene human cultural development. There, symbolic and technological innovations have been seen as key survival mechanisms equipping expanding populations to face cold northern climates. Meanwhile, discoveries of older bow-and-arrow technology and artistic or symbolic behaviors in open grassland or coastal settings in Africa have framed 'savannah' and marine environments, respectively, as key drivers behind early hunting and cultural experiments by Pleistocene humans in their evolutionary homeland.
As co-author of the new study, Patrick Roberts of the MPI-SHH argues that "this traditional focus has meant that other parts of Africa, Asia, Australasia, and the Americas have often been side-lined in discussions of the origins of material culture, such as novel projectile hunting methods or cultural innovations associated with our species." Nevertheless, the last twenty years have highlighted how Pleistocene humans occupied and adapted to a variety of extreme environments as they migrated beyond Africa, including deserts, high-altitude settings and tropical rainforests such as those of Sri Lanka.
A tropical home
The new study saw scientists turn to the beautifully preserved material culture from the cave of Fa-Hien Lena, deep in the heart of Sri Lanka's Wet Zone forests. As co-author Oshan Wedage, PhD at MPI-SHH, states, "Fa-Hien Lena has emerged as one of South Asia's most important archaeological sites since the 1980s, preserving remains of our species, their tools, and their prey in a tropical context." Some of the main finds from the site include remarkable single and doubled pointed bone tools that scientists had suspected were used in the exploitation of tropical resources. Direct proof had been lacking, however, in the absence of detailed high-powered microscopic analysis.
Michelle Langley of Griffith University, the lead author of the new study, is an expert in the study of microscopic traces of tool use and the creation of symbolic material culture in Pleistocene contexts. Applying cutting edge methods to the Fa-Hien Lena material confirmed the researchers' hypothesis. As Langley states, "the fractures on the points indicate damage through high-powered impact -- something usually seen in the use of bow-and-arrow hunting of animals. This evidence is earlier than similar findings in Southeast Asia 32,000 years ago and is currently the earliest clear evidence for bow-and-arrow use beyond the African continent."
The evidence for early human innovation did not stop there. Applying the same microscopic approach to other bone tools, the team identified implements which seem to have been associated with freshwater fishing in nearby tropical streams, as well as the working of fiber to make nets or clothing. "We also found clear evidence for the production of colored beads from mineral ochre and the refined making of shell beads traded from the coast, at a similar age to other 'social signaling' materials found in Eurasia and Southeast Asia, roughly 45,000 years ago," says Michelle Langley. Together, this reveals a complex, early human social network in the tropics of South Asia.
A flexible toolkit for new hunting grounds
The new study highlights that archaeologists can no longer link specific technological, symbolic, or cultural developments in Pleistocene humans to a single region or environment. "The Sri Lankan evidence shows that the invention of bows-and-arrows, clothing, and symbolic signaling occurred multiple times and in multiple different places, including within the tropical rainforests of Asia," says co-author Michael Petraglia of the MPI-SHH. In addition to insulation in cold environments, clothes may have also helped against tropical mosquitoes, "and instead of just hunting large grassland mammals," adds zooarchaeologist Noel Amano, another MPI-SHH co-author, "bows and arrows helped humans procure small, tree-dwelling primates and rodents."
Read more at Science Daily
The island of Sri Lanka in the Indian Ocean, just south of the Indian subcontinent, is home to the earliest fossils of our species, Homo sapiens, in South Asia. It also preserves clear evidence for human occupation and the use of tropical rainforest environments outside of Africa from ~48,000 to 3,000 years ago -- refuting the idea that these supposedly resource-poor environments acted as barriers for migrating Pleistocene humans. The question as to exactly how humans obtained rainforest resources -- including fast-moving food sources like monkeys and squirrels -- remains unresolved.
In this new study, published in Science Advances, an international team of researchers from the Max Planck Institute for the Science of Human History (MPI-SHH) in Germany, Griffith University in Australia and the Department of Archaeology, Government of Sri Lanka, present evidence for the earliest use of bow-and-arrow technologies by humans anywhere outside of Africa. At ~48,000 years old, these tools are earlier than the first similar technology found in Europe. Clear evidence for use on the preserved bone arrowheads shows that they were likely used for hunting difficult-to-catch rainforest prey. Not only that, but the scientists show that other bone tools may have been used for making nets or clothing in tropical settings, dramatically altering traditional assumptions about how certain human innovations were linked with specific environmental requirements.
Hunting in the open and sheltering from the cold?
European cultural products in the form of cave art, amazingly detailed bone carvings, bone tool technologies, and tailored clothing have been frequently held up as the pinnacle of Late Pleistocene human cultural development. There, symbolic and technological innovations have been seen as key survival mechanisms equipping expanding populations to face cold northern climates. Meanwhile, discoveries of older bow-and-arrow technology and artistic or symbolic behaviors in open grassland or coastal settings in Africa have framed 'savannah' and marine environments, respectively, as key drivers behind early hunting and cultural experiments by Pleistocene humans in their evolutionary homeland.
As co-author of the new study, Patrick Roberts of the MPI-SHH argues that "this traditional focus has meant that other parts of Africa, Asia, Australasia, and the Americas have often been side-lined in discussions of the origins of material culture, such as novel projectile hunting methods or cultural innovations associated with our species." Nevertheless, the last twenty years have highlighted how Pleistocene humans occupied and adapted to a variety of extreme environments as they migrated beyond Africa, including deserts, high-altitude settings and tropical rainforests such as those of Sri Lanka.
A tropical home
The new study saw scientists turn to the beautifully preserved material culture from the cave of Fa-Hien Lena, deep in the heart of Sri Lanka's Wet Zone forests. As co-author Oshan Wedage, PhD at MPI-SHH, states, "Fa-Hien Lena has emerged as one of South Asia's most important archaeological sites since the 1980s, preserving remains of our species, their tools, and their prey in a tropical context." Some of the main finds from the site include remarkable single and doubled pointed bone tools that scientists had suspected were used in the exploitation of tropical resources. Direct proof had been lacking, however, in the absence of detailed high-powered microscopic analysis.
Michelle Langley of Griffith University, the lead author of the new study, is an expert in the study of microscopic traces of tool use and the creation of symbolic material culture in Pleistocene contexts. Applying cutting edge methods to the Fa-Hien Lena material confirmed the researchers' hypothesis. As Langley states, "the fractures on the points indicate damage through high-powered impact -- something usually seen in the use of bow-and-arrow hunting of animals. This evidence is earlier than similar findings in Southeast Asia 32,000 years ago and is currently the earliest clear evidence for bow-and-arrow use beyond the African continent."
The evidence for early human innovation did not stop there. Applying the same microscopic approach to other bone tools, the team identified implements which seem to have been associated with freshwater fishing in nearby tropical streams, as well as the working of fiber to make nets or clothing. "We also found clear evidence for the production of colored beads from mineral ochre and the refined making of shell beads traded from the coast, at a similar age to other 'social signaling' materials found in Eurasia and Southeast Asia, roughly 45,000 years ago," says Michelle Langley. Together, this reveals a complex, early human social network in the tropics of South Asia.
A flexible toolkit for new hunting grounds
The new study highlights that archaeologists can no longer link specific technological, symbolic, or cultural developments in Pleistocene humans to a single region or environment. "The Sri Lankan evidence shows that the invention of bows-and-arrows, clothing, and symbolic signaling occurred multiple times and in multiple different places, including within the tropical rainforests of Asia," says co-author Michael Petraglia of the MPI-SHH. In addition to insulation in cold environments, clothes may have also helped against tropical mosquitoes, "and instead of just hunting large grassland mammals," adds zooarchaeologist Noel Amano, another MPI-SHH co-author, "bows and arrows helped humans procure small, tree-dwelling primates and rodents."
Read more at Science Daily
Spectacular bird's-eye view? Hummingbirds see diverse colors humans can only imagine
Hummingbird feeding on flowers |
"Humans are color-blind compared to birds and many other animals," said Mary Caswell Stoddard, an assistant professor in the Princeton University Department of Ecology and Evolutionary Biology. Humans have three types of color-sensitive cones in their eyes -- attuned to red, green and blue light -- but birds have a fourth type, sensitive to ultraviolet light. "Not only does having a fourth color cone type extend the range of bird-visible colors into the UV, it potentially allows birds to perceive combination colors like ultraviolet+green and ultraviolet+red -- but this has been hard to test," said Stoddard.
To investigate how birds perceive their colorful world, Stoddard and her research team established a new field system for exploring bird color vision in a natural setting. Working at the Rocky Mountain Biological Laboratory (RMBL) in Gothic, Colorado, the researchers trained wild broad-tailed hummingbirds (Selasphorus platycercus) to participate in color vision experiments.
"Most detailed perceptual experiments on birds are performed in the lab, but we risk missing the bigger picture of how birds really use color vision in their daily lives," Stoddard said. "Hummingbirds are perfect for studying color vision in the wild. These sugar fiends have evolved to respond to flower colors that advertise a nectar reward, so they can learn color associations rapidly and with little training."
Stoddard's team was particularly interested in "nonspectral" color combinations, which involve hues from widely separated parts of the color spectrum, as opposed to blends of neighboring colors like teal (blue-green) or yellow (green-red). For humans, purple is the clearest example of a nonspectral color. Technically, purple is not in the rainbow: it arises when our blue (short-wave) and red (long-wave) cones are stimulated, but not green (medium-wave) cones.
While humans have just one nonspectral color -- purple, birds can theoretically see up to five: purple, ultraviolet+red, ultraviolet+green, ultraviolet+yellow and ultraviolet+purple.
Stoddard and her colleagues designed a series of experiments to test whether hummingbirds can see these nonspectral colors. Their results appear June 15 in the Proceedings of the National Academy of Sciences.
The research team, which included scientists from Princeton, the University of British Columbia (UBC), Harvard University, University of Maryland and RMBL, performed outdoor experiments each summer for three years. First they built a pair of custom "bird vision" LED tubes programmed to display a broad range of colors, including nonspectral colors like ultraviolet+green. Next they performed experiments in an alpine meadow frequently visited by local broad-tailed hummingbirds, which breed at the high-altitude site.
Each morning, the researchers rose before dawn and set up two feeders: one containing sugar water and the other plain water. Beside each feeder, they placed an LED tube. The tube beside the sugar water emitted one color, while the one next to the plain water emitted a different color. The researchers periodically swapped the positions of the rewarding and unrewarding tubes, so the birds could not simply use location to pinpoint a sweet treat. They also performed control experiments to ensure that the tiny birds were not using smell or another inadvertent cue to find the reward. Over the course of several hours, wild hummingbirds learned to visit the rewarding color. Using this setup, the researchers recorded over 6,000 feeder visits in a series of 19 experiments.
The experiments revealed that hummingbirds can see a variety of nonspectral colors, including purple, ultraviolet+green, ultraviolet+red and ultraviolet+yellow. For example, hummingbirds readily distinguished ultraviolet+green from pure ultraviolet or pure green, and they discriminated between two different mixtures of ultraviolet+red light -- one redder, one less so.
"It was amazing to watch," said Harold Eyster, a UBC Ph.D. student and a co-author of the study. "The ultraviolet+green light and green light looked identical to us, but the hummingbirds kept correctly choosing the ultraviolet+green light associated with sugar water. Our experiments enabled us to get a sneak peek into what the world looks like to a hummingbird."
Even though hummingbirds can perceive nonspectral colors, appreciating how these colors appear to birds can be difficult. "It is impossible to really know how the birds perceive these colors. Is ultraviolet+red a mix of those colors, or an entirely new color? We can only speculate," said Ben Hogan, a postdoctoral research associate at Princeton and a co-author of the study.
"To imagine an extra dimension of color vision -- that is the thrill and challenge of studying how avian perception works," said Stoddard. "Fortunately, the hummingbirds reveal that they can see things we cannot."
"The colors that we see in the fields of wildflowers at our study site, the wildflower capital of Colorado, are stunning to us, but just imagine what those flowers look like to birds with that extra sensory dimension," said co-author David Inouye, who is affiliated with the University of Maryland and RMBL.
Finally, the research team analyzed a data set of 3,315 feather and plant colors. They discovered that birds likely perceive many of these colors as nonspectral, while humans do not. That said, the researchers emphasize that nonspectral colors are probably not particularly special relative to other colors. The wide variety of nonspectral colors available to birds is the result of their ancient four color-cone visual system.
Read more at Science Daily
Super-potent human antibodies protect against COVID-19 in animal tests
Coronavirus illustration |
The research, published today in Science, offers a paradigm of swift reaction to an emergent and deadly viral pandemic, and sets the stage for clinical trials and additional tests of the antibodies, which are now being produced as potential treatments and preventives for COVID-19.
"The discovery of these very potent antibodies represents an extremely rapid response to a totally new pathogen," says study co-senior author Dennis Burton, PhD, the James and Jessie Minor Chair in Immunology in the Department of Immunology & Microbiology at Scripps Research.
In principle, injections of such antibodies could be given to patients in the early stage of COVID-19 to reduce the level of virus and protect against severe disease. The antibodies also may be used to provide temporary, vaccine-like protection against SARS-CoV-2 infection for healthcare workers, elderly people and others who respond poorly to traditional vaccines or are suspected of a recent exposure to the coronavirus.
The project was led by groups at Scripps Research; IAVI, a nonprofit scientific research organization dedicated to addressing urgent, unmet global health challenges; and University of California San Diego School of Medicine.
"It has been a tremendous collaborative effort, and we're now focused on making large quantities of these promising antibodies for clinical trials," says co-lead author Thomas Rogers, MD, PhD, an adjunct assistant professor in the Department of Immunology & Microbiology at Scripps Research, and assistant professor of Medicine at UC San Diego.
An approach that's worked for other deadly viruses
Developing a treatment or vaccine for severe COVID-19 is currently the world's top public health priority. Globally, almost 8 million people have tested positive for SARS-CoV-2 infection, and more than 400,000 have died of severe COVID-19. The daily toll of new infections is still rising.
One approach to new viral threats is to identify, in the blood of recovering patients, antibodies that neutralize the virus's ability to infect cells.
These antibodies can then be mass-produced, using biotech methods, as a treatment that blocks severe disease and as a vaccine-like preventive that circulates in the blood for several weeks to protect against infection. This approach already has been demonstrated successfully against Ebola virus and the pneumonia-causing respiratory syncytial virus, commonly known as RSV.
Potent patient antibodies block the virus
For the new project, Rogers and his UC San Diego colleagues took blood samples from patients who had recovered from mild-to-severe COVID-19. In parallel, scientists at Scripps Research and IAVI developed test cells that express ACE2, the receptor that SARS-CoV-2 uses to get into human cells. In a set of initial experiments, the team tested whether antibody-containing blood from the patients could bind to the virus and strongly block it from infecting the test cells.
The scientists were able to isolate more than 1,000 distinct antibody-producing immune cells, called B cells, each of which produced a distinct anti-SARS-CoV-2 antibody. The team obtained the antibody gene sequences from these B cells so that they could produce the antibodies in the laboratory. By screening these antibodies individually, the team identified several that, even in tiny quantities, could block the virus in test cells, and one that could also protect hamsters against heavy viral exposure.
All of this work -- including the development of the cell and animal infection models, and studies to discover where the antibodies of interest bind the virus -- was completed in less than seven weeks.
"We leveraged our institution's decades of expertise in antibody isolation and quickly pivoted our focus to SARS-CoV-2 to identify these highly potent antibodies," says study co-author Elise Landais, PhD, an IAVI principal scientist.
If further safety tests in animals and clinical trials in people go well, then conceivably the antibodies could be used in clinical settings as early as next January, the researchers say.
"We intend to make them available to those who need them most, including people in low- and middle-income countries," Landais says.
In the course of their attempts to isolate anti-SARS-CoV-2 antibodies from the COVID-19 patients, the researchers found one that can also neutralize SARS-CoV, the related coronavirus that caused the 2002-2004 outbreak of severe acute respiratory syndrome (SARS) in Asia.
"That discovery gives us hope that we will eventually find broadly neutralizing antibodies that provide at least partial protection against all or most SARS coronaviruses, which should be useful if another one jumps to humans," Burton says.
Read more at Science Daily
Jun 15, 2020
New light shed on intelligent life existing across the galaxy
One of the biggest and longest-standing questions in the history of human thought is whether there are other intelligent life forms within our Universe. Obtaining good estimates of the number of possible extraterrestrial civilizations has however been very challenging.
A new study led by the University of Nottingham and published today in The Astrophysical Journal has taken a new approach to this problem. Using the assumption that intelligent life forms on other planets in a similar way as it does on Earth, researchers have obtained an estimate for the number of intelligent communicating civilizations within our own galaxy -the Milky Way. They calculate that there could be over 30 active communicating intelligent civilizations in our home Galaxy.
Professor of Astrophysics at the University of Nottingham, Christopher Conselice who led the research, explains: "There should be at least a few dozen active civilizations in our Galaxy under the assumption that it takes 5 billion years for intelligent life to form on other planets, as on Earth." Conselice also explains that, "The idea is looking at evolution, but on a cosmic scale. We call this calculation the Astrobiological Copernican Limit."
First author Tom Westby explains: "The classic method for estimating the number of intelligent civilizations relies on making guesses of values relating to life, whereby opinions about such matters vary quite substantially. Our new study simplifies these assumptions using new data, giving us a solid estimate of the number of civilizations in our Galaxy.
The two Astrobiological Copernican limits are that intelligent life forms in less than 5 billion years, or after about 5 billion years -- similar to on Earth where a communicating civilization formed after 4.5 billion years. In the strong criteria, whereby a metal content equal to that of the Sun is needed (the Sun is relatively speaking quite metal rich), we calculate that there should be around 36 active civilizations in our Galaxy."
The research shows that the number of civilizations depends strongly on how long they are actively sending out signals of their existence into space, such as radio transmissions from satellites, television, etc. If other technological civilizations last as long as ours which is currently 100 years old, then there will be about 36 ongoing intelligent technical civilizations throughout our Galaxy.
However, the average distance to these civilizations would be 17,000 light-years away, making detection and communication very difficult with our present technology. It is also possible that we are the only civilization within our Galaxy unless the survival times of civilizations like our own are long.
Professor Conselice continues: "Our new research suggests that searches for extraterrestrial intelligent civilizations not only reveals the existence of how life forms, but also gives us clues for how long our own civilization will last. If we find that intelligent life is common then this would reveal that our civilization could exist for much longer than a few hundred years, alternatively if we find that there are no active civilizations in our Galaxy it is a bad sign for our own long-term existence. By searching for extraterrestrial intelligent life -- even if we find nothing -- we are discovering our own future and fate."
From Science Daily
A new study led by the University of Nottingham and published today in The Astrophysical Journal has taken a new approach to this problem. Using the assumption that intelligent life forms on other planets in a similar way as it does on Earth, researchers have obtained an estimate for the number of intelligent communicating civilizations within our own galaxy -the Milky Way. They calculate that there could be over 30 active communicating intelligent civilizations in our home Galaxy.
Professor of Astrophysics at the University of Nottingham, Christopher Conselice who led the research, explains: "There should be at least a few dozen active civilizations in our Galaxy under the assumption that it takes 5 billion years for intelligent life to form on other planets, as on Earth." Conselice also explains that, "The idea is looking at evolution, but on a cosmic scale. We call this calculation the Astrobiological Copernican Limit."
First author Tom Westby explains: "The classic method for estimating the number of intelligent civilizations relies on making guesses of values relating to life, whereby opinions about such matters vary quite substantially. Our new study simplifies these assumptions using new data, giving us a solid estimate of the number of civilizations in our Galaxy.
The two Astrobiological Copernican limits are that intelligent life forms in less than 5 billion years, or after about 5 billion years -- similar to on Earth where a communicating civilization formed after 4.5 billion years. In the strong criteria, whereby a metal content equal to that of the Sun is needed (the Sun is relatively speaking quite metal rich), we calculate that there should be around 36 active civilizations in our Galaxy."
The research shows that the number of civilizations depends strongly on how long they are actively sending out signals of their existence into space, such as radio transmissions from satellites, television, etc. If other technological civilizations last as long as ours which is currently 100 years old, then there will be about 36 ongoing intelligent technical civilizations throughout our Galaxy.
However, the average distance to these civilizations would be 17,000 light-years away, making detection and communication very difficult with our present technology. It is also possible that we are the only civilization within our Galaxy unless the survival times of civilizations like our own are long.
Professor Conselice continues: "Our new research suggests that searches for extraterrestrial intelligent civilizations not only reveals the existence of how life forms, but also gives us clues for how long our own civilization will last. If we find that intelligent life is common then this would reveal that our civilization could exist for much longer than a few hundred years, alternatively if we find that there are no active civilizations in our Galaxy it is a bad sign for our own long-term existence. By searching for extraterrestrial intelligent life -- even if we find nothing -- we are discovering our own future and fate."
From Science Daily
Multi-ethnic study suggests vitamin K may offer protective health benefits in older age
A new, multi-ethnic study found older adults with low vitamin K levels were more likely to die within 13 years compared to those whose vitamin K levels were adequate. The results suggest vitamin K, a nutrient found in leafy greens and vegetable oils, may have protective health benefits as we age, according to the researchers.
The meta-analysis, involving nearly 4,000 Americans aged 54-76, one-third of whom were non-white, was led by researchers at the Jean Mayer USDA Human Nutrition Research Center on Aging at Tufts University (USDA HNRCA) and Tufts Medical Center and is published in The American Journal of Clinical Nutrition.
The research team categorized participants according to their vitamin K blood levels. They then compared risk of heart disease and risk of death across the categories over approximately 13 years of follow-up.
The results showed no significant associations between vitamin K levels and heart disease. However, the people with the lowest vitamin K levels had a 19 percent higher risk of death, compared to the those with vitamin K levels that reflected adequate vitamin K intake.
Vitamin K is a nutrient that is important for maintaining healthy blood vessels. It is found in leafy greens, such as lettuce, kale and spinach, and in some vegetable oils, especially soybean and canola.
"The possibility that vitamin K is linked to heart disease and mortality is based on our knowledge about proteins in vascular tissue that require vitamin K to function. These proteins help prevent calcium from building up in artery walls, and without enough vitamin K, they are less functional," said first author Kyla Shea.
Shea is a scientist on the HNRCA's vitamin K team, long renowned for its work on the role of vitamin K in the prevention of chronic disease. Sarah Booth, a co-author on the study and director of the USDA HNRCA, developed the methodology for measuring vitamin K in blood. Her research team measured the vitamin K levels in the study participants and continues to generate data about vitamin K status in population and clinic-based studies.
"Similar to when a rubber band dries out and loses its elasticity, when veins and arteries are calcified, blood pumps less efficiently, causing a variety of complications. That is why measuring risk of death, in a study such as this, may better capture the spectrum of events associated with worsening vascular health," said last author Daniel Weiner, M.D., nephrologist at Tufts Medical Center, whose research includes vascular disease in people with impaired kidney function.
While this study adds to existing evidence that vitamin K may have protective health benefits, it cannot establish a causal relationship between low vitamin K levels and risk of death because it is observational. Additional studies are also needed to clarify why circulating vitamin K was associated with risk for death but not heart disease.
Methodology
The study is a meta-analysis, which combined data from participants in three ongoing studies: the Health, Aging, and Body Composition Study, the Multi-Ethnic Study of Atherosclerosis, and the Framingham Heart Study (Offspring Cohort). Vitamin K levels for participants in all three studies were measured after fasting, with the same test, and processed at the same laboratory (the vitamin K laboratory at the USDA HNRCA), minimizing the potential for laboratory-based variation. The test showed levels of circulating phylloquinone, the compound known as vitamin K1.
Participants on the blood thinner warfarin were excluded because vitamin K counteracts the anti-clotting effects of warfarin. All participants were free of heart disease at baseline and had vitamin K levels measured during a single medical exam that was part of each study's regular protocol.
The statistical analysis adjusted for age, gender, race, ethnicity, BMI, triglycerides, cholesterol levels, smoking status, and use of medications for diabetes or high blood pressure.
Read more at Science Daily
The meta-analysis, involving nearly 4,000 Americans aged 54-76, one-third of whom were non-white, was led by researchers at the Jean Mayer USDA Human Nutrition Research Center on Aging at Tufts University (USDA HNRCA) and Tufts Medical Center and is published in The American Journal of Clinical Nutrition.
The research team categorized participants according to their vitamin K blood levels. They then compared risk of heart disease and risk of death across the categories over approximately 13 years of follow-up.
The results showed no significant associations between vitamin K levels and heart disease. However, the people with the lowest vitamin K levels had a 19 percent higher risk of death, compared to the those with vitamin K levels that reflected adequate vitamin K intake.
Vitamin K is a nutrient that is important for maintaining healthy blood vessels. It is found in leafy greens, such as lettuce, kale and spinach, and in some vegetable oils, especially soybean and canola.
"The possibility that vitamin K is linked to heart disease and mortality is based on our knowledge about proteins in vascular tissue that require vitamin K to function. These proteins help prevent calcium from building up in artery walls, and without enough vitamin K, they are less functional," said first author Kyla Shea.
Shea is a scientist on the HNRCA's vitamin K team, long renowned for its work on the role of vitamin K in the prevention of chronic disease. Sarah Booth, a co-author on the study and director of the USDA HNRCA, developed the methodology for measuring vitamin K in blood. Her research team measured the vitamin K levels in the study participants and continues to generate data about vitamin K status in population and clinic-based studies.
"Similar to when a rubber band dries out and loses its elasticity, when veins and arteries are calcified, blood pumps less efficiently, causing a variety of complications. That is why measuring risk of death, in a study such as this, may better capture the spectrum of events associated with worsening vascular health," said last author Daniel Weiner, M.D., nephrologist at Tufts Medical Center, whose research includes vascular disease in people with impaired kidney function.
While this study adds to existing evidence that vitamin K may have protective health benefits, it cannot establish a causal relationship between low vitamin K levels and risk of death because it is observational. Additional studies are also needed to clarify why circulating vitamin K was associated with risk for death but not heart disease.
Methodology
The study is a meta-analysis, which combined data from participants in three ongoing studies: the Health, Aging, and Body Composition Study, the Multi-Ethnic Study of Atherosclerosis, and the Framingham Heart Study (Offspring Cohort). Vitamin K levels for participants in all three studies were measured after fasting, with the same test, and processed at the same laboratory (the vitamin K laboratory at the USDA HNRCA), minimizing the potential for laboratory-based variation. The test showed levels of circulating phylloquinone, the compound known as vitamin K1.
Participants on the blood thinner warfarin were excluded because vitamin K counteracts the anti-clotting effects of warfarin. All participants were free of heart disease at baseline and had vitamin K levels measured during a single medical exam that was part of each study's regular protocol.
The statistical analysis adjusted for age, gender, race, ethnicity, BMI, triglycerides, cholesterol levels, smoking status, and use of medications for diabetes or high blood pressure.
Read more at Science Daily
Human portraits reveals shift in culture, cognition
Throughout history, portraits featuring the human profile have evolved to reflect changing cultural norms. A new study led by Helena Miton, a Santa Fe Institute Omidyar Fellow, and co-authored by Dan Sperber of Central European University and Miko?aj Hernik, of UiT the Artic University of Norway, shows that human cognition plays a critical role in the evolution of human portraiture.
"These cognitive factors cause greater spontaneous attention to what is in front of -- rather than behind -- a subject, Miton says. "Scenes with more space in front of a directed object are both produced more often and judged as more aesthetically pleasant. This leads to the prediction that, in profile-oriented human portraits, compositions with more space in front of depicted subjects (a 'forward bias') should be over-represented."
To test their prediction, the research team looked at 1831 paintings by 582 unique European painters from the 15th to the 20th century. They not only found evidence that this forward bias -- where painters put more open space in front of their sitters than behind them -- was widespread, they also found evidence that the bias became stronger when cultural norms of spatial composition favoring centering became less stringent.
In the accompanying image, the portrait on the left (Adrian Brouwer, 1630) is an example of a composition with the sitter centered. On the right, a portrait by Pierre Auguste Renoir (1905) shows the forward bias, with more free space in front of the sitter than behind her. The study showed this type of spatial composition increased over time.
"Culture and cognition are two interacting domains," Miton explains. "With most cultural phenomena, you're going to have some kind of influence from cognition. Our idea is to work out how we identify these factors and how we work with that type of causality."
The research team identified cultural norms that favored centering portraits, especially in the earlier periods. These preferences clearly loosened over time, resulting in more diverse portrait composition.
The widespread presence of a forward bias was robust. Previous studies found some evidence of a forward bias in the production of a handful of painters, but these results suggest that this bias in spatial composition was widespread -- particularly remarkable since it goes against a cultural norm that favors centering sitters.
According to Miton, this research approach can be extended to quantify in a more general way (and with a more general painting data set) how much artistic norms loosen and how much variation increases over time. Beyond the art world, the approach can also look at the role cognition plays in other cultural phenomena, from writing systems to medical practices.
Read more at Science Daily
"These cognitive factors cause greater spontaneous attention to what is in front of -- rather than behind -- a subject, Miton says. "Scenes with more space in front of a directed object are both produced more often and judged as more aesthetically pleasant. This leads to the prediction that, in profile-oriented human portraits, compositions with more space in front of depicted subjects (a 'forward bias') should be over-represented."
To test their prediction, the research team looked at 1831 paintings by 582 unique European painters from the 15th to the 20th century. They not only found evidence that this forward bias -- where painters put more open space in front of their sitters than behind them -- was widespread, they also found evidence that the bias became stronger when cultural norms of spatial composition favoring centering became less stringent.
In the accompanying image, the portrait on the left (Adrian Brouwer, 1630) is an example of a composition with the sitter centered. On the right, a portrait by Pierre Auguste Renoir (1905) shows the forward bias, with more free space in front of the sitter than behind her. The study showed this type of spatial composition increased over time.
"Culture and cognition are two interacting domains," Miton explains. "With most cultural phenomena, you're going to have some kind of influence from cognition. Our idea is to work out how we identify these factors and how we work with that type of causality."
The research team identified cultural norms that favored centering portraits, especially in the earlier periods. These preferences clearly loosened over time, resulting in more diverse portrait composition.
The widespread presence of a forward bias was robust. Previous studies found some evidence of a forward bias in the production of a handful of painters, but these results suggest that this bias in spatial composition was widespread -- particularly remarkable since it goes against a cultural norm that favors centering sitters.
According to Miton, this research approach can be extended to quantify in a more general way (and with a more general painting data set) how much artistic norms loosen and how much variation increases over time. Beyond the art world, the approach can also look at the role cognition plays in other cultural phenomena, from writing systems to medical practices.
Read more at Science Daily
Tuberculosis vaccine strengthens immune system
A tuberculosis vaccine developed 100 years ago also makes vaccinated persons less susceptible to other infections. While this effect has been recognized for a long time, it is not known what causes it. Together with colleagues from Australia and Denmark, researchers from Radboud university medical center the universities of Nijmegen and Bonn have now presented a possible answer to this question. Their results are also interesting against the background of the Covid-19 pandemic: several studies are currently testing the use of the vaccine in preventing severe disease progression in populations at risk such as hospital staff and elderly individuals. The study is published in the journal "Cell Host & Microbe".
The BCG vaccine (the abbreviation stands for Bacillus Calmette-Guérin) is the only vaccine that provides effective protection against infections with the tuberculosis bacterium. Since its first medical application in 1921, it has been used billions of times. An unexpected side effect became apparent: vaccinated individuals not only contracted tuberculosis far less frequently, but also other infections. One example comes from Guinea-Bissau in West Africa: there, the mortality of vaccinated newborns was almost 40 percent lower than that of unvaccinated babies
A similar effect has now been observed with other vaccines, almost exclusively with those based on live pathogens. Experts also speak of "trained immunity": the capacity of innate immune response to become more efficient independently of the type of reinfection. However, it is still largely unknown why this training effect can persist for years, even long after the immune cells that were circulating in the blood at the time of vaccination have died. Detailed studies on this topic were lacking, especially in humans; the current study fills this gap to a certain extent: "We vaccinated 15 volunteers with the BCG vaccine and administered a placebo to five more people for comparison," explains Prof. Dr. Mihai Netea from the Radboud university medical center in Nijmegen, the Netherlands. "Three months later, we took both blood and bone marrow samples from these individuals."
Some striking differences were found between the two groups. For instance, the immune cells in the blood of vaccinated individuals released significantly more inflammatory messengers. These so-called cytokines strengthen the effectiveness of the immune defense; for example, they call on other immune cells for help and direct them to the site of infection. Moreover, the immune cells of vaccinated individuals showed activity of completely different genes than in the placebo group, especially those required for cytokine production.
Easier access to genes for infection defense
There are many different types of immune cells in the blood. All of them are produced in the bone marrow. This is where the so-called hematopoietic stem cells grow, the "mothers" of all immune cells. The BCG vaccination also causes long-term changes in their genetic program. "We have found that after vaccination, certain genetic material becomes more accessible, which means that it can be read by the cells more frequently," explains Prof. Dr. Andreas Schlitzer from LIMES Institute at the University of Bonn.
Metaphorically speaking, every human cell contains in its nucleus a huge library of tens of thousands of books, the genes. When the cell wants to produce a certain molecule, for example a cytokine, it looks up its assembly instructions in the corresponding book. But not all of the books can be taken out so easily: some are usually under lock and key. The BCG vaccination now makes some of these books available, probably for many months or years. These include those that are needed for increased cytokine production. "This explains why the vaccination results in an enhanced immune response in the long term," said Netea. "This may well be the basis for the lasting impact of the training effect."
Another aspect is also interesting: most of the released books, i.e. the genes that become more accessible after the vaccine has been administered, are additionally controlled by a molecule called HNF. This "hepatic nuclear factor" ensures that the immune cells use their newly acquired power prudently, meaning that they only release cytokines when there is actually a pathogen that needs to be attacked. "It may be possible to use this finding therapeutically to specifically manipulate the trained immunity," explains LIMES researcher Prof. Schlitzer.
The results are also of interest against the background of the current Covid-19 pandemic: the researchers hope that a BCG vaccination might have a positive effect on the disease. Although the trained immune system probably cannot prevent infection with the virus, it may reduce the risk of a severe course. This might benefit especially the particularly vulnerable medical staff. Several large-scale medical studies are currently investigating this question, among others two at Radboud university medical center Nijmegen, and another at the University of Melbourne, which is also a partner in the current project.
Read more at Science Daily
The BCG vaccine (the abbreviation stands for Bacillus Calmette-Guérin) is the only vaccine that provides effective protection against infections with the tuberculosis bacterium. Since its first medical application in 1921, it has been used billions of times. An unexpected side effect became apparent: vaccinated individuals not only contracted tuberculosis far less frequently, but also other infections. One example comes from Guinea-Bissau in West Africa: there, the mortality of vaccinated newborns was almost 40 percent lower than that of unvaccinated babies
A similar effect has now been observed with other vaccines, almost exclusively with those based on live pathogens. Experts also speak of "trained immunity": the capacity of innate immune response to become more efficient independently of the type of reinfection. However, it is still largely unknown why this training effect can persist for years, even long after the immune cells that were circulating in the blood at the time of vaccination have died. Detailed studies on this topic were lacking, especially in humans; the current study fills this gap to a certain extent: "We vaccinated 15 volunteers with the BCG vaccine and administered a placebo to five more people for comparison," explains Prof. Dr. Mihai Netea from the Radboud university medical center in Nijmegen, the Netherlands. "Three months later, we took both blood and bone marrow samples from these individuals."
Some striking differences were found between the two groups. For instance, the immune cells in the blood of vaccinated individuals released significantly more inflammatory messengers. These so-called cytokines strengthen the effectiveness of the immune defense; for example, they call on other immune cells for help and direct them to the site of infection. Moreover, the immune cells of vaccinated individuals showed activity of completely different genes than in the placebo group, especially those required for cytokine production.
Easier access to genes for infection defense
There are many different types of immune cells in the blood. All of them are produced in the bone marrow. This is where the so-called hematopoietic stem cells grow, the "mothers" of all immune cells. The BCG vaccination also causes long-term changes in their genetic program. "We have found that after vaccination, certain genetic material becomes more accessible, which means that it can be read by the cells more frequently," explains Prof. Dr. Andreas Schlitzer from LIMES Institute at the University of Bonn.
Metaphorically speaking, every human cell contains in its nucleus a huge library of tens of thousands of books, the genes. When the cell wants to produce a certain molecule, for example a cytokine, it looks up its assembly instructions in the corresponding book. But not all of the books can be taken out so easily: some are usually under lock and key. The BCG vaccination now makes some of these books available, probably for many months or years. These include those that are needed for increased cytokine production. "This explains why the vaccination results in an enhanced immune response in the long term," said Netea. "This may well be the basis for the lasting impact of the training effect."
Another aspect is also interesting: most of the released books, i.e. the genes that become more accessible after the vaccine has been administered, are additionally controlled by a molecule called HNF. This "hepatic nuclear factor" ensures that the immune cells use their newly acquired power prudently, meaning that they only release cytokines when there is actually a pathogen that needs to be attacked. "It may be possible to use this finding therapeutically to specifically manipulate the trained immunity," explains LIMES researcher Prof. Schlitzer.
The results are also of interest against the background of the current Covid-19 pandemic: the researchers hope that a BCG vaccination might have a positive effect on the disease. Although the trained immune system probably cannot prevent infection with the virus, it may reduce the risk of a severe course. This might benefit especially the particularly vulnerable medical staff. Several large-scale medical studies are currently investigating this question, among others two at Radboud university medical center Nijmegen, and another at the University of Melbourne, which is also a partner in the current project.
Read more at Science Daily
Jun 14, 2020
Face masks critical in preventing spread of COVID-19
Crowd of people wearing medical masks. |
Renyi Zhang, Texas A&M Distinguished Professor of Atmospheric Sciences and the Harold J. Haynes Chair in the College of Geosciences, and colleagues from the University of Texas, the University of California-San Diego and the California Institute of Technology have had their work published in the current issue of PNAS (Proceedings of the National Academy of Sciences).
The team examined the chances of COVID-19 infection and how the virus is easily passed from person to person. From trends and mitigation procedures in China, Italy and New York City, the researchers found that using a face mask reduced the number of infections by more than 78,000 in Italy from April 6-May 9 and by over 66,000 in New York City from April 17-May 9.
"Our results clearly show that airborne transmission via respiratory aerosols represents the dominant route for the spread of COVID-19," Zhang said. "By analyzing the pandemic trends without face-covering using the statistical method and by projecting the trend, we calculated that over 66,000 infections were prevented by using a face mask in little over a month in New York City. We conclude that wearing a face mask in public corresponds to the most effective means to prevent inter-human transmission.
"This inexpensive practice, in conjunction with social distancing and other procedures, is the most likely opportunity to stop the COVID-19 pandemic. Our work also highlights that sound science is essential in decision-making for the current and future public health pandemics."
One of the paper's co-authors, Mario Molina, is a professor at the University of California-San Diego and a co-recipient of the 1995 Nobel Prize in Chemistry for his role in understanding the threat to the Earth's ozone layer of human-made halocarbon gases.
"Our study establishes very clearly that using a face mask is not only useful to prevent infected coughing droplets from reaching uninfected persons, but is also crucial for these uninfected persons to avoid breathing the minute atmospheric particles (aerosols) that infected people emit when talking and that can remain in the atmosphere tens of minutes and can travel tens of feet," Molina said.
Zhang said that many people in China have worn face masks for years, mainly because of the bad air quality of the country.
"So people there are sort of used to this," he said. "Mandated face-covering helped China in containing the COVID-19 outbreak."
Zhang said the results should send a clear message to people worldwide -- wearing a face mask is essential in fighting the virus.
Read more at Science Daily
Up to 45 percent of SARS-CoV-2 infections may be asymptomatic
Coronavirus in crowd, photo concept. |
The findings, published in Annals of Internal Medicine, suggest that asymptomatic infections may account for as much as 45 percent of all COVID-19 cases, playing a significant role in the early and ongoing spread of COVID-19. The report highlights the need for expansive testing and contact tracing to mitigate the pandemic.
"The silent spread of the virus makes it all the more challenging to control," says Eric Topol, MD, founder and director of the Scripps Research Translational Institute and professor of Molecular Medicine at Scripps Research. "Our review really highlights the importance of testing. It's clear that with such a high asymptomatic rate, we need to cast a very wide net, otherwise the virus will continue to evade us."
Together with behavioral scientist Daniel Oran, Topol collected information from testing studies on 16 diverse cohorts from around the world. These datasets -- gathered via keyword searches of PubMed, bioRxiv and medRxiv, as well as Google searches of relevant news reports -- included data on nursing home residents, cruise ship passengers, prison inmates and various other groups.
"What virtually all of them had in common was that a very large proportion of infected individuals had no symptoms," says Oran. "Among more than 3,000 prison inmates in four states who tested positive for the coronavirus, the figure was astronomical: 96 percent asymptomatic."
The review further suggests that asymptomatic individuals are able to transmit the virus for an extended period of time, perhaps longer than 14 days. The viral loads are very similar in people with or without symptoms, but it remains unclear whether their infectiousness is of the same magnitude. To resolve that issue, we'll need large-scale studies that include sufficient numbers of asymptomatic people.
The authors also conclude that the absence of symptoms may not imply an absence of harm. CT scans conducted on 54 percent of 76 asymptomatic individuals on the Diamond Princess cruise ship, appear to show significant subclinical lung abnormalities raising the possibility of SARS-CoV-2 infection impacting lung function that might not be immediately apparent. The scientists say further research is needed to confirm the potential significance of this finding.
The authors also acknowledge that the lack of longitudinal data makes distinguishing between asymptomatic and presymptomatic individuals difficult. An asymptomatic individual is someone who is infected with SARS-CoV-2, but never develops symptoms of COVID-19, while a presymptomatic person is similarly infected, but will eventually develop symptoms. Longitudinal testing, which refers to repeated testing of individuals over time, would help differentiate between the two.
Read more at Science Daily
Subscribe to:
Posts (Atom)