When picturing dinosaur tracks, most people imagine a perfectly preserved mold of a foot on firm layer of earth. But what if that dinosaur was running through mud, sinking several inches -- or even up to their ankles -- into the ground as it moved?
Using sophisticated X-ray-based technology, a team of Brown University researchers tracked the movements of guineafowl to investigate how their feet move below ground through various substrates and what those findings could mean for understanding fossil records left behind by dinosaurs.
They found that regardless of the variability in substrates, or the guineafowl moving at different speeds, sinking at different depths or engaging in different behaviors, the birds' overall foot movement remained the same: The toes spread as they stepped onto the substrate surface, remained spread as the foot sank, collapsed and drew back as they were lifted from the substrate, and exited the substrate in front of the point of entry, creating a looping pattern as they walked.
And part of what that means is that fossilized dinosaur tracks that look distinct from each other, and appear to be from different species, might instead come from the same dinosaurs.
"This is the first study that's really shown how the bird foot is moving below ground, showing the patterns of this subsurface foot motion and allowing us to break down the patterns that we're seeing in a living animal that has feet similar to those of a dinosaur," said Morgan Turner, a Ph.D. candidate at Brown in ecology and evolutionary biology and lead author of the research. "Below ground, or even above ground, they're responding to these soft substrates in a very similar way, which has potentially important implications for our ability to study the movement of these animals that we can't observe directly anymore."
The findings were published on Wednesday, July 1, in the Royal Society journal Biology Letters.
To make the observations, Turner and her colleagues, Professor of Biology and Medical Science Stephen Gatesy and Peter Falkingham, now at Liverpool John Moores University, used a 3D-imaging technology developed at Brown called X-ray Reconstruction of Moving Morphology (XROMM). The technology combines CT scans of a skeleton with high-speed X-ray video, aided by tiny implanted metal markers, to create visualizations of how bones and muscles move inside humans and animals. In the study, the team used XROMM to watch guineafowl move through substrates of different hydration and compactness, analyzing how their feet moved underground and the tracks left behind.
Sand, typically a dense combination of quartz and silica, does not lend itself well to X-ray imaging, so the team used poppy seeds to emulate sand. Muds were made using small glass bubbles, adding various amount of clay and water across 107 trials to achieve different consistencies and realistic tracks.
They added metal markers underneath the claws of the guineafowl to allow for tracking in 3D space. It's these claw tips that the researchers think are least disturbed by mud flow and other variables that can impact and distort the form of the track.
Despite the variation, the researchers observed a consistent looping pattern.
"The loops by themselves I don't think are that interesting," Gatesy said. "People are like, 'That's nice. Birds do this underground. So what?' It was only when [Turner] went back into it and said, 'What if we slice those motion trails at different depths as if they were footprints?' Then we made the nice connection to the fossils."
By "slicing" through the 3D images of the movement patterns at different depths, the researchers found similarities between the guineafowl tracks and fossilized dinosaur tracks.
"We don't know what these dinosaurs were doing, we don't know what they were walking through exactly, we don't know how big they were or how deep they were sinking, but we can make this really strong connection between how they were moving and some level of context for where this track is being sampled from within that movement," Turner said.
By recognizing the movement patterns, as well as the entry and exit point of the foot through various substrates, the team says they're able to gain a better understanding of what a dinosaur track could look like.
"You end up generating this big diversity of track shapes from a very simple foot shape because you're sampling at different depths and it's moving in complicated ways," Gatesy said. "Do we really have 40 different kinds of creatures, each with a differently shaped foot, or are we looking at some more complicated interaction that leaves behind these remnants that are partly anatomical and partly motion and partly depth?"
To further their research, the team spent time at the Beneski Museum of Natural History at Amherst College in Massachusetts, which is home to an expansive collection of penetrative tracks discovered in the 1800s by geologist Edward Hitchcock.
Hitchcock originally believed that his collection housed fossil tracks from over 100 distinct animals. Because of the team's work with XROMM, Gatesy now thinks it's possible that at least half of those tracks are actually from the same dinosaurs, just moving their feet in slightly different ways or sampled at slightly different depths.
"Going to museum together and being able to pick out these features and say, 'We think this track is low in the loop and we think this one is high,' that was the biggest moment of insight for me," Turner said.
Turner says she hopes their research can lead to a greater interest in penetrative tracks, even if they seem a little less pretty or polished than the tracks people are used to seeing in museums.
Read more at Science Daily
Jul 2, 2020
First confirmed underwater Aboriginal archaeological sites found off Australian coast
Ancient submerged Aboriginal archaeological sites await underwater rediscovery off the coast of Australia, according to a study published July 1, 2020 in the open-access journal PLOS ONE by Jonathan Benjamin of Flinders University, Adelaide, Australia and colleagues.
At the end of the Ice Age, sea level was much lower than today, and the Australian coastline was 160 kilometers farther offshore. When the ice receded and sea level rose to its current level, approximately two million square kilometers of Australian land became submerged where Aboriginal peoples had previously lived. Thus, it is likely that many ancient Aboriginal sites are currently underwater.
In this study, Benjamin and colleagues report the results of several field campaigns between 2017-2019 during which they applied a series of techniques for locating and investigating submerged archaeological sites, including aerial and underwater remote sensing technologies as well as direct investigation by divers. They investigated two sites off the Murujuga coastline of northwest Australia. In Cape Bruguieres Channel, divers identified 269 artefacts dating to at least 7,000 years old, and a single artefact was identified in a freshwater spring in Flying Foam Passage, dated to at least 8,500 years old. These are the first confirmed underwater archaeological sites found on Australia's continental shelf.
These findings demonstrate the utility of these exploratory techniques for locating submerged archaeological sites. The authors hope that these techniques can be expanded upon in the future for systematic recovery and investigation of ancient Aboriginal cultural artefacts. They further urge that future exploration will rely not only on careful and safe scientific procedures, but also on legislation to protect and manage Aboriginal cultural heritage along the Australian coastline.
Benjamin says, "Managing, investigating and understanding the archaelogy of the Australian continental shelf in partnership with Aboriginal and Torres Strait Islander traditional owners and custodians is one of the last frontiers in Australian archaeology." He adds, "Our results represent the first step in a journey of discovery to explore the potential of archaeology on the continental shelves which can fill a major gap in the human history of the continent."
Read more at Science Daily
At the end of the Ice Age, sea level was much lower than today, and the Australian coastline was 160 kilometers farther offshore. When the ice receded and sea level rose to its current level, approximately two million square kilometers of Australian land became submerged where Aboriginal peoples had previously lived. Thus, it is likely that many ancient Aboriginal sites are currently underwater.
In this study, Benjamin and colleagues report the results of several field campaigns between 2017-2019 during which they applied a series of techniques for locating and investigating submerged archaeological sites, including aerial and underwater remote sensing technologies as well as direct investigation by divers. They investigated two sites off the Murujuga coastline of northwest Australia. In Cape Bruguieres Channel, divers identified 269 artefacts dating to at least 7,000 years old, and a single artefact was identified in a freshwater spring in Flying Foam Passage, dated to at least 8,500 years old. These are the first confirmed underwater archaeological sites found on Australia's continental shelf.
These findings demonstrate the utility of these exploratory techniques for locating submerged archaeological sites. The authors hope that these techniques can be expanded upon in the future for systematic recovery and investigation of ancient Aboriginal cultural artefacts. They further urge that future exploration will rely not only on careful and safe scientific procedures, but also on legislation to protect and manage Aboriginal cultural heritage along the Australian coastline.
Benjamin says, "Managing, investigating and understanding the archaelogy of the Australian continental shelf in partnership with Aboriginal and Torres Strait Islander traditional owners and custodians is one of the last frontiers in Australian archaeology." He adds, "Our results represent the first step in a journey of discovery to explore the potential of archaeology on the continental shelves which can fill a major gap in the human history of the continent."
Read more at Science Daily
Climate change threat to tropical plants
Tropical plants closer to the equator are most at risk from climate change because it is expected to become too hot for many species to germinate in the next 50 years, UNSW researchers have found.
Their study analysed almost 10,000 records for more than 1300 species from the Kew Gardens' global seed germination database.
The research, published in the journal Global Ecology and Biogeography recently, was the first to look at the big picture impact of climate change on such a large number of plant species worldwide.
Lead author Alex Sentinella, UNSW PhD researcher, said past research had found that animal species closer to the equator would be more at risk from climate change.
"The thought was that because tropical species come from a stable climate where it's always warm, they can only cope with a narrow range of temperatures -- whereas species from higher latitudes can cope with a larger range of temperatures because they come from places where the weather varies widely," Mr Sentinella said.
"However, this idea had never been tested for plants.
"Because climate change is a huge issue globally, we wanted to understand these patterns on a global scale and build upon the many studies on plants at an individual level in their environment."
Seeds a key indicator of survival
The researchers examined seed germination data from the Millennium Seed Bank Partnership Data Warehouse, hosted by Kew Royal Botanic Gardens in London, to quantify global patterns in germination temperature.
They analysed 9737 records for 1312 plant species from every continent except Antarctica and excluded agricultural crops.
Mr Sentinella said they chose seed data because it was widely available and relevant to the ability of a species to cope with different temperatures.
"With seeds, you can experiment on them quickly, there are a lot of studies about them and importantly, germination directly relates to how a species will survive, because if the seed doesn't germinate the plant won't live," he said.
"So, we collated the data from the Kew Gardens database, examined all experiments on the same species from the same locations, and then determined the range of temperatures each species could tolerate in order to survive."
The researchers also examined climate data for the same locations as the plant species used in the study.
They looked at current temperature -- the average temperature of the warmest three months from 1970 to 2000 -- and predicted temperature for 2070.
The researchers then compared the temperatures the plants were experiencing now with the forecasted 2070 temperatures.
Tropical plants to hit or exceed temperature limits
The study discovered tropical plants do not have narrower temperature tolerances but were more at risk from global warming, because it would bring them close to their maximum seed germination temperatures.
Mr Sentinella said, on average, the closer a plant was to the equator, the more at risk it would be of exceeding its temperature ceiling by 2070.
"These plants could be more at risk because they are near their upper limits. So, even a small increase in temperature from climate change could push them over the edge," he said.
"The figures are quite shocking because by 2070, more than 20 per cent of tropical plant species, we predict, will face temperatures above their upper limit, which means they won't germinate, and so can't survive."
Mr Sentinella said the researchers also found that more than half of tropical species are expected to experience temperatures exceeding their optimum germination temperatures.
"That's even worse because if those plants can survive it would be at a reduced rate of germination and therefore, they might not be as successful," he said.
"If a seed's germination rate is 100 per cent at its optimum temperature, then it might only manage 50 or 60 per cent, for example, if the temperature is higher than what's ideal."
Mr Sentinella said he was surprised to find that climate change would threaten so many tropical species.
"But our most unexpected discovery was that the hypothesis often used for animals -- that those near the equator would struggle to survive the impact of climate change because they have narrower temperature tolerances -- was not true for plants," he said.
"We found that regardless of latitude, plant species can germinate at roughly the same breadth of temperatures, which does not align with the animal studies."
The researchers also found 95 per cent of plant species at latitudes above 45 degrees are predicted to benefit from warming, because environmental temperatures are expected to shift closer to the species' optimal germination temperatures.
Findings to help target conservation efforts
Mr Sentinella said it was possible for some plants to slowly evolve to increasing temperatures, but it was difficult to predict which ones would survive.
"The problem with the quick change in temperatures forecasted, is that some species won't be able to adapt fast enough," he said.
"Sometimes plants can migrate by starting to grow further away from the equator or, up a mountain slope where it's cooler. But if a species can't do that it will become extinct.
"There are almost 400,000 plant species worldwide -- so, we would expect a number of them to fail to germinate between now and 2070."
Mr Sentinella hopes the researchers' findings will help to conserve plant species under threat from climate change.
"Ideally, we would be able to conserve all ecosystems, but the funding is simply not there. So, our findings could help conservation efforts target resources towards areas which are more vulnerable," he said.
"We also hope our findings further strengthen the global body of research about the risks of climate change.
Read more at Science Daily
Their study analysed almost 10,000 records for more than 1300 species from the Kew Gardens' global seed germination database.
The research, published in the journal Global Ecology and Biogeography recently, was the first to look at the big picture impact of climate change on such a large number of plant species worldwide.
Lead author Alex Sentinella, UNSW PhD researcher, said past research had found that animal species closer to the equator would be more at risk from climate change.
"The thought was that because tropical species come from a stable climate where it's always warm, they can only cope with a narrow range of temperatures -- whereas species from higher latitudes can cope with a larger range of temperatures because they come from places where the weather varies widely," Mr Sentinella said.
"However, this idea had never been tested for plants.
"Because climate change is a huge issue globally, we wanted to understand these patterns on a global scale and build upon the many studies on plants at an individual level in their environment."
Seeds a key indicator of survival
The researchers examined seed germination data from the Millennium Seed Bank Partnership Data Warehouse, hosted by Kew Royal Botanic Gardens in London, to quantify global patterns in germination temperature.
They analysed 9737 records for 1312 plant species from every continent except Antarctica and excluded agricultural crops.
Mr Sentinella said they chose seed data because it was widely available and relevant to the ability of a species to cope with different temperatures.
"With seeds, you can experiment on them quickly, there are a lot of studies about them and importantly, germination directly relates to how a species will survive, because if the seed doesn't germinate the plant won't live," he said.
"So, we collated the data from the Kew Gardens database, examined all experiments on the same species from the same locations, and then determined the range of temperatures each species could tolerate in order to survive."
The researchers also examined climate data for the same locations as the plant species used in the study.
They looked at current temperature -- the average temperature of the warmest three months from 1970 to 2000 -- and predicted temperature for 2070.
The researchers then compared the temperatures the plants were experiencing now with the forecasted 2070 temperatures.
Tropical plants to hit or exceed temperature limits
The study discovered tropical plants do not have narrower temperature tolerances but were more at risk from global warming, because it would bring them close to their maximum seed germination temperatures.
Mr Sentinella said, on average, the closer a plant was to the equator, the more at risk it would be of exceeding its temperature ceiling by 2070.
"These plants could be more at risk because they are near their upper limits. So, even a small increase in temperature from climate change could push them over the edge," he said.
"The figures are quite shocking because by 2070, more than 20 per cent of tropical plant species, we predict, will face temperatures above their upper limit, which means they won't germinate, and so can't survive."
Mr Sentinella said the researchers also found that more than half of tropical species are expected to experience temperatures exceeding their optimum germination temperatures.
"That's even worse because if those plants can survive it would be at a reduced rate of germination and therefore, they might not be as successful," he said.
"If a seed's germination rate is 100 per cent at its optimum temperature, then it might only manage 50 or 60 per cent, for example, if the temperature is higher than what's ideal."
Mr Sentinella said he was surprised to find that climate change would threaten so many tropical species.
"But our most unexpected discovery was that the hypothesis often used for animals -- that those near the equator would struggle to survive the impact of climate change because they have narrower temperature tolerances -- was not true for plants," he said.
"We found that regardless of latitude, plant species can germinate at roughly the same breadth of temperatures, which does not align with the animal studies."
The researchers also found 95 per cent of plant species at latitudes above 45 degrees are predicted to benefit from warming, because environmental temperatures are expected to shift closer to the species' optimal germination temperatures.
Findings to help target conservation efforts
Mr Sentinella said it was possible for some plants to slowly evolve to increasing temperatures, but it was difficult to predict which ones would survive.
"The problem with the quick change in temperatures forecasted, is that some species won't be able to adapt fast enough," he said.
"Sometimes plants can migrate by starting to grow further away from the equator or, up a mountain slope where it's cooler. But if a species can't do that it will become extinct.
"There are almost 400,000 plant species worldwide -- so, we would expect a number of them to fail to germinate between now and 2070."
Mr Sentinella hopes the researchers' findings will help to conserve plant species under threat from climate change.
"Ideally, we would be able to conserve all ecosystems, but the funding is simply not there. So, our findings could help conservation efforts target resources towards areas which are more vulnerable," he said.
"We also hope our findings further strengthen the global body of research about the risks of climate change.
Read more at Science Daily
Beacon from the early universe
Often described as cosmic lighthouses, quasars are luminous beacons that can be observed at the outskirts of the Universe, providing a rich topic of study for astronomers and cosmologists. Now scientists have announced the discovery of the second-most distant quasar ever found, at more than 13 billion lightyears from Earth.
UC Santa Barbara's Joe Hennawi, a professor in the Department of Physics, and former UCSB postdoctoral scholars Frederick Davies and Feige Wang, provided crucial modeling and data analysis tools that enabled this discovery. The results are currently in preprint on ArXiv and will appear in The Astrophysical Journal Letters.
The researchers have named the object Pōniuā'ena, which means "unseen spinning source of creation, surrounded with brilliance" in the Hawaiian language. It is the first quasar to receive an indigenous Hawaiian name.
Quasars are incredibly bright sources of radiation that lie at the centers of distant massive galaxies. Matter spiraling onto a supermassive black hole generates tremendous amounts of heat making it glow at ultraviolet and optical wavelengths. "They are the most luminous objects in the Universe," Hennawi said, "outshining their host galaxies by factors of more than a hundred."
Since the discovery of the first quasar, astronomers have been keen to determine when they first appeared in our cosmic history. By systematically searching for these rare objects in wide-area sky surveys, astronomers discovered the most distant quasar (named J1342+0928) in 2018 and now the second-most distant, Pōniuā'ena (or J1007+2115).
The team first detected Pōniuā'ena as a possible quasar after combing through large area surveys. In 2019, the researchers observed the object using the W. M. Keck Observatory and Gemini Observatory on Mauna Kea, in Hawaii, confirming its existence and identity.
Pōniuā'ena is only the second quasar yet detected at a distance calculated at a cosmological redshift greater than 7.5, or 13 billion light years from Earth. It hosts a black hole twice as large as the other quasar known from the same era. The existence of these massive black holes at such early times challenges current theories of how supermassive black holes formed and grew in the young universe.
A cosmological puzzle
Spectroscopic observations from Gemini and Keck show the black hole powering Pōniuā'ena is 1.5 billion times more massive than our Sun. "Pōniuā'ena is the most distant object known in the universe hosting a black hole exceeding one billion solar masses," said lead author Jinyi Yang, a postdoctoral research associate at the University of Arizona.
Black holes grow by accreting matter. In the standard picture, supermassive black holes grow from a much smaller "seed" black hole, which could have been the remnant of a massive star that died. "So it is puzzling how such a massive black hole can exist so early in the universe's history because there does not appear to be enough time for them to grow given our current understanding," Davies explained.
For a black hole of this size to form this early in the universe, it would need to start as a 10,000-solar-mass seed black hole only 100 million years after the Big Bang -- as opposed to growing from a much smaller black hole formed by the collapse of a single star.
"How can the universe produce such a massive black hole so early in its history?" said Xiaohui Fan, at the University of Arizona. "This discovery presents the biggest challenge yet for the theory of black hole formation and growth in the early universe." The discovery of a more exotic mechanism to form the seed black hole may be required to explain the mere existence of Pōniuā'ena.
The Epoch of Reionization
Current theory holds that the birth of stars and galaxies as we know them started during the Epoch of Reionization. Beginning about 400 million years after the Big Bang, the diffuse matter in between galaxies went from being neutral hydrogen to ionized hydrogen. The growth of the first giant black holes is thought to have occurred during this time.
The discovery of quasars like Pōniuā'ena, deep in the reionization epoch, is a big step towards understanding this process of reionization and the formation of early supermassive black holes and massive galaxies. Pōniuā'ena has placed new and important constraints on the evolution of the intergalactic medium in the reionization epoch.
"Pōniuā'ena acts like a cosmic lighthouse. As its light travels the long journey towards Earth, its spectrum is altered by diffuse gas in the intergalactic medium which allowed us to pinpoint when the Epoch of Reionization occurred," said Hennawi. The modeling and data analysis method used to infer information about the Epoch of Reionization from these distant quasar spectra was developed in his research group at UC Santa Barbara with Davies and Wang.
"Through University of California Observatories, we have privileged access to the Keck telescopes on the summit of Mauna Kea, which allowed us to obtain high quality data on this object shortly after it was discovered using the Gemini telescope," Hennawi said.
Finding these distant quasars is a needle in a haystack problem. Astronomers must mine digital images of billions of celestial objects in order to find quasar candidates. "Even after you identify the candidates, the current success rate of finding them is about 1%, and this involves spending lots of expensive telescope time observing contaminants," Wang explained.
Read more at Science Daily
UC Santa Barbara's Joe Hennawi, a professor in the Department of Physics, and former UCSB postdoctoral scholars Frederick Davies and Feige Wang, provided crucial modeling and data analysis tools that enabled this discovery. The results are currently in preprint on ArXiv and will appear in The Astrophysical Journal Letters.
The researchers have named the object Pōniuā'ena, which means "unseen spinning source of creation, surrounded with brilliance" in the Hawaiian language. It is the first quasar to receive an indigenous Hawaiian name.
Quasars are incredibly bright sources of radiation that lie at the centers of distant massive galaxies. Matter spiraling onto a supermassive black hole generates tremendous amounts of heat making it glow at ultraviolet and optical wavelengths. "They are the most luminous objects in the Universe," Hennawi said, "outshining their host galaxies by factors of more than a hundred."
Since the discovery of the first quasar, astronomers have been keen to determine when they first appeared in our cosmic history. By systematically searching for these rare objects in wide-area sky surveys, astronomers discovered the most distant quasar (named J1342+0928) in 2018 and now the second-most distant, Pōniuā'ena (or J1007+2115).
The team first detected Pōniuā'ena as a possible quasar after combing through large area surveys. In 2019, the researchers observed the object using the W. M. Keck Observatory and Gemini Observatory on Mauna Kea, in Hawaii, confirming its existence and identity.
Pōniuā'ena is only the second quasar yet detected at a distance calculated at a cosmological redshift greater than 7.5, or 13 billion light years from Earth. It hosts a black hole twice as large as the other quasar known from the same era. The existence of these massive black holes at such early times challenges current theories of how supermassive black holes formed and grew in the young universe.
A cosmological puzzle
Spectroscopic observations from Gemini and Keck show the black hole powering Pōniuā'ena is 1.5 billion times more massive than our Sun. "Pōniuā'ena is the most distant object known in the universe hosting a black hole exceeding one billion solar masses," said lead author Jinyi Yang, a postdoctoral research associate at the University of Arizona.
Black holes grow by accreting matter. In the standard picture, supermassive black holes grow from a much smaller "seed" black hole, which could have been the remnant of a massive star that died. "So it is puzzling how such a massive black hole can exist so early in the universe's history because there does not appear to be enough time for them to grow given our current understanding," Davies explained.
For a black hole of this size to form this early in the universe, it would need to start as a 10,000-solar-mass seed black hole only 100 million years after the Big Bang -- as opposed to growing from a much smaller black hole formed by the collapse of a single star.
"How can the universe produce such a massive black hole so early in its history?" said Xiaohui Fan, at the University of Arizona. "This discovery presents the biggest challenge yet for the theory of black hole formation and growth in the early universe." The discovery of a more exotic mechanism to form the seed black hole may be required to explain the mere existence of Pōniuā'ena.
The Epoch of Reionization
Current theory holds that the birth of stars and galaxies as we know them started during the Epoch of Reionization. Beginning about 400 million years after the Big Bang, the diffuse matter in between galaxies went from being neutral hydrogen to ionized hydrogen. The growth of the first giant black holes is thought to have occurred during this time.
The discovery of quasars like Pōniuā'ena, deep in the reionization epoch, is a big step towards understanding this process of reionization and the formation of early supermassive black holes and massive galaxies. Pōniuā'ena has placed new and important constraints on the evolution of the intergalactic medium in the reionization epoch.
"Pōniuā'ena acts like a cosmic lighthouse. As its light travels the long journey towards Earth, its spectrum is altered by diffuse gas in the intergalactic medium which allowed us to pinpoint when the Epoch of Reionization occurred," said Hennawi. The modeling and data analysis method used to infer information about the Epoch of Reionization from these distant quasar spectra was developed in his research group at UC Santa Barbara with Davies and Wang.
"Through University of California Observatories, we have privileged access to the Keck telescopes on the summit of Mauna Kea, which allowed us to obtain high quality data on this object shortly after it was discovered using the Gemini telescope," Hennawi said.
Finding these distant quasars is a needle in a haystack problem. Astronomers must mine digital images of billions of celestial objects in order to find quasar candidates. "Even after you identify the candidates, the current success rate of finding them is about 1%, and this involves spending lots of expensive telescope time observing contaminants," Wang explained.
Read more at Science Daily
Stellar fireworks celebrate birth of giant cluster
Astronomers created a stunning new image showing celestial fireworks in star cluster G286.21+0.17.
Most stars in the universe, including our Sun, were born in massive star clusters. These clusters are the building blocks of galaxies, but their formation from dense molecular clouds is still largely a mystery.
The image of cluster G286.21+0.17, caught in the act of formation, is a multi-wavelength mosaic made out of more than 750 individual radio observations with the Atacama Large Millimeter/submillimeter Array (ALMA) and 9 infrared images from the NASA/ESA Hubble Space Telescope. The cluster is located in the Carina region of our galaxy, about 8000 light-years away.
Dense clouds made of molecular gas (purple 'fireworks streamers') are revealed by ALMA. The telescope observed the motions of turbulent gas falling into the cluster, forming dense cores that ultimately create individual stars.
The stars in the image are revealed by their infrared light, as seen by Hubble, including a large group of stars bursting out from one side of the cloud. The powerful winds and radiation from the most massive of these stars are blasting away the molecular clouds, leaving faint wisps of glowing, hot dust (shown in yellow and red).
"This image shows stars in various stages of formation within this single cluster," said Yu Cheng of the University of Virginia in Charlottesville, Virginia, and lead author of two papers published in The Astrophysical Journal.
Hubble revealed about a thousand newly-formed stars with a wide range of masses. Additionally, ALMA showed that there is a lot more mass present in dense gas that still has to undergo collapse. "Overall the process may take at least a million years to complete," Cheng added.
"This illustrates how dynamic and chaotic the process of star birth is," said co-author Jonathan Tan of Chalmers University in Sweden and the University of Virginia and principal investigator of the project. "We see competing forces in action: gravity and turbulence from the cloud on one side, and stellar winds and radiation pressure from the young stars on the other. This process sculpts the region. It is amazing to think that our own Sun and planets were once part of such a cosmic dance."
Read more at Science Daily
Most stars in the universe, including our Sun, were born in massive star clusters. These clusters are the building blocks of galaxies, but their formation from dense molecular clouds is still largely a mystery.
The image of cluster G286.21+0.17, caught in the act of formation, is a multi-wavelength mosaic made out of more than 750 individual radio observations with the Atacama Large Millimeter/submillimeter Array (ALMA) and 9 infrared images from the NASA/ESA Hubble Space Telescope. The cluster is located in the Carina region of our galaxy, about 8000 light-years away.
Dense clouds made of molecular gas (purple 'fireworks streamers') are revealed by ALMA. The telescope observed the motions of turbulent gas falling into the cluster, forming dense cores that ultimately create individual stars.
The stars in the image are revealed by their infrared light, as seen by Hubble, including a large group of stars bursting out from one side of the cloud. The powerful winds and radiation from the most massive of these stars are blasting away the molecular clouds, leaving faint wisps of glowing, hot dust (shown in yellow and red).
"This image shows stars in various stages of formation within this single cluster," said Yu Cheng of the University of Virginia in Charlottesville, Virginia, and lead author of two papers published in The Astrophysical Journal.
Hubble revealed about a thousand newly-formed stars with a wide range of masses. Additionally, ALMA showed that there is a lot more mass present in dense gas that still has to undergo collapse. "Overall the process may take at least a million years to complete," Cheng added.
"This illustrates how dynamic and chaotic the process of star birth is," said co-author Jonathan Tan of Chalmers University in Sweden and the University of Virginia and principal investigator of the project. "We see competing forces in action: gravity and turbulence from the cloud on one side, and stellar winds and radiation pressure from the young stars on the other. This process sculpts the region. It is amazing to think that our own Sun and planets were once part of such a cosmic dance."
Read more at Science Daily
Jul 1, 2020
Light drinking may protect brain function
Light to moderate drinking may preserve brain function in older age, according to a new study from the University of Georgia.
The study examined the link between alcohol consumption and changes in cognitive function over time among middle-aged and older adults in the U.S.
"We know there are some older people who believe that drinking a little wine everyday could maintain a good cognitive condition," said lead author Ruiyuan Zhang, a doctoral student at UGA's College of Public Health.
"We wanted to know if drinking a small amount of alcohol actually correlates with a good cognitive function, or is it just a kind of survivor bias."
Regular, moderate alcohol consumption has been shown to promote heart health and some research points to a similar protective benefit for brain health. However, many of these studies were not designed to isolate the effects of alcohol on cognition or did not measure effects over time.
Zhang and his team developed a way to track cognition performance over 10 years using participant data from the nationally representative Health and Retirement Study.
During the study, a total of 19,887 participants completed surveys every two years about their health and lifestyle, including questions on drinking habits. Light to moderate drinking is defined as fewer than eight drinks per week for women and 15 drinks or fewer per week among men.
These participants also had their cognitive function measured in a series of tests looking at their overall mental status, word recall and vocabulary. Their test results were combined to form a total cognitive score.
Zhang and his colleagues looked at how participants performed on these cognitive tests over the course of the study and categorized their performance as high or low trajectories, meaning their cognitive function remained high over time or began to decline.
Compared to nondrinkers, they found that those who had a drink or two a day tended to perform better on cognitive tests over time.
Even when other important factors known to impact cognition such as age, smoking or education level were controlled for, they saw a pattern of light drinking associated with high cognitive trajectories.
The optimal amount of drinks per week was between 10 and 14 drinks. But that doesn't mean those who drink less should start indulging more, says Zhang.
"It is hard to say this effect is causal," he said. "So, if some people don't drink alcoholic beverages, this study does not encourage them to drink to prevent cognitive function decline."
Read more at Science Daily
The study examined the link between alcohol consumption and changes in cognitive function over time among middle-aged and older adults in the U.S.
"We know there are some older people who believe that drinking a little wine everyday could maintain a good cognitive condition," said lead author Ruiyuan Zhang, a doctoral student at UGA's College of Public Health.
"We wanted to know if drinking a small amount of alcohol actually correlates with a good cognitive function, or is it just a kind of survivor bias."
Regular, moderate alcohol consumption has been shown to promote heart health and some research points to a similar protective benefit for brain health. However, many of these studies were not designed to isolate the effects of alcohol on cognition or did not measure effects over time.
Zhang and his team developed a way to track cognition performance over 10 years using participant data from the nationally representative Health and Retirement Study.
During the study, a total of 19,887 participants completed surveys every two years about their health and lifestyle, including questions on drinking habits. Light to moderate drinking is defined as fewer than eight drinks per week for women and 15 drinks or fewer per week among men.
These participants also had their cognitive function measured in a series of tests looking at their overall mental status, word recall and vocabulary. Their test results were combined to form a total cognitive score.
Zhang and his colleagues looked at how participants performed on these cognitive tests over the course of the study and categorized their performance as high or low trajectories, meaning their cognitive function remained high over time or began to decline.
Compared to nondrinkers, they found that those who had a drink or two a day tended to perform better on cognitive tests over time.
Even when other important factors known to impact cognition such as age, smoking or education level were controlled for, they saw a pattern of light drinking associated with high cognitive trajectories.
The optimal amount of drinks per week was between 10 and 14 drinks. But that doesn't mean those who drink less should start indulging more, says Zhang.
"It is hard to say this effect is causal," he said. "So, if some people don't drink alcoholic beverages, this study does not encourage them to drink to prevent cognitive function decline."
Read more at Science Daily
Study: 35 percent of excess deaths in pandemic's early months tied to causes other than COVID-19
Since COVID-19's spread to the United States earlier this year, death rates in the U.S. have risen significantly. But deaths attributed to COVID-19 only account for about two-thirds of the increase in March and April, according to a study published Wednesday in the Journal of the American Medical Association.
Researchers at Virginia Commonwealth University and Yale University found that, from March 1 to April 25, the U.S. saw 87,001 excess deaths -- or deaths above the number that would be expected based on averages from the previous five years. The study, "Excess Deaths from COVID-19 and Other Causes, March-April 2020," showed that only 65% of the excess deaths that occurred in March and April were attributed to COVID-19, meaning more than one-third were linked to other causes.
In 14 states, including two of the most populated -- California and Texas -- more than half of the excess deaths were tied to an underlying cause other than COVID-19, said lead author Steven Woolf, M.D., director emeritus of VCU's Center on Society and Health.
This data, Woolf said, suggests the COVID-19 death counts reported to the public underestimate the true death toll of the pandemic in the U.S.
"There are several potential reasons for this under-count," said Woolf, a professor in the Department of Family Medicine and Population Health at VCU School of Medicine. "Some of it may reflect under-reporting; it takes awhile for some of these data to come in. Some cases might involve patients with COVID-19 who died from related complications, such as heart disease, and those complications may have been listed as the cause of death rather than COVID-19.
"But a third possibility, the one we're quite concerned about, is indirect mortality -- deaths caused by the response to the pandemic," Woolf said. "People who never had the virus may have died from other causes because of the spillover effects of the pandemic, such as delayed medical care, economic hardship or emotional distress."
Woolf and his team found that deaths from causes other than COVID-19 rose sharply in the states that had the most COVID-19 deaths in March and April. Those states were Massachusetts, Michigan, New Jersey, New York -- particularly New York City -- and Pennsylvania. At COVID-19's peak for March and April (the week ending April 11), diabetes deaths in those five states rose 96% above the expected number of deaths when compared to the weekly averages in January and February of 2020. Deaths from heart disease (89%), Alzheimer's disease (64%) and stroke (35%) in those states also spiked.
New York City's death rates alone rose a staggering 398% from heart disease and 356% from diabetes, the study stated.
Woolf said he and his team suspect that some of these were indirect deaths from the pandemic that occurred among people with acute emergencies, such as a heart attack or stroke, who may have been afraid to go to a hospital for fear of getting the virus. Those who did seek emergency care, particularly in the areas hardest hit by the virus, may not have been able to get the treatment they needed, such as ventilator support, if the hospital was overwhelmed by the surge.
Others may have died from a chronic health condition, such as diabetes or cancer, that was exacerbated by the effects of the pandemic, said Woolf, VCU's C. Kenneth and Dianne Wright Distinguished Chair in Population Health and Health Equity. Still others may have struggled to deal with the consequences of job loss or social isolation.
"We can't forget about mental health," Woolf said. "A number of people struggling with depression, addiction and very difficult economic conditions caused by lockdowns may have become increasingly desperate, and some may have died by suicide. People addicted to opioids and other drugs may have overdosed. All told, what we're seeing is a death count well beyond what we would normally expect for this time of year, and it's only partially explained by COVID-19."
Woolf and his co-authors, Derek Chapman, Ph.D., Roy Sabo, Ph.D., and Latoya Hill of VCU, and Daniel M. Weinberger, Ph.D., of Yale University, state that further investigation is needed to determine just how many deaths were from COVID-19 and how many were indirect deaths "caused by disruptions in society that diminished or delayed access to health care and the social determinants of health (e.g., jobs, income, food security)."
Woolf, also a family physician, said this paper's results underscore the need for health systems and public officials to make sure services are available not only for COVID-19 but for other health problems. His study showed what happened in the states that were overwhelmed by cases in March and April. Woolf worries that the same spikes in excess deaths may now be occurring in other states that are being overwhelmed.
"The findings from our VCU researchers' study confirm an alarming trend across the U.S., where community members experiencing a health emergency are staying home -- a decision that can have long-term, and sometimes fatal, consequences," said Peter Buckley, M.D., interim CEO of VCU Health System and interim senior vice president of VCU Health Sciences. "Health systems nationwide need to let patients know it is safe and important to seek care in a health emergency, whether it's through telehealth or in person."
Woolf, who serves in a community engagement role with the C. Kenneth and Dianne Wright Center for Clinical and Translational Research, said resources should be available for those facing unemployment, loss of income and food and housing insecurity, including help with the mental health challenges, such as depression, anxiety or addiction, that these hardships could present.
Read more at Science Daily
Researchers at Virginia Commonwealth University and Yale University found that, from March 1 to April 25, the U.S. saw 87,001 excess deaths -- or deaths above the number that would be expected based on averages from the previous five years. The study, "Excess Deaths from COVID-19 and Other Causes, March-April 2020," showed that only 65% of the excess deaths that occurred in March and April were attributed to COVID-19, meaning more than one-third were linked to other causes.
In 14 states, including two of the most populated -- California and Texas -- more than half of the excess deaths were tied to an underlying cause other than COVID-19, said lead author Steven Woolf, M.D., director emeritus of VCU's Center on Society and Health.
This data, Woolf said, suggests the COVID-19 death counts reported to the public underestimate the true death toll of the pandemic in the U.S.
"There are several potential reasons for this under-count," said Woolf, a professor in the Department of Family Medicine and Population Health at VCU School of Medicine. "Some of it may reflect under-reporting; it takes awhile for some of these data to come in. Some cases might involve patients with COVID-19 who died from related complications, such as heart disease, and those complications may have been listed as the cause of death rather than COVID-19.
"But a third possibility, the one we're quite concerned about, is indirect mortality -- deaths caused by the response to the pandemic," Woolf said. "People who never had the virus may have died from other causes because of the spillover effects of the pandemic, such as delayed medical care, economic hardship or emotional distress."
Woolf and his team found that deaths from causes other than COVID-19 rose sharply in the states that had the most COVID-19 deaths in March and April. Those states were Massachusetts, Michigan, New Jersey, New York -- particularly New York City -- and Pennsylvania. At COVID-19's peak for March and April (the week ending April 11), diabetes deaths in those five states rose 96% above the expected number of deaths when compared to the weekly averages in January and February of 2020. Deaths from heart disease (89%), Alzheimer's disease (64%) and stroke (35%) in those states also spiked.
New York City's death rates alone rose a staggering 398% from heart disease and 356% from diabetes, the study stated.
Woolf said he and his team suspect that some of these were indirect deaths from the pandemic that occurred among people with acute emergencies, such as a heart attack or stroke, who may have been afraid to go to a hospital for fear of getting the virus. Those who did seek emergency care, particularly in the areas hardest hit by the virus, may not have been able to get the treatment they needed, such as ventilator support, if the hospital was overwhelmed by the surge.
Others may have died from a chronic health condition, such as diabetes or cancer, that was exacerbated by the effects of the pandemic, said Woolf, VCU's C. Kenneth and Dianne Wright Distinguished Chair in Population Health and Health Equity. Still others may have struggled to deal with the consequences of job loss or social isolation.
"We can't forget about mental health," Woolf said. "A number of people struggling with depression, addiction and very difficult economic conditions caused by lockdowns may have become increasingly desperate, and some may have died by suicide. People addicted to opioids and other drugs may have overdosed. All told, what we're seeing is a death count well beyond what we would normally expect for this time of year, and it's only partially explained by COVID-19."
Woolf and his co-authors, Derek Chapman, Ph.D., Roy Sabo, Ph.D., and Latoya Hill of VCU, and Daniel M. Weinberger, Ph.D., of Yale University, state that further investigation is needed to determine just how many deaths were from COVID-19 and how many were indirect deaths "caused by disruptions in society that diminished or delayed access to health care and the social determinants of health (e.g., jobs, income, food security)."
Woolf, also a family physician, said this paper's results underscore the need for health systems and public officials to make sure services are available not only for COVID-19 but for other health problems. His study showed what happened in the states that were overwhelmed by cases in March and April. Woolf worries that the same spikes in excess deaths may now be occurring in other states that are being overwhelmed.
"The findings from our VCU researchers' study confirm an alarming trend across the U.S., where community members experiencing a health emergency are staying home -- a decision that can have long-term, and sometimes fatal, consequences," said Peter Buckley, M.D., interim CEO of VCU Health System and interim senior vice president of VCU Health Sciences. "Health systems nationwide need to let patients know it is safe and important to seek care in a health emergency, whether it's through telehealth or in person."
Woolf, who serves in a community engagement role with the C. Kenneth and Dianne Wright Center for Clinical and Translational Research, said resources should be available for those facing unemployment, loss of income and food and housing insecurity, including help with the mental health challenges, such as depression, anxiety or addiction, that these hardships could present.
Read more at Science Daily
Beavers gnawing away at the permafrost
Beaver |
When it comes to completely transforming a landscape, beavers are hard to beat. Very few other animals are capable of changing their habitat as precisely as these brown-furred rodents, which can weigh up to 30 kilograms. Armed with sharp teeth, they fell trees and shrubs and build dams, causing small valleys to fill with water and forming new lakes, which can easily measure a few hectares. "Their methods are extremely effective," says Dr Ingmar Nitze from the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI) in Potsdam/Germany. They often build their dams at precisely those points where they can achieve major effects with minimal effort.
This is something that Ingmar Nitze has repeatedly seen in the Arctic regions of Alaska, where the North American beaver is active. The researcher is an expert on remote sensing, and is especially interested in those parts of the Earth where the soil is permanently frozen. Climate researchers fear that, as temperatures rise, this permafrost could increasingly thaw and become unstable. If that happens, it could release massive quantities of greenhouse gases, which would intensify climate change.
Accordingly, Nitze and his colleagues are monitoring the development of these landscapes with the aid of satellite images. One interesting aspect in this regard: how the lakes and other bodies of water are distributed. Because the water they contain is somewhat warmer than the surrounding soil, these lakes and ponds can further accelerate permafrost thawing. And beavers would seem to be actively contributing to the process.
Back in 2018, Ingmar Nitze and Guido Grosse from the AWI, together with colleagues from the USA, determined that the beavers living in an 18,000-square-kilometre section of northwest Alaska had created 56 new lakes in just five years. For their new study, the team from the AWI, the University of Alaska in Fairbanks, and the University of Minnesota in Minneapolis have now taken a closer look at this trend. Using detailed satellite data and extended time series, the experts tracked the beavers' activities in two other regions in Alaska -- and were surprised by what they found.
"Of course, we knew that the beavers there had spread substantially over the last few decades," says Nitze. This is partly due to climate change; thanks to rising temperatures, now more and more habitats offer the shrubs that the animals need for food and building material. Furthermore, the lakes, which used to freeze solid, now offer beaver-friendlier conditions, thanks to their thinner seasonal winter ice cover. Lastly, the rodents aren't hunted as intensively as in the past. As a result, it's a good time to be a beaver in the Arctic.
"But we never would have dreamed they would seize the opportunity so intensively," says Nitze. The high-resolution satellite images of the roughly 100-square-kilometre study area near the town of Kotzebue reveal the scale of the animals' activities there. From just two dams in 2002, the number had risen to 98 by 2019 -- a 5,000-percent increase, with more than 5 new dams being constructed per year. And the larger area surveyed, which covers the entire northern Baldwin Peninsula, also experienced a beaver dam boom. According to Nitze, "We're seeing exponential growth there. The number of these structures doubles roughly every four years."
This has already affected the water balance. Apparently, the rodents intentionally do their work in those parts of the landscape that they can most easily flood. To do so, sometimes they dam up small streams, and sometimes the outlets of existing lakes, which expand as a result. "But they especially prefer drained lake basins," Benjamin Jones, lead author of the study, and Nitze report. In many cases, the bottoms of these former lakes are prime locations for beaver activity. "The animals have intuitively found that damming the outlet drainage channels at the sites of former lakes is an efficient way to create habitat. So a new lake is formed which degrades ice-rich permafrost in the basin, adding to the effect of increasing the depth of the engineered waterbody," added Jones. These actions have their consequences: in the course of the 17-year timeframe studied, the overall water area in the Kotzebue region grew by 8.3 percent. And roughly two-thirds of that growth was due to the beavers.
Read more at Science Daily
To find giant black holes, start with Jupiter
Illustration of black hole, warped spacetime. |
In the search for previously undetected black holes that are billions of times more massive than the sun, Stephen Taylor, assistant professor of physics and astronomy and former astronomer at NASA's Jet Propulsion Laboratory (JPL) together with the North American Nanohertz Observatory for Gravitational Waves (NANOGrav) collaboration has moved the field of research forward by finding the precise location -- the center of gravity of our solar system -- with which to measure the gravitational waves that signal the existence of these black holes.
The potential presented by this advancement, co-authored by Taylor, was published in the journal the Astrophysical Journal in April 2020.
Black holes are regions of pure gravity formed from extremely warped spacetime. Finding the most titanic black holes in the Universe that lurk at the heart of galaxies will help us understand how such galaxies (including our own) have grown and evolved over the billions of years since their formation. These black holes are also unrivaled laboratories for testing fundamental assumptions about physics.
Gravitational waves are ripples in spacetime predicted by Einstein's general theory of relativity. When black holes orbit each other in pairs, they radiate gravitational waves that deform spacetime, stretching and squeezing space. Gravitational waves were first detected by the Laser Interferometer Gravitational-Wave Observatory (LIGO) in 2015, opening new vistas on the most extreme objects in the universe. Whereas LIGO observes relatively short gravitational waves by looking for changes in the shape of a 4-km long detector, NANOGrav, a National Science Foundation (NSF) Physics Frontiers Center, looks for changes in the shape of our entire galaxy.
Taylor and his team are searching for changes to the arrival rate of regular flashes of radio waves from pulsars. These pulsars are rapidly spinning neutron stars, some going as fast as a kitchen blender. They also send out beams of radio waves, appearing like interstellar lighthouses when these beams sweep over Earth. Over 15 years of data have shown that these pulsars are extremely reliable in their pulse arrival rates, acting as outstanding galactic clocks. Any timing deviations that are correlated across lots of these pulsars could signal the influence of gravitational waves warping our galaxy.
"Using the pulsars we observe across the Milky Way galaxy, we are trying to be like a spider sitting in stillness in the middle of her web," explains Taylor. "How well we understand the solar system barycenter is critical as we attempt to sense even the smallest tingle to the web." The solar system barycenter, its center of gravity, is the location where the masses of all planets, moons, and asteroids balance out.
Where is the center of our web, the location of absolute stillness in our solar system? Not in the center of the sun as many might assume, rather it is closer to the surface of the star. This is due to Jupiter's mass and our imperfect knowledge of its orbit. It takes 12 years for Jupiter to orbit the sun, just shy of the 15 years that NANOGrav has been collecting data. JPL's Galileo probe (named for the famed scientist that used a telescope to observe the moons of Jupiter) studied Jupiter between 1995 and 2003, but experienced technical maladies that impacted the quality of the measurements taken during the mission.
Identifying the center of the solar system's gravity has long been calculated with data from Doppler tracking to get an estimate of the location and trajectories of bodies orbiting the sun. "The catch is that errors in the masses and orbits will translate to pulsar-timing artifacts that may well look like gravitational waves," explains JPL astronomer and co-author Joe Simon.
Taylor and his collaborators were finding that working with existing solar system models to analyze NANOGrav data gave inconsistent results. "We weren't detecting anything significant in our gravitational wave searches between solar system models, but we were getting large systematic differences in our calculations," notes JPL astronomer and the paper's lead author Michele Vallisneri. "Typically, more data delivers a more precise result, but there was always an offset in our calculations."
The group decided to search for the center of gravity of the solar system at the same time as sleuthing for gravitational waves. The researchers got more robust answers to finding gravitational waves and were able to more accurately localize the center of the solar system's gravity to within 100 meters. To understand that scale, if the sun were the size of a football field, 100 meters would be the diameter of a strand of hair. "Our precise observation of pulsars scattered across the galaxy has localized ourselves in the cosmos better than we ever could before," said Taylor. "By finding gravitational waves this way, in addition to other experiments, we gain a more holistic overview of all different kinds of black holes in the Universe."
Read more at Science Daily
Jun 30, 2020
Extreme warming of the South Pole
Illustration of Earth centered on Antarctica. |
Fogt, professor of meteorology and director of the Scalia Laboratory for Atmospheric Analysis, and Clem coauthored a paper with an international team of scientists published in the journal Nature Climate Change on the findings. According to the study, this warming period was mainly driven by natural tropical climate variability and was likely intensified by increases in greenhouse gas.
Clem, a current postdoctoral research fellow in climate science at Victoria University of Wellington in New Zealand, is the lead author of the study and studied under Fogt for both his bachelor's and master's degrees at Ohio University.
"I've had a passion for understanding the weather and fascination of its power and unpredictability as far back as I can remember," Clem said. "Working with Ryan I learned all about Antarctic and Southern Hemisphere climate, specifically how West Antarctica was warming and its ice sheet was thinning and contributing to global sea level rise. I also learned that Antarctica experiences some of the most extreme weather and variability on the planet, and due to its remote location we actually know very little about the continent, so there are constant surprises and new things to learn about Antarctica every year."
The Antarctic climate exhibits some of the largest ranges in temperature during the course of the year, and some of the largest temperature trends on the planet, with strong regional contrasts. Most of West Antarctica and the Antarctic Peninsula experienced warming and ice-sheet thinning during the late 20th century. By contrast, the South Pole -- located in the remote and high-altitude continental interior -- cooled until the 1980s and has since warmed substantially. These trends are affected by natural and anthropogenic climate change, but the individual contribution of each factor is not well understood.
Clem and his team analyzed weather station data at the South Pole, as well as climate models to examine the warming in the Antarctic interior. They found that between 1989 and 2018, the South Pole had warmed by about 1.8 degrees Celsius over the past 30 years at a rate of +0.6 degrees Celcius per decade -- three times the global average.
The study also found that the strong warming over the Antarctic interior in the last 30 years was mainly driven by the tropics, especially warm ocean temperatures in the western tropical Pacific Ocean that changed the winds in the South Atlantic near Antarctica and increased the delivery of warm air to the South Pole. They suggest these atmospheric changes along Antarctica's coast are an important mechanism driving climate anomalies in its interior.
Clem and Fogt argue that these warming trends were unlikely the result of natural climate change alone, emphasizing the effects of added anthropogenic warming on top of the large tropical climate signal on Antarctic climate have worked in tandem to make this one of the strongest warming trends worldwide.
Read more at Science Daily
A cosmic mystery: ESO telescope captures the disappearance of a massive star
Very Large Telescope complex. |
Between 2001 and 2011, various teams of astronomers studied the mysterious massive star, located in the Kinman Dwarf galaxy, and their observations indicated it was in a late stage of its evolution. Allan and his collaborators in Ireland, Chile and the US wanted to find out more about how very massive stars end their lives, and the object in the Kinman Dwarf seemed like the perfect target. But when they pointed ESO's VLT to the distant galaxy in 2019, they could no longer find the telltale signatures of the star. "Instead, we were surprised to find out that the star had disappeared!" says Allan, who led a study of the star published today in Monthly Notices of the Royal Astronomical Society.
Located some 75 million light-years away in the constellation of Aquarius, the Kinman Dwarf galaxy is too far away for astronomers to see its individual stars, but they can detect the signatures of some of them. From 2001 to 2011, the light from the galaxy consistently showed evidence that it hosted a 'luminous blue variable' star some 2.5 million times brighter than the Sun. Stars of this type are unstable, showing occasional dramatic shifts in their spectra and brightness. Even with those shifts, luminous blue variables leave specific traces scientists can identify, but they were absent from the data the team collected in 2019, leaving them to wonder what had happened to the star. "It would be highly unusual for such a massive star to disappear without producing a bright supernova explosion," says Allan.
The group first turned the ESPRESSO instrument toward the star in August 2019, using the VLT's four 8-metre telescopes simultaneously. But they were unable to find the signs that previously pointed to the presence of the luminous star. A few months later, the group tried the X-shooter instrument, also on ESO's VLT, and again found no traces of the star.
"We may have detected one of the most massive stars of the local Universe going gently into the night," says team-member Jose Groh, also of Trinity College Dublin. "Our discovery would not have been made without using the powerful ESO 8-metre telescopes, their unique instrumentation, and the prompt access to those capabilities following the recent agreement of Ireland to join ESO." Ireland became an ESO member state in September 2018.
The team then turned to older data collected using X-shooter and the UVES instrument on ESO's VLT, located in the Chilean Atacama Desert, and telescopes elsewhere."The ESO Science Archive Facility enabled us to find and use data of the same object obtained in 2002 and 2009," says Andrea Mehner, a staff astronomer at ESO in Chile who participated in the study. "The comparison of the 2002 high-resolution UVES spectra with our observations obtained in 2019 with ESO's newest high-resolution spectrograph ESPRESSO was especially revealing, from both an astronomical and an instrumentation point of view."
The old data indicated that the star in the Kinman Dwarf could have been undergoing a strong outburst period that likely ended sometime after 2011. Luminous blue variable stars such as this one are prone to experiencing giant outbursts over the course of their life, causing the stars' rate of mass loss to spike and their luminosity to increase dramatically.
Based on their observations and models, the astronomers have suggested two explanations for the star's disappearance and lack of a supernova, related to this possible outburst. The outburst may have resulted in the luminous blue variable being transformed into a less luminous star, which could also be partly hidden by dust. Alternatively, the team says the star may have collapsed into a black hole, without producing a supernova explosion. This would be a rare event: our current understanding of how massive stars die points to most of them ending their lives in a supernova.
Read more at Science Daily
Major new paleoclimatology study shows global warming has upended 6,500 years of cooling
Glacier collapse |
Four researchers of Northern Arizona University's School of Earth and Sustainability (SES) led the study, with Regents' professor Darrell Kaufman as lead author and associate professor Nicholas McKay as co-author, along with assistant research professors Cody Routson and Michael Erb. The team worked in collaboration with scientists from research institutions all over the world to reconstruct the global average temperature over the Holocene Epoch -- the period following the Ice Age and beginning about 12,000 years ago.
"Before global warming, there was global cooling," said Kaufman. "Previous work has shown convincingly that the world naturally and slowly cooled for at least 1,000 years prior to the middle of the 19th century, when the global average temperature reversed course along with the build-up of greenhouse gases. This study, based on a major new compilation of previously published paleoclimate data, combined with new statistical analyses, shows more confidently than ever that the millennial-scale global cooling began approximately 6,500 years ago."
Earlier this year, an international group of 93 paleoclimate scientists from 23 countries -- also led by Kaufman, McKay, Routson and Erb -- published the most comprehensive set of paleoclimate data ever compiled for the past 12,000 years, compressing 1,319 data records based on samples taken from 679 sites globally. At each site, researchers analyzed ecological, geochemical and biophysical evidence from both marine and terrestrial archives, such as lake deposits, marine sediments, peat and glacier ice, to infer past temperature changes. Countless scientists working around the world over many decades conducted the basic research contributing to the global database.
"The rate of cooling that followed the peak warmth was subtle, only around 0.1°C per 1,000 years. This cooling seems to be driven by slow cycles in the Earth's orbit, which reduced the amount of summer sunlight in the Northern Hemisphere, culminating in the 'Little Ice Age' of recent centuries," said Erb, who analyzed the temperature reconstructions.
Since the mid-19th century, global warming has climbed to about 1°C, suggesting that the global average temperature of the last decade (2010-2019) was warmer than anytime during the present post-glacial period.
McKay, who developed some of the statistical approaches to synthesizing data from around the world, notes that individual decades are not resolved in the 12,000-year-long temperature reconstruction, making it difficult to compare it with any recent decade. "On the other hand, this past decade was likely cooler than what the average temperatures will be for the rest of this century and beyond, which are very likely to continue to exceed 1°C above pre-industrial temperatures," McKay said.
"It's possible," Kaufman said, "that the last time the sustained average global temperature was 1°C above the 19th century was prior to the last Ice Age, back around 125,000 years ago when sea level was around 20 feet higher than today."
"Investigating the patterns of natural temperature changes over space and time helps us understand and quantify the processes that cause climate to change, which is important as we prepare for the full range of future climate changes due to both human and natural causes," said Routson. He used an earlier version of the database to link Arctic warming to a reduction in precipitation at mid latitudes (see related article).
"Our future climate will largely depend on the influence of human factors, especially the build-up of greenhouse gases. However, future climate will also be influenced by natural factors, and it will be complicated by the natural variability within the climate system. Future projections of climate change will be improved by better accounting for both anthropogenic and natural factors," he said.
Read more at Science Daily
Asteroid impact, not volcanoes, made the Earth uninhabitable for dinosaurs
Illustration of dinosaurs and asteroid. |
The asteroid, which struck the Earth off the coast of Mexico at the end of the Cretaceous era 66 million years ago, has long been believed to be the cause of the demise of all dinosaur species except those that became birds.
However, some researchers have suggested that tens of thousands of years of large volcanic eruptions may have been the actual cause of the extinction event, which also killed off almost 75% of life on Earth.
Now, a research team from Imperial College London, the University of Bristol and University College London has shown that only the asteroid impact could have created conditions that were unfavourable for dinosaurs across the globe.
They also show that the massive volcanism could also have helped life recover from the asteroid strike in the long term. Their results are published today in Proceedings of the National Academy of Sciences.
Lead researcher Dr Alessandro Chiarenza, who conducted this work whilst studying for his PhD in the Department of Earth Science and Engineering at Imperial, said: "We show that the asteroid caused an impact winter for decades, and that these environmental effects decimated suitable environments for dinosaurs. In contrast, the effects of the intense volcanic eruptions were not strong enough to substantially disrupt global ecosystems.
"Our study confirms, for the first time quantitatively, that the only plausible explanation for the extinction is the impact winter that eradicated dinosaur habitats worldwide."
The asteroid strike would have released particles and gases high into the atmosphere, blocking out the Sun for years and causing permanent winters. Volcanic eruptions also produce particles and gases with Sun-blocking effects, and around the time of the mass extinction there were tens of thousands of years of eruptions at the Deccan Traps, in present-day India.
To determine which factor, the asteroid or the volcanism, had more climate-changing power, researchers have traditionally used geological markers of climate and powerful mathematical models. In the new paper, the team combined these methods with information about what kinds of environmental factors, such as rainfall and temperature, each species of dinosaur needed to thrive.
They were then able to map where these conditions would still exist in a world after either an asteroid strike or massive volcanism. They found that only the asteroid strike wiped out all potential dinosaur habitats, while volcanism left some viable regions around the equator.
Co-lead author of the study Dr Alex Farnsworth, from the University of Bristol, said: "Instead of only using the geologic record to model the effect on climate that the asteroid or volcanism might have caused worldwide, we pushed this approach a step forward, adding an ecological dimension to the study to reveal how these climatic fluctuations severely affected ecosystems."
Co-author Dr Philip Mannion, from University College London, added: "In this study we add a modelling approach to key geological and climate data that shows the devastating effect of the asteroid impact on global habitats. Essentially, it produces a blue screen of death for dinosaurs."
Although volcanoes release Sun-blocking gases and particles, they also release carbon dioxide, a greenhouse gas. In the short term after an eruption, the Sun-blockers have a larger effect, causing a 'volcanic winter'. However, in the longer term these particles and gases drop out of the atmosphere, while carbon dioxide stays around and builds up, warming the planet.
After the initial drastic global winter caused by the asteroid, the team's model suggests that in the longer term, volcanic warming could have helped restore many habitats, helping new life that evolved after the disaster to thrive.
Read more at Science Daily
Jun 29, 2020
New 3D model shows how the paradise tree snake uses aerial undulation to fly
When the paradise tree snake flies from one tall branch to another, its body ripples with waves like green cursive on a blank pad of blue sky. That movement, aerial undulation, happens in each glide made by members of the Chrysopelea family, the only known limbless vertebrates capable of flight. Scientists have known this, but have yet to fully explain it.
For more than 20 years, Jake Socha, a professor in the Department of Biomedical Engineering and Mechanics at Virginia Tech, has sought to measure and model the biomechanics of snake flight and answer questions about them, like that of aerial undulation's functional role. For a study published by Nature Physics, Socha assembled an interdisciplinary team to develop the first continuous, anatomically-accurate 3D mathematical model of Chrysopelea paradisi in flight.
The team, which included Shane Ross, a professor in the Kevin T. Crofton Department of Aerospace and Ocean Engineering, and Isaac Yeaton, a recent mechanical engineering doctoral graduate and the paper's lead author, developed the 3D model after measuring more than 100 live snake glides. The model factors in frequencies of undulating waves, their direction, forces acting on the body, and mass distribution. With it, the researchers have run virtual experiments to investigate aerial undulation.
In one set of those experiments, to learn why undulation is a part of each glide, they simulated what would happen if it wasn't -- by turning it off. When their virtual flying snake could no longer aerially undulate, its body began to tumble. The test, paired with simulated glides that kept the waves of undulation going, confirmed the team's hypothesis: aerial undulation enhances rotational stability in flying snakes.
Questions of flight and movement fill Socha's lab. The group has fit their work on flying snakes between studies of how frogs leap from water and skitter across it, how blood flows through insects, and how ducks land on ponds. In part, it was important to Socha to probe undulation's functional role in snake glides because it would be easy to assume that it didn't really have one.
"We know that snakes undulate for all kinds of reasons and in all kinds of locomotor contexts," said Socha. "That's their basal program. By program, I mean their neural, muscular program? -- they're receiving specific instructions: fire this muscle now, fire that muscle, fire this muscle. It's ancient. It goes beyond snakes. That pattern of creating undulations is an old one. It's quite possible that a snake gets into the air, then it goes, 'What do I do? I'm a snake. I undulate.'"
But Socha believed there was much more to it. Throughout the paradise tree snake's flight, so many things happen at once, it's difficult to untangle them with the naked eye. Socha described a few steps that take place with each glide ? -- steps that read as intentional.
First, the snake jumps, usually by curving its body into a "J-loop" and springing up and out. As it launches, the snake reconfigures its shape, its muscles shifting to flatten its body out everywhere but the tail. The body becomes a "morphing wing" that produces lift and drag forces when air flows over it, as it accelerates downward under gravity. Socha has examined these aerodynamic properties in multiple studies. With the flattening comes undulation, as the snake sends waves down its body.
At the outset of the study, Socha had a theory for aerial undulation he explained by comparing two types of aircraft: jumbo jets versus fighter jets. Jumbo jets are designed for stability and start to level back out on their own when perturbed, he said, whereas fighters roll out of control.
So which would the snake be?
"Is it like a big jumbo jet, or is it naturally unstable?" Socha said. "Is this undulation potentially a way of it dealing with stability?"
He believed the snake would be more like a fighter jet.
To run tests investigating undulation's importance to stability, the team set out to develop a 3D mathematical model that could produce simulated glides. But first, they needed to measure and analyze what real snakes do when gliding.
In 2015, the researchers collected motion capture data from 131 live glides made by paradise tree snakes. They turned The Cube, a four-story black-box theater at the Moss Arts Center, into an indoor glide arena and used its 23 high-speed cameras to capture the snakes' motion as they jumped from 27 feet up -- from an oak tree branch atop a scissor lift -- and glided down to an artificial tree below, or onto the surrounding soft foam padding the team set out in sheets to cushion their landings.
The cameras put out infrared light, so the snakes were marked with infrared-reflective tape on 11 to 17 points along their bodies, allowing the motion capture system to detect their changing position over time. Finding the number of measurement points has been key to the study; in past experiments, Socha marked the snake at three points, then five, but those numbers didn't provide enough information. The data from fewer video points only provided a coarse understanding, making for choppy and low-fidelity undulation in the resulting models.
The team found a sweet spot in 11 to 17 points, which gave high-resolution data. "With this number, we could get a smooth representation of the snake, and an accurate one," said Socha.
The researchers went on to build the 3D model by digitizing and reproducing the snake's motion while folding in measurements they had previously collected on mass distribution and aerodynamics. An expert in dynamic modeling, Ross guided Yeaton's work on a continuous model by drawing inspiration from work in spacecraft motion.
He had worked with Socha to model flying snakes since 2013, and their previous models treated the snake's body in parts -- first in three parts, as a trunk, a middle, and an end, and then as a bunch of links. "This is the first one that's continuous," said Ross. "It's like a ribbon. It's the most realistic to this point."
In virtual experiments, the model showed that aerial undulation not only kept the snake from tipping over during glides, but it increased the horizontal and vertical distances traveled.
Ross sees an analogy for the snake's undulation in a frisbee's spin: the reciprocating motion increases rotational stability and results in a better glide. By undulating, he said, the snake is able to balance out the lift and drag forces its flattened body produces, rather than being overwhelmed by them and toppling, and it's able to go further.
The experiments also revealed to the team details they hadn't previously been able to visualize. They saw that the snake employed two waves when undulating: a large-amplitude horizontal wave and a newly discovered, smaller-amplitude vertical wave. The waves went side to side and up and down at the same time, and the data showed that the vertical wave went at twice the rate of the horizontal one. "This is really, really freaky," said Socha. These double waves have only been discovered in one other snake, a sidewinder, but its waves go at the same frequency.
"What really makes this study powerful is that we were able to dramatically advance both our understanding of glide kinematics and our ability to model the system," said Yeaton. "Snake flight is complicated, and it's often tricky to get the snakes to cooperate. And there are many intricacies to make the computational model accurate. But it's satisfying to put all of the pieces together."
"In all these years, I think I've seen close to a thousand glides," said Socha. "It's still amazing to see every time. Seeing it in person, there's something a little different about it. It's shocking still. What exactly is this animal doing? Being able to answer the questions I've had since I was a graduate student, many, many years later, is incredibly satisfying."
Socha credits some of the elements that shaped the real and simulated glide experiments to forces out of his control. Chance led him to the indoor glide arena: a few years after the Moss Arts Center opened, Tanner Upthegrove, a media engineer for the Institute for Creativity, Arts, and Technology, or ICAT, asked him if he'd ever thought about working in the Cube.
"What's the Cube?" he asked. When Upthegrove showed him the space, he was floored. It seemed designed for Socha's experiments.
In some ways, it was. "Many projects at ICAT used the advanced technology of the Cube, a studio unlike any other in the world, to reveal that which could normally not be seen," said Ben Knapp, the founding director of ICAT. "Scientists, engineers, artists, and designers join forces here to build, create, and innovate new ways to approach the world's grandest challenges."
In one of the center's featured projects, "Body, Full of Time," media and visual artists used the space to motion capture the body movements of dancers for an immersive performance. Trading dancers for snakes, Socha was able to make the most of the Cube's motion capture system. The team could move cameras around, optimizing their position for the snake's path. They took advantage of latticework at the top of the space to position two cameras pointing down, providing an overhead view of the snake, which they'd never been able to do before.
Socha and Ross see potential for their 3D model to continue exploring snake flight. The team is planning outdoor experiments to gather motion data from longer glides. And one day, they hope to cross the boundaries of biological reality.
Right now, their virtual flying snake always glides down, like the real animal. But what if they could get it to move so that it would actually start to go up? To really fly? That ability could potentially be built into the algorithms of robotic snakes, which have exciting applications in search and rescue and disaster monitoring, Ross said.
Read more at Science Daily
For more than 20 years, Jake Socha, a professor in the Department of Biomedical Engineering and Mechanics at Virginia Tech, has sought to measure and model the biomechanics of snake flight and answer questions about them, like that of aerial undulation's functional role. For a study published by Nature Physics, Socha assembled an interdisciplinary team to develop the first continuous, anatomically-accurate 3D mathematical model of Chrysopelea paradisi in flight.
The team, which included Shane Ross, a professor in the Kevin T. Crofton Department of Aerospace and Ocean Engineering, and Isaac Yeaton, a recent mechanical engineering doctoral graduate and the paper's lead author, developed the 3D model after measuring more than 100 live snake glides. The model factors in frequencies of undulating waves, their direction, forces acting on the body, and mass distribution. With it, the researchers have run virtual experiments to investigate aerial undulation.
In one set of those experiments, to learn why undulation is a part of each glide, they simulated what would happen if it wasn't -- by turning it off. When their virtual flying snake could no longer aerially undulate, its body began to tumble. The test, paired with simulated glides that kept the waves of undulation going, confirmed the team's hypothesis: aerial undulation enhances rotational stability in flying snakes.
Questions of flight and movement fill Socha's lab. The group has fit their work on flying snakes between studies of how frogs leap from water and skitter across it, how blood flows through insects, and how ducks land on ponds. In part, it was important to Socha to probe undulation's functional role in snake glides because it would be easy to assume that it didn't really have one.
"We know that snakes undulate for all kinds of reasons and in all kinds of locomotor contexts," said Socha. "That's their basal program. By program, I mean their neural, muscular program? -- they're receiving specific instructions: fire this muscle now, fire that muscle, fire this muscle. It's ancient. It goes beyond snakes. That pattern of creating undulations is an old one. It's quite possible that a snake gets into the air, then it goes, 'What do I do? I'm a snake. I undulate.'"
But Socha believed there was much more to it. Throughout the paradise tree snake's flight, so many things happen at once, it's difficult to untangle them with the naked eye. Socha described a few steps that take place with each glide ? -- steps that read as intentional.
First, the snake jumps, usually by curving its body into a "J-loop" and springing up and out. As it launches, the snake reconfigures its shape, its muscles shifting to flatten its body out everywhere but the tail. The body becomes a "morphing wing" that produces lift and drag forces when air flows over it, as it accelerates downward under gravity. Socha has examined these aerodynamic properties in multiple studies. With the flattening comes undulation, as the snake sends waves down its body.
At the outset of the study, Socha had a theory for aerial undulation he explained by comparing two types of aircraft: jumbo jets versus fighter jets. Jumbo jets are designed for stability and start to level back out on their own when perturbed, he said, whereas fighters roll out of control.
So which would the snake be?
"Is it like a big jumbo jet, or is it naturally unstable?" Socha said. "Is this undulation potentially a way of it dealing with stability?"
He believed the snake would be more like a fighter jet.
To run tests investigating undulation's importance to stability, the team set out to develop a 3D mathematical model that could produce simulated glides. But first, they needed to measure and analyze what real snakes do when gliding.
In 2015, the researchers collected motion capture data from 131 live glides made by paradise tree snakes. They turned The Cube, a four-story black-box theater at the Moss Arts Center, into an indoor glide arena and used its 23 high-speed cameras to capture the snakes' motion as they jumped from 27 feet up -- from an oak tree branch atop a scissor lift -- and glided down to an artificial tree below, or onto the surrounding soft foam padding the team set out in sheets to cushion their landings.
The cameras put out infrared light, so the snakes were marked with infrared-reflective tape on 11 to 17 points along their bodies, allowing the motion capture system to detect their changing position over time. Finding the number of measurement points has been key to the study; in past experiments, Socha marked the snake at three points, then five, but those numbers didn't provide enough information. The data from fewer video points only provided a coarse understanding, making for choppy and low-fidelity undulation in the resulting models.
The team found a sweet spot in 11 to 17 points, which gave high-resolution data. "With this number, we could get a smooth representation of the snake, and an accurate one," said Socha.
The researchers went on to build the 3D model by digitizing and reproducing the snake's motion while folding in measurements they had previously collected on mass distribution and aerodynamics. An expert in dynamic modeling, Ross guided Yeaton's work on a continuous model by drawing inspiration from work in spacecraft motion.
He had worked with Socha to model flying snakes since 2013, and their previous models treated the snake's body in parts -- first in three parts, as a trunk, a middle, and an end, and then as a bunch of links. "This is the first one that's continuous," said Ross. "It's like a ribbon. It's the most realistic to this point."
In virtual experiments, the model showed that aerial undulation not only kept the snake from tipping over during glides, but it increased the horizontal and vertical distances traveled.
Ross sees an analogy for the snake's undulation in a frisbee's spin: the reciprocating motion increases rotational stability and results in a better glide. By undulating, he said, the snake is able to balance out the lift and drag forces its flattened body produces, rather than being overwhelmed by them and toppling, and it's able to go further.
The experiments also revealed to the team details they hadn't previously been able to visualize. They saw that the snake employed two waves when undulating: a large-amplitude horizontal wave and a newly discovered, smaller-amplitude vertical wave. The waves went side to side and up and down at the same time, and the data showed that the vertical wave went at twice the rate of the horizontal one. "This is really, really freaky," said Socha. These double waves have only been discovered in one other snake, a sidewinder, but its waves go at the same frequency.
"What really makes this study powerful is that we were able to dramatically advance both our understanding of glide kinematics and our ability to model the system," said Yeaton. "Snake flight is complicated, and it's often tricky to get the snakes to cooperate. And there are many intricacies to make the computational model accurate. But it's satisfying to put all of the pieces together."
"In all these years, I think I've seen close to a thousand glides," said Socha. "It's still amazing to see every time. Seeing it in person, there's something a little different about it. It's shocking still. What exactly is this animal doing? Being able to answer the questions I've had since I was a graduate student, many, many years later, is incredibly satisfying."
Socha credits some of the elements that shaped the real and simulated glide experiments to forces out of his control. Chance led him to the indoor glide arena: a few years after the Moss Arts Center opened, Tanner Upthegrove, a media engineer for the Institute for Creativity, Arts, and Technology, or ICAT, asked him if he'd ever thought about working in the Cube.
"What's the Cube?" he asked. When Upthegrove showed him the space, he was floored. It seemed designed for Socha's experiments.
In some ways, it was. "Many projects at ICAT used the advanced technology of the Cube, a studio unlike any other in the world, to reveal that which could normally not be seen," said Ben Knapp, the founding director of ICAT. "Scientists, engineers, artists, and designers join forces here to build, create, and innovate new ways to approach the world's grandest challenges."
In one of the center's featured projects, "Body, Full of Time," media and visual artists used the space to motion capture the body movements of dancers for an immersive performance. Trading dancers for snakes, Socha was able to make the most of the Cube's motion capture system. The team could move cameras around, optimizing their position for the snake's path. They took advantage of latticework at the top of the space to position two cameras pointing down, providing an overhead view of the snake, which they'd never been able to do before.
Socha and Ross see potential for their 3D model to continue exploring snake flight. The team is planning outdoor experiments to gather motion data from longer glides. And one day, they hope to cross the boundaries of biological reality.
Right now, their virtual flying snake always glides down, like the real animal. But what if they could get it to move so that it would actually start to go up? To really fly? That ability could potentially be built into the algorithms of robotic snakes, which have exciting applications in search and rescue and disaster monitoring, Ross said.
Read more at Science Daily
Consumers can distinguish between bitter tastes in beer -- doesn't alter liking
Although most beer consumers can distinguish between different bitter tastes in beer, this does not appear to influence which beer they like. It seems they just like beer, regardless of the source of the bitterness.
That is the conclusion of Penn State sensory researchers who conducted multiple studies with more than 150 self-identified beer drinkers to see if they could differentiate bitterants in beer. But the question of whether humans can discriminate between types of bitterness remains controversial, according to researcher John Hayes, associate professor of food science.
"Given that countless craft breweries around the country have been very successful in selling a near-endless variety of India pale ales -- better known as IPAs -- we wanted to move past testing bitter chemicals in water to see if consumers could differentiate different bitters in a real food such as beer," he said.
To determine beer drinkers' ability to distinguish between bitter chemicals, study participants in blind taste tests were given commercially available nonalcoholic beer spiked with hop extract Isolone, quinine -- the ingredient that makes tonic water bitter -- and sucrose octaacetate, a food additive so bitter it has been used as a nail-biting and thumb-sucking deterrent.
Participants, about half men and half women, most in their 30s, took part in three experiments. In the first, researchers asked subjects to rate the amount of bitterness and other beer flavor attributes in samples using an intensity scale, to ensure the beer samples were equally bitter.
In the next experiment, beer consumers rated how samples differed from a reference on a seven-point scale. Then, to understand how each sample differed from others, participants checked attributes on a list of 13 descriptors to describe the samples.
In the final experiment, beer consumers tasted the beer samples, rated how much they liked each sample and provided a forced-choice ranking for best-liked to worst-liked.
According to Hayes, who is director of Penn State's Sensory Evaluation Center in the College of Agricultural Sciences, most participants were able to discern differences in bitterness -- even though the samples had been matched for bitterness intensity.
"But our results also show that, despite being able to differentiate between the different bitter chemicals, they were not able to verbally describe these differences, even when provided a list of attributes," he said. "Further, we found no consistent effect on liking or preference. The source of bitterness did not influence which beers they liked."
In the sampled beers, researchers attempted to match the flavor profile of a pale ale style beer, in which high bitterness is not only accepted but desired by consumers, noted lead researcher Molly Higgins, who will receive her doctoral degree in food science this August. Higgins explained that she recruited regular beer consumers because they are more likely to be aware of the various flavor profiles of beer and respond positively to the bitter qualities of samples during testing.
"What we found was unsurprising in hindsight -- beer consumers simply like beer," she said. "So, it seems that for consumers who drink IPAs, a beer just needs to have a bitter profile. For them, it's about bitterness in general, not the specific bitter quality -- if it's there, they will like it."
Higgins suggests that this finding may help in quality assurance at breweries. "Beer consumers may be more forgiving than previously believed when it comes to small variations across batches," she said.
Higgins noted that some breweries use highly trained expert tasters to evaluate each batch. If these experts detect any off notes or flaws in the final product, they may throw out an entire batch. "When breweries can establish an acceptable range for sensory attributes for their final products, they can make better decisions about how much variation is tolerable," she said.
However, there are many segments of beer consumers, Higgins added, and within the craft beer market there is a unique subgroup of consumers who are devoted to their IPAs. Those beer drinkers, she explained, doubtlessly pick up on more of the finer bitter notes created by novel blends of hops. Those consumers patronize craft breweries and are willing to try many different beers.
The bitter beer tasting study, recently published in Nutrients, was part of a larger research project conducted by Higgins at Penn State for her dissertation. Because of its sensory complexity and wide acceptance by many consumers, she contends, beer is a good model food to explore the capacity of people to perceive bitter taste.
Higgins said when people ask her why she would do this kind of a study, she points out that it's not about beer.
Read more at Science Daily
That is the conclusion of Penn State sensory researchers who conducted multiple studies with more than 150 self-identified beer drinkers to see if they could differentiate bitterants in beer. But the question of whether humans can discriminate between types of bitterness remains controversial, according to researcher John Hayes, associate professor of food science.
"Given that countless craft breweries around the country have been very successful in selling a near-endless variety of India pale ales -- better known as IPAs -- we wanted to move past testing bitter chemicals in water to see if consumers could differentiate different bitters in a real food such as beer," he said.
To determine beer drinkers' ability to distinguish between bitter chemicals, study participants in blind taste tests were given commercially available nonalcoholic beer spiked with hop extract Isolone, quinine -- the ingredient that makes tonic water bitter -- and sucrose octaacetate, a food additive so bitter it has been used as a nail-biting and thumb-sucking deterrent.
Participants, about half men and half women, most in their 30s, took part in three experiments. In the first, researchers asked subjects to rate the amount of bitterness and other beer flavor attributes in samples using an intensity scale, to ensure the beer samples were equally bitter.
In the next experiment, beer consumers rated how samples differed from a reference on a seven-point scale. Then, to understand how each sample differed from others, participants checked attributes on a list of 13 descriptors to describe the samples.
In the final experiment, beer consumers tasted the beer samples, rated how much they liked each sample and provided a forced-choice ranking for best-liked to worst-liked.
According to Hayes, who is director of Penn State's Sensory Evaluation Center in the College of Agricultural Sciences, most participants were able to discern differences in bitterness -- even though the samples had been matched for bitterness intensity.
"But our results also show that, despite being able to differentiate between the different bitter chemicals, they were not able to verbally describe these differences, even when provided a list of attributes," he said. "Further, we found no consistent effect on liking or preference. The source of bitterness did not influence which beers they liked."
In the sampled beers, researchers attempted to match the flavor profile of a pale ale style beer, in which high bitterness is not only accepted but desired by consumers, noted lead researcher Molly Higgins, who will receive her doctoral degree in food science this August. Higgins explained that she recruited regular beer consumers because they are more likely to be aware of the various flavor profiles of beer and respond positively to the bitter qualities of samples during testing.
"What we found was unsurprising in hindsight -- beer consumers simply like beer," she said. "So, it seems that for consumers who drink IPAs, a beer just needs to have a bitter profile. For them, it's about bitterness in general, not the specific bitter quality -- if it's there, they will like it."
Higgins suggests that this finding may help in quality assurance at breweries. "Beer consumers may be more forgiving than previously believed when it comes to small variations across batches," she said.
Higgins noted that some breweries use highly trained expert tasters to evaluate each batch. If these experts detect any off notes or flaws in the final product, they may throw out an entire batch. "When breweries can establish an acceptable range for sensory attributes for their final products, they can make better decisions about how much variation is tolerable," she said.
However, there are many segments of beer consumers, Higgins added, and within the craft beer market there is a unique subgroup of consumers who are devoted to their IPAs. Those beer drinkers, she explained, doubtlessly pick up on more of the finer bitter notes created by novel blends of hops. Those consumers patronize craft breweries and are willing to try many different beers.
The bitter beer tasting study, recently published in Nutrients, was part of a larger research project conducted by Higgins at Penn State for her dissertation. Because of its sensory complexity and wide acceptance by many consumers, she contends, beer is a good model food to explore the capacity of people to perceive bitter taste.
Higgins said when people ask her why she would do this kind of a study, she points out that it's not about beer.
Read more at Science Daily
Responses to cyberbullying
It is well-known that victims of bullying can have higher risks of future health and social problems. However, different victims experience a broad range of responses and some may not suffer at all. Researchers felt this implied there might be factors that could protect against some consequences of bullying. In a study of over 6,000 adolescents in Japan, they found a strong candidate in the moderation of what is known as emotional competence.
Online bullying, or cyberbullying, is not a new phenomenon, but as the world becomes more dependent on online communication, it does become a greater threat. Lead author Yuhei Urano, Associate Professor Ryu Takizawa and Professor Haruhiko Shimoyama from the Department of Clinical Psychology at the University of Tokyo and their team investigated protective factors for the adverse effects of cyberbullying victimization. They analyzed data from 6,403 adolescents aged 12 to 18 (1,925 male, 4,478 female) for their study.
"We chose users of a social networking app as participants of the study, because they were likely to experience more online interactions than others," said Urano. "The surveys explored instances of cyberbullying victimization and a cross section of other personal and social information. These allowed us to investigate whether the ability to handle emotions, called emotional competence, correlated with the severity of the repercussions of cyberbullying."
What the researchers found may at first seem counterintuitive, but after careful analysis, their results showed that higher emotional skills were not always associated with better mental health; they may actually make things worse depending on the social context. It depends on the individual's specific emotional competence, defined as the ability to identify, understand, express, regulate and use emotions. There is intrapersonal emotional competence, the ability to handle one's own emotions, and interpersonal emotional competence, the ability to handle others' emotions.
"We thought that intrapersonal emotional competence showed buffering effects against cyberbullying, because the ability to handle one's own emotions is known to have a positive impact on our mental health," said Urano. "On the other hand, we thought interpersonal emotional competence showed the opposite effect. Because the ability to understand emotional states in others may encourage individuals to dwell on the bully's intentions."
The researchers hope this study could pave the way to investigations about the different roles of intrapersonal and interpersonal emotional competence, both the positive and negative effects they may have. However, given the inherent complexity of the topic in question, they suggest that in order to correctly determine the causal relationships behind their results, more longitudinal studies should be conducted in the future.
From Science Daily
Online bullying, or cyberbullying, is not a new phenomenon, but as the world becomes more dependent on online communication, it does become a greater threat. Lead author Yuhei Urano, Associate Professor Ryu Takizawa and Professor Haruhiko Shimoyama from the Department of Clinical Psychology at the University of Tokyo and their team investigated protective factors for the adverse effects of cyberbullying victimization. They analyzed data from 6,403 adolescents aged 12 to 18 (1,925 male, 4,478 female) for their study.
"We chose users of a social networking app as participants of the study, because they were likely to experience more online interactions than others," said Urano. "The surveys explored instances of cyberbullying victimization and a cross section of other personal and social information. These allowed us to investigate whether the ability to handle emotions, called emotional competence, correlated with the severity of the repercussions of cyberbullying."
What the researchers found may at first seem counterintuitive, but after careful analysis, their results showed that higher emotional skills were not always associated with better mental health; they may actually make things worse depending on the social context. It depends on the individual's specific emotional competence, defined as the ability to identify, understand, express, regulate and use emotions. There is intrapersonal emotional competence, the ability to handle one's own emotions, and interpersonal emotional competence, the ability to handle others' emotions.
"We thought that intrapersonal emotional competence showed buffering effects against cyberbullying, because the ability to handle one's own emotions is known to have a positive impact on our mental health," said Urano. "On the other hand, we thought interpersonal emotional competence showed the opposite effect. Because the ability to understand emotional states in others may encourage individuals to dwell on the bully's intentions."
The researchers hope this study could pave the way to investigations about the different roles of intrapersonal and interpersonal emotional competence, both the positive and negative effects they may have. However, given the inherent complexity of the topic in question, they suggest that in order to correctly determine the causal relationships behind their results, more longitudinal studies should be conducted in the future.
From Science Daily
Humans and monkeys show similar thinking patterns
Rhesus macaque |
In experiments on 100 study participants across age groups, cultures and species, researchers found that indigenous Tsimane' people in Bolivia's Amazon rainforest, American adults and preschoolers and macaque monkeys all show, to varying degrees, a knack for "recursion," a cognitive process of arranging words, phrases or symbols in a way that helps convey complex commands, sentiments and ideas.
The findings, published today (Friday, June 26) in the journal Science Advances, shed new light on our understanding of the evolution of language, researchers said.
"For the first time, we have strong empirical evidence about patterns of thinking that come naturally to probably all humans and, to a lesser extent, non-human primates," said study co-author Steven Piantadosi, a UC Berkeley assistant professor of psychology.
Indeed, the monkeys were found to perform far better in the tests than the researchers had predicted.
"Our data suggest that, with sufficient training, monkeys can learn to represent a recursive process, meaning that this ability may not be as unique to humans as is commonly thought," said Sam Cheyette, a Ph.D. student in Piantadosi's lab and co-author of the study.
Known in linguistics as "nested structures," recursive phrases within phrases are crucial to syntax and semantics in human language. A simple example is a British nursery rhyme that talks about "the dog that worried the cat that killed the rat that ate the malt that lay in the house that Jack built."
Researchers tested the recursive skills of 10 U.S. adults, 50 preschoolers and kindergarteners, 37 members of the Tsimane' and three male macaque monkeys.
First, all participants were trained to memorize different sequences of symbols in a particular order. Specifically, they learned sequences such as { ( ) } or { [ ] }, which are analogous to some linguistic nested structures.
Participants from the U.S. and monkeys used a large touchscreen monitor to memorize the sequences. They heard a ding if they got a symbol in the right place, a buzzer if they got it wrong and a chime if the whole sequence was correct. The monkeys received snacks or juice as positive feedback.
Meanwhile, the Tsimane' participants, who are less accustomed to interacting with computers, were tested with paper index cards and given verbal feedback.
Next, all participants were asked to place, in the right order, four images from different groupings shown in random order on the screen.
To varying degrees, the participants all arranged their new lists in recursive structures, which is remarkable given that "Tsimane' adults, preschool children and monkeys, who lack formal mathematics and reading training, had never been exposed to such stimuli before testing," the study noted.
"These results are convergent with recent findings that monkeys can learn other kinds of structures found in human grammar," Piantadosi said.
Read more at Science Daily
Jun 28, 2020
Sled dogs are closely related to 9,500-year-old 'ancient dog'
Sled dogs |
Dogs play an important role in human life all over the world -- whether as a family member or as a working animal. But where the dog comes from and how old various groups of dogs are is still a bit of a mystery.
Now, light has been shed on the origin of the sledge dog. In a new study published in SCIENCE, researchers from the Faculty of Health and Medical Sciences, University of Copenhagen, show that the sledge dog is both older and has adapted to the Arctic much earlier than thought. The research was conducted in collaboration with the University of Greenland and the Institute of Evolutionary Biology, Barcelona.
"We have extracted DNA from a 9,500-year-old dog from the Siberian island of Zhokhov, which the dog is named after. Based on that DNA we have sequenced the oldest complete dog genome to date, and the results show an extremely early diversification of dogs into types of sledge dogs," says one of the two first authors of the study, PhD student Mikkel Sinding, the Globe Institute.
Until now, it has been the common belief that the 9,500-year-old Siberian dog, Zhokhov, was a kind of ancient dog -- one of the earliest domesticated dogs and a version of the common origin of all dogs. But according to the new study, modern sledge dogs such as the Siberian Husky, the Alaskan Malamute and the Greenland sledge dog share the major part of their genome with Zhokhov.
"This means that modern sledge dogs and Zhokhov had the same common origin in Siberia more than 9,500 years ago. Until now, we have thought that sledge dogs were only 2-3,000 years old," says the other first author, Associate Professor Shyam Gopalakrishnan, Globe Institute.
The Original Sledge Dog
To learn more about the origins of the sledge dog, researchers have further sequenced genomes of a 33,000-year-old Siberian wolf and ten modern Greenlandic sledge dogs. They have compared these genomes to genomes of dogs and wolves from around the world.
"We can see that the modern sledge dogs have most of their genomes in common with Zhokhov. So, they are more closely related to this ancient dog than to other dogs and wolves. But not just that -- we can see traces of crossbreeding with wolves such as the 33,000-year-old Siberian wolf -- but not with modern wolves. It further emphasises that the origin of the modern sledge dog goes back much further than we had thought," says Mikkel Sinding.
The modern sledge dogs have more genetic overlap with other modern dog breeds than Zhokhov has, but the studies do not show us where or when this occurred. Nevertheless, among modern sledge dogs, the Greenland sledge dogs stands out and has the least overlap with other dogs, meaning that the Greenland sledge dog is probably the most original sledge dog in the world.
Common Features with Inuit and Polar Bears
In addition to advancing the common understanding of the origin of sledge dogs, the new study also teaches the researchers more about the differences between sledge dogs and other dogs. Sledge dogs do not have the same genetic adaptations to a sugar and starch rich diet that other dogs have. On the other hand, they have adaptations to high-fat diets, with mechanisms that are similar to those described for polar bears and Arctic people.
Read more at Science Daily
Mystery of solar cycle illuminated
Sun's surface |
A team of scientists from the Max Planck Institute for Solar System Research, the University of Göttingen and New York University Abu Dhabi has now succeeded in drawing the most comprehensive picture of the plasma flows in nort-south-direction to date. The researchers have found a remarkably simple flow geometry: the plasma describes a single turnover in each solar hemisphere, which lasts for about 22 years. In addition, the flow in the direction of the equator at the bottom of the convection zone causes spots to form closer and closer to the equator during the solar cycle.
The number of sunspots on the visible solar surface varies; sometimes there are more, sometimes fewer. The distance between two sunspot maxima is about eleven years, after 22 years the sunspots are again magnetically polarized in the same way. During the maximum not only large sunspots appear, but also active regions. In addition, impressive arcs of hot plasma reach far into the solar atmosphere, particles and radiation are hurled into space in violent eruptions. At the activity minimum, however, the sun calms down noticeably.
"Over the course of a solar cycle, the meridional flow acts as a conveyor belt that drags the magnetic field along and sets the period of the solar cycle," says Prof. Dr. Laurent Gizon, MPS Director and first author of the new study. "Seeing the geometry and the amplitude of motions in the solar interior is essential to understanding the Sun's magnetic field," he adds. To this end, Gizon and his team used helioseismology to map the plasma flow below the Sun's surface.
Helioseismology is to solar physics what seismology is to geophysics. Helioseismologists use sound waves to probe the Sun's interior, in much the same way geophysicists use earthquakes to probe the interior of the Earth. Solar sound waves have periods near five minutes and are continuously excited by near surface convection. The motions associated with solar sound waves can be measured at the Sun's surface by telescopes on spacecrafts or on the ground.
In this study, Gizon and his team used observations of sound waves at the surface that propagate in the north-south direction through the solar interior. These waves are perturbed by the meridional flow: they travel faster along the flow than against the flow. These very small travel-time perturbations (less than 1 second) were measured very carefully and were interpreted to infer the meridional flow using mathematical modeling and computers.
Because it is small, the meridional flow is extremely difficult to see in the solar interior. "The meridional flow is much slower than other components of motion, such as the Sun's differential rotation," Gizon explains. The meridional flow throughout the convection zone is no more than its maximum surface value of 50 kilometers per hour. "To reduce the noise level in the helioseismic measurements, it is necessary to average the measurements over very long periods of time," says Dr. Zhi-Chao Liang of MPS.
The team of scientists analyzed, for the first time, two independent very long time series of data. One was provided by SOHO, the oldest solar observatory in space which is operated by ESA and NASA. The data taken by SOHO's Michelson Doppler Imager (MDI) covers the time from 1996 until 2011. A second independent data set was provided by the Global Oscillation Network Group (GONG), which combines six ground-based solar telescopes in the USA, Australia, India, Spain, and Chile to offer nearly continuous observations of the Sun since 1995.
"The international solar physics community is to be commended for delivering multiple datasets covering the last two solar cycles," says Dr. John Leibacher, a former director of the GONG project. "This makes it possible to average over long periods of time and to compare answers, which is absolutely essential to validate inferences," he adds.
Gizon and his team find the flow is equatorward at the base of the convection zone, with a speed of only 15 kilometers per hour (running speed). The flow at the solar surface is poleward and reaches up to 50 kilometers per hour. The overall picture is that the plasma goes around in one gigantic loop in each hemisphere. Remarkably, the time taken for the plasma to complete the loop is approximately 22 years -- and this provides the physical explanation for the Sun's eleven-year cycle.
Read more at Science Daily
Subscribe to:
Posts (Atom)