Apr 19, 2024

Astronomers uncover methane emission on a cold brown dwarf

Using new observations from the James Webb Space Telescope (JWST), astronomers have discovered methane emission on a brown dwarf, an unexpected finding for such a cold and isolated world. Published in the journal Nature, the findings suggest that this brown dwarf might generate aurorae similar to those seen on our own planet as well as on Jupiter and Saturn.

More massive than planets but lighter than stars, brown dwarfs are ubiquitous in our solar neighborhood, with thousands identified. Last year, Jackie Faherty, a senior research scientist and senior education manager at the American Museum of Natural History, led a team of researchers who were awarded time on JWST to investigate 12 brown dwarfs. Among those was CWISEP J193518.59-154620.3 (or W1935 for short) -- a cold brown dwarf 47 light years away that was co-discovered by Backyard Worlds: Planet 9 citizen science volunteer Dan Caselden and the NASA CatWISE team. W1935 is a cold brown dwarf with a surface temperature of about 400° Fahrenheit, or about the temperature at which you'd bake chocolate chip cookies. The mass for W1935 isn't well known but it likely ranges between 6-35 times the mass of Jupiter.

After looking at a number of brown dwarfs observed with JWST, Faherty's team noticed that W1935 looked similar but with one striking exception: it was emitting methane, something that's never been seen before on a brown dwarf.

"Methane gas is expected in giant planets and brown dwarfs but we usually see it absorbing light, not glowing," said Faherty, the lead author of the study. "We were confused about what we were seeing at first but ultimately that transformed into pure excitement at the discovery."

Computer modeling yielded another surprise: the brown dwarf likely has a temperature inversion, a phenomenon in which the atmosphere gets warmer with increasing altitude. Temperature inversions can easily happen to planets orbiting stars, but W1935 is isolated, with no obvious external heat source.

"We were pleasantly shocked when the model clearly predicted a temperature inversion," said co-author Ben Burningham from the University of Hertfordshire. "But we also had to figure out where that extra upper atmosphere heat was coming from."

To investigate, the researchers turned to our solar system. In particular, they looked at studies of Jupiter and Saturn, which both show methane emission and have temperature inversions. The likely cause for this feature on solar system giants is aurorae, therefore, the research team surmised that they had uncovered that same phenomenon on W1935.

Planetary scientists know that one of the major drivers of aurorae on Jupiter and Saturn are high-energy particles from the Sun that interact with the planets' magnetic fields and atmospheres, heating the upper layers. This is also the reason for the aurorae that we see on Earth, commonly referred to as the Northern or Southern Lights since they are most extraordinary near the poles. But with no host star for W1935, a solar wind cannot contribute to the explanation.

There is an enticing additional reason for the aurora in our solar system. Both Jupiter and Saturn have active moons that occasionally eject material into space, interact with the planets, and enhance the auroral footprint on those worlds. Jupiter's moon Io is the most volcanically active world in the solar system, spewing lava fountains dozens of miles high, and Saturn's moon Enceleadus ejects water vapor from its geysers that simultaneously freezes and boils when it hits space. More observations are needed, but the researchers speculate that one explanation for the aurora on W1935 might be an active, yet-to-be discovered moon.

"Every time an astronomer points JWST at an object, there's a chance of a new mind-blowing discovery," said Faherty. "Methane emission was not on my radar when we started this project but now that we know it can be there and the explanation for it so enticing I am constantly on the look-out for it. That's part of how science moves forward."

Read more at Science Daily

Ice age climate analysis reduces worst-case warming expected from rising CO2

As carbon dioxide accumulates in the atmosphere, the Earth will get hotter. But exactly how much warming will result from a certain increase in CO2 is under study. The relationship between CO2 and warming, known as climate sensitivity, determines what future we should expect as CO2 levels continue to climb.

New research led by the University of Washington analyzes the most recent ice age, when a large swath of North America was covered in ice, to better understand the relationship between CO2 and global temperature. It finds that while most future warming estimates remain unchanged, the absolute worst-case scenario is unlikely.

The open-access study was published April 17 in Science Advances.

"The main contribution from our study is narrowing the estimate of climate sensitivity, improving our ability to make future warming projections," said lead author Vince Cooper, a UW doctoral student in atmospheric sciences. "By looking at how much colder Earth was in the ancient past with lower levels of greenhouse gases, we can estimate how much warmer the current climate will get with higher levels of greenhouse gases."

The new paper doesn't change the best-case warming scenario from doubling CO2 -- about 2 degrees Celsius average temperature increase worldwide -- or the most likely estimate, which is about 3 degrees Celsius. But it reduces the worst-case scenario for doubling of CO2 by a full degree, from 5 degrees Celsius to 4 degrees Celsius. (For reference, CO2 is currently at 425 ppm, or about 1.5 times preindustrial levels, and unless emissions drop is headed toward double preindustrial levels before the end of this century.)

As our planet heads toward a doubling of CO2, the authors caution that the recent decades are not a good predictor of the future under global warming. Shorter-term climate cycles and atmospheric pollution's effects are just some reasons that recent trends can't reliably predict the rest of this century.

"The spatial pattern of global warming in the most recent 40 years doesn't look like the long-term pattern we expect in the future -- the recent past is a bad analog for future global warming," said senior author Kyle Armour, a UW associate professor of atmospheric sciences and of oceanography.

Instead, the new study focused on a period 21,000 years ago, known as the Last Glacial Maximum, when Earth was on average 6 degrees Celsius cooler than today. Ice core records show that atmospheric CO2 then was less than half of today's levels, at about 190 parts per million.

"The paleoclimate record includes long periods that were on average much warmer or colder than the current climate, and we know that there were big climate forcings from ice sheets and greenhouse gases during those periods," Cooper said. "If we know roughly what the past temperature changes were and what caused them, then we know what to expect in the future."

Researchers including co-author Gregory Hakim, a UW professor of atmospheric sciences, have created new statistical modeling techniques that allow paleoclimate records to be assimilated into computer models of Earth's climate, similar to today's weather forecasting models. The result is more realistic temperature maps from previous millennia.

For the new study the authors combined prehistoric climate records -- including ocean sediments, ice cores, and preserved pollen -- with computer models of Earth's climate to simulate the weather of the Last Glacial Maximum. When much of North America was covered with ice, the ice sheet didn't just cool the planet by reflecting summer sunlight off the continents, as previous studies had considered.

By altering wind patterns and ocean currents, the ice sheet also caused the northern Pacific and Atlantic oceans to become especially cold and cloudy. Analysis in the new study shows that these cloud changes over the oceans compounded the glacier's global cooling effects by reflecting even more sunlight.

In short, the study shows that CO2 played a smaller role in setting ice age temperatures than previously estimated. The flipside is that the most dire predictions for warming from rising CO2 are less likely over coming decades.

"This paper allows us to produce more confident predictions because it really brings down the upper end of future warming, and says that the most extreme scenario is less likely," Armour said. "It doesn't really change the lower end, or the average estimate, which remain consistent with all the other lines of evidence."

Read more at Science Daily

Marine plankton behavior could predict future marine extinctions

Marine communities migrated to Antarctica during the Earth's warmest period in 66 million years long before a mass-extinction event.

All but the most specialist sea plankton moved to higher latitudes during the Early Eocene Climatic Optimum, an interval of sustained high global temperatures equivalent to worst case global warming scenarios.

When the team, comprised of researchers from the University of Bristol, Harvard University, University of Texas Institute for Geophysics and the University of Victoria, compared biodiversity and global community structure, they found that the community often responds to climate change millions of years before losses of biodiversity.

The study, published today in Nature, suggests that plankton migrated to cooler regions to escape the tropical heat and that only the most highly specialised species were able to remain.

These findings imply that changes on the community scale will be evident long before extinctions in the modern world and that more effort must be placed on monitoring the structure of marine communities to potentially predict future marine extinctions.

Dr Adam Woodhouse from the University of Bristol's School of Earth Sciences, explained: "Considering three billion people live in the tropics, this is not great news.

"We knew that biodiversity amongst marine plankton groups has changed throughout the last 66 million years, but no one had ever explored it on a global, spatial, scale through the lens of a single database.

"We used the Triton dataset, that I created during my PhD, which offered new insights into how biodiversity responds spatially to global changes in climate, especially during intervals of global warmth which are relevant to future warming projections."

Dr Woodhouse teamed up with Dr Anshuman Swain, an ecologist and specialist in the application of networks to biological data. They applied networks to micropalaeontology for the first time ever to document the global spatial changes in community structure as climate has evolved over the Cenozoic, building on previous research on cooling restructured global marine plankton communities.

Dr Woodhouse continued: "The fossil record of marine plankton is the most complete and extensive archive of ancient biological changes available to science. By applying advanced computational analyses to this archive we were able to detail global community structure of the oceans since the death of the dinosaurs, revealing that community change often precedes the extinction of organisms.

"This exciting result suggests that monitoring of ocean community structure may represent an 'early warning system' which precedes the extinction of oceanic life."

Read more at Science Daily

Honey bees experience multiple health stressors out-in-the-field

It's not a single pesticide or virus stressing honey bees, and affecting their health, but exposure to a complex web of multiple interacting stressors encountered while at work pollinating crops, found new research out of York University.

Scientists have been unable to explain increasing colony mortality, even after decades of research examining the role of specific pesticides, parasitic mites, viruses or genetics. This led the research team to wonder if previous studies were missing something by focussing on one stressor at a time.

"Our study is the first to apply systems level or network analyses to honey bee stressors at a massive scale. I think this represents a paradigm shift in the field because we have been so focussed on finding the one big thing, the smoking gun," says corresponding author of the new paper York Faculty of Science Professor Amro Zayed, York Research Chair in Genomics. "But we are finding that bees are exposed to a very complicated network of stressors that change quickly over time and space. It's a level of complexity that we haven't thought about before. To me, that's the big surprise of this study."

The paper, Honey bee stressor networks are complex and dependent on crop and region, published today in Current Biology, takes a much broader look at the interplay of stressors and their effects. The study team also included researchers from the University of British Columbia, Agriculture and Agri-Food Canada, the University of Victoria, the University of Lethbridge, the University of Manitoba, l'Université Laval, the University of Guelph, and the Ontario Beekeepers' Association.

Not all stressors are the same, however. Some stressors are more influential than others -- what researchers call the social media influencers of the bee world -- having an outsized impact on the architecture of a highly complex network and their co-stressors. They also found that most of these influencer stressors are viruses and pesticides that regularly show up in combination with specific other stressors, compounding the negative effects through their interactions.

"Understanding which stressors co-occur and are likely to interact is profoundly important to unravelling how they are impacting the health and mortality of honey bee colonies," says lead author, York Postdoctoral Fellow Sarah French of the Faculty of Science.

"There have been a lot of studies about major pesticides, but in this research, we also saw a lot of minor pesticides that we don't usually think about or study. We also found a lot of viruses that beekeepers don't typically test for or manage. Seeing the influencer stressors interact with all these other stressors, whether it be mites, other pesticides or viruses, was not only interesting, but surprising."

French says the way influencer stressors co-occur with other stressors is similar to the way humans experience co-morbidities, such as when someone is diagnosed with heart disease. They are more likely to also have diabetes or high blood pressure or both, and each one impacts the other. "That's similar to the way we examine bee colonies. We look at everything that's going on in the colony and then compare or amalgamate all the colonies together to look at the broader patterns of what is happening and how everything is related. Two or multiple stressors can really synergize off each other leading to a much greater effect on bee health."

From Québec to British Columbia, honey bee colonies were given the job of pollinating some of Canada's most valuable crops -- apples, canola oil and seed, highbush and lowbush blueberry, soybean, cranberry and corn. The study covered multiple time scales, providing numerous snapshots, rather than the usual single snapshot in time. The research team found that honey bees were exposed to an average of 23 stressors at once that combined to create 307 interactions.

Honey bees are a billion dollar industry. In 2021, honey bees contributed some $7 billion in economic value by pollinating orchards, vegetables, berries and oil seeds like canola, and produced 75 to 90 million pounds of honey. Figuring which stressors would provide the most benefit if managed would go a long way toward developing the right tools to tackle them, something beekeepers are often lacking.

The research is part of the BEECSI: 'OMIC tools for assessing bee health project funded to the tune of $10 million by Genome Canada in 2018 to use genomic tools to develop a new health assessment and diagnosis platform powered by stressor-specific markers.

More research is needed to unravel how the stressors are interacting and impacting honey bee mortality and colony health going forward, says French. "It's really teasing apart which of these compounds might have that relationship and how can we build off this to study those specific relationships."

It can't come soon enough, honey bees are currently facing poor health, colony loss, parasites, pathogens and heightened stressors worldwide. Some beekeepers in this country and the United States face a loss over winter of up to 60 per cent of their colonies.

Read more at Science Daily

Apr 18, 2024

Most massive stellar black hole in our galaxy found

Astronomers have identified the most massive stellar black hole yet discovered in the Milky Way galaxy. This black hole was spotted in data from the European Space Agency's Gaia mission because it imposes an odd 'wobbling' motion on the companion star orbiting it. Data from the European Southern Observatory's Very Large Telescope (ESO's VLT) and other ground-based observatories were used to verify the mass of the black hole, putting it at an impressive 33 times that of the Sun.

Stellar black holes are formed from the collapse of massive stars and the ones previously identified in the Milky Way are on average about 10 times as massive as the Sun. Even the next most massive stellar black hole known in our galaxy, Cygnus X-1, only reaches 21 solar masses, making this new 33-solar-mass observation exceptional.

Remarkably, this black hole is also extremely close to us -- at a mere 2000 light-years away in the constellation Aquila, it is the second-closest known black hole to Earth. Dubbed Gaia BH3 or BH3 for short, it was found while the team were reviewing Gaia observations in preparation for an upcoming data release. "No one was expecting to find a high-mass black hole lurking nearby, undetected so far," says Gaia collaboration member Pasquale Panuzzo, an astronomer at the Observatoire de Paris, part of France's National Centre for Scientific Research (CNRS). "This is the kind of discovery you make once in your research life."

To confirm their discovery, the Gaia collaboration used data from ground-based observatories, including from the Ultraviolet and Visual Echelle Spectrograph (UVES) instrument on ESO's VLT, located in Chile's Atacama Desert. These observations revealed key properties of the companion star, which, together with Gaia data, allowed astronomers to precisely measure the mass of BH3.

Astronomers have found similarly massive black holes outside our galaxy (using a different detection method), and have theorised that they may form from the collapse of stars with very few elements heavier than hydrogen and helium in their chemical composition. These so-called metal-poor stars are thought to lose less mass over their lifetimes and hence have more material left over to produce high-mass black holes after their death. But evidence directly linking metal-poor stars to high-mass black holes has been lacking until now.

Stars in pairs tend to have similar compositions, meaning that BH3's companion holds important clues about the star that collapsed to form this exceptional black hole. UVES data showed that the companion was a very metal-poor star, indicating that the star that collapsed to form BH3 was also metal-poor -- just as predicted.

The research study, led by Panuzzo, is published today in Astronomy & Astrophysics. "We took the exceptional step of publishing this paper based on preliminary data ahead of the forthcoming Gaia release because of the unique nature of the discovery," says co-author Elisabetta Caffau, also a Gaia collaboration member from the CNRS Observatoire de Paris. Making the data available early will let other astronomers start studying this black hole right now, without waiting for the full data release, planned for late 2025 at the earliest.

Further observations of this system could reveal more about its history and about the black hole itself. The GRAVITY instrument on ESO's VLT Interferometer, for example, could help astronomers find out whether this black hole is pulling in matter from its surroundings and better understand this exciting object.

Read more at Science Daily

38 trillion dollars in damages each year: World economy already committed to income reduction of 19 % due to climate change

Even if CO2 emissions were to be drastically cut down starting today, the world economy is already committed to an income reduction of 19 % until 2050 due to climate change, a new study published in Nature finds. These damages are six times larger than the mitigation costs needed to limit global warming to two degrees. Based on empirical data from more than 1,600 regions worldwide over the past 40 years, scientists at the Potsdam Institute for Climate Impact Research (PIK) assessed future impacts of changing climatic conditions on economic growth and their persistence.

"Strong income reductions are projected for the majority of regions, including North America and Europe, with South Asia and Africa being most strongly affected. These are caused by the impact of climate change on various aspects that are relevant for economic growth such as agricultural yields, labour productivity or infrastructure," says PIK scientist and first author of the study Maximilian Kotz. Overall, global annual damages are estimated to be at 38 trillion dollars, with a likely range of 19-59 trillion dollars in 2050. These damages mainly result from rising temperatures but also from changes in rainfall and temperature variability. Accounting for other weather extremes such as storms or wildfires could further raise them.

Huge economic costs also for the United States and European Union

"Our analysis shows that climate change will cause massive economic damages within the next 25 years in almost all countries around the world, also in highly-developed ones such as Germany, France and the United States," says PIK scientist Leonie Wenz who led the study. "These near-term damages are a result of our past emissions. We will need more adaptation efforts if we want to avoid at least some of them. And we have to cut down our emissions drastically and immediately -- if not, economic losses will become even bigger in the second half of the century, amounting to up to 60% on global average by 2100. This clearly shows that protecting our climate is much cheaper than not doing so, and that is without even considering non-economic impacts such as loss of life or biodiversity."

To date, global projections of economic damages caused by climate change typically focus on national impacts from average annual temperatures over long-time horizons. By including the latest empirical findings from climate impacts on economic growth in more than 1,600 subnational regions worldwide over the past 40 years and by focusing on the next 26 years, the researchers were able to project sub-national damages from temperature and rainfall changes in great detail across time and space all the while reducing the large uncertainties associated with long-term projections. The scientists combined empirical models with state-of-the-art climate simulations (CMIP-6). Importantly, they also assessed how persistently climate impacts have affected the economy in the past and took this into account as well.

Read more at Science Daily

Interspecies competition led to even more forms of ancient human -- defying evolutionary trends in vertebrates

Competition between species played a major role in the rise and fall of hominins -- and produced a "bizarre" evolutionary pattern for the Homo lineage -- according to a new University of Cambridge study that revises the start and end dates for many of our early ancestors.

Conventionally, climate is held responsible for the emergence and extinction of hominin species. In most vertebrates, however, interspecies competition is known to play an important role.

Now, research shows for the first time that competition was fundamental to "speciation" -- the rate at which new species emerge -- across five million years of hominin evolution.

The study, published today in Nature Ecology & Evolution, also suggests that the species formation pattern of our own lineage was unlike almost anything else.

"We have been ignoring the way competition between species has shaped our own evolutionary tree," said lead author Dr Laura van Holstein, a University of Cambridge biological anthropologist from Clare College. "The effect of climate on hominin species is only part of the story."

In other vertebrates, species form to fill ecological "niches" says van Holstein. Take Darwin's finches: some evolved large beaks for nut-cracking, while others evolved small beaks for feeding on certain insects. When each resource niche gets filled, competition kicks in, so no new finches emerge and extinctions take over.

Van Holstein used Bayesian modelling and phylogenetic analyses to show that, like other vertebrates, most hominin species formed when competition for resources or space were low.

"The pattern we see across many early hominins is similar to all other mammals. Speciation rates increase and then flatline, at which point extinction rates start to increase. This suggests that interspecies competition was a major evolutionary factor."

However, when van Holstein analysed our own group, Homo, the findings were "bizarre."

For the Homo lineage that led to modern humans, evolutionary patterns suggest that competition between species actually resulted in the appearance of even more new species -- a complete reversal of the trend seen in almost all other vertebrates.

"The more species of Homo there were, the higher the rate of speciation. So when those niches got filled, something drove even more species to emerge. This is almost unparalleled in evolutionary science."

The closest comparison she could find was in beetle species that live on islands, where contained ecosystems can produce unusual evolutionary trends.

"The patterns of evolution we see across species of Homo that led directly to modern humans is closer to those of island-dwelling beetles than other primates, or even any other mammal."

Recent decades have seen the discovery of several new hominin species, from Australopithecus sediba to Homo floresiensis. Van Holstein created a new database of "occurrences" in the hominin fossil record: each time an example of a species was found and dated, around 385 in total.

Fossils can be an unreliable measure of species' lifetimes. "The earliest fossil we find will not be the earliest members of a species," said van Holstein.

"How well an organism fossilises depends on geology, and on climatic conditions: whether it is hot or dry or damp. With research efforts concentrated in certain parts of the world, and we might well have missed younger or older fossils of a species as a result."

Van Holstein used data modelling to address this problem, and factor in likely numbers of each species at the beginning and end of their existence, as well as environmental factors on fossilisation, to generate new start and end dates for most known hominin species (17 in total).

She found that some species thought to have evolved through "anagenesis" -- when one slowly turns into another, but lineage doesn't split -- may have actually "budded": when a new species branches off from an existing one.*

This meant that several more hominin species than previously assumed were co-existing, and so possibly competing.

While early species of hominins, such as Paranthropus, probably evolved physiologically to expand their niche -- adapting teeth to exploit new types of food, for example -- the driver of the very different pattern in our own genus Homo may well have been technology.

"Adoption of stone tools or fire, or intensive hunting techniques, are extremely flexible behaviours. A species that can harness them can quickly carve out new niches, and doesn't have to survive vast tracts of time while evolving new body plans," said van Holstein

She argues that an ability to use technology to generalise, and rapidly go beyond ecological niches that force other species to compete for habitat and resources, may be behind the exponential increase in the number of Homo species detected by the latest study.

But it also led to Homo sapiens -- the ultimate generalisers. And competition with an extremely flexible generalist in almost every ecological niche may be what contributed to the extinction of all other Homo species.

Added van Holstein: "These results show that, although it has been conventionally ignored, competition played an important role in human evolution overall. Perhaps most interestingly, in our own genus it played a role unlike that across any other vertebrate lineage known so far."

Read more at Science Daily

Paleontologists unearth what may be the largest known marine reptile

The fossilised remains of a second gigantic jawbone measuring more than two metres long has been found on a beach in Somerset, UK.

Experts have identified the bones as belonging to the jaws of a new species of enormous ichthyosaur, a type of prehistoric marine reptile. Estimates suggest the oceanic titan would have been more than 25 metres long.

Father and daughter, Justin and Ruby Reynolds from Braunton, Devon, found the first pieces of the second jawbone to be found in May 2020, while searching for fossils on the beach at Blue Anchor, Somerset. Ruby, then aged 11, found the first chunk of giant bone before searching together for additional pieces.

Realising they had discovered something significant, they contacted leading ichthyosaur expert, Dr Dean Lomax, a palaeontologist at The University of Manchester. Dr Lomax, who is also a 1851 Research Fellow at the University of Bristol, contacted Paul de la Salle, a seasoned fossil collector who had found the first giant jawbone in May 2016 from further along the coast at Lilstock.

Dr Dean Lomax said: "I was amazed by the find. In 2018, my team (including Paul de la Salle) studied and described Paul's giant jawbone and we had hoped that one day another would come to light. This new specimen is more complete, better preserved, and shows that we now have two of these giant bones -- called a surangular -- that have a unique shape and structure. I became very excited, to say the least."

Justin and Ruby, together with Paul, Dr Lomax, and several family members, visited the site to hunt for more pieces of this rare discovery. Over time, the team found additional pieces of the same jaw which fit together perfectly, like a multimillion-year-old jigsaw.

Justin said: "When Ruby and I found the first two pieces we were very excited as we realised that this was something important and unusual. When I found the back part of the jaw, I was thrilled because that is one of the defining parts of Paul's earlier discovery."

The last piece of bone was recovered in October 2022.

The research team, led by Dr Lomax, revealed that the jaw bones belong to a new species of giant ichthyosaur that would have been about the size of a blue whale. Comparing the two examples of the same bone with the same unique features from the same geologic time zone supports their identifications.

The team have called the new genus and species Ichthyotitan severnensis, meaning "giant fish lizard of the Severn."

The bones are around 202 million years old, dating to the end of the Triassic Period in a time known as the Rhaetian. During this time, the gigantic ichthyosaurs swam the seas while the dinosaurs walked on land. It was the titans' final chapter, however -- as the story told in the rocks above these fossils record a cataclysm known as the Late Triassic global mass extinction event. After this time, giant ichthyosaurs from the family known as Shastasauridae go extinct. Today, these bones represent the very last of their kind.

Ichthyotitan is not the world's first giant ichthyosaur, but de la Salles' and Reynolds' discoveries are unique among those known to science. These two bones appear roughly 13 million years after their latest geologic relatives, including Shonisaurus sikanniensis from British Columbia, Canada, and Himalayasaurus tibetensis from Tibet, China.

Dr Lomax added: "I was highly impressed that Ruby and Justin correctly identified the discovery as another enormous jawbone from an ichthyosaur. They recognised that it matched the one we described in 2018. I asked them whether they would like to join my team to study and describe this fossil, including naming it. They jumped at the chance. For Ruby, especially, she is now a published scientist who not only found but also helped to name a type of gigantic prehistoric reptile. There are probably not many 15-year-olds who can say that! A Mary Anning in the making, perhaps."

Ruby said: "It was so cool to discover part of this gigantic ichthyosaur. I am very proud to have played a part in a scientific discovery like this."

Further examinations of the bones' internal structures have been carried out by master's student, Marcello Perillo, from the University of Bonn, Germany. His work confirmed the ichthyosaur origin of the bones and revealed that the animal was still growing at the time of death.

He said: "We could confirm the unique set of histological characters typical of giant ichthyosaur lower jaws: the anomalous periosteal growth of these bones hints at yet to be understood bone developmental strategies, now lost in the deep time, that likely allowed late Triassic ichthyosaurs to reach the known biological limits of vertebrates in terms of size. So much about these giants is still shrouded by mystery, but one fossil at a time we will be able to unravel their secret."

Concluding the work, Paul de la Salle added: "To think that my discovery in 2016 would spark so much interest in these enormous creatures fills me with joy. When I found the first jawbone, I knew it was something special. To have a second that confirms our findings is incredible. I am overjoyed."

Read more at Science Daily

Apr 17, 2024

No gamma rays seen coming from nearby supernova

A nearby supernova in 2023 offered astrophysicists an excellent opportunity to test ideas about how these types of explosions boost particles, called cosmic rays, to near light-speed. But surprisingly, NASA's Fermi Gamma-ray Space Telescope detected none of the high-energy gamma-ray light those particles should produce.

On May 18, 2023, a supernova erupted in the nearby Pinwheel galaxy (Messier 101), located about 22 million light-years away in the constellation Ursa Major. The event, named SN 2023ixf, is the most luminous nearby supernova discovered since Fermi launched in 2008.

"Astrophysicists previously estimated that supernovae convert about 10% of their total energy into cosmic ray acceleration," said Guillem Martí-Devesa, a researcher at the University of Trieste in Italy. "But we have never observed this process directly. With the new observations of SN 2023ixf, our calculations result in an energy conversion as low as 1% within a few days after the explosion. This doesn't rule out supernovae as cosmic ray factories, but it does mean we have more to learn about their production."

The paper, led by Martí-Devesa while at the University of Innsbruck in Austria, will appear in a future edition of Astronomy and Astrophysics.

Trillions of trillions of cosmic rays collide with Earth's atmosphere every day. Roughly 90% of them are hydrogen nuclei -- or protons -- and the remainder are electrons or the nuclei of heavier elements.

Scientists have been investigating cosmic ray origins since the early 1900s, but the particles can't be traced back to their sources. Because they're electrically charged, cosmic rays change course as they travel to Earth thanks to magnetic fields they encounter.

"Gamma rays, however, travel directly to us," said Elizabeth Hays, the Fermi project scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Cosmic rays produce gamma rays when they interact with matter in their environment. Fermi is the most sensitive gamma-ray telescope in orbit, so when it doesn't detect an expected signal, scientists must explain the absence. Solving that mystery will build a more accurate picture of cosmic ray origins."

Astrophysicists have long suspected supernovae of being top cosmic ray contributors.

These explosions occur when a star at least eight times the Sun's mass runs out of fuel. The core collapses and then rebounds, propelling a shock wave outward through the star. The shock wave accelerates particles, creating cosmic rays. When cosmic rays collide with other matter and light surrounding the star, they generate gamma rays.

Supernovae greatly impact a galaxy's interstellar environment. Their blast waves and expanding cloud of debris may persist for more than 50,000 years. In 2013, Fermi measurements showed that supernova remnants in our own Milky Way galaxy were accelerating cosmic rays, which generated gamma-ray light when they struck interstellar matter. But astronomers say the remnants aren't producing enough high-energy particles to match scientists' measurements on Earth.

One theory proposes that supernovae may accelerate the most energetic cosmic rays in our galaxy in the first few days and weeks after the initial explosion.

But supernovae are rare, occurring only a few times a century in a galaxy like the Milky Way. Out to distances of around 32 million light-years, a supernova occurs, on average, just once a year.

After a month of observations, starting when visible light telescopes first saw SN 2023ixf, Fermi had not detected gamma rays.

"Unfortunately, seeing no gamma rays doesn't mean there are no cosmic rays," said co-author Matthieu Renaud, an astrophysicist at the Montpellier Universe and Particles Laboratory, part of the National Center for Scientific Research in France. "We have to go through all the underlying hypotheses regarding acceleration mechanisms and environmental conditions in order to convert the absence of gamma rays into an upper limit for cosmic ray production."

The researchers propose a few scenarios that may have affected Fermi's ability to see gamma rays from the event, like the way the explosion distributed debris and the density of material surrounding the star.

Read more at Science Daily

CO2 worsens wildfires by helping plants grow

By fueling the growth of plants that become kindling, carbon dioxide is driving an increase in the severity and frequency of wildfires, according to a UC Riverside study.

The worldwide surge in wildfires over the past decade is often attributed to the hotter, drier conditions of climate change. However, the study found that the effect of increasing levels of carbon dioxide (CO2) on plants may be a bigger factor.

"It's not because it's hotter that things are burning, it's because there's more fuel, in the form of plants," said UCR doctoral student in Earth and planetary sciences and study author James Gomez.

This conclusion, and a description of the eight model experiments that produced it, have been published in Communications Earth & Environment.

To convert light into food in a process called photosynthesis, plants require CO2. Burning fossil fuels for heat, electricity, and transportation is adding increasing levels of CO2 into the atmosphere. Plants use the extra CO2 to make carbohydrates that help them grow, leading to an increase in biomass that burns.

Certainly, heat waves and drought occur more frequently in today's climate than they did 50 years ago. These are conditions that cause plants to wither and die. As they dry out and die, they burn more easily. The models accounted for these effects on plants, as well as for different types of plants, and for the increase in atmospheric CO2.

"Warming and drying are still important fire factors. These are the conditions that make the extra plant mass more flammable," said UCR professor of Earth sciences Robert Allen.

The models analyzed by the research team all assumed an idealized 1% per year increase in atmospheric CO2 concentrations since 1850. The idealized increase is meant to isolate the effects of the greenhouse gas on wildfire activity.

"These experiments are mainly looking at the contribution of CO2 to changes in wildfire activity," Gomez said. "That's the only thing that's changing in these models. Other drivers of climate change and wildfire activity do not change through time," Gomez said. "This includes, for example, changes in other greenhouse gases like methane, as well as changes in land use."

Seasons are still important factors in promoting wildfires, and fires still occur more often during "fire seasons." Dry, windy conditions help spread the flames faster, increasing the size of the burned area. "However, our study shows the increase in fires during hotter seasons is driven by fuel load rather than an increase in the number of what some consider 'fire weather' days," Gomez said.

This means megafires can often happen outside of what is considered fire season. As an example, the biggest wildfire on record in Texas, with more than a million acres burned, occurred this past February.

The researchers hope that their results inspire others to conduct additional studies of the factors driving the increase in wildfires. In addition, they hope that policymakers recognize the urgent need to decrease the amount of CO2 that people release into the atmosphere.

Read more at Science Daily

Microplastics make their way from the gut to other organs

It's happening every day. From our water, our food and even the air we breathe, tiny plastic particles are finding their way into many parts of our body.

But what happens once those particles are inside? What do they do to our digestive system?

In a recent paper published in the journal Environmental Health Perspectives, University of New Mexico researchers found that those tiny particles -- microplastics -- are having a significant impact on our digestive pathways, making their way from the gut and into the tissues of the kidney, liver and brain.

Eliseo Castillo, PhD, an associate professor in the Division of Gastroenterology & Hepatology in the UNM School of Medicine's Department of Internal Medicine and an expert in mucosal immunology, is leading the charge at UNM on microplastic research.

"Over the past few decades, microplastics have been found in the ocean, in animals and plants, in tap water and bottled water," Castillo, says. "They appear to be everywhere."

Scientists estimate that people ingest 5 grams of microplastic particles each week on average -- equivalent to the weight of a credit card.

While other researchers are helping to identify and quantify ingested microplastics, Castillo and his team focus on what the microplastics are doing inside the body, specifically to the gastrointestinal (GI) tract and to the gut immune system.

Over a four-week period, Castillo, postdoctoral fellow Marcus Garcia, PharmD, and other UNM researchers exposed mice to microplastics in their drinking water. The amount was equivalent to the quantity of microplastics humans are believed to ingest each week.

Microplastics had migrated out of the gut into the tissues of the liver, kidney and even the brain, the team found. The study also showed the microplastics changed metabolic pathways in the affected tissues.

"We could detect microplastics in certain tissues after the exposure," Castillo says. "That tells us it can cross the intestinal barrier and infiltrate into other tissues."

Castillo says he's also concerned about the accumulation of the plastic particles in the human body. "These mice were exposed for four weeks," he says. "Now, think about how that equates to humans, if we're exposed from birth to old age."

The healthy laboratory animals used in this study showed changes after brief microplastic exposure, Castillo says. "Now imagine if someone has an underlying condition, and these changes occur, could microplastic exposure exacerbate an underlying condition?"

He has previously found that microplastics are also impacting macrophages -- the immune cells that work to protect the body from foreign particles.

In a paper published in the journal Cell Biology & Toxicology in 2021, Castillo and other UNM researchers found that when macrophages encountered and ingested microplastics, their function was altered and they released inflammatory molecules.

"It is changing the metabolism of the cells, which can alter inflammatory responses," Castillo says. "During intestinal inflammation -- states of chronic illness such as ulcerative colitis and Crohn's disease, which are both forms of inflammatory bowel disease -- these macrophages become more inflammatory and they're more abundant in the gut."

The next phase of Castillo's research, which is being led by postdoctoral fellow Sumira Phatak, PhD, will explore how diet is involved in microplastic uptake.

"Everyone's diet is different," he says. "So, what we're going to do is give these laboratory animals a high-cholesterol/high-fat diet, or high-fiber diet, and they will be either exposed or not exposed to microplastics. The goal is to try to understand if diet affects the uptake of microplastics into our body."

Castillo says one of his PhD students, Aaron Romero, is also working to understand why there is a change in the gut microbiota. "Multiple groups have shown microplastics change the microbiota, but how it changes the microbiota hasn't been addressed."

Castillo hopes that his research will help uncover the potential impacts microplastics are having to human health and that it will help spur changes to how society produces and filtrates plastics.

Read more at Science Daily

Can animals count?

Research co-led by neuroscientists Professor Yung Wing-ho from City University of Hong Kong (CityUHK) and Professor Ke Ya from The Chinese University of Hong Kong (CUHK) Faculty of Medicine (CU Medicine) has made a groundbreaking discovery regarding number sense in animals by confirming the existence of discrete number sense in rats, offering a crucial animal model for investigating the neural basis of numerical ability and disability in humans.

The research team has developed an innovative approach that employs a novel numerical learning task, brain manipulation techniques and artificial intelligence modelling, and that resolves an ongoing argument about whether rats have a sense of numbers. The study sheds light on the mechanisms underlying numerical ability. The findings have been published in the renowned multidisciplinary scientific journal Science Advances.

Number sense closely linked to survival and intelligence

Number sense is a fundamental ability in animals' perception of the world and increases their chances of survival. It is also an important cognitive ability, which is fundamental to mathematical aptitude, a hallmark of human intelligence. About 3% to 7% of people suffer from dyscalculia, a learning disability that affects the ability to learn arithmetic and mathematics of people of normal intelligence; a deficit in number sense is one of the major symptoms.

Number sense refers to the capability to compare, estimate and manipulate nonsymbolic numerical quantities, rather than associated magnitudes, which are continuous dimensions inherent in a group of items, such as the area of visual objects or the duration of sound pulses. There have been challenges regarding whether number sense can be assessed in isolation from the influence of continuous magnitudes. Also, there has been a vivid ongoing debate regarding whether the sense of magnitude or the sense of number is more fundamental.

Study confirms that the rat brain has a specific area for dealing with numbers

The research team minimised the influence of continuous magnitudes in numerical tests and conducted meticulous quantitative analyses in the study to determine the respective contributions of numbers and magnitudes. They developed an algorithm to generate stimuli that enable animals to focus only on numbers, minimising other distracting factors. This will help scientists better understand how animals perceive and quantify numbers.

The study found that rats without any previous knowledge of numbers were able to develop a sense of numbers when trained with sounds representing two or three numbers. Despite the influence of continuous magnitudes, the rats consistently focused on the number of sounds when making choices for food rewards.

Professor Yung, Associate Dean of the Jockey Club College of Veterinary Medicine and Life Sciences and Chair Professor of Cognitive Neuroscience at CityUHK, said, "Our study helps dissect the relationship between magnitude and numerosity processing. We discovered that when we blocked a specific part of the rats' brain, called the posterior parietal cortex, their ability to understand numbers was affected but not their sense of magnitude. This suggests that the brain has a specific area for dealing with numbers. In fact, this is the first time scientists have demonstrated that rats have the ability to discriminate and categorise three different numbers in a single test, surpassing a simple quantity comparison."

Professor Ke from the School of Biomedical Sciences at CU Medicine expressed excitement about the findings. "The study not only solves a long-standing mystery about how brains handle numbers, but also offers new insights into studying the specific neural circuits involved in number processing in animals and how genes are associated with mathematical ability," she said. "Furthermore, the findings from neural network modelling could have practical applications in the field of artificial intelligence. In the future, our increased understanding of the brain mechanisms underlying the processing of numbers may contribute to the development of interventions for individuals with numerical difficulties."

Read more at Science Daily

Apr 16, 2024

Physicists solve puzzle about ancient galaxy found by Webb telescope

Last September, the James Webb Space Telescope, or JWST, discovered JWST-ER1g, a massive ancient galaxy that formed when the universe was just a quarter of its current age. Surprisingly, an Einstein ring is associated with this galaxy. That's because JWST-ER1g acts as a lens and bends light from a distant source, which then appears as a ring -- a phenomenon called strong gravitational lensing, predicted in Einstein's theory of general relativity.

The total mass enclosed within the Einstein radius -- the radius of the Einstein ring -- has two components: stellar and dark matter components.

"If we subtract the stellar mass from the total mass, we get the dark matter mass within the Einstein radius," said Hai-Bo Yu, a professor of physics and astronomy at the University of California, Riverside, whose team has published new work about JWST-ER1g in the journal The Astrophysical Journal Letters. "But the value for the dark matter mass seems higher than expected. This is puzzling. In our paper, we offer an explanation."

A dark matter halo is the halo of invisible matter that permeates and surrounds a galaxy like JWST-ER1g. Although dark matter has never been detected in laboratories, physicists are confident dark matter, which makes up 85% of the universe's matter, exists.

"When ordinary matter -- pristine gas and stars -- collapses and condenses into the dark matter halo of JWST-ER1g, it may be compressing the halo, leading to a high density," said Demao Kong, a second-year graduate student at UCR, who led the analysis. "Our numerical studies show that this mechanism can explain the high dark matter density of JWST-ER1g -- more dark matter mass in the same volume, resulting in higher density."

According to Daneng Yang, a postdoctoral researcher at UCR and co-author on the paper, JWST-ER1g, formed 3.4 billion years after the Big Bang, provides "a great chance to learn about dark matter."

"This strong lensing object is unique because it has a perfect Einstein ring, from which we can obtain valuable information about the total mass within the Einstein radius, a critical step for testing dark matter properties," he said.

Launched on Christmas Day in 2021, NASA's JWST is an orbiting infrared observatory. Also called Webb, it is designed to answer questions about the universe. It is the largest, most complex, and powerful space telescope ever built.

"JWST provides an unprecedented opportunity for us to observe ancient galaxies formed when the universe was young," Yu said. "We expect to see more surprises from JWST and learn more about dark matter soon."

Read more at Science Daily

GeoAI technologies for sustainable urban development

From heatwaves to pandemic diseases, the urban environments of the world face numerous challenges. Researchers at the Hong Kong Polytechnic University are harnessing artificial intelligence (AI) and informatics to address emerging concerns related to environmental changes and urban growth.

Innovative geospatial and AI technologies offer ground-breaking solutions and insights into the dynamic changes occurring in our natural and social surroundings. The applications of GeoAI are rapidly expanding across various fields, encompassing transportation, urban and public safety, planning, climate change and natural disasters.

Prof. Qihao WENG, Chair Professor of Geomatics and Artificial Intelligence of the Department of Land Surveying and Geo-Informatics, and Global STEM Professor, established the PolyU Research Centre for Artificial Intelligence in Geomatics (RCAIG), to focus on the development of original and innovative AI methodologies and technologies for geomatics and their applications in urban areas, with the goal of it becoming a global R&D hub in GeoAI. Prof. Weng has recently been honoured with the 2024 American Association of Geographers (AAG) Wilbanks Prize for Transformational Research in Geography and the 2024 AAG Remote Sensing Specialty Group Lifetime Achievement Honor Award for his ground-breaking contributions in geography.

Earth observations Prof. WENG said, "By leveraging the latest geospatial technology and AI, we stand at the forefront of addressing global environmental and societal challenges. Our research encompasses a wide spectrum of subjects in the fields of earth observations and geoinformatics."

Satellite observations are invaluable tools for our community, relying on satellite imagery, videos and data that are crucial for informed decision-making in urban resilience and public health. For instance, satellite observations help us understand the impact of extreme heatwave on population exposure and aid in the development of urban flood monitoring algorithms. Real-time data acquisition also facilitate applications in traffic conditions, air quality, nature disasters, population movement and urban land use.

Prof. WENG said, "Earth observation is important as a guiding compass for understanding changes in the environment and society. Our research focuses on diverse fields including Geospatial big data and AI, remote sensing, ground-based sensors, navigation and positioning, surveying and geodesy, laser scanning and photogrammetry. These technologies play a crucial role in addressing and resolving key issues."

In particular, GeoAI has revolutionised building monitoring by utilising thousands of learnable parameters. An illustration of this is its ability to automatically learn and identify general patterns of buildings such as colour and shape. This technology is crucially applied to detect disaster-damaged buildings, retrieve building height, identify structural changes, and estimate building energy consumption. As a result, GeoAI has emerged as a mainstream solution for more efficient and insightful building monitoring.

Environmental monitoring

As the world rapidly urbanises, cities become the focal point of diverse aspects of human development, including building and environmental monitoring, conservation efforts, urban safety, and the impacts of climate change.

By leveraging AI techniques like deep neural networks, alongside with remote sensing methods, these technologies have the ability to detect and track changes such as in habitats, urbanisation and deforestation patterns. Additionally, monitoring the uptake of carbon by vegetation plays a crucial role in combating climate change and developing effective mitigation strategies.

For urban resilience and public health, these technologies aim to enhance the ability of urban areas to withstand and recover from various challenges such as extreme heatwaves, while promoting the well-being and sustainable development of urban population.

In the field of urbanisation monitoring, research team of the RCAIG has developed an impervious surface area (ISA) based urban cellular automata (CA) model that can simulate the fractional change of urban areas within each grid by utilising annual urban extent time series data obtained from satellite observations. By characterising the historical pathways of urban area growth under different levels of urbanisation, the model offers more detailed insights compared to traditional binary CA models. This demonstrates its great potential in supporting sustainable development.

Research conducted by Ms Wanru HE, an RCAIG doctoral research assistant and the team, titled "Modeling gridded urban fractional change using the temporal context information in the urban cellular automata model" was published on Cities. Their model effectively capture the dynamics of urban sprawl with significantly improved computational efficiency and performance, and it enables the modelling of urban growth at regional even global level, under diverse future urbanisation scenarios.

GeoAI for traffic management

GeoAI utilises machine learning and deep learning to effectively analyse intricate information, offering applications like real-time traffic management. Through the integration of diverse data modalities, such as text, images, and knowledge graphs, GeoAI enables accurate traffic flow prediction, route optimisation, accident warnings, and the planning of an efficient traffic network. Consequently, this contributes to the advancement of smart traffic management.

To enhance the efficiency of ride-hailing platforms and achieve intelligent management of their services, research team of the RCAIG has developed a multi-agent order matching and vehicle repositioning (MAMR) approach. This innovative technology focuses on coordinating the supply and demand of ride-hailing services, ultimately aiming to improve their overall efficiency.

This approach provides a ground-breaking solution to tackle two critical aspects of efficient ride-hailing services. Firstly, it addresses order matching by efficiently assigning orders to available vehicles. Secondly, it incorporates proactive vehicle repositioning, strategically deploying idle vehicles to regions with potentially high demand. Based on multi-agent deep reinforcement learning (MARL), this innovation solves the complex planning in transportation and offers a news perspective on long-term spatiotemporal planning problem. The research conducted by Ms Mingyue XU, another RCAIG researcher and the team, titled "Multi-agent reinforcement learning to unify order-matching and vehicle-repositioning in ride-hailing services," was published on International Journal of Geographical Information Science. The study demonstrated outperforming results, including reduced passenger rejection rates and driver idle time. With a focus on geospatial artificial intelligence (GeoAI), the RCAIG and the POLEIS at PolyU are dedicated to conducting research in diverse fields, including urban building and energy, urban safety and securing, environmental monitoring and conservation and urban resilience and public health. This aligns with the 11th United Nations Sustainable Development Goal (SDG11), which aims to create inclusive, safe, resilient, and sustainable cities and human settlements.

Read more at Science Daily

Evolution's recipe book: How 'copy paste' errors cooked up the animal kingdom

A series of whole genome and gene duplication events that go back hundreds of millions of years have laid the foundations for tissue-specific gene expression, according to a new study in the journal Nature Ecology and Evolution. The 'copy paste' errors allowed animals to keep one copy of their genome or genes for fundamental functions, while the second copy could be used as raw material for evolutionary innovation. Events like these, at varying degrees of scale, occurred constantly throughout the bilaterian evolutionary tree and enabled traits and behaviours as diverse as insect flight, octopus camouflage and human cognition.

700 million years ago, a remarkable creature emerged for the first time. Though it may not have been much to look at by today's standards, the animal had a front and a back, a top and a bottom. This was a groundbreaking adaptation at the time, and one which laid down the basic body plan which most complex animals, including humans, would eventually inherit.

The inconspicuous animal resided in the ancient seas of Earth, likely crawling along the seafloor. This was the last common ancestor of bilaterians, a vast supergroup of animals including vertebrates (fish, amphibians, reptiles, birds, and mammals), and invertebrates (insects, arthropods, molluscs, worms, echinoderms and many more).

To this day, more than 7,000 groups of genes can be traced back to the last common ancestor of bilaterians, according to a study of 20 different bilaterian species including humans, sharks, mayflies, centipedes and octopuses. The findings were made by researchers at the Centre for Genomic Regulation (CRG) in Barcelona and are published today in the journal Nature Ecology and Evolution.

Remarkably, the study found that around half of these ancestral genes have since been repurposed by animals for use in specific parts of the body, particularly in the brain and reproductive tissues. The findings are surprising because ancient, conserved genes usually have fundamental, important jobs that are needed in many parts of the body.

When the researchers took a closer look, they found a series of serendipitous 'copy paste' errors during bilaterian evolution were to blame. For example, there was a significant moment early in the history of vertebrates. A bunch of tissue-specific genes first appeared coinciding with two whole genome duplication events. Animals could keep one copy for fundamental functions, while the second copy could be used as raw material for evolutionary innovation. Events like these, at varying degrees of scale, occurred constantly throughout the bilaterian evolutionary tree.

"Our genes are like a vast library of recipes that can be cooked up differently to create or change tissues and organs. Imagine you end up with two copies of a recipe for paella by accident. You can keep and enjoy the original recipe while evolution tweaks the extra copy so that it makes risotto instead. Now imagine the entire recipe book is copied -- twice -- and the possibilities it opens for evolution. The legacy of these events, which took place hundreds of millions of years ago, lives on in most complex animals today," explains Federica Mantica, author of the paper and researcher at the Centre for Genomic Regulation (CRG) in Barcelona.

The authors of the study found many examples of new, tissue-specific functions made possible by the specialisation of these ancestral genes. For example, the TESMIN and tomb genes, which originated from the same ancestor, ended up independently playing a specialised role in the testis both in vertebrates and insects. Their importance is highlighted by the fact that problems with these genes can disrupt sperm production, affecting fertility in both mice and fruit flies.

The specialisation of ancestral genes also laid some foundations for the development of complex nervous systems. For example, in vertebrates, the researchers found genes critical for the formation of myelin sheaths around nerve cells, which are essential for fast nerve signal transmission. In humans they also identified FGF17, which is thought to play an important role in maintaining cognitive functions into old age.

In insects, specific genes became specialised in muscles and in the epidermis for cuticle formation, contributing to their ability to fly. In the skin of octopuses, other genes became specialised to perceive light stimulI, contributing to their ability to change colour, camouflage and communicate with other octopuses.

By studying the evolution of species at the tissue level, the study demonstrates that changes in the way genes are used in different parts of the body have played a big role in creating new and unique features in animals. In other words, when genes start acting in specific tissues, it can lead to the development of new physical traits or abilities, which ultimately contributes to animal evolution.

Read more at Science Daily

Physical activity reduces stress-related brain activity to lower cardiovascular disease risk

New research indicates that physical activity lowers cardiovascular disease risk in part by reducing stress-related signaling in the brain.

In the study, which was led by investigators at Massachusetts General Hospital (MGH), a founding member of the Mass General Brigham healthcare system and published in the Journal of the American College of Cardiology, people with stress-related conditions such as depression experienced the most cardiovascular benefits from physical activity.

To assess the mechanisms underlying the psychological and cardiovascular disease benefits of physical activity, Ahmed Tawakol, MD, an investigator and cardiologist in the Cardiovascular Imaging Research Center at Massachusetts General Hospital, and his colleagues analyzed medical records and other information of 50,359 participants from the Mass General Brigham Biobank who completed a physical activity survey.

A subset of 774 participants also underwent brain imaging tests and measurements of stress-related brain activity.

Over a median follow-up of 10 years, 12.9% of participants developed cardiovascular disease. Participants who met physical activity recommendations had a 23% lower risk of developing cardiovascular disease compared with those not meeting these recommendations.

Individuals with higher levels of physical activity also tended to have lower stress-related brain activity. Notably, reductions in stress-associated brain activity were driven by gains in function in the prefrontal cortex, a part of the brain involved in executive function (i.e., decision making, impulse control) and is known to restrain stress centers of the brain. Analyses accounted for other lifestyle variables and risk factors for coronary disease.

Moreover, reductions in stress-related brain signaling partially accounted for physical activity's cardiovascular benefit.

As an extension of this finding, the researchers found in a cohort of 50,359 participants that the cardiovascular benefit of exercise was substantially greater among participants who would be expected to have higher stress-related brain activity, such as those with pre-existing depression.

"Physical activity was roughly twice as effective in lowering cardiovascular disease risk among those with depression. Effects on the brain's stress-related activity may explain this novel observation," says Tawakol, who is the senior author of the study.

Read more at Science Daily

Apr 14, 2024

Beautiful nebula, violent history: Clash of stars solves stellar mystery

When astronomers looked at a stellar pair at the heart of a stunning cloud of gas and dust, they were in for a surprise. Star pairs are typically very similar, like twins, but in HD 148937, one star appears younger and, unlike the other, is magnetic. New data from the European Southern Observatory (ESO) suggest there were originally three stars in the system, until two of them clashed and merged. This violent event created the surrounding cloud and forever altered the system's fate.

"When doing background reading, I was struck by how special this system seemed," says Abigail Frost, an astronomer at ESO in Chile and lead author of the study published today in Science. The system, HD 148937, is located about 3800 light-years away from Earth in the direction of the Norma constellation.

It is made up of two stars much more massive than the Sun and surrounded by a beautiful nebula, a cloud of gas and dust.

"A nebula surrounding two massive stars is a rarity, and it really made us feel like something cool had to have happened in this system. When looking at the data, the coolness only increased."

"After a detailed analysis, we could determine that the more massive star appears much younger than its companion, which doesn't make any sense since they should have formed at the same time!" Frost says.

The age difference -- one star appears to be at least 1.5 million years younger than the other -- suggests something must have rejuvenated the more massive star.

Another piece of the puzzle is the nebula surrounding the stars, known as NGC 6164/6165.

It is 7500 years old, hundreds of times younger than both stars.

The nebula also shows very high amounts of nitrogen, carbon and oxygen.

This is surprising as these elements are normally expected deep inside a star, not outside; it is as if some violent event had set them free.

To unravel the mystery, the team assembled nine years' worth of data from the PIONIER and GRAVITY instruments, both on ESO's Very Large Telescope Interferometer (VLTI), located in Chile's Atacama Desert.

They also used archival data from the FEROS instrument at ESO's La Silla Observatory.

"We think this system had at least three stars originally; two of them had to be close together at one point in the orbit whilst another star was much more distant," explains Hugues Sana, a professor at KU Leuven in Belgium and the principal investigator of the observations.

"The two inner stars merged in a violent manner, creating a magnetic star and throwing out some material, which created the nebula. The more distant star formed a new orbit with the newly merged, now-magnetic star, creating the binary we see today at the centre of the nebula."

"The merger scenario was already in my head back in 2017 when I studied nebula observations obtained with the European Space Agency's Herschel Space Telescope," adds co-author Laurent Mahy, currently a senior researcher at the Royal Observatory of Belgium.

"Finding an age discrepancy between the stars suggests that this scenario is the most plausible one and it was only possible to show it with the new ESO data."

This scenario also explains why one of the stars in the system is magnetic and the other is not -- another peculiar feature of HD 148937 spotted in the VLTI data.

At the same time, it helps solve a long-standing mystery in astronomy: how massive stars get their magnetic fields.

While magnetic fields are a common feature of low-mass stars like our Sun, more massive stars cannot sustain magnetic fields in the same way.

Yet some massive stars are indeed magnetic.

Read more at Science Daily

New approach needed to save Australia's non-perennial rivers

Non-perennial rivers, which stop flowing at some point each year, dominate surface water movement across Australia, yet monitoring the continued health of these vital waterways demands a new type of research attention.

More than 70% of this nation's rivers are non-perennial due to a combination of ancient landscape, dry climates, highly variable rainfall regimes, and human interventions that have altered riverine environments.

An extensive review of current research incorporating geomorphology, hydrology, biogeochemistry, ecology and Indigenous knowledges identifies prevailing factors that shape water and energy flows in Australia's non-perennial rivers -- but the review also points to research deficiencies that must be addressed if these river systems are to be preserved and protected.

"Australia relies on our rivers, and has a strong history of research to understand river flows and ecosystems and the human impacts on them. Now, we must address emerging threats to river systems due to climate change and other anthropogenic impacts," says lead author of the review, Dr Margaret Shanafield, from Flinders University's College of Science and Engineering.

"We have to work together to tackle emerging threats to our rivers. If we are going to plug gaps in existing knowledge, which this review identifies, then a new style of inter-disciplinary scientific research is necessary to achieve the required outcomes."

While dominant research themes in Australia focus on drought, floods, salinity, dryland ecology and water management, four other areas of research attention are urgently needed, namely:

  •     Integrating Indigenous and western scientific knowledge;
  •     Quantifying climate change impacts on hydrological and biological function;
  •     Clarifying the meaning and measurement of "restoration" of non-perennial systems;
  •     Understanding the role of groundwater.


Addressing these areas through multi-disciplinary efforts supported by technological advances will provide a map for improved water research outcomes that the rest of the world can follow.

"Australia is globally unique in its spread and diversity of non-perennial rivers spanning climates and landforms -- but most, if not all, of the classes of non-perennial rivers found in Australia also occur in other regions of the world with similar climates and geology," says Dr Shanafield.

"Therefore, the evolving body of knowledge about Australian rivers provides a foundation for comparison with other dryland areas globally where recognition of the importance of non-perennial rivers is expanding."

The review authors are concerned that Australian non-perennial river research has been driven by the needs of its inhabitants for survival, agriculture, resource economics, environmental concern and politics.

"Considering the continent's ancient geological history and its harsh, arid climate, it comes as no surprise that significant attention has been directed toward water resource management during drought periods, the reduction of salinisation, and gaining insights into the intricate dynamics of the transient rivers that are a defining feature of central Australia," says the review.

"The prevalence of prolonged drought periods has had a marked impact on driving research -- so it is critical to address the knowledge gaps this review has identified, given that increasing trends in hydrological droughts are projected to negatively impact streamflow not just in Australia, but also in South America, southern Africa, and the Mediterranean."

The review authors -- a multi-disciplinary collective of scientists from across more than two dozen institutions and government departments -- say more investment in long-term hydrological monitoring is desperately needed to increase water management knowledge that can address the competing water needs of communities, agriculture, mining and ecosystems in a dry environment -- not only in Australia, but throughout the world.

"We anticipate that changing global water fluxes and continued groundwater pumping will cause more of the world's rivers to become non-perennial, accelerating our need to understand these systems across many disciplines," says Dr Shanafield.

Read more at Science Daily

Star Trek's Holodeck recreated using ChatGPT and video game assets

In Star Trek: The Next Generation, Captain Picard and the crew of the U.S.S. Enterprise leverage the holodeck, an empty room capable of generating 3D environments, to prepare for missions and to entertain themselves, simulating everything from lush jungles to the London of Sherlock Holmes. Deeply immersive and fully interactive, holodeck-created environments are infinitely customizable, using nothing but language: the crew has only to ask the computer to generate an environment, and that space appears in the holodeck.

Today, virtual interactive environments are also used to train robots prior to real-world deployment in a process called "Sim2Real." However, virtual interactive environments have been in surprisingly short supply. "Artists manually create these environments," says Yue Yang, a doctoral student in the labs of Mark Yatskar and Chris Callison-Burch, Assistant and Associate Professors in Computer and Information Science (CIS), respectively. "Those artists could spend a week building a single environment," Yang adds, noting all the decisions involved, from the layout of the space to the placement of objects to the colors employed in rendering.

That paucity of virtual environments is a problem if you want to train robots to navigate the real world with all its complexities. Neural networks, the systems powering today's AI revolution, require massive amounts of data, which in this case means simulations of the physical world. "Generative AI systems like ChatGPT are trained on trillions of words, and image generators like Midjourney and DALLE are trained on billions of images," says Callison-Burch. "We only have a fraction of that amount of 3D environments for training so-called 'embodied AI.' If we want to use generative AI techniques to develop robots that can safely navigate in real-world environments, then we will need to create millions or billions of simulated environments."

Enter Holodeck, a system for generating interactive 3D environments co-created by Callison-Burch, Yatskar, Yang and Lingjie Liu, Aravind K. Joshi Assistant Professor in CIS, along with collaborators at Stanford, the University of Washington, and the Allen Institute for Artificial Intelligence (AI2). Named for its Star Trek forebear, Holodeck generates a virtually limitless range of indoor environments, using AI to interpret users' requests. "We can use language to control it," says Yang. "You can easily describe whatever environments you want and train the embodied AI agents."

Holodeck leverages the knowledge embedded in large language models (LLMs), the systems underlying ChatGPT and other chatbots. "Language is a very concise representation of the entire world," says Yang. Indeed, LLMs turn out to have a surprisingly high degree of knowledge about the design of spaces, thanks to the vast amounts of text they ingest during training. In essence, Holodeck works by engaging an LLM in conversation, using a carefully structured series of hidden queries to break down user requests into specific parameters.

Just like Captain Picard might ask Star Trek's Holodeck to simulate a speakeasy, researchers can ask Penn's Holodeck to create "a 1b1b apartment of a researcher who has a cat." The system executes this query by dividing it into multiple steps: first, the floor and walls are created, then the doorway and windows. Next, Holodeck searches Objaverse, a vast library of premade digital objects, for the sort of furnishings you might expect in such a space: a coffee table, a cat tower, and so on. Finally, Holodeck queries a layout module, which the researchers designed to constrain the placement of objects, so that you don't wind up with a toilet extending horizontally from the wall.

To evaluate Holodeck's abilities, in terms of their realism and accuracy, the researchers generated 120 scenes using both Holodeck and ProcTHOR, an earlier tool created by AI2, and asked several hundred Penn Engineering students to indicate their preferred version, not knowing which scenes were created by which tools. For every criterion -- asset selection, layout coherence and overall preference -- the students consistently rated the environments generated by Holodeck more favorably.

The researchers also tested Holodeck's ability to generate scenes that are less typical in robotics research and more difficult to manually create than apartment interiors, like stores, public spaces and offices. Comparing Holodeck's outputs to those of ProcTHOR, which were generated using human-created rules rather than AI-generated text, the researchers found once again that human evaluators preferred the scenes created by Holodeck. That preference held across a wide range of indoor environments, from science labs to art studios, locker rooms to wine cellars.

Finally, the researchers used scenes generated by Holodeck to "fine-tune" an embodied AI agent. "The ultimate test of Holodeck," says Yatskar, "is using it to help robots interact with their environment more safely by preparing them to inhabit places they've never been before."

Across multiple types of virtual spaces, including offices, daycares, gyms and arcades, Holodeck had a pronounced and positive effect on the agent's ability to navigate new spaces.

For instance, whereas the agent successfully found a piano in a music room only about 6% of the time when pre-trained using ProcTHOR (which involved the agent taking about 400 million virtual steps), the agent succeeded over 30% of the time when fine-tuned using 100 music rooms generated by Holodeck.

Read more at Science Daily