Feb 23, 2024

Brightest and fastest-growing: Astronomers identify record-breaking quasar

Using the European Southern Observatory's (ESO) Very Large Telescope (VLT), astronomers have characterised a bright quasar, finding it to be not only the brightest of its kind, but also the most luminous object ever observed. Quasars are the bright cores of distant galaxies and they are powered by supermassive black holes. The black hole in this record-breaking quasar is growing in mass by the equivalent of one Sun per day, making it the fastest-growing black hole to date.

The black holes powering quasars collect matter from their surroundings in a process so energetic that it emits vast amounts of light.

So much so that quasars are some of the brightest objects in our sky, meaning even distant ones are visible from Earth.

As a general rule, the most luminous quasars indicate the fastest-growing supermassive black holes.

"We have discovered the fastest-growing black hole known to date. It has a mass of 17 billion Suns, and eats just over a Sun per day. This makes it the most luminous object in the known Universe," says Christian Wolf, an astronomer at the Australian National University (ANU) and lead author of the study published today in Nature Astronomy. The quasar, called J0529-4351, is so far away from Earth that its light took over 12 billion years to reach us.

The matter being pulled in toward this black hole, in the form of a disc, emits so much energy that J0529-4351 is over 500 trillion times more luminous than the Sun.

"All this light comes from a hot accretion disc that measures seven light-years in diameter -- this must be the largest accretion disc in the Universe," says ANU PhD student and co-author Samuel Lai.

Seven light-years is about 15,000 times the distance from the Sun to the orbit of Neptune.

And, remarkably, this record-breaking quasar was hiding in plain sight.

"It is a surprise that it has remained unknown until today, when we already know about a million less impressive quasars. It has literally been staring us in the face until now," says co-author Christopher Onken, an astronomer at ANU.

He added that this object showed up in images from the ESO Schmidt Southern Sky Survey dating back to 1980, but it was not recognised as a quasar until decades later.

Finding quasars requires precise observational data from large areas of the sky.

The resulting datasets are so large, researchers often use machine-learning models to analyse them and tell quasars apart from other celestial objects.

However, these models are trained on existing data, which limits the potential candidates to objects similar to those already known.

If a new quasar is more luminous than any other previously observed, the programme might reject it and classify it instead as a star not too distant from Earth.

An automated analysis of data from the European Space Agency's Gaia satellite passed over J0529-4351 for being too bright to be a quasar, suggesting it to be a star instead.

The researchers identified it as a distant quasar last year using observations from the ANU 2.3-metre telescope at the Siding Spring Observatory in Australia.

Discovering that it was the most luminous quasar ever observed, however, required a larger telescope and measurements from a more precise instrument.

The X-shooter spectrograph on ESO's VLT in the Chilean Atacama Desert provided the crucial data.

The fastest-growing black hole ever observed will also be a perfect target for the GRAVITY+ upgrade on ESO's VLT Interferometer (VLTI), which is designed to accurately measure the mass of black holes, including those far away from Earth.

Additionally, ESO's Extremely Large Telescope (ELT), a 39-metre telescope under construction in the Chilean Atacama Desert, will make identifying and characterising such elusive objects even more feasible.

Read more at Science Daily

Cooler, wetter parts of Pacific Northwest likely to see more fires, new simulations predict

Forests in the coolest, wettest parts of the western Pacific Northwest are likely to see the biggest increases in burn probability, fire size and number of blazes as the climate continues to get warmer and drier, according to new modeling led by an Oregon State University scientist.

Understanding how fire regimes may change under future climate scenarios is critical for developing adaptation strategies, said the study's lead author, Alex Dye.

Findings were published today in JGR Biogeosciences.

Dye, a faculty research associate in the OSU College of Forestry, and collaborators with the U.S. Forest Service conducted novel, comprehensive wildfire simulations for more than 23 million acres of forest land west of the Cascade Range crest in Oregon and Washington.

The simulations showed that by the 30-year period beginning in 2035, Washington's North Cascades region, the Olympic Mountains, the Puget Lowlands and the western Oregon Cascades could see at least twice as much fire activity as was observed during the prior 30 years, Dye said.

To a lesser degree, that trend holds for the western Washington Cascades and the Oregon Coast Range, he added.

Forests in all of the affected areas are linchpins of multiple socio-ecological systems in the Northwest, Dye said, meaning more fire will likely put pressure on everything from drinking water sources and timber resources to biodiversity and carbon stocks.

"The moist, highly productive forests of the Northwest don't get fire as often as other parts of the West, like California or eastern Oregon," Dye said.

"But fire does naturally occur in the PNW 'Westside' as we call it -- the fire regimes are actually quite complex in this region. It can be challenging to assess fire probability in an environment where there isn't a lot of empirical information about the fire history to build models."

The comparative infrequency of fire also means it's easy for the general public to think of the Westside as not a high-risk area, and it also means the region is generally not a focal point of studies such as the one he just completed, Dye said.

But recent big blazes such as those that occurred in the Northwest around Labor Day 2020 showed what can happen when severe fire strikes Westside areas.

"And what if fires like that were to start happening more frequently in the near future?" Dye said.

"What if that once every 200 years became once every 50 years, or once every 25 years as climate change brings hotter and drier conditions to the region?"

Climate is just one factor influencing wildfire, he noted, but it is an important one.

He sees the findings as a crucial planning tool to help the Northwest prepare for a rapid acceleration of fire over the next few decades.

"Describing the possibilities of how, when and where climate change could affect fire regimes helps bracket everyone's expectations," he said.

"Particularly important among our findings are new insights into the possibility of shifts towards more frequent and large fires, especially those greater than 40,000 hectares as well as shifts toward more fires burning at the beginning of fall when extreme weather has the potential to increase fire spread."

Forty thousand hectares is just under 99,000 acres.

Read more at Science Daily

Baleen whales evolved a unique larynx to communicate but cannot escape human noise

Baleen whales are the largest animals to have ever roamed our planet and as top predators play a vital role in marine ecosystems. To communicate across vast distances and find each other, baleen whales depend critically on the production of sounds that travels far in murky and dark oceans.

However, since whale songs were first discovered more than 50 years ago, it remained unknown how baleen whales produce their complex vocalizations -- until now.

A new study in the journal Nature reports that baleen whales evolved unique structures in their larynx that enable their low-frequency vocalizations, but also limit their communication range.

The study was led by voice scientists Professor Coen Elemans, at the Department of Biology, University of Southern Denmark and Professor Tecumseh Fitch at the Department of Behavioral and Cognitive Biology, University of Vienna in Austria.

"The toothed and baleen whales evolved from land mammals that had a larynx serving two functions: protecting the airways and sound production. However, their transition to aquatic life placed new and strict demands on the larynx to prevent choking underwater," says Tecumseh Fitch.

The study shows that baleen whales nevertheless can still produce sound with their larynx, but they have evolved novel structures to do so, that only exists in baleen whales. First, the tiny cartilages in the human larynx -- called the arytenoids -- that change the position of our vocal folds, have changed dramatically in whales.

"The arytenoids changed into large, long cylinders fused at the base to form a large U-shaped rigid structure that extends nearly the full length of the larynx," Elemans says.

"This is probably to keep a rigid open airway when they have to move huge amounts of air in and out during explosive surface breathing," states Fitch.

"We found that this U-shaped structure pushes against a big fatty cushion on the inside of the larynx. When the whales push air from their lungs past this cushion, it starts to vibrate and this generates very low frequency underwater sounds," says Elemans.

Trying to work on the biology and particularly physiology of whales is very challenging.

"Even though humans hunted whales close to the brink of extinction, they made very little effort in trying to learn about their physiology," says Magnus Wahlberg, whale expert at University of Southern Denmark and co-author on the study.

"Strandings are unique and rare opportunities to learn about these amazing animals, but even then, it is very hard to study physiology, because the tissue decays so fast. Whales are known to explode on the beach," adds Wahlberg.

Thanks to Danish and Scottish Marine Mammal Stranding Networks, the researchers could quickly extract the larynx of a sei, minke and humpback whale for close investigation in the lab.

"Our experiments showed for the first time how the whales make their very low frequency vocalizations," says Elemans.

To understand how muscle activity could change the calls, the researchers built a computational model of the entire whale larynx.

"Our model includes accurate 3D shapes of the larynx and its muscles, which made it possible to simulate, for example, how the frequency is controlled through muscle modulation," say Qian Xue and Xudong Zheng, professors at the Mechanical Engineering Department at Rochester Institute of Technology, USA, co-authors on the study.

"Our model accurately predicted the results of our experiments, but we could also calculate acoustic features we could not measure in the lab, such as the frequency range," says Weili Jiang, postdoc at Rochester Institute of Technology, USA, co-author on the study.

The models predicted the natural vocalizations of the whales very well.

However, these newly discovered anatomical features that allowed whales to successfully communicate in the vast oceans also poses unsurmountable physiological limits for many baleen whales.

Combining experiments and models, the researchers provide the first evidence that baleen whales are physiologically incapable of escaping anthropogenic noise, because it masks their voices, and thus limits their communication range.

"Regrettably, the frequency range and maximum communication depth of 100 meters we predict, overlaps completely with the dominant frequency range and depth of human-made noise caused by shipping traffic," Elemans says.

"The first acoustic recordings of humpback whale song by Roger and Katy Payne in 1970 resonated with humanity profoundly, started the flourishing field of marine bioacoustics, and sparked global interest in marine conservation efforts." says Coen Elemans.

"These recordings were so politically important then that they are aboard the Voyager space missions," he continues.

The Payne's made people aware how quiet the seas were before humans started the widespread use of propeller ships and continuously running shipboard generators. Those were the seas whales evolved in.

Read more at Science Daily

Giant new snake species identified in the Amazon

A team of scientists on location with a film crew in the remote Amazon has uncovered a previously undocumented species of giant anaconda.

Professor Bryan Fry from The University of Queensland led a team which captured and studied several specimens of the newly named northern green anaconda (Eunectes akayima), located in the Bameno region of Baihuaeri Waorani Territory in the Ecuadorian Amazon.

"Our team received a rare invitation from the Waorani people to explore the region and collect samples from a population of anacondas, rumoured to be the largest in existence," Professor Fry said.

"The indigenous hunters took us into the jungle on a 10-day expedition to search for these snakes, which they consider sacred.

"We paddled canoes down the river system and were lucky enough to find several anacondas lurking in the shallows, lying in wait for prey.

"The size of these magnificent creatures was incredible -- one female anaconda we encountered measured an astounding 6.3 metres long.

"There are anecdotal reports from the Waorani people of other anacondas in the area measuring more than 7.5 metres long and weighing around 500 kilograms."

Professor Fry said the northern green anaconda species diverged from the southern green anaconda almost 10 million years ago, and they differ genetically by 5.5 per cent.

"It's quite significant -- to put it in perspective, humans differ from chimpanzees by only about 2 per cent," he said.

"This discovery is the highlight of my career."

The new anaconda species was found while filming with National Geographic for their upcoming Disney+ series Pole to Pole with Will Smith, on which Professor Fry, a National Geographic Explorer, was the expedition's scientific leader.

"Our journey into the heart of the Amazon, facilitated by the invitation of Waorani Chief Penti Baihua, was a true cross-cultural endeavour," he said.

"The importance of our Waorani collaborators is recognised with them being co-authors on the paper."

The scientists also set out to compare the genetics of the green anaconda with specimens collected elsewhere by world-leading anaconda expert Dr Jesus Rivas from New Mexico Highlands University, and use them as an indicator species for ecosystem health.

Professor Fry said the Amazon continues to face alarming ecological threats.

"Deforestation of the Amazon basin from agricultural expansion has resulted in an estimated 20-31 per cent habitat loss, which may impact up to 40 per cent of its forests by 2050," he said.

"Another increasing problem is habitat degradation from land fragmentation, led by industrialised agriculture and heavy metal pollution associated with spills from oil extraction activities.

"Forest fires, drought and climate change are also notable threats.

"These rare anacondas, and the other species that share this remote ecosystem, face significant challenges."

Professor Fry said his next research project would focus on heavy metal pollution in the Amazon.

"It's not only these gigantic snakes that are facing environmental threats, but almost all living things in the region," he said.

"The discovery of a new species of anaconda is exciting, but it is critical to highlight the urgent need to further research these threatened species and ecosystems.

Read moer at Science Daily

Feb 22, 2024

Black hole at center of the Milky Way resembles a football

The supermassive black hole in the center of the Milky Way is spinning so quickly it is warping the spacetime surrounding it into a shape that can look like a football, according to a new study using data from NASA's Chandra X-ray Observatory and the U.S. National Science Foundation's Karl G. Jansky Very Large Array (VLA). That football shape suggests the black hole is spinning at a substantial speed, which researchers estimated to be about 60% of its potential limit.

The work, led by Penn State Berks Professor of Physics Ruth Daly, was published in the Monthly Notices of the Royal Astronomical Society.

Astronomers call this giant black hole Sagittarius A* (Sgr A*). It is located about 26,000 light-years away from Earth in the center of the galaxy. To determine how quickly Sgr A* is spinning -- one of its fundamental properties, along with mass -- the researchers applied a method that uses X-ray and radio data to assess how material is flowing towards and away from the black hole. The method was developed and published by Daly in 2019 in The Astrophysical Journal.

"Our work may help settle the question of how fast our galaxy's supermassive black hole is spinning," Daly said. "Our results indicate that Sgr A* is spinning very rapidly, which is interesting and has far-reaching implications."

The team found the angular velocity -- the number of revolutions per second -- of Sgr A*'s spin is about 60% of the maximum possible value, a limit set because material cannot travel faster than the speed of light.

Past estimations of Sgr A*'s speed have been made with different techniques and by other astronomers, with results ranging from no rotation at all to spinning at almost the maximum rate.

"This work, however, shows that this could change if the amount of material in the vicinity of Sgr A* increases," Daly said.

As a black hole rotates, it pulls "spacetime" -- the combination of time and the three dimensions of space -- and nearby matter. The gravitational pull also squashes the spacetime, altering its shape depending on how it's observed. Spacetime appears circular if the black hole is viewed from the top. From the side, however, the spacetime is shaped like a football. The faster the spin, the flatter the football.

The spin can also serve as an energy source, Daly said, if matter -- such as gas or the remnants of a star that wanders too close -- exists in the vicinity of the black hole. As the black hole spins, matter can escape in the form of narrow jets called collimated outflows. However, Sgr A* currently has limited nearby matter, so the black hole has been relatively quiet, with weakly collimated outflows, in recent millennia.

"A spinning black hole is like a rocket on the launch pad," said Biny Sebastian, a co-author from the University of Manitoba in Winnipeg, Canada. "Once material gets close enough, it's like someone has fueled the rocket and hit the 'launch' button."

This means that in the future, if the properties of the matter and the magnetic field strength close to the black hole change, part of the enormous energy of the black hole's spin could drive more powerful outflows. This source material could come from gas or from the remnants of a star torn apart by the black hole's gravity if that star wanders too close to Sgr A*.

"Jets powered and collimated by a galaxy's spinning central black hole can profoundly affect the gas supply for an entire galaxy, which affects how quickly and even whether stars can form," said co-author Megan Donahue from Michigan State University. "The 'Fermi bubbles' seen in X-rays and gamma rays around our Milky Way's black hole show the black hole was probably active in the past. Measuring the spin of our black hole is an important test of this scenario."

Fermi bubbles refer to structures that emit gamma rays above and below the black hole that researchers have theorized resulted from prior massive outflows.

The researchers used the outflow method to determine the spin of Sgr A*. Daly's approach incorporates consideration of the relationship between the spin of the black hole and its mass, the properties of the matter near the black hole and the outflow properties. The collimated outflow produces the radio waves, while the disk of gas surrounding the black hole emits X-rays. The researchers combined observational data from Chandra and the VLA with an independent estimate of the black hole's mass from other telescopes to inform the outflow method and determine the black hole's spin.

"We have a special view of Sgr A* because it is the nearest supermassive black hole to us," said co-author Anan Lu from McGill University in Montreal, Canada. "Although it's quiet right now, our work shows that in the future it will give an incredibly powerful kick to surrounding matter. That might happen in a thousand or a million years, or it could happen in our lifetimes."

Read more at Science Daily

Decline in microbial genetic richness in the western Arctic Ocean

The Arctic region is experiencing climate change at a much faster rate than the rest of the world. Melting ice sheets, runoff from thawing permafrost and other factors are rapidly changing the composition of the Arctic Ocean's water. And that change is being experienced all the way down to the microbial level.

In a Concordia-led study published in the journal ISME Communications, researchers analyzed archival samples of bacteria and archaea populations taken from the Beaufort Sea, bordering northwest Canada and Alaska.

The samples were collected between 2004 and 2012, a period that included two years -- 2007 and 2012 -- in which the sea ice coverage was historically low.

The researchers looked at samples taken from three levels of water: the summer mixed layer, the upper Arctic water below it and the Pacific-origin water at the deepest level.

The study examined the microbes' genetic composition using bioinformatics and statistical analysis across the nine-year time span.

Using this data, the researchers were able to see how changing environmental conditions were influencing the organisms' structure and function.

The researchers found subtle but statistically significant changes in the communities they studied.

"We observed a general overall loss in diversity of species across all the different water masses," says David Walsh, a professor in the Department of Biology and the paper's corresponding author.

"We also saw changes in the composition of the microbial community, meaning there were different species after the 2007 sea ice minimum than before."

However, the periods of population richness decline changed between the ocean's layers of water.

Sudden decline in the fresher summer mixed water level, between 3-9 meters deep, was observed in 2005-2007.

The upper Arctic water, between 16-78 metres, saw declines in 2010-2012, while the deeper Pacific water layer, between 49-154 meters, experienced a two-step decline -- once between 2005-2007 and again between 2010-2012.

Small beginnings

The researchers are taking care not to overemphasize the results of their findings, saying the changes, while significant, remain slight.

But with the summer Arctic ice cover shrinking steadily year over year, the data does hint at possible trends that may be visible in upcoming population studies in more recent years.

"With the warming and freshening of the Arctic Ocean comes a decrease of nutrients that are important for photosynthesis, which produces the organic matter that serves as energy and carbon sources for the marine food web," Walsh explains.

"This shift risks strengthening what is known as the microbial loop, in which the energy and carbon that would normally go into higher trophic levels -- meaning zooplankton and then fish -- is rapidly recycled by microorganisms. This ecosystem is already dominated by microbial processes, which will only get stronger as this system continues."

"This study provides us with a baseline idea of what is happening in the Arctic," says co-author Arthi Ramachandran, PhD 23. "The Arctic is warming four times faster than the rest of the world, which makes it a fascinating ecosystem to study. The oceans are all interconnected, and the physical barriers of these oceans are becoming much less defined."

Looking into the ocean's warmer, fresher future

The researchers are now planning a metagenomic study that extends the time series to cover periods of even more intense sea ice minima.

They hope to fully sequence the organisms' genomes to further understand the microbial communities' diversity and function in the environment.

Read more at Science Daily

Science fiction meets reality: New technique to overcome obstructed views

After a recent car crash, John Murray-Bruce wished he could have seen the other car coming. The crash reaffirmed the University of South Florida assistant professor of computer science and engineering's mission to create a technology that could do just that: See around obstacles and ultimately expand one's line of vision.

Using a single photograph, Murray-Bruce and his doctoral student, Robinson Czajkowski, created an algorithm that computes highly accurate, full-color three-dimensional reconstructions of areas behind obstacles -- a concept that can not only help prevent car crashes, but help law enforcement experts in hostage situations, search-and-rescue and strategic military efforts.

"We're turning ordinary surfaces into mirrors to reveal regions, objects and rooms that are outside our line of vision," Murray-Bruce said.

"We live in a 3D world, so obtaining a more complete 3D picture of a scenario can be critical in a number of situations and applications."

As published in Nature Communications, Czajkowski and Murray-Bruce's research is the first-of-its-kind to successfully reconstruct a hidden scene in 3D using an ordinary digital camera.

The algorithm works by using information from the photo of faint shadows cast on nearby surfaces to create a high-quality reconstruction of the scene.

While it is more technical for the average person, it could have broad applications.

"These shadows are all around us," Czajkowski said. "The fact we can't see them with our naked eye doesn't mean they're not there."

The idea of seeing around obstacles has been a topic of science-fiction movies and books for decades.

Murray-Bruce says this research takes significant strides in bringing that concept to life.

Prior to this work, researchers have only used ordinary cameras to create rough 2D reconstructions of small spaces.

The most successful demonstrations of 3D imaging of hidden scene all required specialized, expensive equipment.

"Our work achieves a similar result using far less," Czajkowski said.

"You don't need to spend a million dollars on equipment for this anymore."

Czajkowski and Murray-Bruce expect it will be 10 to 20 years before the technology is robust enough to be adopted by law enforcement and car manufacturers.

Right now, they plan to continue their research to further improve the technology's speed and accuracy to expand its applications in the future, including self-driving cars to improve their safety and situational awareness.

"In just over a decade since the idea of seeing around corners emerged, there has been remarkable progress and there is accelerating interest and research activity in the area," Murray-Bruce said.

"This increased activity, along with access to better, more sensitive cameras and faster computing power form the basis for my optimism on how soon this technology will become practical for a wide range of scenarios."

Read more at Science Daily

Did neanderthals use glue? Researchers find evidence that sticks

Neanderthals created stone tools held together by a multi-component adhesive, a team of scientists has discovered. Its findings, which are the earliest evidence of a complex adhesive in Europe, suggest these predecessors to modern humans had a higher level of cognition and cultural development than previously thought.

The work, reported in the journal Science Advances, included researchers from New York University, the University of Tübingen, and the National Museums in Berlin.

"These astonishingly well-preserved tools showcase a technical solution broadly similar to examples of tools made by early modern humans in Africa, but the exact recipe reflects a Neanderthal 'spin,' which is the production of grips for handheld tools," says Radu Iovita, an associate professor at New York University's Center for the Study of Human Origins.

The research team, led by Patrick Schmidt from the University of Tübingen's Early Prehistory and Quaternary Ecology section and Ewa Dutkiewicz from the Museum of Prehistory and Early History at the National Museums in Berlin, re-examined previous finds from Le Moustier, an archaeological site in France that was discovered in the early 20th century.

The stone tools from Le Moustier -- used by Neanderthals during the Middle Palaeolithic period of the Mousterian between 120,000 and 40,000 years ago -- are kept in the collection of Berlin's Museum of Prehistory and Early History and had not previously been examined in detail.

The tools were rediscovered during an internal review of the collection and their scientific value was recognized.

"The items had been individually wrapped and untouched since the 1960s," says Dutkiewicz.

"As a result, the adhering remains of organic substances were very well preserved."

The researchers discovered traces of a mixture of ochre and bitumen on several stone tools, such as scrapers, flakes, and blades.

Ochre is a naturally occurring earth pigment; bitumen is a component of asphalt and can be produced from crude oil, but also occurs naturally in the soil.

"We were surprised that the ochre content was more than 50 percent," says Schmidt.

"This is because air-dried bitumen can be used unaltered as an adhesive, but loses its adhesive properties when such large proportions of ochre are added."

He and his team examined these materials in tensile tests -- used to determine strength -- and other measures.

"It was different when we used liquid bitumen, which is not really suitable for gluing. If 55 percent ochre is added, a malleable mass is formed," Schmidt says.

The mixture was just sticky enough for a stone tool to remain stuck in it, but without adhering to hands, making it suitable material for a handle.

In fact, a microscopic examination of the use-wear traces on these stone tools revealed that the adhesives on the tools from Le Moustier were used in this way.

"The tools showed two kinds of microscopic wear: one is the typical polish on the sharp edges that is generally caused by working other materials," explains Iovita, who conducted this analysis.

"The other is a bright polish distributed all over the presumed hand-held part, but not elsewhere, which we interpreted as the results of abrasion from the ochre due to movement of the tool within the grip."

The use of adhesives with several components, including various sticky substances such as tree resins and ochre, was previously known from early modern humans, Homo sapiens, in Africa but not from earlier Neanderthals in Europe.

Overall, the development of adhesives and their use in the manufacture of tools is considered to be some of the best material evidence of the cultural evolution and cognitive abilities of early humans.

"Compound adhesives are considered to be among the first expressions of the modern cognitive processes that are still active today," says Schmidt.

In the Le Moustier region, ochre and bitumen had to be collected from distant locations, which meant a great deal of effort, planning, and a targeted approach, the authors note.

"Taking into account the overall context of the finds, we assume that this adhesive material was made by Neanderthals," concludes Dutkiewicz.

"What our study shows is that early Homo sapiens in Africa and Neanderthals in Europe had similar thought patterns," adds Schmidt.

Read more at Science Daily

Feb 21, 2024

Spy-satellite images offer insights into historical ecosystem changes

A large number of historical spy-satellite photographs from the Cold War Era were declassified decades ago. This valuable remote sensing data has been utilised by scientists across a wide range of disciplines from archaeology to civil engineering. However, its use in ecology and conservation remains limited. A new study led by Dr. Catalina Munteanu from the Faculty of Environment and Natural Resources at the University of Freiburg, Germany, aims to advance the application of declassified satellite data in the fields of ecology and conservation. Leveraging recent progress in image processing and analysis, these globally available black-and-white images can offer better insights into the historical changes of ecosystems, species populations or changes in human influences on the environment dating back to the 1960s, the researchers suggest.

Historical satellite images cover nearly the entire globe across all seasons

In their study, the researchers initially evaluated the spatial, temporal, and seasonal coverage of over one million declassified images from four historical US spy-satellite programmes, showing that this data spans nearly the entire globe and is available across all seasons.

Upon reviewing how spy-satellite imagery is currently employed in ecology-related fields, the team then identified potential future applications.

Crucially, the broad spatial-temporal scale of the satellite images could enhance the understanding of ecological concepts such as shifting baselines, lag effects, and legacy effects.

This improved understanding could lead to better mapping of the historical extent and structure of ecosystems, aid in the reconstruction of past habitats and species distributions as well as offer new insights into historical human impacts on present ecosystem conditions.

Going forward, this knowledge can also be helpful for conservation planning and ecosystem restoration efforts by helping identify, for example, meaningful ecological baselines, the researchers explain.

Read more at Science Daily

Viruses that can help 'dial up' carbon capture in the sea

Armed with a catalog of hundreds of thousands of DNA and RNA virus species in the world's oceans, scientists are now zeroing in on the viruses most likely to combat climate change by helping trap carbon dioxide in seawater or, using similar techniques, different viruses that may prevent methane's escape from thawing Arctic soil.

By combining genomic sequencing data with artificial intelligence analysis, researchers have identified ocean-based viruses and assessed their genomes to find that they "steal" genes from other microbes or cells that process carbon in the sea. Mapping microbial metabolism genes, including those for underwater carbon metabolism, revealed 340 known metabolic pathways throughout the global oceans. Of these, 128 were also found in the genomes of ocean viruses.

"I was shocked that the number was that high," said Matthew Sullivan, professor of microbiology and director of the Center of Microbiome Science at The Ohio State University.

Having mined this massive trove of data via advances in computation, the team has now revealed which viruses have a role in carbon metabolism and are using this information in newly developed community metabolic models to help predict how using viruses to engineer the ocean microbiome toward better carbon capture would look.

"The modeling is about how viruses may dial up or dial down microbial activity in the system," Sullivan said. "Community metabolic modeling is telling me the dream data point: which viruses are targeting the most important metabolic pathways, and that matters because it means they're good levers to pull on."

Sullivan presented the research today (Feb. 17, 2024) at the annual meeting of the American Association for the Advancement of Science in Denver.

Sullivan was the virus coordinator for the Tara Oceans Consortium, a three-year global study of the impact of climate change on the world's oceans and the source of 35,000 water samples containing the microbial bounty. His lab focuses on phages, viruses that infect bacteria, and their potential to be scaled up in an engineering framework to manipulate marine microbes into converting carbon into the heaviest organic form that will sink to the ocean floor.

"Oceans soak up carbon, and that buffers us against climate change. CO2 is absorbed as a gas, and its conversion into organic carbon is dictated by microbes," Sullivan said. "What we're seeing now is that viruses target the most important reactions in these microbial community metabolisms. This means we can start investigating which viruses could be used to convert carbon toward the kind we want.

"In other words, can we strengthen this massive ocean buffer to be a carbon sink to buy time against climate change, as opposed to that carbon being released back into the atmosphere to accelerate it?"

In 2016, the Tara team determined that carbon sinking in the ocean was related to the presence of viruses. It is thought that viruses help sink carbon when virus-infected carbon-processing cells cluster into larger, sticky aggregates that drop to the ocean floor. The researchers developed AI-based analytics to identify from thousands of viruses which few are "VIP" viruses to culture in the lab and work with as model systems for ocean geoengineering.

This new community metabolic modeling, developed by collaborator Professor Damien Eveillard of the Tara Oceans Consortium, helps them understand what unintended consequences might be of such an approach. Sullivan's lab is taking these oceanic lessons learned and applying them to using viruses to engineer microbiomes in human settings to aid recovery from spinal cord injury, improve outcomes for infants born to mothers with HIV, combat infection in burn wounds, and more.

"The conversation we're having is, 'How much of this is transferable?'" said Sullivan, also a professor of civil, environmental and geodetic engineering. "The overall goal is engineering microbiomes toward what we think is something useful."

He also reported on early efforts to use phages as geoengineering tools in an entirely different ecosystem: the permafrost in northern Sweden, where microbes both change the climate and respond to climate change as the frozen soil thaws. Virginia Rich, associate professor of microbiology at Ohio State, is co-director of the National Science Foundation-funded EMERGE Biology Integration Institute based at Ohio State that organizes the microbiome science at the Sweden field site. Rich also co-led previous research that identified a lineage of single-cell organisms in the thawing permafrost soil as a significant producer of methane, a potent greenhouse gas.

Rich co-organized the AAAS session with Ruth Varner of the University of New Hampshire, who co-directs the EMERGE Institute, which is focusing on better understanding how microbiomes respond to permafrost thaw and the resulting climate interactions.

Sullivan's talk was titled "From ecosystems biology to managing microbiomes with viruses," and was presented at the session titled "Microbiome-Targeted Ecosystem Management: Small Players, Big Roles."

Read more at Science Daily

Ancient DNA reveals Down syndrome in past human societies

By analysing ancient DNA, an international team of researchers have uncovered cases of chromosomal disorders, including what could be the first case of Edwards syndrome ever identified from prehistoric remains.

The team identified six cases of Down syndrome and one case of Edwards syndrome in human populations that were living in Spain, Bulgaria, Finland, and Greece from as long ago as 4,500 years before today.

The research indicated that these individuals were buried with care, and often with special grave goods, showing that they were appreciated as members of their ancient societies.

The global collaborative study, led by first author Dr Adam "Ben" Rohrlach of the University of Adelaide, and senior author Dr Kay Prüfer of the Max Planck Institute for Evolutionary Anthropology, involved screening DNA from approximately 10,000 ancient and pre-modern humans for evidence of autosomal trisomies, a condition where people carry an extra (third) copy of one of the first 22 chromosomes.

"Using a new statistical model, we screened the DNA extracted from human remains from the Mesolithic, Neolithic, Bronze and Iron Ages all the way up to the mid-1800s. We identified six cases of Down syndrome," says Dr Rohrlach, a statistician from the University of Adelaide's School of Mathematical Sciences.

"While we expected that people with Down syndrome certainly existed in the past, this is the first time we've been able to reliably detect cases in ancient remains, as they can't be confidently diagnosed by looking at the skeletal remains alone."

Down syndrome occurs when an individual carries an extra copy of chromosome 21. The researchers were able to find these six cases using a novel Bayesian approach to accurately and efficiently screen tens of thousands of ancient DNA samples.

"The statistical model identifies when an individual has approximately 50 per cent too much DNA that comes from one specific chromosome," says Dr Patxuka de-Miguel-Ibáñez of the University of Alicante, and lead osteologist for the Spanish sites.

"We then compared the remains of the individuals with Down syndrome for common skeletal abnormalities such as irregular bone growth, or porosity of the skull bones, which may help to identify future cases of Down syndrome when ancient DNA can't be recovered."

The study also uncovered one case of Edwards syndrome, a rare condition caused by three copies of chromosome 18, that comes with far more severe symptoms than Down syndrome.

The remains indicated severe abnormalities in bone growth, and an age of death of approximately 40 weeks gestation.

All of the cases were detected in perinatal or infant burials, but from different cultures and time periods.

"These individuals were buried according to either the standard practices of their time or were in some way treated specially. This indicates that they were acknowledged as members of their community and were not treated differently in death," says Dr Rohrlach.

"Interestingly, we discovered the only case of Edwards syndrome, and a noticeable increase in cases of Down syndrome, in individuals from the Early Iron Age in Spain. The remains could not confirm that these babies survived to birth, but they were among the infants buried within homes at the settlement, or within other important buildings," says Professor Roberto Risch, co-author and archaeologist from The Autonomous University of Barcelona.

"We don't know why this happened, as most people were cremated during this time, but it appears as if they were purposefully choosing these infants for special burials."

Read more at Science Daily

Study identifies distinct brain organization patterns in women and men

A new study by Stanford Medicine investigators unveils a new artificial intelligence model that was more than 90% successful at determining whether scans of brain activity came from a woman or a man.

The findings, to be published Feb. 19 in the Proceedings of the National Academy of Sciences, help resolve a long-term controversy about whether reliable sex differences exist in the human brain and suggest that understanding these differences may be critical to addressing neuropsychiatric conditions that affect women and men differently.

"A key motivation for this study is that sex plays a crucial role in human brain development, in aging, and in the manifestation of psychiatric and neurological disorders," said Vinod Menon, PhD, professor of psychiatry and behavioral sciences and director of the Stanford Cognitive and Systems Neuroscience Laboratory. "Identifying consistent and replicable sex differences in the healthy adult brain is a critical step toward a deeper understanding of sex-specific vulnerabilities in psychiatric and neurological disorders."

Menon is the study's senior author. The lead authors are senior research scientist Srikanth Ryali, PhD, and academic staff researcher Yuan Zhang, PhD.

"Hotspots" that most helped the model distinguish male brains from female ones include the default mode network, a brain system that helps us process self-referential information, and the striatum and limbic network, which are involved in learning and how we respond to rewards.

The investigators noted that this work does notweigh in on whether sex-related differences arise early in life or may be driven by hormonal differences or the different societal circumstances that men and women may be more likely to encounter.

Uncovering brain differences

The extent to which a person's sex affects how their brain is organized and operates has long been a point of dispute among scientists. While we know the sex chromosomes we are born with help determine the cocktail of hormones our brains are exposed to -- particularly during early development, puberty and aging -- researchers have long struggled to connect sex to concrete differences in the human brain. Brain structures tend to look much the same in men and women, and previous research examining how brain regions work together has also largely failed to turn up consistent brain indicators of sex.

In their current study, Menon and his team took advantage of recent advances in artificial intelligence, as well as access to multiple large datasets, to pursue a more powerful analysis than has previously been employed. First, they created a deep neural network model, which learns to classify brain imaging data: As the researchers showed brain scans to the model and told it that it was looking at a male or female brain, the model started to "notice" what subtle patterns could help it tell the difference.

This model demonstrated superior performance compared with those in previous studies, in part because it used a deep neural network that analyzes dynamic MRI scans. This approach captures the intricate interplay among different brain regions. When the researchers tested the model on around 1,500 brain scans, it could almost always tell if the scan came from a woman or a man.

The model's success suggests that detectable sex differences do exist in the brain but just haven't been picked up reliably before. The fact that it worked so well in different datasets, including brain scans from multiple sites in the U.S. and Europe, make the findings especially convincing as it controls for many confounds that can plague studies of this kind.

"This is a very strong piece of evidence that sex is a robust determinant of human brain organization," Menon said.

Making predictions

Until recently, a model like the one Menon's team employed would help researchers sort brains into different groups but wouldn't provide information about how the sorting happened. Today, however, researchers have access to a tool called "explainable AI," which can sift through vast amounts of data to explain how a model's decisions are made.

Using explainable AI, Menon and his team identified the brain networks that were most important to the model's judgment of whether a brain scan came from a man or a woman. They found the model was most often looking to the default mode network, striatum, and the limbic network to make the call.

The team then wondered if they could create another model that could predict how well participants would do on certain cognitive tasks based on functional brain features that differ between women and men. They developed sex-specific models of cognitive abilities: One model effectively predicted cognitive performance in men but not women, and another in women but not men. The findings indicate that functional brain characteristics varying between sexes have significant behavioral implications.

"These models worked really well because we successfully separated brain patterns between sexes," Menon said. "That tells me that overlooking sex differences in brain organization could lead us to miss key factors underlying neuropsychiatric disorders."

While the team applied their deep neural network model to questions about sex differences, Menon says the model can be applied to answer questions regarding how just about any aspect of brain connectivity might relate to any kind of cognitive ability or behavior. He and his team plan to make their model publicly available for any researcher to use.

"Our AI models have very broad applicability," Menon said. "A researcher could use our models to look for brain differences linked to learning impairments or social functioning differences, for instance -- aspects we are keen to understand better to aid individuals in adapting to and surmounting these challenges."

Read more at Science Daily

Feb 20, 2024

Researchers shed light on river resiliency to flooding

Researchers at theUniversity of Nevada, Renohave completed one of the most extensive river resilience studies, examining how river ecosystems recover following floods. They developed a novel modeling approach that used data from oxygen sensors placed in rivers to estimate daily growth in aquatic plants and algae. The researchers then modeled the algal and plant biomass in 143 rivers across the contiguous U.S. to quantify what magnitude of flooding disturbs the biomass and how long the rivers take to recover from floods. Increased understanding of rivers' resiliency is important to maintaining healthy rivers, as human actions can affect flood regimes and change the conditions in rivers for other aquatic life that may rely on algae and plants as a food source.

Assistant Professor Joanna Blaszczak and Postdoctoral Scholar Heili Lowman, both in the University'sCollege of Agriculture, Biotechnology & Natural ResourcesandGlobal Water Centerled the research, which was published in two separate journal articles. The preliminary work, led by Blaszczak andpublished in Ecology Letters last June, first studied six rivers and laid the groundwork and methodology for the second study, which Blaszczak hired Lowman to conduct, examining the 143 rivers. The results of that research were justpublished last month in PNAS (the Proceedings of the National Academy of Sciences).

The research is unique because it estimates changes in biomass in rivers more frequently than ever before without needing to directly sample rivers. This is done by using both data from oxygen sensors placed in the rivers by the U.S. Geological Survey and a population model of algal and plant biomass -- similar to a human population model that models change in the number of people over time, but instead modeling the change in the amount of algae and plants. The oxygen sensors began collecting data in 2007, and the most recent Nevada-led study of 143 rivers includes some data that are for nine years running, among the longest such records on file for rivers across the globe.

"Previously, you would have to go to a river and scrub rocks to measure the algae, and do that several times for an extended period of time in order to estimate changes in biomass growth and loss," Blaszczak said. "This is very time consuming, so the data have been extremely limited relative to how extensive our sensor networks are."

Blaszczak said that with the oxygen sensors that take data as frequently as every five minutes, the team found that they could use statistical models to extract the amount of photosynthesis that occurs daily and estimate daily changes in the amount of biomass in a river over time.

"The dissolved oxygen sensors show the peak during the day, and the low during the night, and from those patterns, you can estimate how much new algae and other biomass grew that day," she said. "With the sensors measuring data continuously in hundreds of rivers for years now, we can get a much bigger, clearer picture. The data is there, and we can use it to model the size of flood needed to disturb the biomass in a river, as well as the rate at which a river recovers from flood disturbances, which can help us manage rivers more effectively."

Getting started

In the first study, Blaszczak used two years of data from oxygen sensors placed in six rivers. She found that she could successfully use this data to model the flood threshold specific to a river that disturbed the underlying biomass, and that generally, the magnitude of flood necessary to disturb biomass and reduce ecosystem productivity was lower than the disturbance flow threshold necessary to mobilize river bed sediment, a metric of disturbance commonly used by those studying rivers. In other words, instead of estimating the disturbance of the river by the movement of the rocks on the river bed, this study used the biology -- the changes in algae and plant growth -- to quantify disturbance to the river and found that the biological disturbance threshold was lower.

"The amount of biomass is important for water quality and a food source for everything that lives in a river," Blaszczak explained, "so it is more important than rock movement, in terms how a river ecosystem is affected by a disturbance."

Blaszczak, a freshwater ecologist, began this work with Robert O. Hall Jr. of the Flathead Lake Biological Station at the University of Montana and enlisted the help of her colleague Assistant Professor Robert Shriver, a plant ecologist, for both research projects to complete the biomass growth modeling. Blaszczak, Shriver, and Lowman all conduct research as part of the College'sDepartment of Natural Resources & Environmental Science, as well as the College'sExperiment Stationresearch unit. The College's faculty often take interdisciplinary approaches to meet research challenges, Blaszczak said.

Expanding out to a continental scale

Blaszczak wanted to delve further by applying this approach to more rivers over a longer period of time to shed light on how various factors may be influencing both a river's thresholds for flood disturbance and its resilience to floods. Thus, she recruited Lowman to embark on the second, more extensive study. Lowman's research examined landscape and river characteristics that affected the rivers' resiliency to floods.

"We've never had such great insight into the resilience of rivers, and because of the amount of data and our modeling, we now understand the natural variation in resilience, and that the widest rivers without dams upstream recover the most quickly," Lowman said.

The fact that wide rivers without dams recover more quickly than wide rivers with dams upstream was not immediately obvious, she said, and is one example of how rivers can be affected and/or managed by our actions. Most of the rivers Lowman researched had three to four years of data, with some having as much as nine years, and a handful having less than a year.

"Having three to four years' worth of data is way more than we've ever been able to use before," Lowman said. "And, we used rivers of various sizes with various climates and land characteristics."

Besides wider rivers without dams being more resilient, Lowman said those rivers that had more frequent floods also tended to recover more quickly.

"It could be that they have had a long history of frequent flooding, so their algae and plant communities have developed the ability to adapt to more frequent disturbance," she said.

Overall, Lowman said the new model results are consistent with other previous approaches. But, she said that some sites took much longer, a month or more, to recover from floods than other sites, regardless of river size.

"It might be the composition of the algal and plant communities, the structure of the river bed, or other factors," she said. "The thresholds and recovery times are very likely partially dependent on the slope, the grain size of the sediment, and possibly other factors that aren't as well documented. Those are some next steps for future research."

Read more at Science Daily

Diving deeper into our oceans: Underwater drones open new doors for global coral reef research

At the Okinawa Institute of Science and Technology (OIST), scientists at the Marine Genomics Unit, in collaboration with the Japanese telecommunications company NTT Communications, have identified the genera of mesophotic corals using eDNA collected by underwater drones for the first time. Their groundbreaking research has been published in the journal Royal Society Open Science. Now, with the help of submersible robots, large-scale eDNA monitoring of corals can be conducted without relying on direct observations during scientific scuba diving or snorkeling.

Mesophotic ('middle-light') coral ecosystems are light dependent tropical or subtropical habitats found at depths of 30 to 150 meters.

They are unique because they host more native species compared to shallow-water coral ecosystems.

Despite this, they are largely unexplored, and more research is needed to understand their basic biology.

Researchers studying corals access these invertebrate reef builders by snorkeling and scuba diving, but these methods have limitations, especially when identifying corals at deeper depths.

Using genetic material that organisms shed from their bodies into their environment -- environmental DNA or eDNA -- scientists can identify types of corals and other organisms living in a particular habitat, providing a powerful tool for biodiversity assessment.

Importantly, studying the eDNA of corals offers unique advantages.

First, unlike fish, corals are stationary, eliminating uncertainties about their location.

Second, they constantly secrete mucus into the sea, providing plenty of coral eDNA for sampling.

For this study, the researchers analyzed mitochondrial DNA, which is more abundant and of higher quality compared to nuclear DNA, improving the accuracy of their findings.

To learn more about the coral eDNA metabarcording analysis methods used in this study, see here.

Faster and easier monitoring of coral reefs

Mesophotic coral ecosystems (MCEs) in Japan have some of the highest diversity of stony corals (Scleractinia) in the world, making them particularly important for researchers, but difficult to monitor because they are often located at deeper depths.

Additionally, to accurately monitor corals, scientists require both scuba diving and taxonomy skills, which can be challenging.

Existing methods for monitoring MCEs therefore impose limitations on conducting thorough surveys, and new methods are needed.

In October 2022, Prof. Noriyuki Satoh, leader of the Marine Genomics Unit, was approached by Mr. Shinichiro Nagahama of NTT Communications who had read about his research on coral eDNA methods.

Mr. Nagahama suggested using their underwater drones to collect samples from deeper coral reefs for eDNA analysis.

Prof. Satoh then put forward the idea of using the drones to conduct extensive surveys of mesophotic corals at greater depths.

Kerama National Park in Japan, about 30 km west of Okinawa Island, boasts some of the most transparent water in the Okinawa Archipelago.

Often referred to as 'Kerama blue', these waters provided an excellent opportunity for the researchers to test this new sampling technique.

They collected seawater samples -- each measuring 0.5 liters -- from 1 to 2 meters above the coral reefs (between 20 and 80 meters deep). The sampling sites were chosen across 24 locations within 6 different areas around the picturesque Zamami Island.

The next step involved subjecting these samples to coral metabarcoding analyses, which uses Scleractinian-specific genetic markers to identify the different genera of corals present in each sample.

From the eDNA analysis results, the researchers successfully identified corals at the genus level.

The presence and absence of certain genera of stony corals shown by this method indicated that reefs around the Kerama Islands exhibited different compositions of stony corals depending on location and depth.

For example, the genus Acropora had the highest ratios at 11 sites, indicating that these corals are common at Zamami Island reefs.

The researchers also found that the proportion of Acropora eDNA was higher at shallow reefs and upper ridges of slopes, while the proportion of the genus Porites increased at mesophotic sites.

Regarding depth, Acropora was readily detected at shallow reefs (≤15 meters), while other genera were more frequently found at deeper reefs (>20 meters).

To study corals using eDNA metabarcoding methods, further sequencing of mitochondrial genomes of stony corals is needed, and this study suggests that it may be possible to more efficiently monitor mesophotic corals at the generic level using eDNA collected by underwater drones.

Collaborative innovation ahead

NTT Communications has developed a new version of the original drone used for this study.

In response to a request from Prof. Satoh, an additional sampler was added so that two samples can be collected during a single dive.

Additionally, the cable length between the controller and drone was extended from 150 meters to 300 meters and the battery is now changeable, so researchers can continue their survey work for an entire day.

Read more at Science Daily

Vittrup Man crossed over from forager to farmer before being sacrificed in Denmark

Vittrup Man was born along the Scandinavian coast before moving to Denmark, where he was later sacrificed, according to a study published February 14, 2024 in the open-access journal PLOS ONE by Anders Fischer of the University of Gothenburg, Sweden and colleagues.

Vittrup Man is the nickname of a Stone Age skeleton recovered from a peat bog in Northwest Denmark, dating to between 3300-3100 BC. The fragmented nature of the remains, including a smashed skull, indicate that he was killed in a ritualistic sacrifice, a common practice in this region at this time.

After a DNA study found Vittrup Man's genetic signature to be distinct from contemporary, local skeletons, Fischer and colleagues were inspired to combine additional evidence to reconstruct the life history of this Stone Age individual at an unprecedented resolution.

Strontium, carbon and oxygen isotopes from Vittrup Man's tooth enamel indicate a childhood spent along the coast of the Scandinavian Peninsula.

Corroborating this, genetic analysis found a close relationship between Vittrup Man and Mesolithic people from Norway and Sweden.

Additional isotope and protein analysis of the teeth and bones indicate a shift in diet from coastal food (marine mammals and fish) in early life to farm food (including sheep or goat) in later life, a transition that happened in the later teen years.

These results suggest that Vittrup Man spent his early years in a northern foraging society before relocating to a farming society in Denmark.

It isn't clear why this individual moved, though the authors suggest he might have been a trader or captive who became integrated into local society.

Mysteries remain about Vittrup Man, but this detailed understanding of his geographic and dietary life history provides new insights into interactions between Mesolithic and Neolithic societies in Europe.

Read more at Science Daily

A lighthouse in the Gobi desert

A new study published in the journal PLOS ONE explores the weight great fossil sites have on our understanding of evolutionary relationships between fossil groups -- the lagerstätten effect -- and for the first time, quantified the power these sites have on our understanding of evolutionary history. Surprisingly, the authors discovered that the wind-swept sand deposits of the Late Cretaceous Gobi Desert's extraordinarily diverse and well-preserved fossil lizard record shapes our understanding of their evolutionary history more than any other site on the planet.

While famous as the region where Velociraptor was discovered, China and Mongolia's Late Cretaceous Gobi Desert might have more of an impact on our understanding of ancient -- and modern -- life thanks to its rich record of fossil lizards.

"What's so cool about these Late Cretaceous Gobi Desert deposits is that you're getting extremely diverse, exceptionally complete, three-dimensionally-preserved lizard skeletons," said Dr. Hank Woolley, lead author and NSF Postdoctoral Research Fellow at the Dinosaur Institute.

"You're getting many lineages on the squamate Tree of Life represented from this single unit, giving us this remarkable fossil signal of biodiversity in the rock record, something that stands out as a lighthouse in the deep dark chasms of squamate evolutionary history."

More complete skeletons make it easier to trace relationships through time by making it easier to compare similarities and differences.

The more complete a skeleton is, the more traits are preserved, and those traits translate into phylogenetic data -- data that are used to construct the tree of life.

"Where there's exceptional preservation -- hundreds of species from one part of the world at one period of very specific time -- that doesn't necessarily give you a good idea of global signals," said Woolley.

"It's putting its thumb on the scale."

To measure how impactful deposits of exceptional fossil preservation (known in the paleontology community by the German term "lagerstätten") are on the broader understanding of evolutionary relationships through time, Woolley and co-authors including Dr. Nathan Smith, Curator of the Dinosaur Institute, combed through published records of 1,327 species of non-avian theropod dinosaurs, Mesozoic birds, and fossil squamates (the group of reptiles that includes mosasaurs, snakes, and lizards).

The Fossil Meta Narrative


When it came to squamates, the researchers found no correlation between the intensity of sampling and whether any given site impacted phylogenetic data on a global scale.

Instead, they found a signal from depositional environments, the different kinds of sites where sediments accumulated preserved markedly different groups.

Because the squamate record from the Gobi Desert is so complete, it shapes our understanding of squamate evolution around the world and across time, a prime example of the "lagerstätten effect" -- despite not being a typical lagerstätte.

Traditional lagerstätten deposits come from marine chalks, salty lagoons, and ancient lake environments -- not from arid sand dunes.

The ancient environment shapes what gets preserved in the fossil record.

"We were not expecting to find this detailed record from lizards in a desert sand dune deposit," said Woolley.

"We often think of lagerstätten deposits as preserving soft tissues and organisms that rarely fossilize, or especially rich concentrations of fossils. What makes the Gobi squamate record unique, is that it includes both exceptionally complete skeletons, and a high diversity of species from across the group's family tree," said Smith.

Read more at Science Daily

Feb 19, 2024

Under pressure -- space exploration in our time

In the past decade, humanity has seen the birth and expansion of a commercial space sector with new, private players, addressing technological challenges -- from space launch to communication and satellite imagery of Earth. Last year, the global space industry skyrocketed launching more than 2,660 satellites into orbit, and, into the universe, interplanetary probes, landers, and much more. In the United States, SpaceX was responsible for almost 90% of these launches. In parallel to this progression is the expansion of more than 70 countries demonstrating space capabilities. It affirms the general consensus and understanding that humanity will continue to rely on space activities to better the human experience. These developments create a novel landscape of both competition and collaboration for scientists, offering both challenges and opportunities.

In an increasingly fragmented world, the scientific community stands as an example of successful international collaboration and diplomacy.

Science is based in a long-standing tradition of knowledge exchange that often transcends political boundaries for the benefit of all humanity.

Cost-effective, commercial space technologies can enable novel research or reduce the cost of investigations opening new possibilities for the scientific community.

At the same time, international partnerships can further broaden engagement, diversity, and collaboration in science and space exploration.

While this might seem like a "win-win" scenario, the interest of the scientific community is to openly share data and analysis.

Differing principles and ideals present potential areas of conflict.

As governments and private entities fund bold, new projects, leaders, academics, and legal experts are contemplating both the larger consequences, as well as potential prospects.

High stakes leadership and landing on Mars

"When every mission is a first for humanity, the stakes are very high," says Thomas Zurbuchen who led 54 missions as NASA's longest serving Associate Administrator for the Science Mission Directorate.

"When the clock is ticking, and the world is watching, a leader's most powerful asset is a highly diverse team," he says.

Within this context, diversity can emerge from international and commercial partnerships, and can give rise to new missions.

In fact, an estimated two-thirds of science missions have international partnerships.

Now, as the Director for ETH Zurich Space, in Switzerland, Zurbuchen reflects on the value of partnerships as a leadership tool.

He uses examples from recent international missions, such as Mars InSight, to discuss how diversity creates opportunities for new and different ideas to come forward -- even if some ideas pose an element of risk.

He also addresses some of the challenges arising from partnerships.

For example, some companies and countries prefer not to share their science data gathered in space, making reproducibility challenging for scientific analysis.

Cross-border diffusion -- learning from lightning on Venus

"Scientific ideas move like space plasmas," says David Malaspina, a space plasma physicist at University of Colorado, Boulder.

"When they encounter a border, they find a way across." Malaspina describes international academic collaborations as vital engines of discovery and attributes the language of science for fostering a sense of awe and wonder for the universe that transcends cultures.

In science, and in plasmas, the most interesting physics happens at the boundaries.

Malaspina engages in international and generationally diverse research teams, including a team building a sounding rocket to explore the interface between Earth and space.

He is also a member of a team that uses data from the Parker Solar Probe mission to explore Venus, seeking to understand the importance of a planetary magnetic field for habitability of Earth-like planets.

He discusses how teams that foster inclusion of diverse perspectives create new opportunities for scientific progress.

Why protect bootprints on the Moon?

Unlike ancient footprints, cave drawings, and stone-age tools found here on Earth, the first traces of human activity on the Moon, including Neil Armstrong's bootprint, are not protected under any existing laws or regulations.

Michelle Hanlon, space lawyer and Executive Director of the Center for Air and Space Law at the University of Mississippi School of Law, thinks this is a travesty.

Read more at Science Daily

Neolithic groups from the south of the Iberian Peninsula first settled permanently in San Fernando (Cadiz) 6,200 years ago

New study by ICTA-UAB and the University of Cádiz reveals that the first farmers and herdsmen settled in Andalusia collected and consumed shellfish throughout the year, especially in winter.

The first Neolithic farmers and shepherds in Andalusia settled permanently on the island of San Fernando, Cadiz, 6,200 years ago, where they continued to collect and consume shellfish throughout the year, preferably in winter.

This is the conclusion of an archaeological study led by Asier García-Escárzaga, researcher at the Institute of Environmental Science and Technology (ICTA-UAB) and the Department of Prehistory of the Universitat Autònoma de Barcelona (UAB), which shows that these populations occupied the island throughout the year.

The research carried out in recent decades in the south of the Iberian Peninsula has revealed many aspects of the life of the first Neolithic groups in Andalusia.

These populations were the first to base their subsistence mainly on agriculture and livestock, rather than hunting and gathering.

However, there were still questions to be answered about the patterns of occupation of sites (annual or seasonal) and the exploitation of marine resources after the adoption of a new economic model.

In a new study, published in the international journal Archaeological and Anthropological Science, oxygen stable isotope analysis was applied to marine shells to address both questions.

The shells analysed were recovered from the sites of Campo de Hockey (San Fernando, Cadiz).

The necropolis of Campo de Hockey, excavated in 2008, is located on the ancient island of San Fernando, just 150 metres from the ancient coastline.

The excavations, directed by Eduardo Vijande from the University of Cadiz, allowed to document 53 graves (45 single, 7 double and 1 quadruple). Most of them were plain (simple graves in which the individual is buried), but what stood out the most was the existence of 4 graves of greater complexity and monumentality, made with medium and large stones considered to be proto-megalithic.

The Campo de Hockey II site, annexed to the first site and whose excavation and research was conducted by María Sánchez and Eduardo Vijande in 2018, allowed for the identification of 28 archaeological structures (17 hearths, two shell heaps, four tombs and five stone structures).

The high presence of hearths and mollusk and fish remains in the middens suggests that the area was used for the processing and consumption of marine resources.

Among the information that can be obtained from the analysis of stable oxygen isotopes in marine shells is the possibility of reconstructing the time of year when the mollusks died, and therefore when they were consumed by prehistoric populations in the past.

The results of this research indicate that the first farmers occupying the island of San Fernando collected shellfish all year round, but more in the colder months of autumn, winter, and early spring, that is, from November to April.

This information allowed the scientific team to conclude that these populations occupied the island throughout the year.

"The size of the necropolis already led us to believe that it was an annual habitat, but these studies confirm the existence of a permanent settlement 6,200 years ago," said Eduardo Vijande, researcher at the University of Cadiz and co-author of the study.

The greatest exploitation of shellfish during the coldest months of the year coincides with the annual period of maximum profitability of this food resource due to the formation of gametes.

A seasonal pattern of shellfish consumption based on energetic cost-benefit principles which is similar to that developed by the last hunter-gatherer populations of the Iberian Peninsula.

"That is to say, there is a greater exploitation of these topshells in the winter months, since this is the time when these animals present a greater quantity of meat," points out Asier García-Escárzaga.

Read more at Science Daily

Mystery solved: The oldest fossil reptile from the alps is an historical forgery

A 280-million-year-old fossil that has baffled researchers for decades has been shown to be, in part, a forgery following new examination of the remnants.

The discovery has led the team led by Dr Valentina Rossi of University College Cork, Ireland (UCC) to urge caution in how the fossil is used in future research.

Tridentinosaurus antiquus was discovered in the Italian alps in 1931 and was thought to be an important specimen for understanding early reptile evolution.

Its body outline, appearing dark against the surrounding rock, was initially interpreted as preserved soft tissues.

This led to its classification as a member of the reptile group Protorosauria.

However, this new research, published in the scientific journal Palaeontology, reveals that the fossil renowned for its remarkable preservation is mostly just black paint on a carved lizard-shaped rock surface.

The purported fossilised skin had been celebrated in articles and books but never studied in detail.

The somewhat strange preservation of the fossil had left many experts uncertain about what group of reptiles this strange lizard-like animal belonged to and more generally its geological history.

Dr Rossi, of UCC's School of Biological, Earth and Environmental Sciences, said:

"Fossil soft tissues are rare, but when found in a fossil they can reveal important biological information, for instance, the external colouration, internal anatomy and physiology.

"The answer to all our questions was right in front of us, we had to study this fossil specimen in details to reveal its secrets -- even those that perhaps we did not want to know."

The microscopic analysis showed that the texture and composition of the material did not match that of genuine fossilised soft tissues.

Preliminary investigation using UV photography revealed that the entirety of the specimen was treated with some sort of coating material.

Coating fossils with varnishes and/or lacquers was the norm in the past and sometimes is still necessary to preserve a fossil specimen in museum cabinets and exhibits.

The team was hoping that beneath the coating layer, the original soft tissues were still in good condition to extract meaningful palaeobiological information.

The findings indicate that the body outline of Tridentinosaurus antiquus was artificially created, likely to enhance the appearance of the fossil.

This deception misled previous researchers, and now caution is being urged when using this specimen in future studies.

The team behind this research includes contributors based in Italy at the University of Padua, Museum of Nature South Tyrol, and the Museo delle Scienze in Trento.

Co-author Prof Evelyn Kustatscher, coordinator of the project "Living with the supervolcano," funded by the Autonomous Province of Bolzano said:

"The peculiar preservation of Tridentinosaurus had puzzled experts for decades. Now, it all makes sense. What it was described as carbonized skin, is just paint."

However all not all is lost, and the fossil is not a complete fake.

The bones of the hindlimbs, in particular, the femurs seem genuine, although poorly preserved.

Moreover, the new analyses have shown the presence of tiny bony scales called osteoderms -- like the scales of crocodiles -- on what perhaps was the back of the animal.

Read more at Science Daily

Advanced artificial photosynthesis catalyst uses CO2 more efficiently to create biodegradable plastics

A research team from Osaka Metropolitan University that had previously succeeded in synthesizing fumaric acid using bicarbonate and pyruvic acid, and carbon dioxide collected directly from the gas phase as one of the raw materials, has now created a new photosensitizer and developed a new artificial photosynthesis technology, effectively doubling the yield of fumaric acid production compared to the previous method. The results of this research are expected to reduce carbon dioxide emissions and provide an innovative way to produce biodegradable plastics while reusing waste resources.

Amid growing global concern over climate change and plastic pollution, researchers at Osaka Metropolitan University are making great strides in the sustainable production of fumaric acid -- a component of biodegradable plastics such as polybutylene succinate, which is commonly used for food packaging.

The researchers have managed to efficiently produce fumaric acid, which is traditionally derived from petroleum, using renewable resources, carbon dioxide, and biomass-derived compounds.

In a previous study, a research team led by Professor Yutaka Amao of the Research Center for Artificial Photosynthesis at Osaka Metropolitan University demonstrated the synthesis of fumaric acid from bicarbonate and pyruvic acid, a biomass-derived compound, using solar energy.

They also succeeded in producing fumaric acid using carbon dioxide obtained directly from the gas phase as a raw material.

However, the yield in the production of fumaric acid remained low.

In their latest research, published in Dalton Transactions, the researchers have now developed a new photosensitizer and further advanced an artificial photosynthesis technique that doubles the yield of fumaric acid compared to conventional methods.

Read more at Science Daily

Feb 18, 2024

First-ever atomic freeze-frame of liquid water

In an experiment akin to stop-motion photography, scientists have isolated the energetic movement of an electron while "freezing" the motion of the much larger atom it orbits in a sample of liquid water.

The findings, reported today in the journal Science, provide a new window into the electronic structure of molecules in the liquid phase on a timescale previously unattainable with X-rays. The new technique reveals the immediate electronic response when a target is hit with an X-ray, an important step in understanding the effects of radiation exposure on objects and people.

"The chemical reactions induced by radiation that we want to study are the result of the electronic response of the target that happens on the attosecond timescale," said Linda Young, a senior author of the research and Distinguished Fellow at Argonne National Laboratory. "Until now radiation chemists could only resolve events at the picosecond timescale, a million times slower than an attosecond. It's kind of like saying 'I was born and then I died.' You'd like to know what happens in between. That's what we are now able to do."

A multi-institutional group of scientists from several Department of Energy national laboratories and universities in the U.S. and Germany combined experiments and theory to reveal in real-time the consequences when ionizing radiation from an X-ray source hits matter.

Working on the time scales where the action happens will allow the research team to understand complex radiation-induced chemistry more deeply. Indeed, these researchers initially came together to develop the tools needed to understand the effect of prolonged exposure to ionizing radiation on the chemicals found in nuclear waste. The research is supported by the Interfacial Dynamics in Radioactive Environments and Materials (IDREAM) Energy Frontier Research Center sponsored by the Department of Energy and headquartered at Pacific Northwest National Laboratory (PNNL).

"Members of our early-career network participated in the experiment, and then joined our full experimental and theoretical teams to analyze and understand the data," said Carolyn Pearce, IDREAM EFRC director and a PNNL chemist. "We couldn't have done this without the IDREAM partnerships."

From the Nobel Prize to the field

Subatomic particles move so fast that capturing their actions requires a probe capable of measuring time in attoseconds, a time frame so small that there are more attoseconds in a second than there have been seconds in the history of the universe.

The current investigation builds upon the new science of attosecond physics, recognized with the 2023 Nobel Prize in Physics. Attosecond X-ray pulses are only available in a handful of specialized facilities worldwide. This research team conducted their experimental work at the Linac Coherent Light Source (LCLS), located at SLAC National Accelerator Laboratory, in Menlo Park, Calif, where the local team pioneered the development of attosecond X-ray free-electron lasers.

"Attosecond time-resolved experiments are one of the flagship R&D developments at the Linac Coherent Light Source," said Ago Marinelli from the SLAC National Accelerator Laboratory, who, together with James Cryan, led the development of the synchronized pair of X-ray attosecond pump/probe pulses that this experiment used. "It's exciting to see these developments being applied to new kinds of experiments and taking attosecond science into new directions."

The technique developed in this study, all X-ray attosecond transient absorption spectroscopy in liquids, allowed them to "watch" electrons energized by X-rays as they move into an excited state, all before the bulkier atomic nucleus has time to move. They chose the liquid water as their test case for an experiment.

"We now have a tool where, in principle, you can follow the movement of electrons and see newly ionized molecules as they're formed in real-time," said Young, who is also a professor in the Department of Physics and James Franck Institute at the University of Chicago.

These newly reported findings resolve a long-standing scientific debate about whether X-ray signals seen in previous experiments are the result of different structural shapes, or "motifs," of water or hydrogen atom dynamics. These experiments demonstrate conclusively that those signals are not evidence for two structural motifs in ambient liquid water.

"Basically, what people were seeing in previous experiments was the blur caused by moving hydrogen atoms," said Young. "We were able to eliminate that movement by doing all of our recording before the atoms had time to move."

From simple to complex reactions

The researchers envision the current study as the beginning of a whole new direction for attosecond science.

To make the discovery, PNNL experimental chemists teamed with physicists at Argonne and the University of Chicago, X-ray spectroscopy specialists and accelerator physicists at SLAC, theoretical chemists at the University of Washington, and attosecond science theoreticians from the Hamburg Centre for Ultrafast Imaging and the Center for Free-Electron Laser Science (CFEL), Deutsches Elektronen-Synchrotron (DESY), in Hamburg, Germany.

During the global pandemic, in 2021 and into 2022, the PNNL team used techniques developed at SLAC to spray an ultra-thin sheet of pure water across the X-ray pump pulse path.

"We needed a nice, flat, thin sheet of water where we could focus the X-rays," said Emily Nienhuis, an early-career chemist at PNNL, who started the project as a post-doctoral research associate. "This capability was developed at the LCLS." At PNNL, Nienhuis demonstrated that this technique can also be used to study the specific concentrated solutions that are central to the IDREAM EFRC and will be investigated at the next stage of the research.

From experiment to theory

Once the X-ray data had been collected, theoretical chemist Xiaosong Li and graduate student Lixin Lu from the University of Washington applied their knowledge of interpreting the X-ray signals to reproduce the signals observed at SLAC. The CFEL team, led by theoretician Robin Santra, modelled the liquid water response to attosecond X-rays to verify that the observed signal was indeed confined to the attosecond timescale.

"Using the Hyak supercomputer at the University of Washington, we developed a cutting-edge computational chemistry technique that enabled detailed characterization of the transient high-energy quantum states in water," said Li, the Larry R. Dalton Endowed Chair in Chemistry at the University of Washington and a Laboratory Fellow at PNNL. "This methodological breakthrough yielded a pivotal advancement in the quantum-level understanding of ultrafast chemical transformation, with exceptional accuracy and atomic-level detail."

Principal Investigator Young originated the study and supervised its execution, which was led on-site by first author and postdoc Shuai Li. Physicist Gilles Doumy, also of Argonne, and graduate student Kai Li of the University of Chicago were part of the team that conducted the experiments and analyzed the data. Argonne's Center for Nanoscale Materials, a DOE Office of Science user facility, helped characterize the water sheet jet target.

Together, the research team got a peek at the real-time motion of electrons in liquid water while the rest of the world stood still.

"The methodology we developed permits the study of the origin and evolution of reactive species produced by radiation-induced processes, such as encountered in space travel, cancer treatments, nuclear reactors and legacy waste," said Young.

Read more at Science Daily

Frequent marine heatwaves in the Arctic Ocean will be the norm

Marine heatwaves will become a regular occurrence in the Arctic in the near future and are a product of higher anthropogenic greenhouse-gas emissions -- as shown in a study just released by Dr. Armineh Barkhordarian from Universität Hamburg's Cluster of Excellence for climate research CLICCS.

Since 2007, conditions in the Arctic have shifted, as confirmed by data recently published in the journal Nature Communications Earth & Environment. Between 2007 and 2021, the marginal zones of the Arctic Ocean experienced 11 marine heatwaves, producing an average temperature rise of 2.2 degrees Celsius above seasonal norm and lasting an average of 37 days.

Since 2015, there have been Arctic marine heatwaves every year.

The most powerful heatwave to date in the Arctic Ocean was in 2020; it continued for 103 days, with peak temperatures intensity that were four degrees Celsius over the long-term average.

The probability of such a heatwave occurring without the influence of anthropogenic greenhouse gases is less than one percent, as calculated by Barkhordarian's team at the Cluster of Excellence CLICCS.

By doing so, they have narrowed down the number of plausible climate scenarios in the Arctic.

According to the study, annual marine heatwaves will be the norm.

The Arctic entered a new phase

In the study, Barkhordarian also proves for the first time that heatwaves are produced when sea ice melts early and rapidly after the winter.

When this happens, considerable heat energy can accumulate in the water by the time maximum solar radiation is reached in July.

"In 2007, a new phase began in the Arctic," says Barkhordarian, an expert on climate statistics.

"There is less and less of the thicker, several-year-old ice, while the percentage of thin, seasonal ice is consistently increasing." However, the thin ice is less durable and melts more quickly, allowing incoming solar radiation to warm the water's surface.

Officially, it is considered to be a marine heatwave when temperatures at the water's surface are higher than 95 percent of the values from the past 30 years for at least five consecutive days.

"Not just the constant loss of sea ice but also warmer waters can have dramatic negative effects on the Arctic ecosystem," says Barkhordarian.

Food chains could collapse, fish stocks could be reduced, and overall biodiversity could decline.

Read more at Science Daily

Salt substitutes help to maintain healthy blood pressure in older adults

The replacement of regular salt with a salt substitute can reduce incidence of hypertension, or high blood pressure, in older adults without increasing their risk of low blood pressure episodes, according to a recent study in the Journal of the American College of Cardiology. People who used a salt substitute had a 40% lower incidence and likelihood of experiencing hypertension compared to those who used regular salt.

According to the World Health Organization, hypertension is the leading risk factor for cardiovascular disease and mortality.

It affects over 1.4 billion adults and results in 10.8 million deaths per year worldwide.

One of the most effective ways to reduce hypertension risk is to reduce sodium intake.

This study looks at salt substitutes as a better solution to control and maintain healthy blood pressure than reducing salt alone.

"Adults frequently fall into the trap of consuming excess salt through easily accessible and budget-friendly processed foods," said Yangfeng Wu, MD, PhD, lead author of the study and Executive Director of Peking University Clinical Research Institute in Beijing, China.

"It's crucial to recognize the impact of our dietary choices on heart health and increase the public's awareness of lower-sodium options."

Researchers in this study evaluated the impact of sodium reduction strategies on blood pressure in elderly adults residing in care facilities in China.

While previous studies prove that reducing salt intake can prevent or delay new-onset hypertension, long-term salt reduction and avoidance can be challenging.

The DECIDE-Salt study included 611 participants 55 years or older from 48 care facilities split into two groups: 24 facilities (313 participants) replacing usual salt with the salt substitute and 24 facilities (298 participants) continuing the use of usual salt.

All participants had blood pressure <140/90mmHg and were not using anti-hypertension medications at baseline.

The primary outcome was participants who had incident hypertension, initiated anti-hypertension medications or developed major cardiovascular adverse events during follow-up.

At two years, the incidence of hypertension was 11.7 per 100 people-years in participants with salt substitute and 24.3 per 100 people-years in participants with regular salt.

People using the salt substitute were 40% less likely to develop hypertension compared to those using regular salt.

Furthermore, the salt substitutes did not cause hypotension, which can be a common issue in older adults.

"Our results showcase an exciting breakthrough in maintaining blood pressure that offers a way for people to safeguard their health and minimize the potential for cardiovascular risks, all while being able to enjoy the perks of adding delicious flavor to their favorite meals," Wu said.

"Considering its blood pressure -- lowering effect, proven in previous studies, the salt substitute shows beneficial to all people, either hypertensive or normotensive, thus a desirable population strategy for prevention and control of hypertension and cardiovascular disease."

Limitations of the study include that it is a post-hoc analysis, study outcomes were not pre-specified and there was a loss of follow-up visits in many patients.

Analyses indicated that these missing values were at random, and multiple sensitivity analyses supports the robustness of the results.

Read more at Science Daily