Jan 21, 2023

Tumultuous migration on the edge of the Hot Neptune Desert

All kinds of exoplanets orbit very close to their star. Some look like the Earth, others like Jupiter. Very few, however, are similar to Neptune. Why this anomaly in the distribution of exoplanets? Researchers from the University of Geneva (UNIGE) and the National Centre of Competence in Research (NCCR) PlanetS have observed a sample of planets located at the edge of this Hot Neptune Desert to understand its creation. Using a technique combining the two main methods of studying exoplanets (radial velocities and transits), they were able to establish that a part of these exoplanets has migrated in a turbulent way near their star, which pushed them out of the orbital plane where they were formed. These results are published in the specialized journal Astronomy & Astrophysics.

Since the discovery of the first exoplanet in 1995, researchers have detected more than 5'000 planets in our galactic neighborhood, most of them orbiting very close to their star. If the diversity of these new worlds ranges from gas giants the size of Jupiter or Saturn to smaller planets the size of Mercury, including rocky planets larger than the Earth, gas planets the size of Neptune seem to be missing. Astronomers call this empty ''box'' in the distribution of close-in planets the Hot Neptune Desert.

''The distribution of planets close to their star is shaped by a complex interaction between atmospheric and dynamical processes, i.e. the motions of the planets over time,'' comments Vincent Bourrier, assistant professor in the Department of Astronomy at the UNIGE Faculty of Science. ''Today we have several hypotheses to explain this desert but nothing is certain yet and the mystery remains''. Did these planets lose their atmosphere entirely, eroded by the intense radiation of their star? Did they migrate from their birthplace to the outer parts of the system by a different mechanism than other types of planets, preventing them from reaching the same close orbits?

Disrupted migration

In a recent work, a team of scientists from the UNIGE brings some answers by looking at the orbital architecture of the planets located at the edge of this desert. By surveying fourteen planets around this area, ranging from small planets to gas giants, the astronomers were interested in the way their orbits are oriented with respect to the axis of rotation of their star. This information makes it possible to distinguish the processes of soft migration (the planets move in the equatorial plane of their star where they were formed) from the processes of disruptive migration (the planets migrate and are pushed out of the plane where they were formed).

The researchers were able to show that most of the planets in their sample have an orbit misaligned with the stellar equator. ''We found that three-quarters of these planets have a polar orbit (they rotate above the poles of their star), which is a larger fraction than for planets further away from the desert. This reflects the role of disruptive migration processes in the formation of the desert,'' summarizes Vincent Bourrier, first author.

Two methods combined


To achieve these results, the scientists used the radial velocity method and the transit method, which are employed to study exoplanets. ''Analyzing the radial velocities during the transit of a planet allows us to determine if it orbits around the stellar equator, around the poles, or if the system is in an intermediate configuration, because different architectures will produce different signatures,'' explains Omar Attia, a doctoral student in the Department of Astronomy at the UNIGE Faculty of Science and second author of the study. These two methods were combined with data obtained with the HARPS and HARPS-North spectrographs, created at UNIGE and located on the 3.6m telescope of ESO (European Southern Observatory) and TNG (Telescopio Nazionale Galileo).

The path to understand all of the mechanisms involved in the formation of the Hot Neptune Desert is still long. It will be necessary in particular to explore with this technique the smallest planets at the edge of the desert, today difficult to access even with instruments of last generation such as the spectrograph ESPRESSO, built by the UNIGE and installed on the largest European telescopes. It will be necessary to wait for the commissioning of the ELT, the 39-meter super telescope of ESO, planned for 2027.

Read more at Science Daily

17-pound meteorite discovered in Antarctica

Antarctica is a tough place to work, for obvious reasons -- it's bitterly cold, remote, and wild. However, it's one of the best places in the world to hunt for meteorites. That's partly because Antarctica is a desert, and its dry climate limits the degree of weathering the meteorites experience. On top of the dry conditions, the landscape is ideal for meteorite hunting: the black space rocks stand out clearly against snowy fields. Even when meteorites sink into the ice, the glaciers' churning motion against the rock below helps re-expose the meteorites near the surface of the continent's blue ice fields.

An international team of researchers who just got back from Antarctica can attest to the continent's meteorite-hunter-friendliness: they returned with five new meteorites, including one that weighs 16.7 pounds (7.6 kg).

Maria Valdes, a research scientist at the Field Museum and the University of Chicago, estimates that of the roughly 45,000 meteorites retrieved from Antarctica over the past century, only about a hundred or so are this size or larger. "Size doesn't necessarily matter when it comes to meteorites, and even tiny micrometeorites can be incredibly scientifically valuable," says Valdes, "but of course, finding a big meteorite like this one is rare, and really exciting."

Valdes was one of four scientists on the mission, led by Vinciane Debaille of the Université Libre de Bruxelles (FNRS-ULB); the research team was rounded out by Maria Schönbächler (ETH-Zurich) and Ryoga Maeda (VUB-ULB). The researchers were the first to explore potential new meteorite sites mapped using satellite imagery by Veronica Tollenaar, a thesis student in glaciology at the ULB.

"Going on an adventure exploring unknown areas is exciting," says Debaille, "but we also had to deal with the fact that the reality on the ground is much more difficult than the beauty of satellite images." Despite timing their trip for Antarctica's summertime in late December, temperatures hovered around 14° F (-10° C). Valdes notes that some days during their trip, it was actually colder in Chicago than it was in Antarctica, but spending days riding snowmobiles and trekking through ice fields and then sleeping in a tent made the Antarctic weather feel more extreme.

The five meteorites recovered by the team will be analyzed at the Royal Belgian Institute of Natural Sciences; meanwhile, sediment potentially containing tiny micrometeorites was divided among the researchers for study at their institutions.

Valdes says she's eager to see what the analyses of the meteorites reveal, because "studying meteorites helps us better understand our place in the universe. The bigger a sample size we have of meteorites, the better we can understand our Solar System, and the better we can understand ourselves."

Read more at Science Daily

Jan 20, 2023

Stars disappear before our eyes

A startling analysis from Globe at Night -- a citizen science program run by NSF's NOIRLab -- concludes that stars are disappearing from human sight at an astonishing rate. The study finds that, to human eyes, artificial lighting has dulled the night sky more rapidly than indicated by satellite measurements. The study published in the journal Science showcases the unique contributions that citizen scientists can make in essential fields of research.

From the glowing arc of the Milky Way to dozens of intricate constellations, the unaided human eye should be able to perceive several thousand stars on a clear, dark night. Unfortunately, growing light pollution has robbed about 30% of people around the globe and approximately 80% of people in the United States of the nightly view of their home galaxy. A new paper published in the journal Science concludes that the problem is getting rapidly worse.

New citizen-science-based research sheds alarming light on the problem of 'skyglow' -- the diffuse illumination of the night sky that is a form of light pollution. The data for this study came from crowd-sourced observations collected from around the world as part of Globe at Night, a program run by NSF's NOIRLab and developed by NRAO astronomer Connie Walker. The research reveals that skyglow is increasing more rapidly than shown in satellite measurements of Earth's surface brightness at night.

"At this rate of change, a child born in a location where 250 stars were visible would be able to see only abound100 by the time they turned 18," said Christopher Kyba, a researcher at the German Research Centre for Geosciences and lead author of the paper detailing these results.

Light pollution is a familiar problem that has many detrimental effects, not only on the practice of astronomy. It also has an impact on human health and wildlife, since it disrupts the cyclical transition from sunlight to starlight that biological systems have evolved alongside. Furthermore, the loss of visible stars is a poignant loss of human cultural heritage. Until relatively recently, humans throughout history had an impressive view of the starry night sky, and the effect of this nightly spectacle is evident in ancient cultures, from the myths it inspired to the structures that were built in alignment with celestial bodies.

Despite being a well-recognized issue, however, the changes in sky brightness over time are not well documented, particularly on a global scale.

Globe at Night has been gathering data on stellar visibility every year since 2006.* Anyone can submit observations through the Globe at Night web application on a desktop or smartphone. After entering the relevant date, time and location, participants are shown a number of star maps. They then record which one best matches what they can see in the sky without any telescopes or other instruments.

This gives an estimate of what is called the naked eye limiting magnitude, which is a measure of how bright an object must be in order to be seen. This can be used to estimate the brightness of skyglow, because as the sky brightens, the fainter objects disappear from sight.

The authors of the paper analyzed more than 50,000 observations submitted to Globe at Night between 2011 and 2022, ensuring consistency by omitting entries that were affected by factors including cloud cover and moonlight. They focused on data from Europe and North America, since these regions had a sufficient distribution of observations across the land area as well as throughout the decade studied. The paper notes that the sky is likely brightening more quickly in developing countries, where satellite observations indicate the prevalence of artificial lighting is growing at a higher rate.

After devising a new method to convert these observations into estimates of the change in skyglow, the authors found that the loss of visible stars reported by Globe at Night indicates an increase in sky brightness of 9.6% per year over the past decade. This is much greater than the roughly 2% per year global increase in surface brightness measured by satellites.

"This shows that existing satellites aren't sufficient to study how Earth's night is changing," said Kyba. "We've developed a way to 'translate' Globe at Night observations of star visibility made at different locations from year to year into continent-wide trends of sky brightness change. That shows that Globe at Night isn't just an interesting outreach activity, it's an essential measurement of one of Earth's environmental variables."

Existing satellites are not well suited to measuring skyglow as it appears to humans, because there are no current instruments monitoring the whole Earth that can detect wavelengths shorter than 500 nanometers, which corresponds to the color cyan, or greenish blue. Shorter wavelengths, however, contribute disproportionately to skyglow, because they scatter more effectively in the atmosphere. White LEDs, now increasingly commonly used in high-efficiency outdoor lighting, have a peak in emission between 400 and 500 nanometers.

"Since human eyes are more sensitive to these shorter wavelengths at nighttime, LED lights have a strong effect on our perception of sky brightness," said Kyba. "This could be one of the reasons behind the discrepancy between satellite measurements and the sky conditions reported by Globe at Night participants."

Beyond wavelength differences, space-based instruments do not measure light emitted horizontally very well, such as from illuminated signs or windows, but these sources are significant contributors to skyglow as seen from the ground. Crowd-sourced observations will therefore always be invaluable for investigating the direct human effects of sky brightness.

Read more at Science Daily

Researchers uncover 92 fossil nests belonging to some of India's largest dinosaurs

The discovery of more than 250 fossilized eggs reveals intimate details about the lives of titanosaurs in the Indian subcontinent, according to a study published January 18, 2022 in the open-access journal PLOS ONE by Harsha Dhiman of the University of Delhi, New Delhi and colleagues.

The Lameta Formation, located in the Narmada Valley of central India, is well-known for fossils of dinosaur skeletons and eggs of the Late Cretaceous Period. Recent work in the area uncovered 92 nesting sites containing a total of 256 fossil eggs belonging to titanosaurs, which were among the largest dinosaurs to have ever lived. Detailed examination of these nests has allowed Dhiman and colleagues to make inferences about the life habits of these dinosaurs.

The authors identified six different egg-species (oospecies), suggesting a higher diversity of titanosaurs than is represented by skeletal remains from this region. Based on the layout of the nests, the team inferred that these dinosaurs buried their eggs in shallow pits like modern-day crocodiles. Certain pathologies found in the eggs, such as a rare case of an "egg-in-egg," indicate that titanosaur sauropods had a reproductive physiology that parallels that of birds and possibly laid their eggs in a sequential manner as seen in modern birds. The presence of many nests in the same area suggests these dinosaurs exhibited colonial nesting behavior like many modern birds. But the close spacing of the nests left little room for adult dinosaurs, supporting the idea that adults left the hatchlings (newborns) to fend for themselves.

Details of dinosaur reproductive habits can be difficult to determine. These fossil nests provide a wealth of data about some of the largest dinosaurs in history, and they come from a time shortly before the age of dinosaurs came to an end. The insights gleaned from this study contribute significantly to paleontologists' understanding of how dinosaurs lived and evolved.

Harsha Dhiman, lead author of the research, adds: "Our research has revealed the presence of an extensive hatchery of titanosaur sauropod dinosaurs in the study area and offers new insights into the conditions of nest preservation and reproductive strategies of titanosaur sauropod dinosaurs just before they went extinct."

Guntupalli V.R. Prasad, co-author and leader of the research team, adds: "Together with dinosaur nests from Jabalpur in the upper Narmada valley in the east and those from Balasinor in the west, the new nesting sites from Dhar District in Madhya Pradesh (Central India), covering an east-west stretch of about 1000 km, constitute one of the largest dinosaur hatcheries in the world."

From Science Daily

Light-based tech could inspire Moon navigation and next-gen farming

Super-thin chips made from lithium niobate are set to overtake silicon chips in light-based technologies, according to world-leading scientists in the field, with potential applications ranging from remote ripening-fruit detection on Earth to navigation on the Moon.

They say the artificial crystal offers the platform of choice for these technologies due to its superior performance and recent advances in manufacturing capabilities.

RMIT University's Distinguished Professor Arnan Mitchell and University of Adelaide's Dr Andy Boes led this team of global experts to review lithium niobate's capabilities and potential applications in the journal Science.

The international team, including scientists from Peking University in China and Harvard University in the United States, is working with industry to make navigation systems that are planned to help rovers drive on the Moon later this decade.

As it is impossible to use global positioning system (GPS) technology on the Moon, navigation systems in lunar rovers will need to use an alternative system, which is where the team's innovation comes in.

By detecting tiny changes in laser light, the lithium-niobate chip can be used to measure movement without needing external signals, according to Mitchell.

"This is not science fiction -- this artificial crystal is being used to develop a range of exciting applications. And competition to harness the potential of this versatile technology is heating up," said Mitchell, Director of the Integrated Photonics and Applications Centre.

He said while the lunar navigation device was in the early stages of development, the lithium niobate chip technology was "mature enough to be used in space applications."

"Our lithium niobate chip technology is also flexible enough to be rapidly adapted to almost any application that uses light," Mitchell said.

"We are focused on navigation now, but the same technology could also be used for linking internet on the Moon to the internet on Earth."

What is lithium niobate and how can it be used?

Lithium niobate is an artificial crystal that was first discovered in 1949 but is "back in vogue," according to Boes.

"Lithium niobate has new uses in the field of photonics -- the science and technology of light -- because unlike other materials it can generate and manipulate electro-magnetic waves across the full spectrum of light, from microwave to UV frequencies," he said.

"Silicon was the material of choice for electronic circuits, but its limitations have become increasingly apparent in photonics.

"Lithium niobate has come back into vogue because of its superior capabilities, and advances in manufacturing mean that it is now readily available as thin films on semiconductor wafers."

A layer of lithium niobate about 1,000 times thinner than a human hair is placed on a semiconductor wafer, Boes said.

"Photonic circuits are printed into the lithium niobate layer, which are tailored according to the chip's intended use. A fingernail-sized chip may contain hundreds of different circuits," he said.

How does the lunar navigation tech work?

The team is working with the Australian company Advanced Navigation to create optical gyroscopes, where laser light is launched in both clockwise and anticlockwise directions in a coil of fibre, Mitchell said.

"As the coil is moved the fibre is slightly shorter in one direction than the other, according to Albert Einstein's theory of relativity," he said.

"Our photonic chips are sensitive enough to measure this tiny difference and use it to determine how the coil is moving. If you can keep track of your movements, then you know where you are relative to where you started. This is called inertial navigation."

Potential applications closer to home

This technology can also be used to remotely detect the ripeness of fruit.

"Gas emitted by ripe fruit is absorbed by light in the mid-infrared part of the spectrum," Mitchell said.

"A drone hovering in an orchard would transmit light to another which would sense the degree to which the light is absorbed and when fruit is ready for harvesting.

"Our microchip technology is much smaller, cheaper and more accurate than current technology and can be used with very small drones that won't damage fruit trees."

Read more at Science Daily

Plague trackers: Researchers cover thousands of years in a quest to understand the elusive origins of the Black Death

Seeking to better understand more about the origins and movement of bubonic plague, in ancient and contemporary times, researchers at McMaster University, University of Sydney and the University of Melbourne, have completed a painstaking granular examination of hundreds of modern and ancient genome sequences, creating the largest analysis of its kind.

Despite massive advances in DNA technology and analysis, the origin, evolution and dissemination of the plague remain notoriously difficult to pinpoint.

The plague is responsible for the two largest and most deadly pandemics in human history. However, the ebb and flow of these, why some die out and others persist for years has confounded scientists.

In a paper published today in the journal Communications Biology, McMaster researchers use comprehensive data and analysis to chart what they can about the highly complex history of Y. pestis, the bacterium that causes plague.

The research features an analysis of more than 600 genome sequences from around the globe, spanning the plague's first emergence in humans 5,000 years ago, the plague of Justinian, the medieval Black Death and the current (or third) Pandemic, which began in the early 20th century.

"The plague was the largest pandemic and biggest mortality event in human history. When it emerged and from what host may shed light on where it came from, why it continually erupted over hundreds of years and died out in some locales but persisted in others. And ultimately, why it killed so many people," explains evolutionary geneticist Hendrik Poinar, director of McMaster's Ancient DNA Centre.

Poinar is a principal investigator with the Michael G. DeGroote Institute for Infectious Disease Research and McMaster's Global Nexus for Pandemics & Biological Threats.

The team studied genomes from strains with a worldwide distribution and of different ages and determined that Y. pestis has an unstable molecular clock. This makes it particularly difficult to measure the rate at which mutations accumulate in its genome over time, which are then used to calculate dates of emergence.

Because Y. pestis evolves at a very slow pace, it is almost impossible to determine exactly where it originated.

Humans and rodents have carried the pathogen around the globe through travel and trade, allowing it to spread faster than its genome evolved. Genomic sequences found in Russia, Spain, England, Italy and Turkey, despite being separated by years are all identical, for example, creating enormous challenges to determining the route of transmission.

To address the problem, researchers developed a new method for distinguishing specific populations of Y. pestis, enabling them to identify and date five populations throughout history, including the most famous ancient pandemic lineages which they now estimate had emerged decades or even centuries before the pandemic was historically documented in Europe.

"You can't think of the plague as just a single bacterium," explains Poinar. "Context is hugely important, which is shown by our data and analysis."

To properly reconstruct pandemics of our past, present, and future, historical, ecological, environmental, social and cultural contexts are equally significant.

Read more at Science Daily

Jan 19, 2023

Billions of celestial objects revealed in gargantuan survey of the Milky Way

Astronomers have released a gargantuan survey of the galactic plane of the Milky Way. The new dataset contains a staggering 3.32 billion celestial objects -- arguably the largest such catalog so far. The data for this unprecedented survey were taken with the Dark Energy Camera, built by the US Department of Energy, at the NSF's Cerro Tololo Inter-American Observatory in Chile, a Program of NOIRLab.

The Milky Way Galaxy contains hundreds of billions of stars, glimmering star-forming regions, and towering dark clouds of dust and gas. Imaging and cataloging these objects for study is a herculean task, but a newly released astronomical dataset known as the second data release of the Dark Energy Camera Plane Survey (DECaPS2) reveals a staggering number of these objects in unprecedented detail. The DECaPS2 survey, which took two years to complete and produced more than 10 terabytes of data from 21,400 individual exposures, identified approximately 3.32 billion objects -- arguably the largest such catalog compiled to date. Astronomers and the public can explore the dataset here.

This unprecedented collection was captured by the Dark Energy Camera (DECam) instrument on the Víctor M. Blanco 4-meter Telescope at Cerro Tololo Inter-American Observatory (CTIO), a Program of NSF's NOIRLab. CTIO is a constellation of international astronomical telescopes perched atop Cerro Tololo in Chile at an altitude of 2200 meters (7200 feet). CTIO's lofty vantage point gives astronomers an unrivaled view of the southern celestial hemisphere, which allowed DECam to capture the southern Galactic plane in such detail.

DECaPS2 is a survey of the plane of the Milky Way as seen from the southern sky taken at optical and near-infrared wavelengths. The first trove of data from DECaPS was released in 2017, and with the addition of the new data release, the survey now covers 6.5% of the night sky and spans a staggering 130 degrees in length. While it might sound modest, this equates to 13,000 times the angular area of the full Moon.

The DECaPS2 dataset is available to the entire scientific community and is hosted by NOIRLab's Astro Data Lab, which is part of the Community Science and Data Center. Interactive access to the imaging with panning/zooming inside of a web-browser is available from the Legacy Survey Viewer, the World Wide Telescope and Aladin.

Most of the stars and dust in the Milky Way are located in its disk -- the bright band stretching across this image -- in which the spiral arms lie. While this profusion of stars and dust makes for beautiful images, it also makes the Galactic plane challenging to observe. The dark tendrils of dust seen threading through this image absorb starlight and blot out fainter stars entirely, and the light from diffuse nebulae interferes with any attempts to measure the brightness of individual objects. Another challenge arises from the sheer number of stars, which can overlap in the image and make it difficult to disentangle individual stars from their neighbors.

Despite the challenges, astronomers delved into the Galactic plane to gain a better understanding of our Milky Way. By observing at near-infrared wavelengths, they were able to peer past much of the light-absorbing dust. The researchers also used an innovative data-processing approach, which allowed them to better predict the background behind each star. This helped to mitigate the effects of nebulae and crowded star fields on such large astronomical images, ensuring that the final catalog of processed data is more accurate.

"One of the main reasons for the success of DECaPS2 is that we simply pointed at a region with an extraordinarily high density of stars and were careful about identifying sources that appear nearly on top of each other," said Andrew Saydjari, a graduate student at Harvard University, researcher at the Center for Astrophysics | Harvard & Smithsonian and lead author of the paper. "Doing so allowed us to produce the largest such catalog ever from a single camera, in terms of the number of objects observed."

"When combined with images from Pan-STARRS 1, DECaPS2 completes a 360-degree panoramic view of the Milky Way's disk and additionally reaches much fainter stars," said Edward Schlafly, a researcher at the AURA-managed Space Telescope Science Institute and a co-author of the paper describing DECaPS2 published in the Astrophysical Journal Supplement. "With this new survey, we can map the three-dimensional structure of the Milky Way's stars and dust in unprecedented detail."

"Since my work on the Sloan Digital Sky Survey two decades ago, I have been looking for a way to make better measurements on top of complex backgrounds," said Douglas Finkbeiner, a professor at the Center for Astrophysics, co-author of the paper, and principal investigator behind the project. "This work has achieved that and more!"

"This is quite a technical feat. Imagine a group photo of over three billion people and every single individual is recognizable!" says Debra Fischer, division director of Astronomical Sciences at NSF. "Astronomers will be poring over this detailed portrait of more than three billion stars in the Milky Way for decades to come. This is a fantastic example of what partnerships across federal agencies can achieve."

Read more at Science Daily

Underlying assumptions of air quality need to be redefined

The 40-meter-high monitoring tower of the Innsbruck Atmospheric Observatory near the city center of Innsbruck in Austria, Europe, continuously provides data on the composition of the atmosphere near the surface. Every hour, 36,000 data points are recorded. Using a special measuring method -- the so-called eddy covariance method -- the concentration of air components can be continuously monitored. An international team led by Thomas Karl from the Department of Atmospheric and Cryospheric Sciences at the University of Innsbruck has now used these data to study the chemistry of ozone, nitrogen monoxide and nitrogen dioxide in urban areas in detail. The high proportion of diesel vehicles in European cities leads to strong concentrations of nitrogen monoxide. This reacts with ozone to produce nitrogen dioxide. In the atmosphere, nitrogen dioxide decomposes again to nitrogen monoxide and atomic oxygen, which immediately combines with atmospheric oxygen to form ozone.

Common assumptionneeds to be refined

This chemical cycle was described mathematically over 60 years ago in the first air pollution textbook by Philip Leighton. The relationship between the two processes has since been referred to as the Leighton ratio. Computer models of atmospheric chemistry use the Leighton ratio to minimize complexity by deriving the concentration of ozone, nitric oxide, and nitrogen dioxide from the concentration of each of the other two. In practice, this has been used, for example, to derive ozone concentrations in areas polluted by nitrogen oxides. The Innsbruck atmospheric researchers' data now show that in the presence of high nitrogen monoxide emissions, computational simplifications made by Leighton lead to incorrect results. Thomas Karl points out that "in cities with high nitrogen monoxide emissions, this ratio can be overestimated by up to 50 percent, which can lead to model calculations overestimating ground-level ozone concentrations in urban areas." The effect of chemistry -- turbulence interactions plays a significant role in the lowest layer of the atmosphere, up to 200 meters above the ground.

Responsible for the effect studied in Innsbruck is the combination of strong turbulence in urban areas in the presence of high nitrogen monoxide emissions. . The mixing of the gases combined with the relatively rapid chemical processes lead to more ozone being converted into nitrogen dioxide. The researchers' data also show that direct emissions of nitrogen dioxide from urban traffic are largely negligible in comparison to secondary formation. "It remains important to note that environmental regulations do not rely on model calculations but come into effect depending on actual measured pollutant concentrations," Thomas Karl emphasized.

Read more at Science Daily

At least half of Africa's rhinos are now in private hands; New paths for rhino conservation are needed

African rhino numbers are declining at unsustainable rates in core state-run parks which is why more than half the continent's remaining rhinos are now on private land.

Until the past decade, the largest population of rhinos was found in South Africa's Kruger National Park. This state-run park has, however, lost 76% and 68% of its white and black rhinos over the past decade, respectively. By contrast, the number of white rhinos on private land has steadily increased over the same decade, particularly in South Africa.

Private rhino owners now conserve at least half of the continents' remaining rhinos, and communal lands conserve a growing proportion as well.

In a new article published in Frontiers in Ecology and the Environment, scientists from the University of Helsinki in Finland and the Universities of Stellenbosch and Nelson Mandela in South Africa have compiled publicly available rhino population data for African countries where rhinos occur, disaggregated by state, private, and communal land types where possible. They consider the implications of an emerging shift in rhino conservation from state to private and communal lands, and chart a new path for rhino conservation.

"Private and communal landowners in several southern and East African countries can generate revenues from wildlife tourism, trophy hunting and trade in live animals, making it financially viable to use their land to conserve wildlife rather than for farming livestock" explains paper author Dr Hayley Clements. "The result has been that hundreds of landowners conserve rhinos on their properties."

But the cost-benefit ratio of conserving rhinos is changing, explains study co-author Dr Dave Balfour. "Accelerating poaching has meant private rhino owners now spend on average US$150,000 per year on security measures. This is far more than state parks are able to spend per rhino or per unit area conserved. Combined with the generally smaller size of private rhino populations (averaging 100 km2), which likely makes them easier to protect than in places like Kruger (20,000 km2), this spending on security means private rhino populations have suffered lower poaching rates than in some core state-run parks. But these rising security costs mean many landowners are not willing or able to continue conserving rhinos, with some choosing to sell their rhinos, often at a loss."

"It is important that future policy enables new incentives that compensate for rising security costs, encouraging rhino conservation on private and communal land," explains senior author Prof. Enrico Di Minin. "For example, could landowners that conserve rhinos in extensive systems qualify for a more favourable tax structure? Could they be eligible for carbon or the emerging biodiversity credits or rhino bonds, given the role of rhinos in carbon cycling? Could they receive certifications for extensive management that increase the value of their wildlife-based tourism and hunting offerings? These are crucial questions that need to be addressed in order to support more sustainable conservation strategies for rhinos," he continues.

"If additional incentives are not enabled, we risk losing private and communal rhino custodians, and with them, half of the remaining African rhinos" concludes Dr Clements.

Read more at Science Daily

Low-impact human recreation changes wildlife behavior

Even without hunting rifles, humans appear to have a strong negative influence on the movement of wildlife. A study of Glacier National Park hiking trails during and after a COVID-19 closure adds evidence to the theory that humans can create a "landscape of fear" like other apex predators, changing how species use an area simply with their presence.

Washington State University and National Park Service researchers found that when human hikers were present, 16 out of 22 mammal species, including predators and prey alike, changed where and when they accessed areas. Some completely abandoned places they previously used, others used them less frequently, and some shifted to more nocturnal activities to avoid humans.

"When the park was open to the public, and there were a lot of hikers and recreators using the area, we saw a bunch of changes in how animals were using that same area," said Daniel Thornton, WSU wildlife ecologist and senior author on the study published in the journal Scientific Reports. "The surprising thing is that there's no other real human disturbance out there because Glacier is such a highly protected national park, so these responses really are being driven by human presence and human noise."

The researchers had also expected to find an effect known as "human shielding," when human presence causes large predators to avoid an area, providing opportunity for smaller predators and perhaps some prey species to use an area more frequently. In this case, they found this potential effect for only one species, red fox. The foxes were more present on and near trails when the park was open-perhaps because their competitors, coyotes, avoided those areas when humans were around.

Several species showed a decline in use of trail areas when the park was open, including black bear, elk and white-tailed deer. Many decreased their day-time activities, including mule deer, snowshoe hare, grizzly bears and coyotes. A few, including cougars, seemed indifferent to human presence.

While the influence of low-impact recreation is concerning, the researchers emphasized that more research is needed to determine if it has negative effects on the species' survival.

"This study does not say that hiking is necessarily bad for wildlife, but it does have some impacts on spatiotemporal ecology, or how wildlife uses a landscape and when," said Alissa Anderson, a resent WSU master's graduate and first author on the study. "Maybe they are not on the trails as much, but they're using different places, and how much does that actually impact species' ability to survive and thrive in a place, or not? There are a lot of questions about how this actually plays into population survival."

The study came about in part because of the pandemic. Both humans and wildlife like to use trails, so the researchers had set up an array of camera traps near several trails to study lynx populations in Glacier National Park when COVID-19 hit. In an effort to keep the virus from spreading to the nearby Blackfeet Indian Reservation, the eastern portion of the park was closed in 2020 with only minimal access allowed to administrators and researchers.

This allowed Anderson, Thornton and co-author John Waller of Glacier National Park to conduct a natural experiment. They captured images in summer of 2020 when the park was closed as well as in 2021 when it opened again.

Glacier, which covers nearly 1,600 square-miles of northwestern Montana, sees more than 3 million human visitors a year. It is also home to diverse range of animals with almost the full complement of mammal species that has existed in the region historically.

Thornton said park managers are faced with a balancing act between conservation and public use missions.

Read more at Science Daily

Jan 18, 2023

The rich meteorology of Mars studied in detail from the Perseverance rover

Perseverance is a NASA autonomous vehicle that arrived at the Jezero Crater (the bed of an ancient, now dried-up lake on Mars) on 18 February 2021. The rover is equipped with seven novel, complex scientific instruments dedicated to exploring the planet's surface in search of signs of possible past life, collecting and depositing samples to be brought back to Earth, testing new technologies for use in human exploration, and studying the planet's atmosphere in detail. With regard to the aim of studying the atmosphere, the MEDA (Mars Environmental Dynamics Analyzer) instrument has been obtaining novel results. MEDA's lead researcher is José Antonio Rodríguez-Manfredi of the Centre for Astrobiology (CAB) in Madrid, and it has had the participation of a team from the UPV/EHU's Planetary Sciences Research Group. The instrument comprises a set of sensors that measure temperature, pressure, wind, humidity and properties of the dust that is always present in suspension in the Mars atmosphere.

Perseverance has now completed its investigation of the atmosphere throughout the first Martian year (which lasts approximately two Earth years). A preview of the results, which appears on the cover, is published today in the January issue of the journal Nature Geoscience. Specifically, the UPV/EHU team, formed by Agustín Sánchez-Lavega, Ricardo Hueso, Teresa del Río-Gaztelurrutia and the PhD student Asier Munguira, has led the study of the seasonal and daily cycles of temperature and pressure, as well as their significant variations on other time scales resulting from very different processes.

Throughout the seasons, the average air temperature at the Jezero Crater, located near the planet's equator, is around minus 55 degrees Celsius, but varies greatly between day and night, with typical differences of around 50 to 60 degrees. In the middle of the day, the heating of the surface generates turbulent movements in the air as a result of the rise and fall of air masses (convection) which cease in the evening, when the air settles.

Pressure sensors, on the other hand, show in detail the seasonal change of the tenuous Martian atmosphere produced by the melting and freezing of atmospheric carbon dioxide at the polar caps, as well as by a complex, variable daily cycle, modulated by thermal tides in the atmosphere. "The pressure and temperature of the Mars atmosphere oscillate with periods of the Martian solar day (somewhat longer than the Earth's, it averages at 24 hrs 39.5 min) and with their submultiples, following the daily cycle of sunshine greatly influenced by the amount of dust and the presence of clouds in the atmosphere," says Agustín Sánchez-Lavega, professor at the Faculty of Engineering -- Bilbao (EIB) and co-researcher on the Mars 2020 mission.

Both sensors are also detecting dynamic phenomena in the atmosphere that occur in the vicinity of the rover, for example, those produced by the passage of whirlwinds known as "dust devils" because of the dust they sometimes kick up, or the generation of gravity waves whose origin is not yet well understood. "The dust devils are more abundant at Jezero than elsewhere on Mars, and can be very large, forming whirlwinds more than 100 metres in diameter. With MEDA we have been able to characterise not only their general aspects (size and abundance) but also to unravel how these whirlwinds function," says Ricardo Hueso, lecturer at the Faculty of Engineering -- Bilbao (EIB).

MEDA has also detected the presence of storms thousands of kilometres away, very similar in origin to terrestrial storms, as shown by the images from orbiting satellites, and which move along the edge of the north polar cap, formed by the deposition of carbonic snow.

Within the rich variety of phenomena studied, MEDA has been able to characterise in detail the changes that have taken place in the atmosphere by one of the dreaded dust storms, such as the one that developed in early January 2022. Its passage over the rover led to abrupt changes in temperature and pressure accompanied by strong gusts of wind, which kicked up dust and hit the instruments, damaging one of the wind sensors.

Read more at Science Daily

The dark cost of being toxic

An international research team including scientists from the Max Planck Institute for Chemical Ecology in Jena has discovered that the striking orange and black wings of monarch butterflies not only send the message to predators that these butterflies are highly toxic, but that the storage of toxins and development of the colourful wings come at a cost to the butterfly's body. The team reared monarch caterpillars on their milkweed food plants that had different levels of toxins. Monarchs that had ingested high levels of toxins from their food plants as caterpillars, experienced high levels of oxidative damage after storing these toxins in their bodies, and were less conspicuous in their coloration. The study demonstrated experimentally that the storing of toxins is costly for insects that are highly specialized on their food plants.

Monarch butterflies (Danaus plexippus) feed on milkweeds of the genus Asclepias when they are caterpillars, storing the plants' cardenolide heart poisons in their bodies for their own defence. The combination of the toxins with the striking orange and black wings of the monarch is called aposematism (derived from the Greek terms apo = away and sema = signal). Hannah Rowland head of the Max Planck Research Group on Predators and Toxic Prey at the Max Planck Institute for Chemical Ecology explains: "aposematism works because predators learn that eye-catching prey are best avoided. Predators learn faster when the visual signal is always the same. Bright orange means "`'don't eat me'. But other scientists and I have repeatedly found that aposematic animals can have varying degrees of warning signal strength, and we wondered what about pale orange, or deep orange? What does this mean, and what causes the difference?"

Rowland, together with her colleague Jonathan Blount from the University of Exeter, along with their international team of scientists, tested whether the storage of the plant's toxins is costly to the butterfly's body condition. Specifically, whether the storage of toxins causes oxidative stress, whichhappens when antioxidant levels are low. Because antioxidants can be used to make colourful pigments, they tested if the amount of toxins in the monarch is related to their conspicuousness and their oxidative state.

The researchers reared monarch caterpillars on four different milkweeds of the genus Asclepias that have different toxin levels. With this, they were able to manipulate the amount of toxins ingested to subsequently measure concentrations of cardenolides, determine oxidative state, and compare the resulting wing coloration.

"Monarch butterflies that sequestered higher concentrations of cardenolides experienced higher levels of oxidative damage than those that sequestered lower concentrations. Our results are among the first to show a potential physiological mechanism of oxidative damage as a cost of sequestration for these insects," says Hannah Rowland. The scientists also found that the colour of the wings of male monarchs depended on how much cardenolides they sequestered, and how much oxidative damage this had resulted in. Males with the highest levels of oxidative damage showed decreasing colour intensity with increased toxin uptake, while males with the least oxidative damage were the most toxic and colour intense.

Read more at Science Daily

Inner ear has a need for speed

The sensory organs that allow us to walk, dance and turn our heads without dizziness or loss of balance contain specialized synapses that process signals faster than any other in the human body.

In a discovery more than 15 years in the making, a small group of neuroscientists, physicists and engineers from several institutions has unlocked the mechanism of the synapses, paving the way for research that could improve treatments for vertigo and balance disorders that affect as many as 1 in 3 Americans over age 40.

The new study in the Proceedings of the National Academy of Sciences describes the workings of "vestibular hair cell-calyx synapses," which are found in organs of the innermost ear that sense head position and movements in different directions.

"Nobody fully understood how this synapse can be so fast, but we have shed light on the mystery," said Rob Raphael, a Rice University bioengineer who co-authored the study with the University of Chicago's Ruth Anne Eatock, the University of Illinois Chicago's Anna Lysakowski, current Rice graduate student Aravind Chenrayan Govindaraju and former Rice graduate student Imran Quraishi, now an assistant professor at Yale University.

Synapses are biological junctions where neurons can relay information to one another and other parts of the body. The human body contains hundreds of trillions of synapses, and almost all of them share information via quantal transmission, a form of chemical signaling via neurotransmitters that requires at least 0.5 milliseconds to send information across a synapse.

Prior experiments had shown a faster, "nonquantal" form of transmission occurs in vestibular hair cell-calyx synapses, the points where motion-sensing vestibular hair cells meet afferent neurons that connect directly to the brain. The new research explains how these synapses operate so quickly.

In each, a signal-receiving neuron surrounds the end of its partner hair cell with a large cuplike structure called a calyx. The calyx and hair cell remain separated by a tiny gap, or cleft, measuring just a few billionths of a meter.

"The vestibular calyx is a wonder of nature," Lysakowski said. "Its large cup-shaped structure is the only one of its kind in the entire nervous system. Structure and function are intimately related, and nature obviously devoted a great deal of energy to produce this structure. We've been trying to figure out its special purpose for a long time."

From the ion channels expressed in hair cells and their associated calyces, the authors created the first computational model capable of quantitatively describing the nonquantal transmission of signals across this nanoscale gap. Simulating nonquantal transmission allowed the team to investigate what happens throughout the synaptic cleft, which is more extensive in vestibular synapses than other synapses.

"The mechanism turns out to be quite subtle, with dynamic interactions giving rise to fast and slow forms of nonquantal transmission," Raphael said. "To understand all this, we made a biophysical model of the synapse based on its detailed anatomy and physiology."

The model simulates the voltage response of the calyx to mechanical and electrical stimuli, tracking the flow of potassium ions through low-voltage-activated ion channels from pre-synaptic hair cells to the post-synaptic calyx.

Raphael said the model accurately predicted changes in potassium in the synaptic cleft, providing key new insights about changes in electrical potential that are responsible for the fast component of nonquantal transmission; explained how nonquantal transmission alone could trigger action potentials in the post-synaptic neuron; and showed how both fast and slow transmission depend on the close and extensive cup formed by the calyx on the hair cell.

Eatock said, "The key capability was the ability to predict the potassium level and electrical potential at every location within the cleft. This allowed the team to illustrate that the size and speed of nonquantal transmission depend on the novel structure of the calyx. The study demonstrates the power of engineering approaches to elucidate fundamental biological mechanisms, one of the important but sometimes overlooked goals of bioengineering research."

Quraishi began constructing the model and collaborating with Eatock in the mid-2000s when he was a graduate student in Raphael's research group and she was on the faculty of Baylor College of Medicine, just a few blocks from Rice in Houston's Texas Medical Center.

His first version of the model captured important features of the synapse, but he said gaps in "our knowledge of the specific potassium channels and other components that make up the model was too limited to claim it was entirely accurate."

Since then, Eatock, Lysakowski and others discovered ion channels in the calyx that transformed scientists' understanding of how ionic currents flow across hair cell and calyx membranes.

Qurashi said, "The unfinished work had weighed on me," and he was both relieved and excited when Govindaraju, a Ph.D. student in applied physics, joined Raphael's lab and resumed work on the model in 2018.

"By the time I started on the project, more data supported nonquantal transmission," Govindaraju said. "But the mechanism, especially that of fast transmission, was unclear. Building the model has given us a better understanding of the interplay and purpose of different ion channels, the calyx structure and dynamic changes in potassium and electric potential in the synaptic cleft."

Raphael said, "One of my very first grants was to develop a model of ion transport in the inner ear. It is always satisfying to achieve a unified mathematical model of a complex physiological process. For the past 30 years -- since the original observation of nonquantal transmission -- scientists have wondered, 'Why is this synapse so fast?' and, 'Is the transmission speed related to the unique calyx structure?' We have provided answers to both questions."

He said the link between the structure and function of the calyx "is an example of how evolution drives morphological specialization. A compelling argument can be made that once animals emerged from the sea and began to move on land, swing in trees and fly, there were increased demands on the vestibular system to rapidly inform the brain about the position of the head in space. And at this point the calyx appeared."

Raphael said the model opens the door for a deeper exploration of information processing in vestibular synapses, including research into the unique interactions between quantal and nonquantal transmission.

He said the model could also be a powerful tool for researchers who study electrical transmission in other parts of the nervous system, and he hopes it will aid those who design vestibular implants, neuroprosthetic devices that can restore function to those who have lost their balance.

Read more at Science Daily

Global warming reaches central Greenland

At high elevations of the Greenland Ice Sheet, the years 2001 to 2011 were 1.5 °C warmer than in the 20th century and represent the warmest decade in the last thousand years.

A temperature reconstruction from ice cores of the past 1,000 years reveals that today's warming in central-north Greenland is surprisingly pronounced. The most recent decade surveyed in a study, the years 2001 to 2011, was the warmest in the past 1,000 years, and the region is now 1.5 °C warmer than during the 20th century, as researchers led by the Alfred Wegener Institute just report in the journal Nature. Using a set of ice cores unprecedented in length and quality, they reconstructed past temperatures in central-north Greenland and melting rates of the ice sheet.

The Greenland Ice Sheet plays a pivotal part in the global climate system. With enormous amounts of water stored in the ice (about 3 million cubic kilometres), melt and resulting sea-level rise is considered a potential tipping point. For unmitigated global emissions rates ('business as usual'), the ice sheet is projected to contribute up to 50 centimetres to global mean sea-level by 2100. Weather stations along the coast have been recording rising temperatures for many years. But the influence of global warming on the up to 3,000 m elevated parts of the ice sheet have remained unclear to due to the lack of long-term observations. In a study now published in Nature, experts from the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI) present clear evidence that effects of global warming have reached the remote, high-elevation areas of central-north Greenland.

"The time series we recovered from ice cores now continuously covers more than 1,000 years, from year 1000 to 2011. This data shows that the warming in 2001 to 2011 clearly differs from natural variations during the past 1,000 years. Although grimly expected in the light of global warming, we were surprised by how evident this difference really was," says AWI glaciologist Dr Maria Hörhold, lead author of the study. Together with colleagues from AWI and the University of Copenhagen's Niels Bohr Institute, she analysed the isotope composition in shallow ice cores gathered in central-north Greenland during dedicated AWI expeditions.

Previous ice cores obtained at co-located sites starting in the 1990s, did not indicate clear warming in central-north Greenland, despite rising global mean temperatures. Part of the reason is substantial natural climate variability in the region.

The AWI researchers have now extended the previous datasets up to winter 2011/2012 by a dedicated redrilling effort, recovering time series unprecedented length and quality. The temperatures were reconstructed by using consistently one single method for the entire record in the lab: measuring concentrations of stable oxygen isotopes within the ice, which vary with the temperatures prevailing at times of ice formation. Previous studies had to draw on a range of different climate archives and combine results to reconstruct temperature, introducing much larger uncertainties in the assessment of natural variability.

In addition to the temperature, the team reconstructed the melt production of the ice sheet. Melting has increased substantially in Greenland since the 2000s and now significantly contributes to global sea-level rise. "We were amazed to see how closely temperatures inland are connected to Greenland-wide meltwater drainage -- which, after all, occurs in low-elevation areas along the rim of the ice sheet near the coast," says Maria Hörhold.

In order to quantify this connection between temperatures in high-elevation parts and melting along the edges of the ice sheet, the authors used data from a regional climate model for the years 1871 to 2011 and satellite observations of ice-mass changes for the years 2002 to 2021 from the GRACE/GRACE-FO gravimetry missions. This allowed them to convert the temperature variations identified in the ice cores into melting rates and provide estimates for the past 1,000 years. This represents an important dataset for climate research: better understanding of the melt dynamics of the ice sheet in the past improves projections of related future sea-level rise; reduced uncertainties in projections is one step to help optimize adaptation measures.

Read more at Science Daily

Jan 17, 2023

Hubble finds hungry black hole twisting captured star into donut shape

Black holes are gatherers, not hunters. They lie in wait until a hapless star wanders by. When the star gets close enough, the black hole's gravitational grasp violently rips it apart and sloppily devours its gasses while belching out intense radiation.

Astronomers using NASA's Hubble Space Telescope have recorded a star's final moments in detail as it gets gobbled up by a black hole.

These are termed "tidal disruption events." But the wording belies the complex, raw violence of a black hole encounter. There is a balance between the black hole's gravity pulling in star stuff, and radiation blowing material out. In other words, black holes are messy eaters. Astronomers are using Hubble to find out the details of what happens when a wayward star plunges into the gravitational abyss.

Hubble can't photograph the AT2022dsb tidal event's mayhem up close, since the munched-up star is nearly 300 million light-years away at the core of the galaxy ESO 583-G004. But astronomers used Hubble's powerful ultraviolet sensitivity to study the light from the shredded star, which include hydrogen, carbon, and more. The spectroscopy provides forensic clues to the black hole homicide.

About 100 tidal disruption events around black holes have been detected by astronomers using various telescopes. NASA recently reported that several of its high-energy space observatories spotted another black hole tidal disruption event on March 1, 2021, and it happened in another galaxy. Unlike Hubble observations, data was collected in X-ray light from an extremely hot corona around the black hole that formed after the star was already torn apart.

"However, there are still very few tidal events that are observed in ultraviolet light given the observing time. This is really unfortunate because there's a lot of information that you can get from the ultraviolet spectra," said Emily Engelthaler of the Center for Astrophysics | Harvard & Smithsonian (CfA) in Cambridge, Massachusetts. "We're excited because we can get these details about what the debris is doing. The tidal event can tell us a lot about a black hole." Changes in the doomed star's condition are taking place on the order of days or months.

For any given galaxy with a quiescent supermassive black hole at the center, it's estimated that the stellar shredding happens only a few times in every 100,000 years.

This AT2022dsb stellar snacking event was first caught on March 1, 2022 by the All-Sky Automated Survey for Supernovae (ASAS-SN or "Assassin"), a network of ground-based telescopes that surveys the extragalactic sky roughly once a week for violent, variable, and transient events that are shaping our universe. This energetic collision was close enough to Earth and bright enough for the Hubble astronomers to do ultraviolet spectroscopy over a longer than normal period of time.

"Typically, these events are hard to observe. You get maybe a few observations at the beginning of the disruption when it's really bright. Our program is different in that it is designed to look at a few tidal events over a year to see what happens," said Peter Maksym of the CfA. "We saw this early enough that we could observe it at these very intense black hole accretion stages. We saw the accretion rate drop as it turned to a trickle over time."

The Hubble spectroscopic data are interpreted as coming from a very bright, hot, donut-shaped area of gas that was once the star. This area, known as a torus, is the size of the solar system and is swirling around a black hole in the middle.

Read more at Science Daily

Salmonella exposure a risk for colon cancer

A new study published in the journal Cell Reports Medicine links exposure to salmonella bacteria to colon cancer risk.

The researchers, including a team led by Jun Sun from the University of Illinois Chicago, studied human colon cancer tissue samples and animal models and found that exposure to salmonella was linked with colon cancers that developed earlier and grew larger.

The study authors first looked at data from a Netherlands-based retrospective study of colon cancer patients that found tissue samples taken during routine colon cancer surgery with salmonella antibodies tended to be from people who had worse colon cancer outcomes.

Using salmonella strains isolated from these tissue samples, Sun and her U.S.-based team studied mice with colon cancer that had been exposed to the bacteria. They observed accelerated tumor growth and larger tumors in mice with salmonella exposure. They also saw that there was increased salmonella translocated to the tumors.

"During infection, salmonella hijacks essential host signaling pathways, and these molecular manipulations may cause oncogenic transformation. The current study tells us that more research is needed into the connection between salmonella exposure and colon cancer risk in the USA, and that simply by practicing safe food preparation, we can potentially help to protect ourselves," said Sun, UIC professor of medicine.

Sun's collaborators in the Netherlands also studied the bacteria in vitro. They combined human cancer cells and pre-cancer cells with the salmonella strain in the lab and measured any growth or changes in the tumor. They saw that even one infection caused transformation and that each salmonella infection exponentially increased the rate of cell transformation.

"The mouse and tissue culture experiments show that salmonella infection had a chronic effect to accelerate tumor growth," said Sun, who also is a member of the University of Illinois Cancer Center at UIC. "This evidence tells us that we need to look closer at salmonella exposure as an environmental risk factor for chronic diseases, such as colon cancer."

Read more at Science Daily

Climate change likely to uproot more Amazon trees

Tropical forests are crucial for sucking up carbon dioxide from the atmosphere. But they're also subject to intense storms that can cause "windthrow" -- the uprooting or breaking of trees. These downed trees decompose, potentially turning a forest from a carbon sink into a carbon source.

A new study finds that more extreme thunderstorms from climate change will likely cause a greater number of large windthrow events in the Amazon rainforest. This is one of the few ways that researchers have developed a link between storm conditions in the atmosphere and forest mortality on land, helping fill a major gap in models.

"Building this link between atmospheric dynamics and damage at the surface is very important across the board," said Jeff Chambers, a senior faculty scientist at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab), and director of the Next Generation Ecosystem Experiments (NGEE)-Tropics project, which performed the research. "It's not just for the tropics. It's high-latitude, low-latitude, temperate-latitude, here in the U.S."

Researchers found that the Amazon will likely experience 43% more large blowdown events (of 25,000 square meters or more) by the end of the century. The area of the Amazon likely to see extreme storms that trigger large windthrows will also increase by about 50%. The study was published in the journal Nature Communications on Jan. 6.

"We want to know what these extreme storms and windthrows mean in terms of the carbon budget and carbon dynamics, and for carbon sinks in the forests," Chambers said. While downed trees slowly release carbon as they decompose, the open forest becomes host to new plants that pull carbon dioxide from the air. "It's a complicated system, and there are still a lot of pieces of the puzzle that we're working on. In order to answer the question more quantitatively, we need to build out the land-atmosphere links in Earth system models."

To find the link between air and land, researchers compared a map of more than 1,000 large windthrows with atmospheric data. They found that a measurement known as CAPE, the "convective available potential energy," was a good predictor of major blowdowns. CAPE measures the amount of energy available to move parcels of air vertically, and a high value of CAPE often leads to thunderstorms. More extreme storms can come with intense vertical winds, heavy rains or hail, and lightning, which interact with trees from the canopy down to the soil.

"Storms account for over half of the forest mortality in the Amazon," said Yanlei Feng, first author on the paper. "Climate change has a lot of impact on Amazon forests, but so far, a large fraction of the research focus has been on drought and fire. We hope our research brings more attention to extreme storms and improves our models to work under a changing environment from climate change."

While this study looked at a future with high carbon emissions (a scenario known as SSP-585), scientists could use projected CAPE data to explore windthrow impacts in different emissions scenarios. Researchers are now working to integrate the new forest-storm relationship into Earth system models. Better models will help scientists explore how forests will respond to a warmer future -- and whether they can continue to siphon carbon out of the atmosphere or will instead become a contributor.

"This was a very impactful climate change study for me," said Feng, who completed the research as a graduate student researcher in the NGEE-Tropics project at Berkeley Lab. She now studies carbon capture and storage at the Carnegie Institution for Science at Stanford University. "I'm worried about the projected increase in forest disturbances in our study and I hope I can help limit climate change. So now I'm working on climate change solutions."

Read more at Science Daily

Vitamin D benefits and metabolism may depend on body weight

Researchers from Brigham and Women's Hospital, a founding member of the Mass General Brigham healthcare system, have found new evidence that vitamin D may be metabolized differently in people with an elevated body mass index (BMI). The study, appearing in JAMA Network Open, is a new analysis of data from the VITAL trial, a large nationwide clinical trial led by Brigham researchers that investigated whether taking vitamin D or marine omega-3 supplements could reduce the risk of developing cancer, heart disease, or stroke.

"The analysis of the original VITAL data found that vitamin D supplementation correlated with positive effects on several health outcomes, but only among people with a BMI under 25," said first author Deirdre K. Tobias, ScD, an associate epidemiologist in Brigham's Division of Preventive Medicine. "There seems to be something different happening with vitamin D metabolism at higher body weights, and this study may help explain diminished outcomes of supplementation for individuals with an elevated BMI."

Vitamin D is an essential nutrient involved in many biological processes, most notably helping our body absorb minerals, such as calcium and magnesium. While some of the vitamin D we need is made in the body from sunlight, vitamin D deficiencies are often treated with supplementation. Evidence from laboratory studies, epidemiologic research and clinical research has also suggested that vitamin D may play a role in the incidence and progression of cancer and cardiovascular disease, and it was this evidence that prompted the original VITAL trial.

The VITAL trial was a randomized, double-blind, placebo-controlled trial in 25,871 U.S. participants, which included men over the age of 50 and women over the age of 55. All participants were free of cancer and cardiovascular disease at the time of enrollment. While the trial found little benefit of vitamin D supplementation for preventing cancer, heart attack, or stroke in the overall cohort, there was a statistical correlation between BMI and cancer incidence, cancer mortality, and autoimmune disease incidence. Other studies suggest similar results for type 2 diabetes.

The new study aimed to investigate this correlation. The researchers analyzed data from 16,515 participants from the original trial who provided blood samples at baseline (before randomization to vitamin D), as well as 2,742 with a follow-up blood sample taken after two years. The researchers measured the levels of total and free vitamin D, as well as many other novel biomarkers for vitamin D, such as its metabolites, calcium, and parathyroid hormone, which helps the body utilize vitamin D.

"Most studies like this focus on the total vitamin D blood level," said senior author JoAnn E. Manson, MD, DrPH, chief of the Division of Preventive Medicine at the Brigham and principal investigator of VITAL. "The fact that we were able to look at this expanded profile of vitamin D metabolites and novel biomarkers gave us unique insights into vitamin D availability and activity, and whether vitamin D metabolism might be disrupted in some people but not in others."

The researchers found that vitamin D supplementation increased most of the biomarkers associated with vitamin D metabolism in people, regardless of their weight. However, these increases were significantly smaller in people with elevated BMIs.

"We observed striking differences after two years, indicating a blunted response to vitamin D supplementation with higher BMI," Tobias said. "This may have implications clinically and potentially explain some of the observed differences in the effectiveness of vitamin D supplementation by obesity status."

"This study sheds light on why we're seeing 30-40 percent reductions in cancer deaths, autoimmune diseases, and other outcomes with vitamin D supplementation among those with lower BMIs but minimal benefit in those with higher BMIs, suggesting it may be possible to achieve benefits across the population with more personalized dosing of vitamin D," said Manson. "These nuances make it clear that there's more to the vitamin D story."

Read more at Science Daily

Jan 16, 2023

A star's unexpected survival

Hundreds of millions of light-years away in a distant galaxy, a star orbiting a supermassive black hole is being violently ripped apart under the black hole's immense gravitational pull. As the star is shredded, its remnants are transformed into a stream of debris that rains back down onto the black hole to form a very hot, very bright disk of material swirling around the black hole, called an accretion disc. This phenomenon -- where a star is destroyed by a supermassive black hole and fuels a luminous accretion flare -- is known as a tidal disruption event (TDE), and it is predicted that TDEs occur roughly once every 10,000 to 100,000 years in a given galaxy.

With luminosities exceeding entire galaxies (i.e., billions of times brighter than our Sun) for brief periods of time (months to years), accretion events enable astrophysicists to study supermassive black holes (SMBHs) from cosmological distances, providing a window into the central regions of otherwise-quiescent -- or dormant -- galaxies. By probing these ``strong-gravity'' events, where Einstein's general theory of relativity is critical for determining how matter behaves, TDEs yield information about one of the most extreme environments in the universe: the event horizon -- the point of no return -- of a black hole.

TDEs are usually "once-and-done" because the extreme gravitational field of the SMBH destroys the star, meaning that the SMBH fades back into darkness following the accretion flare. In some instances, however, the high-density core of the star can survive the gravitational interaction with the SMBH, allowing it to orbit the black hole more than once. Researchers call this a repeating partial TDE.

A team of physicists, including lead author Thomas Wevers, Fellow of the European Southern Observatory, and co-authors Eric Coughlin, assistant professor of physics at Syracuse University, and Dheeraj R. "DJ" Pasham, research scientist at MIT's Kavli Institute for Astrophysics and Space Research, have proposed a model for a repeating partial TDE. Their findings, published in Astrophysical Journal Letters, describe the capture of the star by a SMBH, the stripping of the material each time the star comes close to the black hole, and the delay between when the material is stripped and when it feeds the black hole again. The team's work is the first to develop and use a detailed model of a repeating partial TDE to explain the observations, make predictions about the orbital properties of a star in a distant galaxy, and understand the partial tidal disruption process.

The team is studying a TDE known as AT2018fyk (AT stands for ``Astrophysical Transient''). The star was captured by a SMBH through an exchange process known as "Hills capture," where the star was originally part of a binary system (two stars that orbit one another under their mutual gravitational attraction) that was ripped apart by the gravitational field of the black hole. The other (non-captured) star was ejected from the center of the galaxy at speeds comparable to ~ 1000 km/s, which is known as a hypervelocity star.

Once bound to the SMBH, the star powering the emission from AT2018fyk has been repeatedly stripped of its outer envelope each time it passes through its point of closest approach with the black hole. The stripped outer layers of the star form the bright accretion disk, which researchers can study using X-Ray and Ultraviolet /Optical telescopes that observe light from distant galaxies.

According to Wevers, having the opportunity to study a partial TDE gives unprecedented insight into the existence of supermassive black holes and the orbital dynamics of stars in the centers of galaxies.

"Until now, the assumption has been that when we see the aftermath of a close encounter between a star and a supermassive black hole, the outcome will be fatal for the star, that is, the star is completely destroyed," he says. "But contrary to all other TDEs we know of, when we pointed our telescopes to the same location again several years later, we found that it had re-brightened again. This led us to propose that rather than being fatal, part of the star survived the initial encounter and returned to the same location to be stripped of material once more, explaining the re-brightening phase."

First detected in 2018, AT2018fyk was initially perceived as an ordinary TDE. For approximately 600 days the source stayed bright in the X-ray, but then abruptly went dark and was undetectable -- a result of the stellar remnant core returning to a black hole, explains MIT physicist Dheeraj R. Pasham.

"When the core returns to the black hole it essentially steals all the gas away from the black hole via gravity and as a result there is no matter to accrete and hence the system goes dark," Pasham says.

It wasn't immediately clear what caused the precipitous decline in the luminosity of AT2018fyk, because TDEs normally decay smoothly and gradually -- not abruptly -- in their emission. But around 600 days after the drop, the source was again found to be X-ray bright. This led the researchers to propose that the star survived its close encounter with the SMBH the first time and was in orbit about the black hole.

Using detailed modeling, the team's findings suggest that the orbital period of the star about the black hole is roughly 1,200 days, and it takes approximately 600 days for the material that is shed from the star to return to the black hole and start accreting. Their model also constrained the size of the captured star, which they believe was about the size of the sun. As for the original binary, the team believes the two stars were extremely close to one another before being ripped apart by the black hole, likely orbiting each other every few days.

So how could a star survive its brush with death? It all comes down to a matter of proximity and trajectory. If the star collided head-on with the black hole and passed the event horizon -- the threshold where the speed needed to escape the black hole surpasses the speed of light -- the star would be consumed by the black hole. If the star passed very close to the black hole and crossed the so-called "tidal radius" -- where the tidal force of the hole is stronger than the gravitational force that keeps the star together -- it would be destroyed. In the model they have proposed, the star's orbit reaches a point of closest approach that is just outside of the tidal radius, but doesn't cross it completely: some of the material at the stellar surface is stripped by the black hole, but the material at its center remains intact.

How, or if, the process of the star orbiting the SMBH can occur over many repeated passages is a theoretical question that the team plans to investigate with future simulations. Syracuse physicist Eric Coughlin explains that they estimate between 1 to 10% of the mass of the star is lost each time it passes the black hole, with the large range due to uncertainty in modeling the emission from the TDE.

"If the mass loss is only at the 1% level, then we expect the star to survive for many more encounters, whereas if it is closer to 10%, the star may have already been destroyed," notes Coughlin.

The team will keep their eyes to the sky in the coming years to test their predictions. Based on their model, they forecast that the source will abruptly disappear around March 2023 and brighten again when the freshly stripped material accretes onto the black hole in 2025.

The team says their study offers a new way forward for tracking and monitoring follow-up sources that have been detected in the past. The work also suggests a new paradigm for the origin of repeating flares from the centers of external galaxies.

"In the future, it is likely that more systems will be checked for late-time flares, especially now that this project puts forth a theoretical picture of the capture of the star through a dynamical exchange process and the ensuing repeated partial tidal disruption," says Coughlin. "We're hopeful this model can be used to infer the properties of distant supermassive black holes and gain an understanding of their "demographics," being the number of black holes within a given mass range, which is otherwise difficult to achieve directly."

The team says the model also makes several testable predictions about the tidal disruption process, and with more observations of systems like AT2018fyk, it should give insight into the physics of partial tidal disruption events and the extreme environments around supermassive black holes.

Read more at Science Daily

20,000 premature US deaths caused by human-ignited fires each year

Over 80% of premature deaths caused by small smoke particles in the United States result directly from human-ignited fires. This is the outcome of a study published today in IOP Publishing's journal Environmental Research Letters.

The new study, led by researchers at the Massachusetts Institute of Technology, analyses the impact of smoke particles on air quality in the United States. Their research shows that human-ignited fires account for more than 67% of small smoke particles called PM2.5 in the United States. These particles are known to degrade air quality, causing respiratory illnesses and premature death.

The level of fire activity in the US is on the rise. The research team estimate that smoke from human-ignited fires was responsible for 20,000 premature deaths in 2018 alone, a year with a high frequency of fire events -- a substantial portion of which were associated with human ignitions such as agricultural and human lit fires. This is 270% more than there were in 2003, when there was a low frequency of fire events. The research highlights that during high fire activity years, there are much higher concentrations of smoke PM2.5 in the air.

Dr Therese Carter, lead author of the study, said: "Fires not only threaten human lives, infrastructure, and ecosystems, but they are also a major cause for concern in terms of air quality. High levels of smoke exposure can negatively impact human health resulting in conditions such as respiratory infections, lung cancer, heart disease and even premature births. Our results show that a large and significant portion of harmful smoke particles result directly from human-lit fires."

The team used the Global Fire Emissions Database to quantify agricultural fire emissions, then classify these fires into two categories: human vs. natural ignition. Applying a chemical transport model, they simulate the concentration of smoke particles across the United States, concluding that a significant portion of PM2.5 in the US results from human-ignited fires and thus has the potential to be managed.

To limit the devastating effects of pollution from small smoke particles, the team recommends an ignition-focused approach. State agencies can implement management plans to restrict the ignition of agricultural fires to periods when weather conditions would minimise health impacts. However, human-ignited wildfires are much harder to manage due to their sporadic and unplanned nature.

Read more at Science Daily

NASA says 2022 fifth warmest year on record, warming trend continues

Earth's average surface temperature in 2022 tied with 2015 as the fifth warmest on record, according to an analysis by NASA. Continuing the planet's long-term warming trend, global temperatures in 2022 were 1.6 degrees Fahrenheit (0.89 degrees Celsius) above the average for NASA's baseline period (1951-1980), scientists from NASA's Goddard Institute for Space Studies (GISS) in New York reported.

"This warming trend is alarming," said NASA Administrator Bill Nelson. "Our warming climate is already making a mark: Forest fires are intensifying; hurricanes are getting stronger; droughts are wreaking havoc and sea levels are rising. NASA is deepening our commitment to do our part in addressing climate change. Our Earth System Observatory will provide state-of-the-art data to support our climate modeling, analysis and predictions to help humanity confront our planet's changing climate."

The past nine years have been the warmest years since modern recordkeeping began in 1880. This means Earth in 2022 was about 2 degrees Fahrenheit (or about 1.11 degrees Celsius) warmer than the late 19th century average.

"The reason for the warming trend is that human activities continue to pump enormous amounts of greenhouse gases into the atmosphere, and the long-term planetary impacts will also continue," said Gavin Schmidt, director of GISS, NASA's leading center for climate modeling.

Human-driven greenhouse gas emissions have rebounded following a short-lived dip in 2020 due to the COVID-19 pandemic. Recently, NASA scientists, as well as international scientists, determined carbon dioxide emissions were the highest on record in 2022. NASA also identified some super-emitters of methane -- another powerful greenhouse gas -- using the Earth Surface Mineral Dust Source Investigation instrument that launched to the International Space Station last year.

The Arctic region continues to experience the strongest warming trends -- close to four times the global average -- according to GISS research presented at the 2022 annual meeting of the American Geophysical Union, as well as a separate study.

Communities around the world are experiencing impacts scientists see as connected to the warming atmosphere and ocean. Climate change has intensified rainfall and tropical storms, deepened the severity of droughts, and increased the impact of storm surges. Last year brought torrential monsoon rains that devastated Pakistan and a persistent megadrought in the U.S. Southwest. In September, Hurricane Ian became one of the strongest and costliest hurricanes to strike the continental U.S.

Tracking Our Changing Planet

NASA's global temperature analysis is drawn from data collected by weather stations and Antarctic research stations, as well as instruments mounted on ships and ocean buoys. NASA scientists analyze these measurements to account for uncertainties in the data and to maintain consistent methods for calculating global average surface temperature differences for every year. These ground-based measurements of surface temperature are consistent with satellite data collected since 2002 by the Atmospheric Infrared Sounder on NASA's Aqua satellite and with other estimates.

NASA uses the period from 1951-1980 as a baseline to understand how global temperatures change over time. That baseline includes climate patterns such as La Niña and El Niño, as well as unusually hot or cold years due to other factors, ensuring it encompasses natural variations in Earth's temperature.

Many factors can affect the average temperature in any given year. For example, 2022 was one of the warmest on record despite a third consecutive year of La Niña conditions in the tropical Pacific Ocean. NASA scientists estimate that La Niña's cooling influence may have lowered global temperatures slightly (about 0.11 degrees Fahrenheit or 0.06 degrees Celsius) from what the average would have been under more typical ocean conditions.

A separate, independent analysis by the National Oceanic and Atmospheric Administration (NOAA) concluded that the global surface temperature for 2022 was the sixth highest since 1880. NOAA scientists use much of the same raw temperature data in their analysis and have a different baseline period (1901-2000) and methodology. Although rankings for specific years can differ slightly between the records, they are in broad agreement and both reflect ongoing long-term warming.

Read more at Science Daily

Ten-minute scan enables detection and cure of the commonest cause of high blood pressure

Doctors at Queen Mary University of London and Barts Hospital, and Cambridge University Hospital, have led research using a new type of CT scan to light up tiny nodules in a hormone gland and cure high blood pressure by their removal. The nodules are discovered in one-in-twenty people with high blood pressure.

Published today in Nature Medicine, the research solves a 60-year problem of how to detect the hormone producing nodules without a difficult catheter study that is available in only a handful of hospitals, and often fails. The research also found that, when combined with a urine test, the scan detects a group of patients who come off all their blood pressure medicines after treatment.

128 people participated in the study of a new scan after doctors found that their Hypertension (high blood pressure) was caused by a steroid hormone, aldosterone. The scan found that in two thirds of patients with elevated aldosterone secretion, this is coming from a benign nodule in just one of the adrenal glands, which can then be safely removed. The scan uses a very short-acting dose of metomidate, a radioactive dye that sticks only to the aldosterone-producing nodule. The scan was as accurate as the old catheter test, but quick, painless and technically successful in every patient. Until now, the catheter test was unable to predict which patients would be completely cured of hypertension by surgical removal of the gland. By contrast, the combination of a 'hot nodule' on the scan and urine steroid test detected 18 of the 24 patients who achieved a normal blood pressure off all their drugs.

The research, conducted on patients at Barts Hospital, Cambridge University Hospital, and Guy's and St Thomas's, and Universities of Glasgow and Birmingham, was funded by the National Institute for Health and Care Research (NIHR) and Medical Research Council (MRC) partnership, Barts Charity, and the British Heart Foundation.

Professor Morris Brown, co-senior author of the study and Professor of Endocrine Hypertension at Queen Mary University of London, said: "These aldosterone-producing nodules are very small and easily overlooked on a regular CT scan. When they glow for a few minutes after our injection, they are revealed as the obvious cause of Hypertension, which can often then be cured. Until now, 99% are never diagnosed because of the difficulty and unavailability of tests. Hopefully this is about to change."

Professor William Drake, co-senior author of the study and Professor of Clinical Endocrinology at Queen Mary University of London, said:"This study was the result of years of hard work and collaboration between centres across the UK. Much of the 'on the ground' energy and drive came from the talented research fellows who, in addition to doing this innovative work, gave selflessly of their time and energy during the national pandemic emergency. The future of research in this area is in very safe hands."

Read more at Science Daily

Jan 15, 2023

Madagascar mouse lemur retroviruses are diverse and surprisingly similar to ones found in polar bears or domestic sheep

Madagascar is home to a unique biodiversity with a large number of endemic species, among those many lemur species, including the mouse lemurs. This diversity is also found in their retroviruses, a team led by scientists from the Leibniz Institute of Zoo and Wildlife Research (Leibniz-IZW) and the University of Stirling reports in the journal "Virus Evolution." They analysed the mouse lemur genome and identified viruses of two classes that represent ancient infections of the mouse lemur germline. The viruses now behave similarly to lemur genes and are thus called endogenous retroviruses (ERVs). It was surprising that some of the identified retroviruses are closely related to viruses found in other, very different mammals such as polar bears or domestic sheep. This suggests an intriguing and complex pattern of host switching of retroviruses, much more complex than previously thought.

For their analysis, the team collected blood samples from four species of Malagasy mouse lemurs and screened them using high throughput sequencing. The scientists identified two gamma and three beta retrovirus sequences in the lemurs' genomes, representing ancient infections of the mouse lemur germlines. Since then, the virus DNA has been incorporated in the host genomes and the viruses are no longer active or infectious. "We were surprised to find that one of the two identified gamma retroviruses was related to an ERV described in polar bears," states Dr Sharon Kessler, a German Academic Exchange Service (DAAD) supported scientist and Assistant Professor at the University of Stirling. The polar bear virus is young from an evolutionary point of view whereas the lemur virus is old. "How these related viruses infected such geographically separated species is unclear," Kessler says.

There were further surprises among the beta retroviruses. A virulent retrovirus that infects domestic sheep called Jaagsiekte sheep retrovirus (JSRV), which also forms ERVs in domestic sheep, is thought to be a virus confined to domestic sheep, goats and their relatives -- the first cloned sheep "Dolly" had to be euthanised after a JSRV infection and subsequent illness. The mouse lemurs have a closely related JSRV-like virus in their genome. "This suggests that JSRV-like viruses have been more widespread among mammals and are considerably older than previously thought. Why they only show up in such disparate species and in such a punctuated way is curious," says Prof Alex Greenwood, head of the Leibniz-IZW Department of Wildlife Diseases, where the sample screening was conducted. Similarly, the team also identified a virus in the mouse lemurs related to retroviruses found in squirrel monkeys, vampire bats and marsupials. "This group of viruses is becoming more interesting over time as more and more examples of similar viruses are being found in many places including very young ones that may still have currently infectious exogenous counterparts in nature," says Greenwood.

Much of the mouse lemur retroviral diversity observed is associated with non-primate viruses, suggesting a complex pattern of viral host switching around the time the ancestors of lemurs colonized Madagascar. Further studies of viral diversity will help to clarify the complex history of retroviral transmission among mammals.

Read more at Science Daily

Martian meteorite contains large diversity of organic compounds

The Martian meteorite Tissint contains a huge diversity of organic compounds, found an international team of researchers led by Technical University of Munich and Helmholtz Munich's Philippe Schmitt-Kopplin and including Carnegie's Andrew Steele. Their work is published in Science Advances.

Tissint, which crash landed in Morocco more than 11 years ago, is one of only five Martian meteorites that have been observed as they fell to Earth. Pieces of it were found scattered around the desert about 30 miles from the town after which it is named.

This sample of Martian rock was formed hundreds of millions of years ago on our next-door planetary neighbor and was launched into space by a violent event. Unraveling the origin stories of the Tissint meteorite's organic compounds can help scientists understand whether the Red Planet ever hosted life, as well as Earth's geologic history.

"Mars and Earth share many aspects of their evolution," said lead author Schmitt-Kopplin. "And while life arose and thrived on our home planet, the question of whether it ever existed on Mars is a very hot research topic that requires deeper knowledge of our neighboring planet's water, organic molecules, and reactive surfaces."

Organic molecules contain carbon, hydrogen, oxygen, nitrogen, sulfur, and sometimes other elements. Organic compounds are commonly associated with life, although previous Martian meteorite research demonstrated that they can be created by non-biological processes, referred to as abiotic organic chemistry.

"Understanding the processes and sequence of events that shaped this rich organic bounty will reveal new details about Mars' habitability and potentially about the reactions that could lead to the formation of life," added Steele, who has done extensive research on organic material in Martian meteorites, including Tissint, and is a member of both the Perseverance and Curiosity rovers' science teams.

The researchers were able to thoroughly analyze the meteorite's organic inventory, revealing a link between the type and diversity of organic molecules and specific mineralogy. Their efforts resulted in the most comprehensive catalog ever made of the diversity of organic compounds found in a Martian meteorite or in a sample collected and analyzed by a rover. This work uncovered details about how the processes occurring in Mars' mantle and crust evolved, especially with regard to abiotic organics that formed from water-rock interactions.

Of particular interest was the abundance of organic magnesium compounds, a suite of organic molecules not previously seen on Mars, which offer new insights about the high-pressure, high-temperature geochemistry that shaped the Red Planet's deep interior and indicate a connection between its carbon cycle and its mineral evolution.

The researchers say that samples returned from Mars by future missions should provide an unprecedented amount of information about the formation, stability and dynamics of organic compounds in real Martian environments.

Read more at Science Daily