Sep 17, 2022

Astronomers risk misinterpreting planetary signals in James Webb data

NASA's James Webb Space Telescope is revealing the universe with spectacular, unprecedented clarity. The observatory's ultrasharp infrared vision has cut through the cosmic dust to illuminate some of the earliest structures in the universe, along with previously obscured stellar nurseries and spinning galaxies lying hundreds of millions of light years away.

In addition to seeing farther into the universe than ever before, Webb will capture the most comprehensive view of objects in our own galaxy -- namely, some of the 5,000 planets that have been discovered in the Milky Way. Astronomers are harnessing the telescope's light-parsing precision to decode the atmospheres surrounding some of these nearby worlds. The properties of their atmospheres could give clues to how a planet formed and whether it harbors signs of life.

But a new MIT study suggests that the tools astronomers typically use to decode light-based signals may not be good enough to accurately interpret the new telescope's data. Specifically, opacity models -- the tools that model how light interacts with matter as a function of the matter's properties -- may need significant retuning in order to match the precision of Webb's data, the researchers say.

If these models are not refined? The researchers predict that properties of planetary atmospheres, such as their temperature, pressure, and elemental composition, could be off by an order of magnitude.

"There is a scientifically significant difference between a compound like water being present at 5 percent versus 25 percent, which current models cannot differentiate," says study co-leader Julien de Wit, assistant professor in MIT's Department of Earth, Atmospheric, and Planetary Sciences (EAPS).

"Currently, the model we use to decrypt spectral information is not up to par with the precision and quality of data we have from the James Webb telescope," adds EAPS graduate student Prajwal Niraula. "We need to up our game and tackle together the opacity problem."

De Wit, Niraula, and their colleagues have published their study in Nature Astronomy. Co-authors include spectroscopy experts Iouli Gordon, Robert Hargreaves, Clara Sousa-Silva, and Roman Kochanov of the Harvard-Smithsonian Center for Astrophysics.

Leveling up

Opacity is a measure of how easily photons pass through a material. Photons of certain wavelengths can pass straight through a material, be absorbed, or be reflected back out depending on whether and how they interact with certain molecules within a material. This interaction also depends on a material's temperature and pressure.

An opacity model works on the basis of various assumptions of how light interacts with matter. Astronomers use opacity models to derive certain properties of a material, given the spectrum of light that the material emits. In the context of explanets, an opacity model can decode the type and abundance of chemicals in a planet's atmosphere, based on the light from the planet that a telescope captures.

De Wit says that the current state-of-the-art opacity model, which he likens to a classical language translation tool, has done a decent job of decoding spectral data taken by instruments such as those on the Hubble Space Telescope.

"So far, this Rosetta Stone has been doing OK," de Wit says. "But now that we're going to the next level with Webb's precision, our translation process will prevent us from catching important subtleties, such as those making the difference between a planet being habitable or not."

Light, perturbed

He and his colleagues make this point in their study, in which they put the most commonly used opacity model to the test. The team looked to see what atmospheric properties the model would derive if it were tweaked to assume certain limitations in our understanding of how light and matter interact. The researchers created eight such "perturbed" models. They then fed each model, including the real version, "synthetic spectra" -- patterns of light that were simulated by the group and similar to the precision that the James Webb telescope would see.

They found that, based on the same light spectra, each perturbed model produced wide-ranging predictions for the properties of a planet's atmosphere. Based on their analysis, the team concludes that, if existing opacity models are applied to light spectra taken by the Webb telescope, they will hit an "accuracy wall." That is, they won't be sensitive enough to tell whether a planet has an atmospheric temperature of 300 Kelvin or 600 Kelvin, or whether a certain gas takes up 5 percent or 25 percent of an atmospheric layer.

"That difference matters in order for us to constrain planetary formation mechanisms and reliably identify biosignatures," Niraula says.

The team also found that every model also produced a "good fit" with the data, meaning, even though a perturbed model produced a chemical composition that the researchers knew to be incorrect, it also generated a light spectrum from that chemical composition that was close enough to, or "fit" with the original spectrum.

"We found that there are enough parameters to tweak, even with a wrong model, to still get a good fit, meaning you wouldn't know that your model is wrong and what it's telling you is wrong," de Wit explains.

He and his colleagues raise some ideas for how to improve existing opacity models, including the need for more laboratory measurements and theoretical calculations to refine the models' assumptions of how light and various molecules interact, as well as collaborations across disciplines, and in particular, between astronomy and spectroscopy.

Read more at Science Daily

Even smartest AI models don't match human visual processing

Deep convolutional neural networks (DCNNs) don't see objects the way humans do -- using configural shape perception -- and that could be dangerous in real-world AI applications, says Professor James Elder, co-author of a York University study published today.

Published in the Cell Press journal iScience, Deep learning models fail to capture the configural nature of human shape perception is a collaborative study by Elder, who holds the York Research Chair in Human and Computer Vision and is Co-Director of York's Centre for AI & Society, and Assistant Psychology Professor Nicholas Baker at Loyola College in Chicago, a former VISTA postdoctoral fellow at York.

The study employed novel visual stimuli called "Frankensteins" to explore how the human brain and DCNNs process holistic, configural object properties.

"Frankensteins are simply objects that have been taken apart and put back together the wrong way around," says Elder. "As a result, they have all the right local features, but in the wrong places."

The investigators found that while the human visual system is confused by Frankensteins, DCNNs are not -- revealing an insensitivity to configural object properties.

"Our results explain why deep AI models fail under certain conditions and point to the need to consider tasks beyond object recognition in order to understand visual processing in the brain," Elder says. "These deep models tend to take 'shortcuts' when solving complex recognition tasks. While these shortcuts may work in many cases, they can be dangerous in some of the real-world AI applications we are currently working on with our industry and government partners," Elder points out.

One such application is traffic video safety systems: "The objects in a busy traffic scene -- the vehicles, bicycles and pedestrians -- obstruct each other and arrive at the eye of a driver as a jumble of disconnected fragments," explains Elder. "The brain needs to correctly group those fragments to identify the correct categories and locations of the objects. An AI system for traffic safety monitoring that is only able to perceive the fragments individually will fail at this task, potentially misunderstanding risks to vulnerable road users."

Read more at Science Daily

Sep 16, 2022

Researchers go 'outside the box' to delineate major ocean currents

For the first time University of Rochester researchers have quantified the energy of ocean currents larger than 1,000 kilometers. In the process, they and their collaborators have discovered that the most energetic is the Antarctic Circumpolar Current, some 9,000 kilometers in diameter.

The team, led by Hussein Aluie, associate professor of mechanical engineering, used the same coarse-graining technique developed by his lab to previously document energy transfer at the other end of the scale, during the "eddy-killing" that occurs when wind interacts with temporary, circular currents of water less than 260 kilometers in size.

These new results, reported in Nature Communications, show how the coarse-graining technique can provide a new window for understanding oceanic circulation in all its multiscale complexity, says lead author Benjamin Storer, a research associate in Aluie's Turbulence and Complex Flow Group. This gives researchers an opportunity to better understand how ocean currents function as a key moderator of the Earth's climate system.

The team also includes researchers from the University of Rome Tor Vergata, University of Liverpool, and Princeton University.

Traditionally, researchers interested in climate and oceanography have picked boxes in the ocean 500 to 1,000 square km in size. These box regions, which were assumed to represent the global ocean, were then analyzed using a technique called Fourier analysis, Aluie says.

"The problem is, when you pick a box, you are already limiting yourself to analyzing what's in that box," Aluie says. "You miss everything at a larger scale.

"What we are saying is, we don't need a box; we can think outside the box."

When the researchers use the coarse-graining technique to "blur" satellite images of global circulation patterns, for example, they find that "we gain more by settling for less," Aluie says. "It allows us to disentangle different-sized structures of ocean currents in a systematic way."

He draws an analogy to removing your eyeglasses, then looking at a very crisp, detailed image. It will appear to be blurred. But as you look through a succession of increasing stronger eyeglasses, you will often be able to detect various patterns at each step that would otherwise be hidden in the details.

In essence, that is what coarse graining allows the researchers to do: quantify various structures in ocean current and their energy "from the smallest, finest scales to the largest," Aluie says.

Aluie credits Storer for further developing and refining the code; it has been published so other researchers can use it.

Read more at Science Daily

Saturn's rings and tilt could be the product of an ancient, missing moon

Swirling around the planet's equator, the rings of Saturn are a dead giveaway that the planet is spinning at a tilt. The belted giant rotates at a 26.7-degree angle relative to the plane in which it orbits the sun. Astronomers have long suspected that this tilt comes from gravitational interactions with its neighbor Neptune, as Saturn's tilt precesses, like a spinning top, at nearly the same rate as the orbit of Neptune.

But a new modeling study by astronomers at MIT and elsewhere has found that, while the two planets may have once been in sync, Saturn has since escaped Neptune's pull. What was responsible for this planetary realignment? The team has one meticulously tested hypothesis: a missing moon.

In a study appearing in Science, the team proposes that Saturn, which today hosts 83 moons, once harbored at least one more, an extra satellite that they name Chrysalis. Together with its siblings, the researchers suggest, Chrysalis orbited Saturn for several billion years, pulling and tugging on the planet in a way that kept its tilt, or "obliquity," in resonance with Neptune.

But around 160 million years ago, the team estimates, Chrysalis became unstable and came too close to its planet in a grazing encounter that pulled the satellite apart. The loss of the moon was enough to remove Saturn from Neptune's grasp and leave it with the present-day tilt.

What's more, the researchers surmise, while most of Chrysalis' shattered body may have made impact with Saturn, a fraction of its fragments could have remained suspended in orbit, eventually breaking into small icy chunks to form the planet's signature rings.

The missing satellite, therefore, could explain two longstanding mysteries: Saturn's present-day tilt and the age of its rings, which were previously estimated to be about 100 million years old -- much younger than the planet itself.

"Just like a butterfly's chrysalis, this satellite was long dormant and suddenly became active, and the rings emerged," says Jack Wisdom, professor of planetary sciences at MIT and lead author of the new study.

The study's co-authors include Rola Dbouk at MIT, Burkhard Militzer of the University of California at Berkeley, William Hubbard at the University of Arizona, Francis Nimmo and Brynna Downey of the University of California at Santa Cruz, and Richard French of Wellesley College.

A moment of progress


In the early 2000s, scientists put forward the idea that Saturn's tilted axis is a result of the planet being trapped in a resonance, or gravitational association, with Neptune. But observations taken by NASA's Cassini spacecraft, which orbited Saturn from 2004 to 2017, put a new twist on the problem. Scientists found that Titan, Saturn's largest satellite, was migrating away from Saturn at a faster clip than expected, at a rate of about 11 centimeters per year. Titan's fast migration, and its gravitational pull, led scientists to conclude that the moon was likely responsible for tilting and keeping Saturn in resonance with Neptune.

But this explanation hinges on one major unknown: Saturn's moment of inertia, which is how mass is distributed in the planet's interior. Saturn's tilt could behave differently, depending on whether matter is more concentrated at its core or toward the surface.

"To make progress on the problem, we had to determine the moment of inertia of Saturn," Wisdom says.

The lost element

In their new study, Wisdom and his colleagues looked to pin down Saturn's moment of inertia using some of the last observations taken by Cassini in its "Grand Finale," a phase of the mission during which the spacecraft made an extremely close approach to precisely map the gravitational field around the entire planet. The gravitational field can be used to determine the distribution of mass in the planet.

Wisdom and his colleagues modeled the interior of Saturn and identified a distribution of mass that matched the gravitational field that Cassini observed. Surprisingly, they found that this newly identified moment of inertia placed Saturn close to, but just outside the resonance with Neptune. The planets may have once been in sync, but are no longer.

"Then we went hunting for ways of getting Saturn out of Neptune's resonance," Wisdom says.

The team first carried out simulations to evolve the orbital dynamics of Saturn and its moons backward in time, to see whether any natural instabilities among the existing satellites could have influenced the planet's tilt. This search came up empty.

So, the researchers reexamined the mathematical equations that describe a planet's precession, which is how a planet's axis of rotation changes over time. One term in this equation has contributions from all the satellites. The team reasoned that if one satellite were removed from this sum, it could affect the planet's precession.

The question was, how massive would that satellite have to be, and what dynamics would it have to undergo to take Saturn out of Neptune's resonance?

Wisdom and his colleagues ran simulations to determine the properties of a satellite, such as its mass and orbital radius, and the orbital dynamics that would be required to knock Saturn out of the resonance.

They conclude that Saturn's present tilt is the result of the resonance with Neptune and that the loss of the satellite, Chrysalis, which was about the size of Iapetus, Saturn's third-largest moon, allowed it to escape the resonance.

Sometime between 200 and 100 million years ago, Chrysalis entered a chaotic orbital zone, experienced a number of close encounters with Iapetus and Titan, and eventually came too close to Saturn, in a grazing encounter that ripped the satellite to bits, leaving a small fraction to circle the planet as a debris-strewn ring.

The loss of Chrysalis, they found, explains Saturn's precession, and its present-day tilt, as well as the late formation of its rings.

Read more at Science Daily

Pushing the boundaries of chemistry: Properties of heaviest element studied so far measured at GSI/FAIR

An international research team has succeeded in gaining new insights into the chemical properties of the superheavy element flerovium -- element 114 -- at the accelerator facilities of the GSI Helmholtzzentrum für Schwerionenforschung in Darmstadt. The measurements show that flerovium is the most volatile metal in the periodic table. Flerovium is thus the heaviest element in the periodic table that has been chemically studied. With the results, published in the journal Frontiers in Chemistry, GSI confirms its leading position in the study of the chemistry of superheavy elements and opens new perspectives for the international facility FAIR (Facility for Antiproton and Ion Research), which is currently under construction.

Under the leadership of groups from Darmstadt and Mainz, the two longest-lived flerovium isotopes currently known, flerovium-288 and flerovium-289, were produced using the accelerator facilities at GSI/FAIR and were chemically investigated at the TASCA experimental setup. In the periodic table, flerovium is placed below the heavy metal lead. However, early predictions had postulated that relativistic effects of the high charge in the nucleus of the superheavy element on its valence electrons would lead to noble gas-like behavior, while more recent ones had rather suggested a weakly metallic behavior. Two previously conducted chemistry experiments, one of them at GSI in Darmstadt in 2009, led to contradictory interpretations. While the three atoms observed in the first experiment were used to infer noble gas-like behavior, the data obtained at GSI indicated metallic character based on two atoms. The two experiments were unable to clearly establish the character. The new results show that, as expected, flerovium is inert but capable of forming stronger chemical bonds than noble gases, if conditions are suitable. Flerovium is consequently the most volatile metal in the periodic table.

Flerovium is thus the heaviest chemical element whose character has been studied experimentally. With the determination of the chemical properties, GSI/FAIR confirm their leading position in the research of superheavy elements. "Exploring the boundaries of the periodic table has been a pillar of the research program at GSI since the beginning and will be so at FAIR in the future. The fact that a few atoms can already be used to explore the first fundamental chemical properties, giving an indication of how larger quantities of these substances would behave, is fascinating and possible thanks to the powerful accelerator facility and the expertise of the worldwide collaboration," elaborates Professor Paolo Giubellino, Scientific Managing Director of GSI and FAIR. "With FAIR, we are bringing the universe into the laboratory and explore the limits of matter, also of the chemical elements."

Six weeks of experimentation

The experiments conducted at GSI/FAIR to clarify the chemical nature of flerovium lasted a total of six weeks. For this purpose, four trillion calcium-48 ions were accelerated to ten percent of the speed of light every second by the GSI linear accelerator UNILAC and fired at a target containing plutonium-244, resulting in the formation of a few flerovium atoms per day.

The formed flerovium atoms recoiled from the target into the gas-filled separator TASCA. In its magnetic field, the formed isotopes, flerovium-288 and flerovium-289, which have lifetimes on the order of a second, were separated from the intense calcium ion beam and from byproducts of the nuclear reaction. They penetrated a thin film, thus entering the chemistry apparatus, where they were stopped in a helium/argon gas mixture. This gas mixture flushed the atoms into the COMPACT gas chromatography apparatus, where they first came into contact with silicon oxide surfaces. If the bond to silicon oxide was too weak, the atoms were transported further, over gold surfaces -- first those kept at room temperature, and then over increasingly colder ones, down to about -160 °C. The surfaces were deposited as a thin coating on special nuclear radiation detectors, which registered individual atoms by spatially resolved detection of the radioactive decay. Since the decay products undergo further radioactive decay after a short lifetime, each atom leaves a characteristic signature of several events from which the presence of a flerovium atom can unambiguously be inferred.

One atom per week for chemistry

"Thanks to the combination of the TASCA separator, the chemical separation and the detection of the radioactive decays, as well as the technical development of the gas chromatography apparatus since the first experiment, we have succeeded in increasing the efficiency and reducing the time required for the chemical separation to such an extent that we were able to observe one flerovium atom every week," explains Dr. Alexander Yakushev of GSI/FAIR, the spokesperson for the international experiment collaboration.

Six such decay chains were found in the data analysis. Since the setup is similar to that of the first GSI experiment, the newly obtained data could be combined with the two atoms observed at that time and analyzed together. None of the decay chains appeared within the range of the silicon oxide-coated detector, indicating that flerovium does not form a substantial bond with silicon oxide. Instead, all were transported with the gas into the gold-coated portion of the apparatus within less than a tenth of a second. The eight events formed two zones: a first in the region of the gold surface at room temperature, and a second in the later part of the chromatograph, at temperatures so low that a very thin layer of ice covered the gold, so that adsorption occurred on ice.

From experiments with lead, mercury and radon atoms, which served as representatives of heavy metals, weakly reactive metals as well as noble gases, it was known that lead forms a strong bond with silicon oxide, while mercury reaches the gold detector. Radon even flies over the first part of the gold detector at room temperature and is only partially retained at the lowest temperatures. Flerovium results could be compared with this behavior.

Apparently, two types of interaction of a flerovium species with the gold surface were observed. The deposition on gold at room temperature indicates the formation of a relatively strong chemical bond, which does not occur in noble gases. On the other hand, some of the atoms appear never to have had the opportunity to form such bonds and have been transported over long distances of the gold surface, down to the lowest temperatures. This detector range represents a trap for all elemental species. This complicated behavior can be explained by the morphology of the gold surface: it consists of small gold clusters, at the boundaries of which very reactive sites occur, apparently allowing the flerovium to bond. The fact that some of the flerovium atoms were able to reach the cold region indicates that only the atoms that encountered such sites formed a bond, unlike mercury, which was retained on gold in any case. Thus, the chemical reactivity of flerovium is weaker than that of the volatile metal mercury. The current data cannot completely rule out the possibility that the first deposition zone on gold at room temperature is due to the formation of flerovium molecules. It also follows from this hypothesis, though, that flerovium is chemically more reactive than a noble gas element.

International and interdisciplinary collaboration as the key to understanding

The exotic plutonium target material for the production of the flerovium was provided in part by Lawrence Livermore National Laboratory (LLNL), USA. In the Department of Chemistry's TRIGA site at Johannes Gutenberg University Mainz (JGU), the material was electrolytically deposited onto thin titanium foils fabricated at GSI/FAIR. "There is not much of this material available in the world, and we are fortunate to have been able to use it for these experiments that would not otherwise be possible," said Dr. Dawn Shaughnessy, head of the Nuclear and Chemical Sciences Division at LLNL. "This international collaboration brings together skills and expertise from around the world to solve difficult scientific problems and answer long-standing questions, such as the chemical properties of flerovium."

"Our accelerator experiment was complemented by a detailed study of the detector surface in collaboration with several GSI departments as well as the Department of Chemistry and the Institute of Physics at JGU. This has proven to be key to understanding the chemical character of flerovium. As a result, the data from the two earlier experiments are now understandable and compatible with our new conclusions," says Christoph Düllmann, professor of nuclear chemistry at JGU and head of the research groups at GSI and at the Helmholtz Institute Mainz (HIM), a collaboration between GSI and JGU.

How the relativistic effects affect its neighbors, the elements nihonium (element 113) and moscovium (element 115), which have also only been officially recognized in recent years, is the subject of subsequent experiments. Initial data have already been obtained as part of the FAIR Phase 0 program at GSI. Furthermore, the researchers expect that significantly more stable isotopes of flerovium exist, but these have not yet been found. However, the researchers now already know that they can expect to find a metallic element.

Read more at Science Daily

Machine learning gives glimpse of how a dog's brain represents what it sees

Scientists have decoded visual images from a dog's brain, offering a first look at how the canine mind reconstructs what it sees. The Journal of Visualized Experiments published the research done at Emory University.

The results suggest that dogs are more attuned to actions in their environment rather than to who or what is doing the action.

The researchers recorded the fMRI neural data for two awake, unrestrained dogs as they watched videos in three 30-minute sessions, for a total of 90 minutes. They then used a machine-learning algorithm to analyze the patterns in the neural data.

"We showed that we can monitor the activity in a dog's brain while it is watching a video and, to at least a limited degree, reconstruct what it is looking at," says Gregory Berns, Emory professor of psychology and corresponding author of the paper. "The fact that we are able to do that is remarkable."

The project was inspired by recent advancements in machine learning and fMRI to decode visual stimuli from the human brain, providing new insights into the nature of perception. Beyond humans, the technique has been applied to only a handful of other species, including some primates.

"While our work is based on just two dogs it offers proof of concept that these methods work on canines," says Erin Phillips, first author of the paper, who did the work as a research specialist in Berns' Canine Cognitive Neuroscience Lab. "I hope this paper helps pave the way for other researchers to apply these methods on dogs, as well as on other species, so we can get more data and bigger insights into how the minds of different animals work."

Phillips, a native of Scotland, came to Emory as a Bobby Jones Scholar, an exchange program between Emory and the University of St Andrews. She is currently a graduate student in ecology and evolutionary biology at Princeton University.

Berns and colleagues pioneered training techniques for getting dogs to walk into an fMRI scanner and hold completely still and unrestrained while their neural activity is measured. A decade ago, his team published the first fMRI brain images of a fully awake, unrestrained dog. That opened the door to what Berns calls The Dog Project -- a series of experiments exploring the mind of the oldest domesticated species.

Over the years, his lab has published research into how the canine brain processes vision, words, smells and rewards such as receiving praise or food.

Meanwhile, the technology behind machine-learning computer algorithms kept improving. The technology has allowed scientists to decode some human brain-activity patterns. The technology "reads minds" by detecting within brain-data patterns the different objects or actions that an individual is seeing while watching a video.

"I began to wonder, 'Can we apply similar techniques to dogs?'" Berns recalls.

The first challenge was to come up with video content that a dog might find interesting enough to watch for an extended period. The Emory research team affixed a video recorder to a gimbal and selfie stick that allowed them to shoot steady footage from a dog's perspective, at about waist high to a human or a little bit lower.

They used the device to create a half-hour video of scenes relating to the lives of most dogs. Activities included dogs being petted by people and receiving treats from people. Scenes with dogs also showed them sniffing, playing, eating or walking on a leash. Activity scenes showed cars, bikes or a scooter going by on a road; a cat walking in a house; a deer crossing a path; people sitting; people hugging or kissing; people offering a rubber bone or a ball to the camera; and people eating.

The video data was segmented by time stamps into various classifiers, including object-based classifiers (such as dog, car, human, cat) and action-based classifiers (such as sniffing, playing or eating).

Only two of the dogs that had been trained for experiments in an fMRI had the focus and temperament to lie perfectly still and watch the 30-minute video without a break, including three sessions for a total of 90 minutes. These two "super star" canines were Daisy, a mixed breed who may be part Boston terrier, and Bhubo, a mixed breed who may be part boxer.

"They didn't even need treats," says Phillips, who monitored the animals during the fMRI sessions and watched their eyes tracking on the video. "It was amusing because it's serious science, and a lot of time and effort went into it, but it came down to these dogs watching videos of other dogs and humans acting kind of silly."

Two humans also underwent the same experiment, watching the same 30-minute video in three separate sessions, while lying in an fMRI.

The brain data could be mapped onto the video classifiers using time stamps.

A machine-learning algorithm, a neural net known as Ivis, was applied to the data. A neural net is a method of doing machine learning by having a computer analyze training examples. In this case, the neural net was trained to classify the brain-data content.

The results for the two human subjects found that the model developed using the neural net showed 99% accuracy in mapping the brain data onto both the object- and action-based classifiers.

In the case of decoding video content from the dogs, the model did not work for the object classifiers. It was 75% to 88% accurate, however, at decoding the action classifications for the dogs.

The results suggest major differences in how the brains of humans and dogs work.

"We humans are very object oriented," Berns says. "There are 10 times as many nouns as there are verbs in the English language because we have a particular obsession with naming objects. Dogs appear to be less concerned with who or what they are seeing and more concerned with the action itself."

Dogs and humans also have major differences in their visual systems, Berns notes. Dogs see only in shades of blue and yellow but have a slightly higher density of vision receptors designed to detect motion.

"It makes perfect sense that dogs' brains are going to be highly attuned to actions first and foremost," he says. "Animals have to be very concerned with things happening in their environment to avoid being eaten or to monitor animals they might want to hunt. Action and movement are paramount."

For Philips, understanding how different animals perceive the world is important to her current field research into how predator reintroduction in Mozambique may impact ecosystems. "Historically, there hasn't been much overlap in computer science and ecology," she says. "But machine learning is a growing field that is starting to find broader applications, including in ecology."

Read more at Science Daily

Heart of our evolution discovered: 380-million-year-old heart

Researchers have discovered a 380-million-year-old heart -- the oldest ever found -- alongside a separate fossilised stomach, intestine and liver in an ancient jawed fish, shedding new light on the evolution of our own bodies.

The new research, published today in Science, found that the position of the organs in the body of arthrodires -- an extinct class of armoured fishes that flourished through the Devonian period from 419.2 million years ago to 358.9 million years ago -- is similar to modern shark anatomy, offering vital new evolutionary clues.

Lead researcher John Curtin Distinguished Professor Kate Trinajstic, from Curtin's School of Molecular and Life Sciences and the Western Australian Museum, said the discovery was remarkable given that soft tissues of ancient species were rarely preserved and it was even rarer to find 3D preservation.

"As a palaeontologist who has studied fossils for more than 20 years, I was truly amazed to find a 3D and beautifully preserved heart in a 380-million-year-old ancestor," Professor Trinajstic said.

"Evolution is often thought of as a series of small steps, but these ancient fossils suggest there was a larger leap between jawless and jawed vertebrates. These fish literally have their hearts in their mouths and under their gills -- just like sharks today."

This research presents -- for the first time -- the 3D model of a complex s-shaped heart in an arthrodire that is made up of two chambers with the smaller chamber sitting on top.

Professor Trinajstic said these features were advanced in such early vertebrates, offering a unique window into how the head and neck region began to change to accommodate jaws, a critical stage in the evolution of our own bodies.

"For the first time, we can see all the organs together in a primitive jawed fish, and we were especially surprised to learn that they were not so different from us," Professor Trinajstic said.

"However, there was one critical difference -- the liver was large and enabled the fish to remain buoyant, just like sharks today. Some of today's bony fish such as lungfish and birchers have lungs that evolved from swim bladders but it was significant that we found no evidence of lungs in any of the extinct armoured fishes we examined, which suggests that they evolved independently in the bony fishes at a later date."

The Gogo Formation, in the Kimberley region of Western Australia where the fossils were collected, was originally a large reef.

Enlisting the help of scientists at the Australian Nuclear Science and Technology Organisation in Sydney and the European Synchrotron Radiation Facility in France, researchers used neutron beams and synchrotron x-rays to scan the specimens, still embedded in the limestone concretions, and constructed three-dimensional images of the soft tissues inside them based on the different densities of minerals deposited by the bacteria and the surrounding rock matrix.

This new discovery of mineralised organs, in addition to previous finds of muscles and embryos, makes the Gogo arthrodires the most fully understood of all jawed stem vertebrates and clarifies an evolutionary transition on the line to living jawed vertebrates, which includes the mammals and humans.

Co-author Professor John Long, from Flinders University, said: "These new discoveries of soft organs in these ancient fishes are truly the stuff of palaeontologists' dreams, for without doubt these fossils are the best preserved in the world for this age. They show the value of the Gogo fossils for understanding the big steps in our distant evolution. Gogo has given us world firsts, from the origins of sex to the oldest vertebrate heart, and is now one of the most significant fossil sites in the world. It's time the site was seriously considered for world heritage status."

Co-author Professor Per Ahlberg, from Uppsala University, said: "What's really exceptional about the Gogo fishes is that their soft tissues are preserved in three dimensions. Most cases of soft-tissue preservation are found in flattened fossils, where the soft anatomy is little more than a stain on the rock. We are also very fortunate in that modern scanning techniques allow us to study these fragile soft tissues without destroying them. A couple of decades ago, the project would have been impossible."

Read more at Science Daily

Sep 15, 2022

It's a planet: New evidence of baby planet in the making

Astronomers agree that planets are born in protoplanetary disks -- rings of dust and gas that surround young, newborn stars. While hundreds of these disks have been spotted throughout the universe, observations of actual planetary birth and formation have proved difficult within these environments.

Now, astronomers at the Center for Astrophysics | Harvard & Smithsonian have developed a new way to detect these elusive newborn planets -- and with it, "smoking gun" evidence of a small Neptune or Saturn-like planet lurking in a disk. The results are described today in The Astrophysical Journal Letters.

"Directly detecting young planets is very challenging and has so far only been successful in one or two cases," says Feng Long, a postdoctoral fellow at the Center for Astrophysics who led the new study. "The planets are always too faint for us to see because they're embedded in thick layers of gas and dust."

Scientists instead must hunt for clues to infer a planet is developing beneath the dust.

"In the past few years, we've seen many structures pop up on disks that we think are caused by a planet's presence, but it could be caused by something else, too" Long says. "We need new techniques to look at and support that a planet is there."

For her study, Long decided to re-examine a protoplanetary disk known as LkCa 15. Located 518 light years away, the disk sits in the Taurus constellation on the sky. Scientists previously reported evidence for planet formation in the disk using observations with the ALMA Observatory.

Long dove into new high-resolution ALMA data on LkCa 15, obtained primarily in 2019, and discovered two faint features that had not previously been detected.

About 42 astronomical units out from the star -- or 42 times the distance Earth is from the Sun -- Long discovered a dusty ring with two separate and bright bunches of material orbiting within it. The material took the shape of a small clump and a larger arc, and were separated by 120 degrees.

Long examined the scenario with computer models to figure out what was causing the buildup of material and learned that their size and locations matched the model for the presence of a planet.

"This arc and clump are separated by about 120 degrees," she says. "That degree of separation doesn't just happen -- it's important mathematically."

Long points to positions in space known as Lagrange points, where two bodies in motion -- such as a star and orbiting planet -- produce enhanced regions of attraction around them where matter may accumulate.

"We're seeing that this material is not just floating around freely, it's stable and has a preference where it wants to be located based on physics and the objects involved," Long explains.

In this case, the arc and clump of material Long detected are located at the L4 and L5 Lagrange points. Hidden 60 degrees between them is a small planet causing the accumulation of dust at points L4 and L5.

The results show the planet is roughly the size of Neptune or Saturn, and around one to three million years old. (That's relatively young when it comes to planets.)

Directly imaging the small, newborn planet may not be possible any time soon due to technology constraints, but Long believes further ALMA observations of LkCa 15 can provide additional evidence supporting her planetary discovery.

She also hopes her new approach for detecting planets -- with material preferentially accumulating at Lagrange points -- will be utilized in the future by astronomers.

"I do hope this method can be widely adopted in the future," she says. "The only caveat is that this requires very deep data as the signal is weak."

Long recently completed her postdoctoral fellowship at the Center for Astrophysics and will join the University of Arizona as a NASA Hubble Fellow this September.

Read more at Science Daily

Refreezing poles feasible and cheap, new study finds

The poles are warming several times faster than the global average, causing record smashing heatwaves that were reported earlier this year in both the Arctic and Antarctic. Melting ice and collapsing glaciers at high latitudes would accelerate sea level rise around the planet. Fortunately, refreezing the poles by reducing incoming sunlight would be both feasible and remarkably cheap, according to new research published today in IOP Publishing's Environmental Research Communications.

Scientists laid out a possible future program whereby high-flying jets would spray microscopic aerosol particles into the atmosphere at latitudes of 60 degrees north and south -- roughly Anchorage and the southern tip of Patagonia. If injected at a height of 43,000 feet (above airliner cruising altitudes), these aerosols would slowly drift poleward, slightly shading the surface beneath. "There is widespread and sensible trepidation about deploying aerosols to cool the planet," notes lead author Wake Smith, "but if the risk/benefit equation were to pay off anywhere, it would be at the poles."

Particle injections would be performed seasonally in the long days of the local spring and early summer. The same fleet of jets could service both hemispheres, ferrying to the opposite pole with the change of seasons.

Pre-existing military air-to-air refuelling tankers such as the aged KC-135 and the A330 MMRT don't have enough payload at the required altitudes, whereas newly designed high-altitude tankers would prove much more efficient. A fleet of roughly 125 such tankers could loft a payload sufficient to cool the regions poleward of 60°N/S by 2°C per year, which would return them close to their pre-industrial average temperatures. Costs are estimated at $11 billion annually -- less than one-third the cost of cooling the entire planet by the same 2°C magnitude and a tiny fraction of the cost of reaching net zero emissions.

"Game changing though this could be in a rapidly warming world, stratospheric aerosol injections merely treat a symptom of climate change but not the underlying disease. It's aspirin, not penicillin. It's not a substitute for decarbonization," says Smith.

Cooling at the poles would provide direct protection for only a small fraction of the planet, though the mid-latitudes should also experience some temperature reduction. Since less than 1% of the global human population lives in the target deployment zones, a polar deployment would entail much less direct risk to most of humanity than a global program. "Nonetheless, any intentional turning of the global thermostat would be of common interest to all of humanity and not merely the province of Arctic and Patagonian nations," adds Smith.

Read more at Science Daily

Early gibbon fossil found in southwest China: Discovery fills evolutionary history gap of apes

A team of scientists has discovered the earliest gibbon fossil, a find that helps fill a long-elusive evolutionary gap in the history of apes.

The work, reported in the Journal of Human Evolution, centers on hylobatids, a family of apes that includes 20 species of living gibbons, which are found throughout tropical Asia from northeastern India to Indonesia.

"Hylobatids fossil remains are very rare, and most specimens are isolated teeth and fragmentary jaw bones found in cave sites in southern China and southeast Asia dating back no more than 2 million years ago," explains Terry Harrison, a professor of anthropology at New York University and one of the paper's authors. "This new find extends the fossil record of hylobatids back to 7 to 8 million years ago and, more specifically, enhances our understanding of the evolution of this family of apes."

The fossil, discovered in the Yuanmou area of Yunnan Province in southwestern China, is of a small ape called Yuanmoupithecus xiaoyuan. The analysis, which included Xueping Ji of the Kunming Institute of Zoology and the lead author of the study, focused on the teeth and cranial specimens of Yuanmoupithecus, including an upper jaw of an infant that was less than 2 years old when it died.

Using the size of the molar teeth as a guide, the scientists estimate that Yuanmoupithecus was similar in size to today's gibbons, with a body weight of about 6 kilograms -- or about 13 pounds.

"The teeth and the lower face of Yuanmoupithecus are very similar to those of modern-day gibbons, but in a few features the fossil species was more primitive and points to it being the ancestor of all the living species," observes Harrison, part of NYU's Center for the Study of Human Origins.

Ji found the infant upper jaw during his field survey and identified it as a hylobatid by comparing it with modern gibbon skulls in the Kunming Institute of Zoology. In 2018, he invited Harrison and other colleagues to work on the specimens stored in the Yunnan Institute of Cultural Relics and Archaeology and the Yuanmou Man Museum that had been collected over the past 30 years.

"The remains of Yuanmoupithecus are extremely rare, but with diligence it has been possible to recover enough specimens to establish that the Yuanmou fossil ape is indeed a close relative of the living hylobatids," notes Harrison.

The Journal of Human Evolution study also found that Kapi ramnagarensis, which has been claimed to be an earlier species of hylobatid, based on a single isolated fossil molar from India, is not a hylobatid after all, but a member of a more primitive group of primates that are not closely related to modern-day apes.

"Genetic studies indicate that the hylobatids diverged from the lineage leading to the great apes and humans about 17 to 22 million years ago, so there is still a 10-million-year gap in the fossil record that needs to be filled," Harrison cautions. "With continued exploration of promising fossil sites in China and elsewhere in Asia, it is hoped that additional discoveries will help fill these critical gaps in the evolutionary history of hylobatids."

Read more at Science Daily

Pioneering research using bacteria brings scientists a step closer to creating artificial cells with lifelike functionality

Scientists have harnessed the potential of bacteria to help build advanced synthetic cells which mimic real life functionality.

The research, led by the University of Bristol and published today in Nature, makes important progress in deploying synthetic cells, known as protocells, to more accurately represent the complex compositions, structure, and function of living cells.

Establishing true-to-life functionality in protocells is a global grand challenge spanning multiple fields, ranging from bottom-up synthetic biology and bioengineering to origin of life research. Previous attempts to model protocells using microcapsules have fallen short, so the team of researchers turned to bacteria to build complex synthetic cells using a living material assembly process.

Professor Stephen Mann from the University of Bristol's School of Chemistry, and the Max Planck Bristol Centre for Minimal Biologytogether with colleagues Drs Can Xu, Nicolas Martin (currently at the University of Bordeaux) and Mei Li in the Bristol Centre for Protolife Research have demonstrated an approach to the construction of highly complex protocells using viscous micro-droplets filled with living bacteria as a microscopic building site.

In the first step, the team exposed the empty droplets to two types of bacteria. One population spontaneously was captured within the droplets while the other was trapped at the droplet surface.

Then, both types of bacteria were destroyed so that the released cellular components remained trapped inside or on the surface of the droplets to produce membrane-coated bacteriogenic protocells containing thousands of biological molecules, parts and machinery.

The researchers discovered that the protocells were able to produce energy-rich molecules (ATP) via glycolysis and synthesize RNA and proteins by in vitro gene expression, indicating that the inherited bacterial components remained active in the synthetic cells.

Further testing the capacity of this technique, the team employed a series of chemical steps to remodel the bacteriogenic protocells structurally and morphologically. The released bacterial DNA was condensed into a single nucleus-like structure, and the droplet interior infiltrated with a cytoskeletal-like network of protein filaments and membrane-bounded water vacuoles.

As a step towards the construction of a synthetic/living cell entity, the researchers implanted living bacteria into the protocells to generate self-sustainable ATP production and long-term energization for glycolysis, gene expression and cytoskeletal assembly. Curiously, the protoliving constructs adopted an amoeba-like external morphology due to on-site bacterial metabolism and growth to produce a cellular bionic system with integrated life-like properties.

Corresponding author Professor Stephen Mann said: "Achieving high organisational and functional complexity in synthetic cells is difficult especially under close-to-equilibrium conditions. Hopefully, our current bacteriogenic approach will help to increase the complexity of current protocell models, facilitate the integration of myriad biological components and enable the development of energised cytomimetic systems."

Read more at Science Daily

Sep 13, 2022

Through the quantum looking glass

An ultrathin invention could make future computing, sensing and encryption technologies remarkably smaller and more powerful by helping scientists control a strange but useful phenomenon of quantum mechanics, according to new research recently published in the journal Science.

Scientists at Sandia National Laboratories and the Max Planck Institute for the Science of Light have reported on a device that could replace a roomful of equipment to link photons in a bizarre quantum effect called entanglement. This device -- a kind of nano-engineered material called a metasurface -- paves the way for entangling photons in complex ways that have not been possible with compact technologies.

When scientists say photons are entangled, they mean they are linked in such a way that actions on one affect the other, no matter where or how far apart the photons are in the universe. It is an effect of quantum mechanics, the laws of physics that govern particles and other very tiny things.

Although the phenomenon might seem odd, scientists have harnessed it to process information in new ways. For example, entanglement helps protect delicate quantum information and correct errors in quantum computing, a field that could someday have sweeping impacts in national security, science and finance. Entanglement is also enabling new, advanced encryption methods for secure communication.

Research for the groundbreaking device, which is a hundred times thinner than a sheet of paper, was performed, in part, at the Center for Integrated Nanotechnologies, a Department of Energy Office of Science user facility operated by Sandia and Los Alamos national laboratories. Sandia's team received funding from the Office of Science, Basic Energy Sciences program.

Light goes in, entangled photons come out

The new metasurface acts as a doorway to this unusual quantum phenomenon. In some ways, it's like the mirror in Lewis Carrol's "Through the Looking-Glass," through which the young protagonist Alice experiences a strange, new world.

Instead of walking through their new device, scientists shine a laser through it. The beam of light passes through an ultrathin sample of glass covered in nanoscale structures made of a common semiconductor material called gallium arsenide.

"It scrambles all the optical fields," said Sandia senior scientist Igal Brener, an expert in a field called nonlinear optics who led the Sandia team. Occasionally, he said, a pair of entangled photons at different wavelengths emerge from the sample in the same direction as the incoming laser beam.

Brener said he is excited about this device because it is designed to produce complex webs of entangled photons -- not just one pair at a time, but several pairs all entangled together, and some that can be indistinguishable from each other. Some technologies need these complex varieties of so-called multi-entanglement for sophisticated information processing schemes.

Other miniature technologies based on silicon photonics can also entangle photons but without the much-needed level of complex, multi-entanglement. Until now, the only way to produce such results was with multiple tables full of lasers, specialized crystals and other optical equipment.

"It is quite complicated and kind of intractable when this multi-entanglement needs more than two or three pairs," Brener said. "These nonlinear metasurfaces essentially achieve this task in one sample when before it would have required incredibly complex optical setups."

The Science paper outlines how the team successfully tuned their metasurface to produce entangled photons with varying wavelengths, a critical precursor to generating several pairs of intricately entangled photons simultaneously.

However, the researchers note in their paper that the efficiency of their device -- the rate at which they can generate groups of entangled photons -- is lower than that of other techniques and needs to be improved.

What is a metasurface?


A metasurface is a synthetic material that interacts with light and other electromagnetic waves in ways conventional materials can't. Commercial industries, said Brener, are busy developing metasurfaces because they take up less space and can do more with light than, for instance, a traditional lens.

"You now can replace lenses and thick optical elements with metasurfaces," Brener said. "Those types of metasurfaces will revolutionize consumer products."

Sandia is one of the leading institutions in the world performing research in metasurfaces and metamaterials. Between its Microsystems Engineering, Science and Applications complex, which manufactures compound semiconductors, and the nearby Center for Integrated Nanotechnologies, researchers have access to all the specialized tools they need to design, fabricate and analyze these ambitious new materials.

"The work was challenging as it required precise nanofabrication technology to obtain the sharp, narrowband optical resonances that seeds the quantum process of the work," said Sylvain Gennaro, a former postdoctoral researcher at Sandia who worked on several aspects of the project.

The device was designed, fabricated and tested through a partnership between Sandia and a research group led by physicist Maria Chekhova, an expert in the quantum entanglement of photons at the Max Planck Institute for the Science of Light.

"Metasurfaces are leading to a paradigm shift in quantum optics, combining ultrasmall sources of quantum light with far reaching possibilities for quantum state engineering," said Tomás Santiago-Cruz, a member of the Max Plank team and first author on the paper.

Brener, who has studied metamaterials for more than a decade, said this newest research could possibly spark a second revolution -- one that sees these materials developed not just as a new kind of lens, but as a technology for quantum information processing and other new applications.

"There was one wave with metasurfaces that is already well established and on its way. Maybe there is a second wave of innovative applications coming," he said.

Read more at Science Daily

Pace as important as 10,000 steps for health

10,000 steps a day is the 'sweet spot' for lowered risk of disease and death, but how fast you walk could be just as important according to new research.

The studies, published in the journals JAMA Internal Medicine and JAMA Neurology, monitored 78, 500 adults with wearable trackers -- making these the largest studies to objectively track step count in relation to health outcomes.

The researchers from the University of Sydney, Australia and University of Southern Denmark found lowered risk of dementia, heart disease, cancer and death are associated with achieving 10,000 steps a day. However, a faster stepping pace like a power walk showed benefits above and beyond the number of steps achieved.

"The take-home message here is that for protective health benefits people could not only ideally aim for 10,000 steps a day but also aim to walk faster," said co-lead author Dr Matthew Ahmadi, Research Fellow at the University of Sydney's Charles Perkins Centre and Faculty of Medicine and Health.

'For less active individuals, our study also demonstrates that as low as 3,800 steps a day can cut the risk of dementia by 25 percent," said co-lead author Associate Professor Borja del Pozo Cruz from the University of Southern Denmark and senior researcher in health at the University of Cadiz.

Key points:
 

  • Every 2,000 steps lowered risk of premature death incrementally by 8 to 11 percent, up to approximately 10,000 steps a day.
  • Similar associations were seen for cardiovascular disease and cancer incidence.
  • A higher number of steps per day was associated with a lower risk of all-cause dementia
  • 9,800 steps was the optimal dose linked to lower risk of dementia by 50 percent, however risk was reduced by 25 percent at as low as 3,800 steps a day
  • Stepping intensity or a faster pace showed beneficial associations for all outcomes (dementia, heart disease, cancer and death) over and above total daily steps.


"Step count is easily understood and widely used by the public to track activity levels thanks to the growing popularity of fitness trackers and apps, but rarely do people think about the pace of their steps," said senior author Emmanuel Stamatakis, Professor of Physical Activity, Lifestyle and Population Health at the University of Sydney.

"Findings from these studies could inform the first formal step-based physical activity guidelines and help develop effective public health programs aimed at preventing chronic disease."

How was the study conducted?

The study drew on data from UK Biobank to link up step count data from 78,500 UK adults aged 40 to 79 years with health outcomes 7 years on. Participants wore a wrist accelerometer to measure physical activity over a period of 7 days (minimum 3 days, including a weekend day and monitoring during sleep periods).

With ethics consent, this information was linked with participants' health records through several data sources and registries including inpatient hospital, primary care records, and cancer and death registries.

Only those who were free of cardiovascular disease, cancer or dementia at baseline and disease-free in the first two years of the study were included in the final assessment. Statistical adjustments were also made for confounders, such as the fact that people who do more steps generally walk faster.

The researchers note that the studies are observational, meaning they cannot show direct cause and effect, however, note the strong and consistent associations seen across both studies at the population level.

"The size and scope of these studies using wrist-worn trackers makes it the most robust evidence to date suggesting that 10,000 steps a day is the sweet spot for health benefits and walking faster is associated with additional benefits," said Dr Matthew Ahmadi.

Read more at Science Daily

The gene to which we owe our big brain

ARHGAP11B -- this complex name is given to a gene that is unique to humans and plays an essential role in the development of the neocortex. The neocortex is the part of the brain to which we owe our high mental abilities. A team of researchers from the German Primate Center (DPZ) -- Leibniz Institute for Primate Research in Göttingen, the Max Planck Institute for Molecular Cell Biology and Genetics (MPI-CBG) in Dresden, and the Hector Institute for Translational Brain Research (HITBR) in Mannheim has investigated the importance of ARHGAP11B in neocortex development during human evolution.

To do this, the team introduced for the first time a gene that exists only in humans into laboratory-grown brain organoids from our closest living relatives, chimpanzees. In the chimpanzee brain organoid, the ARHGAP11B gene led to an increase in brain stem cells relevant to brain growth and an increase in those neurons that play a critical role in the extraordinary mental abilities of humans. If, on the other hand, the ARHGAP11B gene was switched off in human brain organoids, the quantity of these brain stem cells fell to the level of a chimpanzee. Thus, the research team was able to show that the ARGHAP11B gene played a crucial role in the evolution of the brain from our ancestors to modern humans.

Animal studies on great apes have long been banned in Europe for ethical reasons. For the question pursued here, so-called organoids, i.e. three-dimensional cell structures a few millimeters in size that are grown in the laboratory, are an alternative to animal experiments. These organoids can be produced from pluripotent stem cells, which then differentiate into specific cell types, such as nerve cells. In this way, the research team was able to produce both chimpanzee brain organoids and human brain organoids. "These brain organoids allowed us to investigate a central question concerning ARHGAP11B," says Wieland Huttner of the MPI-CBG, one of the three lead authors of the study.

"In a previous study we were able to show that ARHGAP11B can enlarge a primate brain. However, it was previously unclear whether ARHGAP11B had a major or minor role in the evolutionary enlargement of the human neocortex," says Wieland Huttner. To clarify this, the ARGHAP11B gene was first inserted into brain ventricle-like structures of chimpanzee organoids. Would the ARGHAP11B gene lead to the proliferation of those brain stem cells in the chimpanzee brain that are necessary for the enlargement of the neocortex? "Our study shows that the gene in chimpanzee organoids causes an increase in relevant brain stem cells and an increase in those neurons that play a crucial role in the extraordinary mental abilities of humans," said Michael Heide, the study's lead author, who is head of the Junior Research Group Brain Development and Evolution at the DPZ and employee at the MPI-CBG. When the ARGHAP11B gene was knocked out in human brain organoids or the function of the ARHGAP11B protein was inhibited, the amount of these brain stem cells decreased to the level of a chimpanzee. "We were thus able to show that ARHGAP11B plays a crucial role in neocortex development during human evolution," says Michael Heide. Julia Ladewig of HITBR, the third of the lead authors, adds: "Given this important role of ARHGAP11B, it is furthermore conceivable that certain maldevelopments of the neocortex may be caused by mutations in this gene."

From Science Daily

What killed dinosaurs and other life on Earth?

Determining what killed the dinosaurs 66 million years ago at the end of the Cretaceous Period has long been the topic of debate, as scientists set out to determine what caused the five mass extinction events that reshaped life on planet Earth in a geological instant. Some scientists argue that comets or asteroids that crashed into Earth were the most likely agents of mass destruction, while others argue that large volcanic eruptions were the cause. A new Dartmouth-led study published in the Proceedings of the National Academy of Sciences (PNAS) reports that volcanic activity appears to have been the key driver of mass extinctions.

The findings provide the most compelling quantitative evidence so far that the link between major volcanic eruptions and wholesale species turnover is not simply a matter of chance.

Four of the five mass extinctions are contemporaneous with a type of volcanic outpouring called a flood basalt, the researchers say. These eruptions flood vast areas -- even an entire continent -- with lava in the blink of a geological eye, a mere million years. They leave behind giant fingerprints as evidence -- extensive regions of step-like, igneous rock (solidified from the erupted lava) that geologists call "large igneous provinces."

To count as "large," a large igneous province must contain at least 100,000 cubic kilometers of magma. For context, the 1980 eruption of Mount St. Helens involved less than one cubic kilometer of magma. The researchers say that most of the volcanoes represented in the study erupted on the order of a million times more lava than that.

The team drew on three well-established datasets on geologic time scale, paleobiology, and large igneous provinces to examine the temporal connection between mass extinction and large igneous provinces.

"The large step-like areas of igneous rock from these big volcanic eruptions seem to line up in time with mass extinctions and other significant climactic and environmental events,"says lead author Theodore Green '21, who conducted this research as part of the Senior Fellowship program at Dartmouth and is now a graduate student at Princeton.

In fact, a series of eruptions in present-day Siberia triggered the most destructive of the mass extinctions about 252 million years ago, releasing a gigantic pulse of carbon dioxide into the atmosphere and nearly choking off all life. Bearing witness are the Siberian Traps, a large region of volcanic rock roughly the size of Australia.

Volcanic eruptions also rocked the Indian subcontinent around the time of the great dinosaur die-off, creating what is known today as the Deccan plateau. This, much like the asteroid strike, would have had far-reaching global effects, blanketing the atmosphere in dust and toxic fumes, asphyxiating dinosaurs and other life in addition to altering the climate on long time scales.

On the other hand, the researchers say, the theories in favor of annihilation by asteroid impact hinge upon the Chicxulub impactor, a space rock that crash-landed into Mexico's Yucatan Peninsula around the same time that the dinosaurs went extinct.

"All other theories that attempted to explain what killed the dinosaurs, including volcanism, got steamrolled when the Chicxulub impact crater was discovered," says co-author Brenhin Keller, an assistant professor of earth sciences at Dartmouth. But there's very little evidence of similar impact events that coincide with the other mass extinctions despite decades of exploration, he points out.

At Dartmouth, Green set out to find a way to quantify the apparent link between eruptions and extinctions and test whether the coincidence was just chance or whether there was evidence of a causal relationship between the two. Working with Keller and co-author Paul Renne, professor-in-residence of earth and planetary science at University of California, Berkeley and director of the Berkeley Geochronology Center, Green recruited the supercomputers at the Dartmouth Discovery Cluster to crunch the numbers.

The researchers compared the best available estimates of flood basalt eruptions with periods of drastic species kill-off in the geological timescale, including but not limited to the five mass extinctions. To prove that the timing was more than a random chance, they examined whether the eruptions would line up just as well with a randomly generated pattern and repeated the exercise with a 100 million such patterns. They found that the agreement with extinction periods was far greater than random chance.

"While it is difficult to determine if a particular volcanic outburst caused one particular mass extinction, our results make it hard to ignore the role of volcanism in extinction," says Keller. If a causal link were to be found between volcanic flood basalts and mass extinctions, scientists expect that larger eruptions would entail more severe extinctions, but such a correlation has not been observed.

Rather than considering the absolute magnitude of eruptions, the research team ordered the volcanic events by the rate at which they spewed lava. They found that the volcanic events with the highest eruptive rates did indeed cause the most destruction, producing more severe extinctions up to the mass extinctions.

"Our results indicate that in all likelihood there would have been a mass extinction at the Cretaceous tertiary boundary of some significant magnitude, regardless of whether there was an impact or not, which can be shown more quantitatively now," says Renne. "The fact that there was an impact undoubtedly made things worse."

The researchers ran the numbers for asteroids too. The coincidence of impacts with periods of species turnover was significantly weaker, and dramatically worsened when the Chicxulub impactor was not considered, suggesting that other smaller known impactors did not cause significant extinctions.

The eruption rate of the Deccan Traps in India suggests that the stage was set for widespread extinction even without the asteroid, says Green. The impact was the double whammy that loudly sounded the death knell for the dinosaurs, he adds.

Flood basalt eruptions aren't common in the geologic record, says Green. The last one of comparable but significantly smaller scale happened about 16 million years ago in the Pacific Northwest.

Read more at Science Daily

Sep 12, 2022

Could more of Earth's surface host life?

Of all known planets, Earth is as friendly to life as any planet could possibly be -- or is it? If Jupiter's orbit changes, a new study shows Earth could be more hospitable than it is today.

When a planet has a perfectly circular orbit around its star, the distance between the star and the planet never changes. Most planets, however, have "eccentric" orbits around their stars, meaning the orbit is oval-shaped. When the planet gets closer to its star, it receives more heat, affecting the climate.

Using detailed models based on data from the solar system as it is known today, UC Riverside researchers created an alternative solar system. In this theoretical system, they found that if gigantic Jupiter's orbit were to become more eccentric, it would in turn induce big changes in the shape of Earth's orbit.

"If Jupiter's position remained the same, but the shape of its orbit changed, it could actually increase this planet's habitability," said Pam Vervoort, UCR Earth and planetary scientist and lead study author.

Between zero and 100 degrees Celsius, the Earth's surface is habitable for multiple known life forms. If Jupiter pushed Earth's orbit to become more eccentric, parts of the Earth would sometimes get closer to the sun. Parts of the Earth's surface that are now sub-freezing would get warmer, increasing temperatures in the habitable range.

This result, now published in the Astronomical Journal, upends two long-held scientific assumptions about our solar system.

"Many are convinced that Earth is the epitome of a habitable planet and that any change in Jupiter's orbit, being the massive planet it is, could only be bad for Earth," Vervoort said. "We show that both assumptions are wrong."

The researchers are interested in applying this finding to the search for habitable planets around other stars, called exoplanets.

"The first thing people look for in an exoplanet search is the habitable zone, the distance between a star and a planet to see if there's enough energy for liquid water on the planet's surface," said Stephen Kane, UCR astrophysicist and study co-author.

During its orbit, different parts of a planet receive more or fewer direct rays, resulting in the planet having seasons. Parts of the planet may be pleasant during one season, and extremely hot or cold in another.

"Having water on its surface a very simple first metric, and it doesn't account for the shape of a planet's orbit, or seasonal variations a planet might experience," Kane said.

Existing telescopes are capable of measuring a planet's orbit. However, there are additional factors that could affect habitability, such as the degree to which a planet is tilted toward or away from a star. The part of the planet tilted away from the star would get less energy, causing it to be colder.

This same study found that if Jupiter were positioned much closer to the sun, it would induce extreme tilting on Earth, which would make large sections of the Earth's surface sub-freezing.

It is more difficult to measure tilt, or a planet's mass, so the researchers would like to work toward methods that help them estimate those factors as well.

Ultimately, the movement of a giant planet is important in the quest to make predictions about the habitability of planets in other systems as well as the quest to understand its influence in this solar system.

Read more at Science Daily

Surprising discovery shows a slowing of continental plate movement controlled the timing of Earth's largest volcanic events

Scientists have shed new light on the timing and likely cause of major volcanic events that occurred millions of years ago and caused such climatic and biological upheaval that they drove some of the most devastating extinction events in Earth's history.

Surprisingly the new research, published today in the journal Science Advances, suggests a slowing of continental plate movement was the critical event that enabled magma to rise to the Earth's surface and deliver the devastating knock-on impacts.

Earth's history has been marked by major volcanic events, called Large Igneous Provinces (LIPs) -- the largest of which have caused major increases in atmospheric carbon emissions that warmed Earth's climate, drove unprecedented changes to ecosystems, and resulted in mass extinctions on land and in the oceans.

Using chemical data from ancient mudstone deposits obtained from a 1.5 km-deep borehole in Wales, an international team led by scientists from Trinity College Dublin's School of Natural Sciences was able to link two key events from around 183 million years ago (the Toarcian period).

The team discovered that this time period, which was characterised by some of the most severe climatic and environmental changes ever, directly coincided with the occurrence of major volcanic activity and associated greenhouse gas release on the southern hemisphere, in what is nowadays known as southern Africa, Antarctica and Australia.

On further investigation -- and more importantly -- the team's plate reconstruction models helped them discover the key fundamental geological process that seemed to control the timing and onset of this volcanic event and others of great magnitude.

Micha Ruhl, Assistant Professor in Trinity's School of Natural Sciences, led the team. He said:

"Scientists have long thought that the onset of upwelling of molten volcanic rock, or magma, from deep in Earth's interior, as mantle plumes, was the instigator of such volcanic activity but the new evidence shows that the normal rate of continental plate movement of several centimetres per year effectively prevents magma from penetrating Earth's continental crust.

"It seems it is only when the speed of continental plate movement slows down to near zero that magmas from mantle plumes can effectively make their way to the surface, causing major large igneous province volcanic eruptions and their associated climatic perturbations and mass extinctions.

"Crucially, further assessment shows that a reduction in continental plate movement likely controlled the onset and duration of many of the major volcanic events throughout Earth's history, making it a fundamental process in controlling the evolution of climate and life at Earth's surface throughout the history of this planet."

The study of past global change events, such as in the Toarcian, allows scientists to disentangle the different processes that control the causes and consequences of global carbon cycle change and constrain fundamental Earth system processes that control tipping points in Earth's climate system.

Read more at Science Daily

Synapse-related genes in microglia are changed by contextual fear conditioning

Microglia acts as the first line of defense in the central nervous system, constantly scanning for pathogens and abnormalities and releasing small proteins called cytokines to help wade off infections. Previous research has shown that, in mice conditioned to fear a particular environment (contextual fear conditioning), microglia play a pivotal role in transferring traumatic memories from short to long-term memory (fear memory consolidation), and the memories subsequent extinction.

Now, Tohoku University scientists have demonstrated that microglial genes associated with the synapse -- structures that allow neurons to pass signals to one another -- undergo changes in response to the consolidation and extinction of contextual fear conditioning. This suggests that microglia and neurons crosstalk via 'non-immune' functions and clarifies the mechanisms linking microglia and neuronal activity related to fear conditioning.

Details of their research were published in the journal Brain Research Bulletin on August 18, 2022.

Dr Zhiqian Yu and Professor Hiroaki Tomita from Tohoku University's Graduate School of Medicine and their team has previously revealed that when mice were subjected to chronic and acute stress, their microglial released a type of cytokine known as TNF-a, which is used by the immune system for cell signaling. TNF-a increased during fear memory consolidation, but returned to base levels after extinction. Hippocampal TNF-a, furthermore, blocks the retrieval and reconsolidating of contextual fear and spatial memories.

Building on the previous study, the team adopted microarray techniques in microglia from mice exposed to contextual fear conditioning. They showed that synapse-related genes in microglia are changed by contextual fear conditioning. However, they also discovered that consolidating the fear memory induced immune dysfunction in microglia and did not recover even during the process of extinction.

Within the microglia's plethora of synaptic function-related genes, Gamma-aminobutyric acid (GABA) and GABAR receptors (GABAR) are the earliest neurotransmitter systems to emerge during development. The GABARB3 encodes the ?3 subunit of GABAA receptors in neurological disorders such as epilepsy and autism.

"Using real-time PCR and immune stain technologies, we found that GABRB3 was expressed in microglial cytoplasm and the long branching processes of the hippocampus," said Yu. "The mRNA and protein levels of GABRB3 changed significantly after fear memory consolidation but recovered after extinction."

Additionally, the researchers investigated a family of proteins called Synapsin, which regulate neurotransmitters' release at the synapse. The transcription of microglial Synapsin was expressed in MG-6 cell line and primary microglia that increased under fear memory consolidation but recovered after fear memory extinction.

Read more at Science Daily

Stone age surgery: Earliest evidence of amputation found

A team of Indonesian and Australian researchers have uncovered the oldest case of surgical amputation to date in Borneo. The find presents a remarkable feat in human prehistory.

The discovery, published in Nature, describes the skeletal remains of a young adult found in a cave in Borneo, who had part of the left lower leg and left foot amputated, probably as a child, at least 31,000 years ago. The person survived the surgical procedure, living for at least another six to nine years.

The find presents a remarkable feat. It is notoriously difficult to prevent infections in surgical amputations, even to this day. Yet 30,000 years ago a community was able to successfully navigate veins, arteries, nerves, and tissue, and keep the wound clean so that it healed successfully. The individual went on the live into adulthood where an unknown cause eventually led to their death.

Bioarchaeologist and an expert in ancient skeletons, Dr Melandri Vlok, at University of Sydney said the find is "incredibly exciting and unexpected."

"The discovery implies that at least some modern human foraging groups in tropical Asia had developed sophisticated medical knowledge and skills long before the Neolithic farming transition," said Dr Vlok, who is co-lead author of the paper and a postdoctoral research associate in Sydney Southeast Asia Centre.

Studying bones

The skeleton of the young adult, possibly in their 20s when they died, was carefully buried within LiangTebo cave -- located Borneo in East Kalimantan, in a limestone karst area that harbours some of the world's earliest dated rock art.

The bones were uncovered by archaeologists from Griffith University and University of Western Australia (UWA) just days before borders closed for the COVID-19 pandemic in March 2020. The team was led by Professor Maxime Aubert and Dr Tim Maloney (Griffith University), Dr India Dilkes-Hall (UWA) and Mr Andika Priyatno from the Kalimantan Timur Cultural Heritage Preservation Centre.

The University of Sydney's Dr Vlok was invited to study the bones when they were brought back to Australia.

"No one told me they had not found the left foot in the grave," Dr Vlok said. "They kept it hidden from me to see what I would find."

As Dr Vlok laid the bones out, the left leg looked withered, and was the size of a child's, but the individual was an adult. She unwrapped the part of the leg that contained the stump and noticed the cut was clean, well healed and had no evidence of any infection. "The chances the amputation was an accident was so infinitely small," Dr Vlok said. "The only conclusion was this was stone age surgery."

Dr Vlok ran to the office to tell her research colleagues what she had found. "I told them I thought it looked like a surgical amputation," she said. "It wasn't until then that they said they already knew the foot was missing." Dr Vlok had just confirmed their suspicions. The foot was never placed in the grave to begin with.

An accident

While it is not entirely clear what led to the amputation, the individual also had a very well healed neck fracture and trauma to their collar bone that may have occurred during the same event, said Dr Vlok.

"An accident, such as a rock fall may have caused the injuries, and it was clearly recognised by the community that the foot had to be taken off for the child to survive," she said.

"It is an extremely rugged environment with steep mountains dotted with caves containing some of the oldest paintings created by our species," said Professor Aubert.

Archaeologists including excavation lead Dr Tim Maloney had to kayak into the valley and scale the enormous cliff to get into the cave, proving just how remarkable it was for someone with only one leg to have survived in such challenging terrain.

Read more at Science Daily