Sep 17, 2024

A wobble from Mars could be sign of dark matter

In a new study, MIT physicists propose that if most of the dark matter in the universe is made up of microscopic primordial black holes -- an idea first proposed in the 1970s -- then these gravitational dwarfs should zoom through our solar system at least once per decade. A flyby like this, the researchers predict, would introduce a wobble into Mars' orbit, to a degree that today's technology could actually detect.

Such a detection could lend support to the idea that primordial black holes are a primary source of dark matter throughout the universe.

"Given decades of precision telemetry, scientists know the distance between Earth and Mars to an accuracy of about 10 centimeters," says study author David Kaiser, professor of physics and the Germeshausen Professor of the History of Science at MIT. "We're taking advantage of this highly instrumented region of space to try and look for a small effect. If we see it, that would count as a real reason to keep pursuing this delightful idea that all of dark matter consists of black holes that were spawned in less than a second after the Big Bang and have been streaming around the universe for 14 billion years."

Kaiser and his colleagues report their findings today in the journal Physical Review D. The study's co-authors are lead author Tung Tran '24, who is now a graduate student at Stanford University; Sarah Geller '12, SM '17, PhD '23, who is now a postdoc at the University of California at Santa Cruz; and MIT Pappalardo Fellow Benjamin Lehmann.

Beyond particles

Less than 20 percent of all physical matter is made from visible stuff, from stars and planets, to the kitchen sink. The rest is composed of dark matter, a hypothetical form of matter that is invisible across the entire electromagnetic spectrum yet is thought to pervade the universe and exert a gravitational force large enough to affect the motion of stars and galaxies.

Physicists have erected detectors on Earth to try and spot dark matter and pin down its properties. For the most part, these experiments assume that dark matter exists as a form of exotic particle that might scatter and decay into observable particles as it passes through a given experiment. But so far, such particle-based searches have come up empty.

In recent years, another possibility, first introduced in the 1970s, has regained traction: Rather than taking on a particle form, dark matter could exist as microscopic, primordial black holes that formed in the first moments following the Big Bang. Unlike the astrophysical black holes that form from the collapse of old stars, primordial black holes would have formed from the collapse of dense pockets of gas in the very early universe and would have scattered across the cosmos as the universe expanded and cooled.

These primordial black holes would have collapsed an enormous amount of mass into a tiny space. The majority of these primordial black holes could be as small as a single atom and as heavy as the largest asteroids. It would be conceivable, then, that such tiny giants could exert a gravitational force that could explain at least a portion of dark matter. For the MIT team, this possibility raised an initially frivolous question.

"I think someone asked me what would happen if a primordial black hole passed through a human body," recalls Tung, who did a quick pencil-and-paper calculation to find that if such a black hole zinged within 1 meter of a person, the force of the black hole would push the person 6 meters, or about 20 feet away in a single second. Tung also found that the odds were astronomically unlikely that a primordial black hole would pass anywhere near a person on Earth.

Their interest piqued, the researchers took Tung's calculations a step further, to estimate how a black hole flyby might affect much larger bodies such as the Earth and the moon.

"We extrapolated to see what would happen if a black hole flew by Earth and caused the moon to wobble by a little bit," Tung says. "The numbers we got were not very clear. There are many other dynamics in the solar system that could act as some sort of friction to cause the wobble to dampen out."

Close encounters

To get a clearer picture, the team generated a relatively simple simulation of the solar system that incorporates the orbits and gravitational interactions between all the planets, and some of the largest moons.

"State-of-the-art simulations of the solar system include more than a million objects, each of which has a tiny residual effect," Lehmann notes. "But even modeling two dozen objects in a careful simulation, we could see there was a real effect that we could dig into."

The team worked out the rate at which a primordial black hole should pass through the solar system, based on the amount of dark matter that is estimated to reside in a given region of space and the mass of a passing black hole, which in this case, they assumed to be as massive as the largest asteroids in the solar system, consistent with other astrophysical constraints.

"Primordial black holes do not live in the solar system. Rather, they're streaming through the universe, doing their own thing," says co-author Sarah Geller. "And the probability is, they're going through the inner solar system at some angle once every 10 years or so."

Given this rate, the researchers simulated various asteroid-mass black holes flying through the solar system, from various angles, and at velocities of about 150 miles per second. (The directions and speeds come from other studies of the distribution of dark matter throughout our galaxy.) They zeroed in on those flybys that appeared to be "close encounters," or instances that caused some sort of effect in surrounding objects. They quickly found that any effect in the Earth or the moon was too uncertain to pin to a particular black hole. But Mars seemed to offer a clearer picture.

The researchers found that if a primordial black hole were to pass within a few hundred million miles of Mars, the encounter would set off a "wobble," or a slight deviation in Mars' orbit. Within a few years of such an encounter, Mars' orbit should shift by about a meter -- an incredibly small wobble, given the planet is more than 140 million miles from Earth. And yet, this wobble could be detected by the various high-precision instruments that are monitoring Mars today.

If such a wobble were detected in the next couple of decades, the researchers acknowledge there would still be much work needed to confirm that the push came from a passing black hole rather than a run-of-the-mill asteroid.

"We need as much clarity as we can of the expected backgrounds, such as the typical speeds and distributions of boring space rocks, versus these primordial black holes," Kaiser notes. "Luckily for us, astronomers have been tracking ordinary space rocks for decades as they have flown through our solar system, so we could calculate typical properties of their trajectories and begin to compare them with the very different types of paths and speeds that primordial black holes should follow."

To help with this, the researchers are exploring the possibility of a new collaboration with a group that has extensive expertise simulating many more objects in the solar system.

"We are now working to simulate a huge number of objects, from planets to moons and rocks, and how they're all moving over long time scales," Geller says. "We want to inject close encounter scenarios, and look at their effects with higher precision."

Read more at Science Daily

Critical crops' alternative way to succeed in heat and drought

Scientists have discovered that certain plants can survive stressful, dry conditions by controlling water loss through their leaves without relying on their usual mechanism -- tiny pores known as 'stomata'.

Nonstomatal control of transpiration in maize, sorghum, and proso millet -- all C4 crops which are critical for global food security -- gives these plants an advantage in maintaining a beneficial microclimate for photosynthesis within their leaves.

This allows the plants to absorb carbon dioxide as part of the photosynthesis and growth process, despite raised temperatures and increased atmospheric demand for water without increasing the water expenditure.

Publishing their findings in PNAS, researchers from the University of Birmingham, Australian National University, Canberra, and James Cook University, Cairns, challenge traditional understanding of plant transpiration and photosynthesis under stressful and dry growing conditions -- namely that stomata alone control leaf water loss.

Co-author Dr Diego Márquez, from the University of Birmingham, commented: "This revolutionised our understanding of plant-water relations by showing that nonstomatal control of transpiration limits water loss without compromising carbon gain -- challenging what is typically accepted as an unavoidable trade-off.

"Our findings have significant implications for plant adaptation to climate change and how crops might be grown in arid environments. Understanding this mechanism could open new avenues for improving water-use efficiency in C4 crops, which are vital for global food security."

The study confirms that C4 plants maintain reduced relative humidities in the substomatal cavity, down to 80% under vapour pressure deficit (VPD) stress, reducing water loss and highlighting a critical role of nonstomatal control in water-use efficiency.

This mechanism helps plants sustain photosynthesis by reducing water loss without significantly lowering intercellular CO2 levels for photosynthesis. This is crucial for maintaining growth and ensuring that the crops thrive.

The findings also suggest that nonstomatal control mechanisms may have evolved before the divergence of C3 and C4 photosynthetic pathways, indicating a shared evolutionary trait.

"Our research reframes understanding of water-use efficiency in C4 plants and reveals that this alternative mechanism helps plants continue to grow and capture carbon dioxide, even when atmospheric water demand is high, challenging traditional assumptions about how these plants survive droughts," added Dr Márquez.

Photosynthesis is how plants use light and carbon dioxide to make sugars for growth, using an enzyme called Rubisco. Plants use the carbon dioxide that enters through open stomata to produce sugar, whilst open stomata also let water vapour out.

Read more at Science Daily

Beneath the brushstrokes, van Gogh's sky is alive with real-world physics

Vincent van Gogh's painting "The Starry Night" depicts a swirling blue sky with yellow moon and stars. The sky is an explosion of colors and shapes, each star encapsulated in ripples of yellow, gleaming with light like reflections on water.

Van Gogh's brushstrokes create an illusion of sky movement so convincing it led atmospheric scientists to wonder how closely it aligns with the physics of real skies. While the atmospheric motion in the painting cannot be measured, the brushstrokes can.

In an article published this week in Physics of Fluids, by AIP Publishing, researchers specializing in marine sciences and fluid dynamics in China and France analyzed van Gogh's painting to uncover what they call the hidden turbulence in the painter's depiction of the sky.

"The scale of the paint strokes played a crucial role," author Yongxiang Huang said. "With a high-resolution digital picture, we were able to measure precisely the typical size of the brushstrokes and compare these to the scales expected from turbulence theories."

To reveal hidden turbulence, the authors used brushstrokes in the painting like leaves swirling in a funnel of wind to examine the shape, energy, and scaling of atmospheric characteristics of the otherwise invisible atmosphere. They used the relative brightness, or luminance, of the varying paint colors as a stand-in for the kinetic energy of physical movement.

"It reveals a deep and intuitive understanding of natural phenomena," Huang said. "Van Gogh's precise representation of turbulence might be from studying the movement of clouds and the atmosphere or an innate sense of how to capture the dynamism of the sky."

Their study examined the spatial scale of the painting's 14 main whirling shapes to find out if they align with the cascading energy theory that describes the kinetic energy transfer from large- to small-scale turbulent flows in the atmosphere.

They discovered the overall picture aligns with Kolmogorov's law, which predicts atmospheric movement and scale according to measured inertial energy. Drilling down to the microcosm within the paint strokes themselves, where relative brightness is diffused throughout the canvas, the researchers discovered an alignment with Batchelor's scaling, which describes energy laws in small-scale, passive scalar turbulence following atmospheric movement.

Finding both scalings in one atmospheric system is rare, and it was a big driver for their research.

"Turbulence is believed to be one of the intrinsic properties of high Reynolds flows dominated by inertia, but recently, turbulence-like phenomena have been reported for different types of flow systems at a wide range of spatial scales, with low Reynolds numbers where viscosity is more dominant," Huang said.

Read more at Science Daily

Moderate coffee and caffeine consumption is associated with lower risk of developing multiple cardiometabolic diseases, new study finds

Consuming moderate amounts of coffee and caffeine regularly may offer a protective effect against developing multiple cardiometabolic diseases, including type 2 diabetes, coronary heart disease and stroke, according to new research published in the Endocrine Society’s Journal of Clinical Endocrinology & Metabolism.

Researchers found that regular coffee or caffeine intake, especially at moderate levels, was associated with a lower risk of new-onset cardiometabolic multimorbidity (CM), which refers to the coexistence of at least two cardiometabolic diseases.

The prevalence of individuals with multiple cardiometabolic diseases, or CM, is becoming an increasing public health concern as populations age around the world, notes the study.

Coffee and caffeine consumption could play an important protective role in almost all phases of CM development, researchers found.

“Consuming three cups of coffee, or 200-300 mg caffeine, per day might help to reduce the risk of developing cardiometabolic multimorbidity in individuals without any cardiometabolic disease,” said the study’s lead author Chaofu Ke, M.D., Ph.D., of the Department of Epidemiology and Biostatistics, School of Public Health at Suzhou Medical College of Soochow University, in Suzhou, China.

The study found that compared with non-consumers or consumers of less than 100mg caffeine per day, consumers of moderate amount of coffee (3 drinks per day) or caffeine (200-300 mg per day) had a 48.1% or 40.7% reduced risk for new-onset CM.

Ke and his colleagues based their findings on data from the UK Biobank, a large and detailed longitudinal dietary study with over 500,000 participants aged 37-73 years. The study excluded individuals who had ambiguous information on caffeine intake. The resulting pool of participants included a total of 172,315 individuals who were free of any cardiometabolic diseases at baseline for the analyses of caffeine, and a corresponding 188,091 individuals for the analyses of coffee and tea consumption.

The participants’ cardiometabolic diseases outcomes were identified from self-reported medical conditions, primary care data, linked inpatient hospital data and death registry records linked to the UK Biobank.

Coffee and caffeine intake at all levels were inversely associated with the risk of new-onset CM in participants without cardiometabolic diseases. Those who reported moderate coffee or caffeine intake had the lowest risk, the study found. Moderate coffee or caffeine intake was inversely associated with almost all developmental stages of CM.

“The findings highlight that promoting moderate amounts of coffee or caffeine intake as a dietary habit to healthy people might have far-reaching benefits for the prevention of CM,” Ke said.

Addressing a Research Gap

Numerous epidemiological studies have revealed the protective effects of coffee, tea and caffeine consumption on morbidity of single cardiometabolic diseases. However, the potential effects of these beverages on the development of CM were largely unknown.

The authors reviewed the available research on this topic and found people with single cardiometabolic disease may have a two-fold higher all-cause mortality risk than those free of any cardiometabolic diseases. By contrast, the researchers found individuals with CM may have an almost 4 to 7 times higher risk of all-cause mortality. The researchers also noted that CM may present higher risks of loss of physical function and mental stress than those with single diseases.

Read more at Science Daily

Sep 15, 2024

Invisibility cloaks? Wave scattering simulation unlocks potential for advanced metamaterials

A new software package developed by researchers at Macquarie University can accurately model the way waves -- sound, water or light -- are scattered when they meet complex configurations of particles.

This will vastly improve the ability to rapidly design metamaterials -- exciting artificial materials used to amplify, block or deflect waves.

The findings, published in the journal Proceedings of the Royal Society A on 19 June 2024, demonstrated the use of TMATSOLVER -- a multipole-based tool that models interactions between waves and particles of various shapes and properties.

The TMATSOLVER software makes it very easy to simulate arrangements of up to several hundred scatterers, even when they have complex shapes.

Lead author Dr Stuart Hawkins from Macquarie University's Department of Mathematics and Statistics says the software uses the transition matrix (T-matrix) -- a grid of numbers that fully describes how a certain object scatters waves.

"The T-matrix has been used since the 1960s, but we've made a big step forward in accurately computing the T-matrix for particles much larger than the wavelength, and with complex shapes," says Dr Hawkins.

"Using TMATSOLVER, we have been able to model configurations of particles that could previously not be addressed."

Dr Hawkins worked with other mathematicians from the University of Adelaide, as well as the University of Manchester and Imperial College London, both in the UK, and from the University of Augsburg and University of Bonn, both in Germany.

"It was fantastic to work on this project and incorporate the TMATSOLVER software into my research on metamaterials," says Dr Luke Bennetts, a researcher at the University of Adelaide and co-author of the article.

"It meant I could avoid the bottleneck of producing numerical computations to test metamaterial theories and allowed me to easily generalise my test cases to far more complicated geometries."

Applications in metamaterials

The researchers demonstrated the software's capabilities through four example problems in metamaterial design. These problems included arrays of anisotropic particles, high-contrast square particles, and tuneable [JvE1] periodic structures that slow down waves.

Metamaterials are designed to have unique properties not found in nature, letting them interact with electromagnetic, sound or other waves by controlling the size, shape and arrangement of their nanoscale structures.

Examples include super-lenses to view objects at the molecular scale; invisibility cloaks, which refract all visible light; and perfect wave absorption for energy harvesting or noise reduction.

The findings from this research and development of the TMATSOLVER tool will have wide application in accelerating research and development in the growing global market for metamaterials which can be designed for precise wave control.

"We have shown that our software can compute the T-matrix for a very wide range of particles, using the techniques most appropriate for the type of particle," Dr Hawkins says.

"This will enable rapid prototyping and validation of new metamaterial designs."

Professor Lucy Marshall, Executive Dean, Faculty of Science and Engineering at Macquarie University, says the software could accelerate new breakthroughs.

Read more at Science Daily

Microbe dietary preferences influence the effectiveness of carbon sequestration in the deep ocean

The movement of carbon dioxide (CO2) from the surface of the ocean, where it is in active contact with the atmosphere, to the deep ocean, where it can be sequestered away for decades, centuries, or longer, depends on a number of seemingly small processes.

One of these key microscale processes is the dietary preferences of bacteria that feed on organic molecules called lipids, according to a journal article, "Microbial dietary preference and interactions affect the export of lipids to the deep ocean," published in Science.

"In our study, we found incredible variation in what the different microbes preferred to digest. Bacteria seem to have very distinct diet preferences for different lipid molecules. This has real implications for understanding carbon sequestration and the biological carbon pump," said journal article co-author Benjamin Van Mooy, a senior scientist in the Marine Chemistry and Geochemistry Department at the Woods Hole Oceanographic Institution (WHOI). "This study used state-of-the-art methods to link the molecular composition of the sinking biomass with its rates of degradation, which we were able to link to the dietary preferences of bacteria." The biological carbon pump is a process where biomass sinks from the ocean surface to the deep ocean.

About 5 to 30% of surface ocean particulate organic matter is composed of lipids, which are carbon-rich fatty acid biomolecules that microbes use for energy storage and cellular functions. As the organic matter sinks to the deep sea, diverse communities of resident microbes degrade and make use of the lipids, exerting an important control on global CO2 concentrations. Understanding this process is vital to improve our ability to forecast global carbon fluxes in changing ocean regimes. Geographic areas where more lipids reach the deep ocean undegraded could be hotspots for natural carbon sequestration.

"Bacteria isolated from marine particles exhibited distinct dietary preferences, ranging from selective to promiscuous degraders," the article states. "Using synthetic communities composed of isolates with distinct dietary preferences, we showed that lipid degradation is modulated by microbial interactions. A particle export model incorporating these dynamics indicates that metabolic specialization and community dynamics may influence lipid transport efficiency in the ocean's mesopelagic zone." The mesopelagic zone extends about 200-1000 meters below the ocean surface.

"I was thrilled to see how much there is to learn about the functioning of the ocean by combining two technologies- high-end chemical analysis and microscale imaging-that have historically never been used together," said co-author Roman Stocker, professor at the Institute of Environment Engineering, Department of Civil, Environmental and Geomatic Engineering, ETH Zurich, Switzerland, "I believe that work at the interface between the exciting technologies we now have available in microbial oceanography will continue to yield important insights into how microbes shape our oceans, now and into the future."

"Scientists are starting to understand that lipids in the ocean can vary significantly depending on different environments, such as the coast versus the open ocean, and the season," said Van Mooy. "With this information, researchers can start to consider whether there are places in the ocean where lipids sink and are sequestered very efficiently, while there may be other locations where lipids are barely sequestered at all or are very inefficiently sequestered."

"What excites me about this paper is that it shows bacteria are not just eating any type of lipid, but are very specialized and, like us, have specific food preferences," said article co-author Lars Behrendt, associate professor and SciLifeLab fellow at the Science for Life Laboratory, Department of Organismal Biology, Uppsala University, Sweden. "This changes how we think about how microorganisms consume food in their natural environment and how they might help each other or compete for the same resource. It also supports the idea that combinations of bacteria better break down specific compounds, including lipids, or to achieve other desired functions."

In addition to studying specific bacteria species in isolation, the researchers also looked at how dietary preference affects degradation rates by multispecies communities of bacteria, which they stated is ecologically more relevant than species in isolation. The researchers found that simple synthetic co-cultures exhibited different degradation rates and delay times when compared to monocultures. The researchers also noted that the degradation of particulate organic matter in the natural environment is even more complex than what is described in the study.

"Phytoplankton are the main reason the ocean is one of the biggest carbon sinks. These microscopic organisms play a huge role in the world's carbon cycle -- absorbing about as much carbon as all the plants on land combined," said co-author Uria Alcolombri, senior lecturer, Alexander Silberman Institute of Life Sciences, Department of Plant and Environmental Sciences, The Hebrew University of Jerusalem, Israel. "It's fascinating that we can study tiny microbial processes under the microscope while uncovering the biological factors that regulate this massive 'digestive system' of the ocean."

Read more at Science Daily

Ancient DNA from Rapa Nui (Easter Island) refutes best-selling population collapse theory

Rapa Nui or Te Pito o Te Henua (the navel of the world), also known as Easter Island, is one of the most isolated inhabited places in the world. Located in the Pacific, it lies over 1,900 km east of the closest inhabited Polynesian island and 3,700 km west of South America. Although the island, its inhabitants and their rich culture have been extensively studied by archaeologists, anthropologists and geneticists, two key elements of Rapanui history remain very controversial to this day. One of these is the theory of population collapse through "ecocide" in the 1600s, thought to be the result of overpopulation and resource mismanagement. The other major contention is whether the Polynesian ancestors of the Rapanui interacted with Indigenous Americans before contact with Europeans in 1722.

This week's issue of Nature features a genetic study that sheds light on these two debates related to Rapanui history by examining the genomes of 15 Rapanui individuals who lived between 1670 and 1950. The remains of these 15 individuals are currently hosted at the Musée de l'Homme, in Paris. The new study was carried out by an international team of scientists and was spearheaded by Assistant Professor Víctor Moreno-Mayar from the Globe Institute at the University of Copenhagen (Denmark), and PhD student Bárbara Sousa da Mota and Associate Prof. Anna-Sapfo Malaspinas from the Faculty of Biology and Medicine at the University of Lausanne (Switzerland), in close collaboration with colleagues in Rapa Nui as well as in Austria, France, Chile, Australia and U.S.A.

The collapse that never happened

The story of the Rapanui has often been presented as a warning tale against humanity's over-exploitation of resources. After Polynesians from the west peopled the island by 1250, the landscape on Rapa Nui changed drastically. Towering stone statues -- the moai -- were carved and placed in all corners of the island, while its original forest of millions of palm trees dwindled and, by the 1600s, was all but gone. According to the "ecocide" theory, a population of over 15,000 Rapanui individuals triggered these changes that led to a period of resource scarcity, famine, warfare and even cannibalism culminating in a catastrophic population collapse.

"While it is well established that the environment of Rapa Nui was affected by anthropogenic activity, such as deforestation, we did not know if or how these changes led to a population collapse," comments Anna-Sapfo Malaspinas, Assoc. Professor at the University of Lausanne and group leader at the SIB Swiss Institute of Bioinformatics, Switzerland, last author of the study.

The researchers looked into the genomes of the Ancient Rapanui individuals expecting to find a genetic signature of a population collapse such as a sudden drop in genetic diversity. But surprisingly, the data did not contain any evidence of a population collapse in the 1600s.

"Our genetic analysis shows a stably growing population from the 13th century through to European contact in the 18th century. This stability is critical because it directly contradicts the idea of a dramatic pre-contact population collapse," says Bárbara Sousa da Mota, a researcher at the Faculty of Biology and Medicine at University of Lausanne and first author of the study.

Through their genetic analysis, Moreno-Mayar, Sousa da Mota, Malaspinas and their colleagues have not only provided evidence against the collapse theory, but also stress the resilience of the Rapanui population facing environmental challenges over several centuries until the colonial disruptions that European contact brought after 1722.

Did Polynesians reach the Americas?


Another debate that has tantalized researchers for decades is whether Polynesians ever reached the Americas. Although long-distance maritime navigation using wooden watercraft likely halted after the Rapa Nui forest disappeared, archaeological and genetic evidence from contemporary individuals hints that voyages to the Americas did occur. However, previous studies looking at small amounts of DNA from ancient Polynesians had rejected the hypothesis that transpacific voyages took place. Thus, these findings have put into question whether Polynesians reached the Americas and have suggested that the inferred contact based on present-day genetic data was mediated by European colonial activity after 1722.

By generating high-quality ancient genomes from the 15 Rapanui individuals, the team substantially increased the amount of genomic data from the island and found that about ten percent of the Rapanui gene pool has an Indigenous American origin. But more importantly, they were able to infer both populations met before Europeans arrived in the island and in the Americas.

"We looked into how the Indigenous American DNA was distributed across the Polynesian genetic background of the Rapanui. This distribution is consistent with a contact occurring between the 13th and the 15th centuries, " says first author Víctor Moreno-Mayar, Asst. Professor at the Globe Institute's Section for Geogenetics, University of Copenhagen.

"While our study cannot tell us where this contact occurred, this might mean that the Rapanui ancestors reached the Americas before Christopher Columbus," says Malaspinas.

Altogether, the results from the new study help settle longstanding debates that have led to years of speculation surrounding Rapanui history.

"Personally, I believe the idea of the ecocide is put together as part of a colonial narrative. That is this idea that these supposedly primitive people could not manage their culture or resources, and that almost destroyed them. But the genetic evidence shows the opposite. Although we have to acknowledge that the arrival of humans dramatically changed the ecosystem, there is no evidence of a population collapse before the Europeans arrived on the island. So we can put those ideas to rest now," says Moreno-Mayar.

"Many thought that present-day Rapanui carry Indigenous American genetic ancestry due to European colonial activity. But instead, the data strongly suggests that Rapanui and Indigenous Americans met and admixed centuries before Europeans made it to Rapa Nui or the Americas. We believe this means that Rapanui were capable of even more formidable voyages across the Pacific than previously established, " adds Sousa da Mota.

Future repatriation efforts

Importantly, the scientists held face-to-face discussions with members of the Rapanui community and the "Comision Asesora de Monumentos Nacionales" in Rapa Nui (CAMN). These discussions allowed to steer the research and to define a set of research questions that were equally of high interest to the scientists and the community. For instance, the team was able to show that the populations closest to the ancient Rapanui are indeed those currently living on the island.

"We have seen that museum archives contain mistakes and mislabels. Now that we have established that these 15 individuals were in fact Rapanui we know that they belong back in the island," says Moana Gorman Edmunds, an archaeologist in Rapa Nui and co-author of the study.

Furthermore, when ongoing results were presented to representatives of the Rapanui community, the need to repatriate their ancestors was discussed as a central goal for immediate future efforts.

Read more at Science Daily

Sep 14, 2024

Path to prosperity for planet and people shrinking rapidly, scientists warn

Our planet will only remain able to provide even the most basic standard of living for everyone in the future if economic systems and technologies are dramatically transformed and critical resources are more fairly used, managed and shared, a new report shows. The report is co-authored by over sixty leading natural and social scientists from the Earth Commission, led by the UvA's Joyeeta Gupta, as well as Prof. Xuemei Bai and Prof. Diana Liverman. The report was published today in The Lancet Planetary Health.

The new research builds on the 'Safe and Just Earth System Boundaries' published in Nature last year, which found that most of the vital limits within which people and the planet can thrive have been surpassed. The new paper identifies the 'Safe and Just Space' -- within which harm to humans and nature can be minimised while ensuring everyone can be provided for -- and sets out the paths to reach and stay in such a space.

Already shrinking

But future projections to 2050 show that this space will shrink over time, driven by inequality, unless urgent transformations take place. The only way to provide for everyone and ensure societies, businesses and economies thrive without destabilising the planet is to reduce inequalities in how critical Earth system resources, such as freshwater and nutrients, are accessed and used -- alongside economic and technological transformation.

The new work found that inequalities and overconsumption of finite resources by a minority are key drivers of this shrinking. Providing minimum resources for those who don't currently have enough would add much less pressure on the Earth system than that currently caused by the minority who use far greater resources.

Joyeeta Gupta: 'We're beginning to realise the damage that inequality is doing to the Earth. Increasing pollution and poor management of natural resources is causing significant harm to people and nature. The longer we continue to widen the gap between those who have too much and those who don't have enough, the more extreme the consequences for all, as the support systems which underpin our way of life, our markets and our economies begin to collapse.'

A life free from poverty


If the Earth System Boundaries represent the "ceiling," above which Earth systems can no longer remain stable and resilient, and significant damage could be caused to people and nature, the Safe and Just Space represents a "foundation," showing us the minimum the global population needs from the Earth system, in order to live a life free from poverty. The space in between is full of opportunities that we can use to ensure a better future for people and planet.

To reach this space, the paper calls for change in three areas. Firstly, a push for changes to how we run the economy, finding new policies and funding mechanisms that can address inequality whilst reducing pressure on nature and climate. Secondly, more efficient and effective management, sharing and usage of resources at every level of society -- including addressing the excess consumption of some communities which is limiting access to basic resources for those who need them the most. Thirdly, investment in sustainable and affordable technologies, which will be essential to help us use fewer resources and to reopen the Safe and Just Space for all -- particularly where there is little or no space left.

Read more at Science Daily

Climate-change-triggered 2023 mega-landslide caused Earth to vibrate for nine days

A landslide in a remote part of Greenland caused a mega-tsunami that sloshed back and forth across a fjord for nine days, generating vibrations throughout Earth, according to a new study involving UCL researchers.

The study, published in the journal Science, concluded that this movement of water was the cause of a mysterious, global seismic signal that lasted for nine days and puzzled seismologists in September 2023.

The initial event, not observed by human eye, was the collapse of a 1.2km-high mountain peak into the remote Dickson Fjord beneath, causing a backsplash of water 200 metres in the air, with a wave up to 110 metres high. This wave, extending across 10km of fjord, reduced to seven metres within a few minutes, the researchers calculated, and would have fallen to a few centimetres in the days after.

The team used a detailed mathematical model, recreating the angle of the landslide and the uniquely narrow and bendy fjord, to demonstrate how the sloshing of water would have continued for nine days, with little energy able to escape.

The model predicted that the mass of water would have moved back and forth every 90 seconds, matching the recordings of vibrations travelling in the Earth's crust all around the globe.

The landslide, the researchers wrote, was a result of the glacier at the foot of the mountain thinning, becoming unable to hold up the rock-face above it. This was ultimately due to climate change. The landslide and tsunami were the first observed in eastern Greenland.

Co-author Dr Stephen Hicks, of UCL Earth Sciences, said: "When I first saw the seismic signal, I was completely baffled. Even though we know seismometers can record a variety of sources happening on Earth's surface, never before has such a long-lasting, globally travelling seismic wave, containing only a single frequency of oscillation, been recorded. This inspired me to co-lead a large team of scientists to figure out the puzzle.

"Our study of this event amazingly highlights the intricate interconnections between climate change in the atmosphere, destabilisation of glacier ice in the cryosphere, movements of water bodies in the hydrosphere, and Earth's solid crust in the lithosphere.

"This is the first time that water sloshing has been recorded as vibrations through the Earth's crust, travelling the world over and lasting several days."

The mysterious seismic signal -- coming from a vibration through the Earth's crust -- was detected by seismometers all over the globe, from the Arctic to Antarctica. It looked completely different to frequency-rich 'rumbles' and 'pings' from earthquake recordings, as it contained only a single vibration frequency, like a monotonous-sounding hum.

When the study's authors first discovered the signal, they made a note of it as a "USO": unidentified seismic object.

At the same time, news of a large tsunami in a remote northeast Greenland fjord reached authorities and researchers working in the area.

The researchers joined forces in a unique multidisciplinary group involving 68 scientists from 40 institutions in 15 countries, combining seismometer and infrasound data, field measurements, on-the-ground and satellite imagery, and simulations of tsunami waves.

The team also used imagery captured by the Danish military who sailed into the fjord just days after the event to inspect the collapsed mountain-face and glacier front along with the dramatic scars left by the tsunami.

It was this combination of local field data and remote, global-scale observations that allowed the team to solve the puzzle and reconstruct the extraordinary cascading sequence of events.

Lead author Dr Kristian Svennevig, from the Geological Survey of Denmark and Greenland (GEUS), said: "When we set out on this scientific adventure, everybody was puzzled and no one had the faintest idea what caused this signal. All we knew was that it was somehow associated with the landslide. We only managed to solve this enigma through a huge interdisciplinary and international effort."

He added: "As a landslide scientist, an additional interesting aspect of this study is that this is the first-ever landslide and tsunami observed from eastern Greenland, showing how climate change already has major impacts there."

The team estimated that 25 million cubic metres of rock and ice crashed into the fjord (enough to fill 10,000 Olympic-sized swimming pools).

They confirmed the size of the tsunami, one of the largest seen in recent history, using numerical simulations as well as local data and imagery.

Seventy kilometres away from the landslide, four-metre-high tsunami waves damaged a research base at Ella Ø (island) and destroyed cultural and archaeological heritage sites across the fjord system.

The fjord is on a route commonly used by tourist cruise ships visiting the Greenland fjords. Fortunately, no cruise ships were close to Dickson Fjord on the day of the landslide and tsunami, but if they had been, the consequences of a tsunami wave of that magnitude could have been devastating.

Mathematical models recreating the width and depth of the fjord at very high resolution demonstrated how the distinct rhythm of a mass of water moving back and forth matched the seismic signal.

The study concluded that with rapidly accelerating climate change, it will become more important than ever to characterise and monitor regions previously considered stable and provide early warning of these massive landslide and tsunami events.

Co-author Thomas Forbriger, from Karlsruhe Institute of Technology, said: "We wouldn't have discovered or been able to analyse this amazing event without networks of high-fidelity broadband seismic stations around the world, which are the only sensors that can truly capture such a unique signal."

 Read more at Science Daily

How El Nino and mega ocean warming caused the greatest-ever mass extinction

Mega ocean warming El Niño events were key in driving the largest extinction of life on planet Earth some 252 million years ago, according to new research.

The study, published today in Science and co-led by the University of Bristol and China University of Geosciences (Wuhan), has shed new light on why the effects of rapid climate change in the Permian-Triassic warming were so devastating for all forms of life in the sea and on land.

Scientists have long linked this mass extinction to vast volcanic eruptions in what is now Siberia. The resulting carbon dioxide emissions rapidly accelerated climate warming, resulting in widespread stagnation and the collapse of marine and terrestrial ecosystems.

But what caused life on land, including plants and usually resilient insects, to suffer just as badly has remained a source of mystery.

Co-lead author Dr Alexander Farnsworth, Senior Research Associate at the University of Bristol, said: “Climate warming alone cannot drive such devastating extinctions because, as we are seeing today, when the tropics become too hot, species migrate to the cooler, higher latitudes. Our research has revealed that increased greenhouse gases don’t just make the majority of the planet warmer, they also increase weather and climate variability making it even more ‘wild’ and difficult for life to survive.”

The Permian-Triassic catastrophe shows the problem of global warming is not just a matter of it becoming unbearably hot, but also a case of conditions swinging wildly over decades.

“Most life failed to adapt to these conditions, but thankfully a few things survived, without which we wouldn’t be here today. It was nearly, but not quite, the end of the life on Earth,” said co-lead author Professor Yadong Sun at China University of Geosciences, Wuhan.

The scale of Permian-Triassic warming was revealed by studying oxygen isotopes in the fossilised tooth material of tiny extinct swimming organisms called conodonts. By studying the temperature record of conodonts from around the world, the researchers were able to show a remarkable collapse of temperature gradients in the low and mid latitudes.

Dr Farnsworth, who used pioneering climate modelling to evaluate the findings, said: “Essentially, it got too hot everywhere. The changes responsible for the climate patterns identified were profound because there were much more intense and prolonged El Niño events than witnessed today. Species were simply not equipped to adapt or evolve quickly enough.”

In recent years El Niño events have caused major changes in rainfall patterns and temperature. For example, the weather extremes that caused the June 2024 North American heatwave when temperatures were around 15°C hotter than normal. 2023-2024 was also one of the hottest years on record globally due to a strong El Niño in the Pacific, which was further exacerbated by increased human-induced CO2 driving catastrophic drought and fires around the world.

“Fortunately such events so far have only lasted one to two years at a time. During the Permian-Triassic crisis, El Niño persisted for much longer resulting in a decade of widespread drought, followed by years of flooding. Basically, the climate was all over the place and that makes it very hard for any species to adapt,” co-author Paul Wignall, Professor of Palaeoenvironments at the University of Leeds.

The results of the climate modelling also help explain the abundant charcoal found in rock layers of that age.

“Wildfires become very common if you have a drought-prone climate. Earth got stuck in a crisis state where the land was burning and the oceans stagnating. There was nowhere to hide,” added co-author Professor David Bond, a palaeontologist at the University of Hull.

The researchers observed that throughout Earth’s history there have been many volcanic events similar to those in Siberia, and many caused extinctions, but none led to a crisis of the scale of the Permian-Triassic event.

They found Permian-Triassic extinction was so different because these Mega-El Niños created positive feedback on the climate which led to incredibly warm conditions starting in the tropics and then beyond, resulting in the dieback of vegetation. Plants are essential for removing CO2 from the atmosphere, as well as the foundation of the food web, and if they die so does one of the Earth's mechanisms to stop CO2 building up in the atmosphere as a result of continued volcanism.

This also helps explain the conundrum regarding the Permian-Triassic mass extinction whereby the extinction on land occurred tens of thousands of years before extinction in the oceans.

“Whilst the oceans were initially shielded from the temperature rises, the mega-El Nino’s caused temperatures on land to exceed most species thermal tolerances at rates so rapid that they could not adapt in time,” explained Dr Sun.

“Only species that could migrate quickly could survive, and there weren’t many plants or animals that could do that.”

Mass extinctions, although rare, are the heartbeat of the Earth’s natural system resetting life and evolution along different paths.

Read more at Science Daily

Sep 1, 2024

Bubbling, frothing and sloshing: Long-hypothesized plasma instabilities finally observed

Whether between galaxies or within doughnut-shaped fusion devices known as tokamaks, the electrically charged fourth state of matter known as plasma regularly encounters powerful magnetic fields, changing shape and sloshing in space. Now, a new measurement technique using protons, subatomic particles that form the nuclei of atoms, has captured details of this sloshing for the first time, potentially providing insight into the formation of enormous plasma jets that stretch between the stars.

Scientists at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) created detailed pictures of a magnetic field bending outward because of the pressure created by expanding plasma. As the plasma pushed on the magnetic field, bubbling and frothing known as magneto-Rayleigh Taylor instabilities arose at the boundaries, creating structures resembling columns and mushrooms.

Then, as the plasma's energy diminished, the magnetic field lines snapped back into their original positions. As a result, the plasma was compressed into a straight structure resembling the jets of plasma that can stream from ultra-dense dead stars known as black holes and extend for distances many times the size of a galaxy. The results suggest that those jets, whose causes remain a mystery, could be formed by the same compressing magnetic fields observed in this research.

"When we did the experiment and analyzed the data, we discovered we had something big," said Sophia Malko, a PPPL staff research physicist and lead scientist on the paper. "Observing magneto-Rayleigh Taylor instabilities arising from the interaction of plasma and magnetic fields had long been thought to occur but had never been directly observed until now. This observation helps confirm that this instability occurs when expanding plasma meets magnetic fields. We didn't know that our diagnostics would have that kind of precision. Our whole team is thrilled!"

"These experiments show that magnetic fields are very important for the formation of plasma jets," said Will Fox, a PPPL research physicist and principal investigator of the research reported in Physical Review Research. "Now that we might have insight into what generates these jets, we could, in theory, study giant astrophysical jets and learn something about black holes."

PPPL has world-renowned expertise in developing and building diagnostics, sensors that measure properties like density and temperature in plasma in a range of conditions. This achievement is one of several in recent years that illustrates how the Lab is advancing measurement innovation in plasma physics.

Using a new technique to produce unprecedented detail


The team improved a measurement technique known as proton radiography by creating a new variation for this experiment that would allow for extremely precise measurements. To create the plasma, the team shone a powerful laser at a small disk of plastic. To produce protons, they shone 20 lasers at a capsule containing fuel made of varieties of hydrogen and helium atoms. As the fuel heated up, fusion reactions occurred and produced a burst of both protons and intense light known as X-rays.

The team also installed a sheet of mesh with tiny holes near the capsule. As the protons flowed through the mesh, the outpouring was separated into small, separate beams that were bent because of the surrounding magnetic fields. By comparing the distorted mesh image to an undistorted image produced by X-rays, the team could understand how the magnetic fields were pushed around by the expanding plasma, leading to whirl-like instabilities at the edges.

"Our experiment was unique because we could directly see the magnetic field changing over time," Fox said. "We could directly observe how the field gets pushed out and responds to the plasma in a type of tug of war."

Diversifying a research portfolio


The findings exemplify how PPPL is expanding its focus to include research focused on high energy density (HED) plasma. Such plasmas, like the one created in this experiment's fuel capsule, are hotter and denser than those used in fusion experiments. "HED plasma is an exciting area of growth for plasma physics," Fox said. "This work is part of PPPL's efforts to advance this field. The results show how the Laboratory can create advanced diagnostics to give us new insights into this type of plasma, which can be used in laser fusion devices, as well as in techniques that use HED plasma to create radiation for microelectronics manufacturing."

"PPPL has an enormous amount of knowledge and experience in magnetized plasmas that can contribute to the field of laser-produced HED plasmas and help make significant contributions," Fox said.

"HED science is complex, fascinating and key to understanding a wide range of phenomena," said Laura Berzak Hopkins, PPPL's associate laboratory director for strategy and partnerships and deputy chief research officer. "It's incredibly challenging to both generate these conditions in a controlled manner and develop advanced diagnostics for precision measurements. These exciting results demonstrate the impact of integrating PPPL's breadth of technical expertise with innovative approaches."

More experiments and better simulations


The researchers plan to work on future experiments that will help improve models of expanding plasma. "Scientists have assumed that in these situations, density and magnetism vary directly, but it turns out that that's not true," Malko said.

"Now that we have measured these instabilities very accurately, we have the information we need to improve our models and potentially simulate and understand astrophysical jets to a higher degree than before," Malko said. "It's interesting that humans can make something in a laboratory that usually exists in space."

Read more at Science Daily

Agricultural impact of flooding

I can barely hear Esther Ngumbi over the roar of greenhouse fans as she shows me around her rooftop laboratory in Morrill Hall. The benches are full of tomato plants, and the tomatoes don't look good. Half of the plants are submerged in bins of water. Their leaves are yellow and withering. Some of the dying tomatoes have flowered. I see one or two baby tomatoes on a couple of spindly plants.

This isn't the only torture inflicted on the tomatoes. Someone has tied little baggies to their stems. Inside the bags, fat green caterpillars are chowing down on the tomato leaves.

Entomology professor Ngumbi has questions -- lots of them -- and this is how she's set out to answer some of them. She is purposely flooding the tomatoes to see how they might respond to flooded conditions in farmers' fields -- a scenario that is becoming more common as a result of climate change.

"In nature, there are many stressors on plants during flooding," Ngumbi says. "Once the tomatoes get flooded, they're already weak, so most likely they will be attracting insects, which like to eat weaker plants. We're investigating how the plants deal with the combined stress of flooding and herbivory."

This explains the caterpillars. They are the larval form of Manduca sexta, the tobacco hornworm. They are feasting on one of the two heirloom tomato varieties Ngumbi is using in the experiment: Cherokee purple and striped German.

Half of the tomato plants in the greenhouse are not flooded, allowing the team to compare the stressed plants with those grown in more common conditions. But there are more investigations going on here.

"Also, within this experiment, we're looking at the microbes," Ngumbi says. "We want to understand how the microbial community changes in flooded conditions."

One of Ngumbi's key focuses is how soil microbes influence plant health and productivity. She's fascinated by mycorrhizal fungi, which form intimate associations with plant roots, offering essential elements like nitrogen to the plants in exchange for glucose supplied by the roots.

The tomato plants are all growing in soil from an Illinois farm, but half were also inoculated with mulch from a local farmer who has developed his own recipe for nurturing mycorrhizal fungi in the soil. Ngumbi wants to see if this inoculation makes any difference to the plants' ability to defend themselves from the fat caterpillars.

To measure plant defenses, Ngumbi's team collects samples of gases emitted by the plants and screens them for volatile organic compounds, the chemicals plants use to ward off bugs that would eat them.

Two years later, Ngumbi publishes the results of these and other laboratory experiments. She found that the two tomato varieties differed in gene expression and in the volatile compounds they emitted -- before any intervention. And when flooded, both varieties of tomatoes had very different chemical emission profiles than when grown in normal conditions. Herbivory influenced the production of these volatile compounds, but not as much as flooding did.

Today, the experiments continue, and Ngumbi's interest in the effects of flooding has only intensified. In a new review published in the journal Trends in Plant Research, she spells out the many changes that occur when plants are inundated with water for days or weeks at a time.

"Flooding is different from other climate-related stressors because it deprives plants of oxygen, an essential and indispensable element and substrate for plant growth and development," Ngumbi writes. Flooding disrupts plant metabolism and energy generation. It interferes with photosynthesis. Flooding kills beneficial bacteria and promotes pathogenic microbes in the soil. It also can compromise plants' ability to defend themselves from disease and harmful insects like the tobacco hornworm.

Ngumbi also warns that increased flooding can undermine decades of research aimed at making plants more resilient to climate change. Flooding may thwart efforts to build soil quality and microbial health to make crops more resilient to stressors such as heat and drought. Flooding also may eliminate gains derived from genetic engineering or plant breeding.

With flooding intensity and frequency predicted to increase by roughly 7% for every 1° C increase in global average temperatures, Ngumbi writes, scientists must consider the impacts of floods to "protect the monumental gains made in building climate-resilient crops."

Read more at Science Daily

Drug may stop migraines before headache starts

When taken at the first signs of a migraine, before headache pain begins, a drug called ubrogepant may be effective in helping people with migraine go about their daily lives with little or no symptoms, according to a new study published in the August 28, 2024, online issue of Neurology®, the medical journal of the American Academy of Neurology. The study focused on people with migraine who could tell when an attack was about to happen, due to early symptoms such as sensitivity to light and sound, fatigue, neck pain or stiffness, or dizziness.

Ubrogepant is a calcitonin gene-related peptide receptor antagonist, or CGRP inhibitor. CGRP is a protein that plays a key role in the migraine process.

"Migraine is one of the most prevalent diseases worldwide, yet so many people who suffer from this condition do not receive treatment or report that they are not satisfied with their treatment," said study author Richard B. Lipton, MD, of Albert Einstein College of Medicine in Bronx, New York, and Fellow of the American Academy of Neurology. "Improving care at the first signs of migraine, even before headache pain begins, can be a key to improved outcomes. Our findings are encouraging, suggesting that ubrogepant may help people with migraine function normally and go about their day."

The study involved 518 participants who had migraine for at least one year and two to eight migraine attacks per month in the three months before the study. All of the participants regularly experienced signs that a migraine would be starting within the next few hours. Participants were asked to treat two attacks during a two-month period.

Researchers divided participants into two groups. The first group received a placebo for their first set of pre-headache symptoms of migraine, followed by taking 100 milligrams (mg) of ubrogepant for their second instance of symptoms. The second group took ubrogepant for the first instance and placebo for the second instance.

Participants evaluated limitations on their activity in their diary using a scale ranging from zero to five, with 0 meaning "not at all limited -- I could do everything"; 1, "a little limited"; 2, "somewhat limited"; 3, "very limited"; or 4, "extremely limited."

Twenty-four hours after taking the drug or a placebo, 65% of people who took ubrogepant reported themselves as "not at all limited -- I could do everything," or "a little limited," compared to 48% of those who took the placebo.

Researchers found that as early as two hours post-medication, people who took the drug were 73% more likely to report that they had "no disability, able to function normally," than those who took the placebo.

"Based on our findings, treatment with ubrogepant may allow people with migraine who experience early warning signs before a migraine occurs to quickly treat migraine attacks in their earliest stages and go about their daily lives with little discomfort and disruption," said Lipton. "This could lead to an improved quality of life for those living with migraine."

Lipton noted that participants showed that based on their headache warning symptoms, they could reliably predict impending migraine headaches. These findings apply only to those with reliable warning symptoms.

A limitation of the study was that participants recorded their symptoms and medication use in electronic diaries, so it is possible some people may not have recorded all information accurately.

Read more at Science Daily

Aug 31, 2024

Dancing galaxies make a monster at the cosmic dawn

Astronomers have spotted a pair of galaxies in the act of merging 12.8 billion years ago. The characteristics of these galaxies indicate that the merger will form a monster galaxy, one of the brightest types of objects in the Universe. These results are important for understanding the early evolution of galaxies and black holes in the early Universe.

Quasars are bright objects powered by matter falling into a supermassive black hole at the center of a galaxy in the early Universe.

The most accepted theory is that when two gas-rich galaxies merge to form a single larger galaxy, the gravitational interaction of the two galaxies causes gas to fall towards the supermassive black hole in one or both of the galaxies, causing quasar activity.

To test this theory, an international team of researchers led by Takuma Izumi used the ALMA (Atacama Large Millimeter/submillimeter Array) radio telescope to study the earliest known pair of close quasars.

This pair was discovered by Yoshiki Matsuoka, at Ehime University in Japan, in images taken by the Subaru Telescope.

Located in the direction of the constellation Virgo, this pair of quasars existed during the first 900 million years of the Universe.

The pair is dim, indicating that the quasars are still in the early stages of their evolution.

The ALMA observations mapped the host galaxies of the quasars and showed that the galaxies are linked by a "bridge" of gas and dust.

This indicates that the two galaxies are in fact merging.

Read more at Science Daily

How a salt giant radically reshaped Mediterranean marine biodiversity

A new study paves the way to understanding biotic recovery after an ecological crisis in the Mediterranean Sea about 5.5 million years ago. An international team led by Konstantina Agiadi from the University of Vienna has now been able to quantify how marine biota was impacted by the salinization of the Mediterranean: Only 11 percent of the endemic species survived the crisis, and the biodiversity did not recover for at least another 1.7 million years. The study was just published in the journal Science.

Lithospheric movements throughout Earth history have repeatedly led to the isolation of regional seas from the world ocean and to the massive accumulations of salt. Salt giants of thousands of cubic kilometers have been found by geologists in Europe, Australia, Siberia, the Middle East, and elsewhere. These salt accumulations present valuable natural resources and have been exploited from antiquity until today in mines around the world (e.g. at the Hallstatt mine in Austria or the Khewra Salt Mine in Pakistan).

The Mediterranean salt giant is a kilometer-thick layer of salt beneath the Mediterranean Sea, which was first discovered in the early 1970s. It formed about 5.5 million years ago because of the disconnection from the Atlantic during the Messinian Salinity Crisis. In a study published in the journal Science, an international team of researchers -- comprising 29 scientists from 25 institutes across Europe -- led by Konstantina Agiadi from University of Vienna now was able to quantify the loss of biodiversity in the Mediterranean Sea due to the Messinian crisis and the biotic recovery afterwards.

Huge impact on marine biodiversity


After several decades of painstaking research on fossils dated from 12 to 3.6 million years found on land in the peri-Mediterranean countries and in deep-sea sediment cores, the team found that almost 67% of the marine species in the Mediterranean Sea after the crisis were different than those before the crisis. Only 86 of 779 endemic species (living exclusively in the Mediterranean before the crisis) survived the enormous change in living conditions after the separation from the Atlantic. The change in the configuration of the gateways, which led to the formation of the salt giant itself, resulted in abrupt salinity and temperature fluctuations, but also changed the migration pathways of marine organisms, the flow of larvae and plankton and disrupted central processes of the ecosystem. Due to these changes, a large proportion of the Mediterranean inhabitants of that time, such as tropical reef-building corals, died out.After the reconnection to the Atlantic and the invasion of new species like the Great White shark and oceanic dolphins, Mediterranean marine biodiversity presented a novel pattern, with the number of species decreasing from west to east, as it does today.

Recovery took longer than expected

Because peripheral seas like the Mediterranean are important biodiversity hotspots, it was very likely that the formation of salt giants throughout geologic history had a great impact, but it hadn't been quantified up to now. "Our study now provides the first statistical analysis of such a major ecological crisis," explains Konstantina Agiadi from the Department of Geology. Furthermore, it also quantifies for the first time the timescales of recovery after a marine environmental crisis, which is actually much longer than expected: "The biodiversity in terms of number of species only recovered after more than 1.7 million years," says the geoscientist. The methods used in the study also provide a model connecting plate tectonics, the birth and death of the oceans, Salt, and marine Life that could be applied to other regions of the world.

Read more at Science Daily

This tiny backyard bug does the fastest backflips on earth

Move over, Sonic. There's a new spin-jumping champion in town -- the globular springtail (Dicyrtomina minuta). This diminutive hexapod backflips into the air, spinning to over 60 times its body height in the blink of an eye, and a new study features the first in-depth look at its jumping prowess.

Globular springtails are tiny, usually only a couple millimeters in body length. They don't fly, bite or sting. But they can jump. In fact, jumping is their go-to (and only) plan for avoiding predators. And they excel at it -- to the naked eye it seems as though they vanish entirely when they take off.

"When globular springtails jump, they don't just leap up and down, they flip through the air -- it's the closest you can get to a Sonic the Hedgehog jump in real life," says Adrian Smith, research assistant professor of biology at North Carolina State University and head of the evolutionary biology and behavior research lab at the North Carolina Museum of Natural Sciences. "So naturally I wanted to see how they do it."

Finding the globular springtails was easy enough -- they're all around us. The ones in this study are usually out from December through March. Smith "recruited" his research subjects by sifting through leaf litter from his own backyard. But the next part proved to be the most challenging.

"Globular springtails jump so fast that you can't see it in real time," Smith says. "If you try to film the jump with a regular camera, the springtail will appear in one frame, then vanish. When you look at the picture closely, you can see faint vapor trail curlicues left behind where it flipped through the one frame."

Smith solved that problem by using cameras that shoot 40,000 frames per second. He urged the springtails to jump by shining a light on them or lightly prodding them with an artist's paintbrush. Then he looked at how they took off, how fast and far they went, and how they landed.

Globular springtails don't use their legs to jump. Instead, they have an appendage called a furca that folds up underneath their abdomen and has a tiny, forked structure at its tip. When the springtails jump, the furca flips down and the forked tip pushes against the ground, launching them into a series of insanely fast backflips.

What do we mean by insanely fast?

"It only takes a globular springtail one thousandth of a second to backflip off the ground and they can reach a peak rate of 368 rotations per second," Smith says. "They accelerate their bodies into a jump at about the same rate as a flea, but on top of that they spin. No other animal on earth does a backflip faster than a globular springtail."

The springtails were also able to launch themselves over 60 millimeters into the air -- more than 60 times their own height. And in most cases, they went backward.

"They can lean into a jump and go slightly sideways, but when launching from a flat surface, they mostly travel up and backward, never forward," says Jacob Harrison, a postdoctoral researcher at the Georgia Institute of Technology and paper co-author. "Their inability to jump forward was an indication to us that jumping is primarily a means to escape danger, rather than a form of general locomotion."

Landing was found in two styles: uncontrolled and anchored. Globular springtails do have a sticky forked tube they can evert -- or push out of their bodies -- to grapple a surface or halt their momentum, but Smith observed that bouncing and tumbling to a stop was just as common as anchored landings.

Read more at Science Daily

Aug 30, 2024

Highest-resolution observations yet from the surface of Earth

The Event Horizon Telescope (EHT) Collaboration has conducted test observations, using the Atacama Large Millimeter/submillimeter Array (ALMA) and other facilities, that achieved the highest resolution ever obtained from the surface of Earth. They managed this feat by detecting light from distant galaxies at a frequency of around 345 GHz, equivalent to a wavelength of 0.87 mm. The Collaboration estimates that in future they will be able to make black hole images that are 50% more detailed than was possible before, bringing the region immediately outside the boundary of nearby supermassive black holes into sharper focus. They will also be able to image more black holes than they have done so far. The new detections, part of a pilot experiment, were published today in The Astronomical Journal.

The EHT Collaboration released images of M87*, the supermassive black hole at the centre of the M87 galaxy, in 2019, and of Sgr A*, the black hole at the heart of our Milky Way galaxy, in 2022. These images were obtained by linking together multiple radio observatories across the planet, using a technique called very long baseline interferometry (VLBI), to form a single 'Earth-sized' virtual telescope.

To get higher-resolution images, astronomers typically rely on bigger telescopes -- or a larger separation between observatories working as part of an interferometer. But since the EHT was already the size of Earth, increasing the resolution of their ground-based observations called for a different approach. Another way to increase the resolution of a telescope is to observe light of a shorter wavelength -- and that's what the EHT Collaboration has now done.

"With the EHT, we saw the first images of black holes using the 1.3-mm wavelength observations, but the bright ring we saw, formed by light bending in the black hole's gravity, still looked blurry because we were at the absolute limits of how sharp we could make the images," said the study's co-lead Alexander Raymond, previously a postdoctoral scholar at the Center for Astrophysics | Harvard & Smithsonian (CfA), and now at the Jet Propulsion Laboratory, both in the United States. "At 0.87 mm, our images will be sharper and more detailed, which in turn will likely reveal new properties, both those that were previously predicted and maybe some that weren't."

To show that they could make detections at 0.87 mm, the Collaboration conducted test observations of distant, bright galaxies at this wavelength. Rather than using the full EHT array, they employed two smaller subarrays, both of which included ALMA and the Atacama Pathfinder EXperiment (APEX) in the Atacama Desert in Chile. The European Southern Observatory (ESO) is a partner in ALMA and co-hosts and co-operates APEX. Other facilities used include the IRAM 30-meter telescope in Spain and the NOrthern Extended Millimeter Array (NOEMA) in France, as well as the Greenland Telescope and the Submillimeter Array in Hawai'i.

In this pilot experiment, the Collaboration achieved observations with detail as fine as 19 microarcseconds, meaning they observed at the highest-ever resolution from the surface of Earth. They have not been able to obtain images yet, though: while they made robust detections of light from several distant galaxies, not enough antennas were used to be able to accurately reconstruct an image from the data.

This technical test has opened up a new window to study black holes. With the full array, the EHT could see details as small as 13 microarcseconds, equivalent to seeing a bottle cap on the Moon from Earth. This means that, at 0.87 mm, they will be able to get images with a resolution about 50% higher than that of previously released M87* and SgrA* 1.3-mm images. In addition, there's potential to observe more distant, smaller and fainter black holes than the two the Collaboration has imaged thus far.

EHT Founding Director Sheperd "Shep" Doeleman, an astrophysicist at the CfA and study co-lead, says: "Looking at changes in the surrounding gas at different wavelengths will help us solve the mystery of how black holes attract and accrete matter, and how they can launch powerful jets that stream over galactic distances."

This is the first time that the VLBI technique has been successfully used at the 0.87 mm wavelength. While the ability to observe the night sky at 0.87 mm existed before the new detections, using the VLBI technique at this wavelength has always presented challenges that took time and technological advances to overcome. For example, water vapour in the atmosphere absorbs waves at 0.87 mm much more than it does at 1.3 mm, making it more difficult for radio telescopes to receive signals from black holes at the shorter wavelength. Combined with increasingly pronounced atmospheric turbulence and noise buildup at shorter wavelengths, and an inability to control global weather conditions during atmospherically sensitive observations, progress to shorter wavelengths for VLBI -- especially those that cross the barrier into the submillimetre regime -- has been slow. But with these new detections, that's all changed.

Read more at Science Daily

Number of fish species at risk of extinction fivefold higher than previous estimates, according to a new prediction

Researchers predict that 12.7% of marine teleost fish species are at risk of extinction, up fivefold from the International Union for Conservation of Nature's prior estimate of 2.5%. Nicolas Loiseau and Nicolas Mouquet from the MARBEC Unit (the Marine Biodiversity, Exploitation and Conservation Unit) in Montpellier, France, and colleagues report these findings in a study published August 29th in the open-access journal PLOS Biology. Their report includes nearly 5,000 species that did not receive an IUCN conservation status due to insufficient data.

The IUCN's Red List of Threatened Species tracks more than 150,000 species to guide global conservation efforts on behalf of the most threatened.

However, 38% of marine fish species (or 4,992 species at the time of this research) are considered Data-Deficient and do not receive an official conservation status or the associated protections.

To better direct conservation efforts toward the species that need them, Loiseau and colleagues combined a machine learning model with an artificial neural network to predict the extinction risks of Data-Deficient species.

The models were trained on occurrence data, biological traits, taxonomy and human uses from 13,195 species.

They categorized 78.5% of the 4,992 species as Non-Threatened or Threatened (which includes Critically Endangered, Endangered and Vulnerable IUCN categories). Predicted Threatened species increased fivefold (from 334 to 1,671) and predicted Non-Threatened species increased by a third (from 7,869 to 10,451).

Predicted Threatened species tended to have a small geographic range, large body size and low growth rate.

The extinction risk was also correlated with shallow habitats.

The South China Sea, the Philippine and Celebes Seas and the west coasts of Australia and North America emerged as hotspots for predicted Threatened species.

The researchers recommend increased research and conservation efforts in these areas.

The researchers observed "a marked change in conservation priority ranking after species IUCN predictions," recommending that the Pacific Islands and Southern Hemisphere's polar and subpolar regions be prioritized to account for emerging at-risk species.

Many species that remained Data-Deficient occur in the Coral Triangle, indicating that additional research is needed there.

The researchers note that models cannot replace direct evaluations of at-risk species but AI offers a unique opportunity to provide a rapid, extensive and cost-effective evaluation of extinction risk of species.

Read more at Science Daily

Researchers map 50,000 of DNA's mysterious 'knots' in the human genome

Innovative study of DNA's hidden structures may open up new approaches for treatment and diagnosis of diseases, including cancer.

DNA is well-known for its double helix shape. But the human genome also contains more than 50,000 unusual 'knot'-like DNA structures called i-motifs, researchers at the Garvan Institute of Medical Research have discovered.

Published today in The EMBO Journalis the first comprehensive map of these unique DNA structures, shedding light on their potential roles in gene regulation involved in disease.

In a landmark 2018 study, Garvan scientists were the first to directly visualise i-motifs inside living human cells using a new antibody tool they developed to recognise and attach to i-motifs. The current research builds on those findings by deploying this antibody to identify i-motif locations across the entire genome.

"In this study, we mapped more than 50,000 i-motif sites in the human genome that occur in all three of the cell types we examined," says senior author Professor Daniel Christ, Head of the Antibody Therapeutics Lab and Director of the Centre for Targeted Therapy at Garvan. "That's a remarkably high number for a DNA structure whose existence in cells was once considered controversial. Our findings confirm that i-motifs are not just laboratory curiosities but widespread -- and likely to play key roles in genomic function."

Curious DNA i-motifs could play a dynamic role in gene activity

I-motifs are DNA structures that differ from the iconic double helix shape. They form when stretches of cytosine letters on the same DNA strand pair with each other, creating a four-stranded, twisted structure protruding from the double helix.

The researchers found that i-motifs are not randomly scattered but concentrated in key functional areas of the genome, including regions that control gene activity.

"We discovered that i-motifs are associated with genes that are highly active during specific times in the cell cycle. This suggests they play a dynamic role in regulating gene activity," says Cristian David Peña Martinez, a research officer in the Antibody Therapeutics Lab and first author of the study.

"We also found that i-motifs form in the promoter region of oncogenes, for instance the MYC oncogene, which encodes one of cancer's most notorious 'undruggable' targets. This presents an exciting opportunity to target disease-linked genes through the i-motif structure," he says.

I-motifs hold promise for new type of therapies and diagnostics


"The widespread presence of i-motifs near these 'holy grail' sequences involved in hard-to-treat cancers opens up new possibilities for new diagnostic and therapeutic approaches. It might be possible to design drugs that target i-motifs to influence gene expression, which could expand current treatment options," says Associate Professor Sarah Kummerfeld, Chief Scientific Officer at Garvan and co-author of the study.

Professor Christ adds that mapping i-motifs was only possible thanks to Garvan's world-leading expertise in antibody development and genomics. "This study is an example of how fundamental research and technological innovation can come together to make paradigm-shifting discoveries," he says.

Read more at Science Daily

Gene therapy gets a turbo boost

For decades, scientists have dreamt of a future where genetic diseases, such as the blood clotting disorder hemophilia, could be a thing of the past. Gene therapy, the idea of fixing faulty genes with healthy ones, has held immense promise. But a major hurdle has been finding a safe and efficient way to deliver those genes.

Now, researchers at the University of Hawai'i's John A. Burns School of Medicine (JABSOM) have made a significant breakthrough in gene editing technology that could revolutionize how we treat genetic diseases. Their new method offers a faster, safer, and more efficient way to deliver healthy genes into the body, potentially leading to treatments for hundreds of conditions. This research was recently published in Nucleic Acids Research.

Here's how it works.

Current methods can fix errors in genes, but they can also cause unintended damage by creating breaks in the DNA. Additionally, they struggle to insert large chunks of genetic material such as whole genes.

The new technique, developed by Dr. Jesse Owens along with his team Dr. Brian Hew, Dr. Ryuei Sato and Sabranth Gupta, from JABSOM's Institute for Biogenesis Research and Cell and Molecular Biology Department, addresses these limitations. They used laboratory evolution to generate a new super-active integrase capable of inserting therapeutic genes into the genome at record-breaking efficiencies.

"It's like having a "paste" function for the human genome," said Dr. Owens. "It uses specially engineered 'integrases' to carefully insert healthy genes into the exact location needed, without causing breaks in the DNA. This method is much more efficient, with success rates of up to 96% in some cases."

"This could lead to faster and more affordable treatments for a wide range of diseases, potentially impacting hundreds of conditions with a single faulty gene," said Dr. Owens.

Faster Development of Treatments and a Broader Range of Applications

The implications of this research extend beyond gene therapy. The ability to efficiently insert large pieces of DNA has applications in other areas of medicine.

When making cell lines to produce therapeutic proteins, the gene encoding the protein is usually randomly inserted into the genome, and it rarely lands in a location in the genome that is good for production. This is like searching for a needle in a haystack. Additionally, finding a cell with the gene inserted correctly and producing the desired protein can take many months.

Instead of searching for a needle in a haystack, Dr. Owens' technique makes a stack of needles. It delivers the gene directly to the desired location, significantly speeding up the development process.

"JABSOM takes pride in nurturing talented researchers like Jesse Owens, whose work has the power to create a global impact," said Sam Shomaker, dean of the University of Hawai'i John A. Burns School of Medicine. "This research, conducted in our lab in the middle of the Pacific, has the potential to significantly improve the way we treat genetic diseases."

Dr. Owens' team is exploring how this technique could accelerate the development and manufacture of biologics and advanced therapies such as antibodies. Currently, finding the right cell line for efficient production can be a time-consuming process. However, Dr. Owens' new genome engineering tool can reduce the cell line development timeline and accelerate the manufacture of life-saving therapeutics.

Read more at Science Daily

Aug 29, 2024

Dark matter could have helped make supermassive black holes in the early universe

Supermassive black holes typically take billions of years to form. But the James Webb Space Telescope is finding them not that long after the Big Bang -- before they should have had time to form.

It takes a long time for supermassive black holes, like the one at the center of our Milky Way galaxy, to form. Typically, the birth of a black hole requires a giant star with the mass of at least 50 of our suns to burn out -- a process that can take a billion years -- and its core to collapse in on itself.

Even so, at only about 10 solar masses, the resulting black hole is a far cry from the 4 million-solar-masses black hole, Sagittarius A*, found in our Milky Way galaxy, or the billion-solar-mass supermassive black holes found in other galaxies. Such gigantic black holes can form from smaller black holes by accretion of gas and stars, and by mergers with other black holes, which take billions of years.

Why, then, is the James Webb Space Telescope discovering supermassive black holes near the beginning of time itself, eons before they should have been able to form? UCLA astrophysicists have an answer as mysterious as the black holes themselves: Dark matter kept hydrogen from cooling long enough for gravity to condense it into clouds big and dense enough to turn into black holes instead of stars. The finding is published in the journal Physical Review Letters.

"How surprising it has been to find a supermassive black hole with a billion solar mass when the universe itself is only half a billion years old," said senior author Alexander Kusenko, a professor of physics and astronomy at UCLA. "It's like finding a modern car among dinosaur bones and wondering who built that car in the prehistoric times."

Some astrophysicists have posited that a large cloud of gas could collapse to make a supermassive black hole directly, bypassing the long history of stellar burning, accretion and mergers. But there's a catch: Gravity will, indeed, pull a large cloud of gas together, but not into one large cloud. Instead, it gathers sections of the gas into little halos that float near each other but don't form a black hole.

The reason is because the gas cloud cools too quickly. As long as the gas is hot, its pressure can counter gravity. However, if the gas cools, pressure decreases, and gravity can prevail in many small regions, which collapse into dense objects before gravity has a chance to pull the entire cloud into a single black hole.

"How quickly the gas cools has a lot to do with the amount of molecular hydrogen," said first author and doctoral student Yifan Lu. "Hydrogen atoms bonded together in a molecule dissipate energy when they encounter a loose hydrogen atom. The hydrogen molecules become cooling agents as they absorb thermal energy and radiate it away. Hydrogen clouds in the early universe had too much molecular hydrogen, and the gas cooled quickly and formed small halos instead of large clouds."

Lu and postdoctoral researcher Zachary Picker wrote code to calculate all possible processes of this scenario and discovered that additional radiation can heat the gas and dissociate the hydrogen molecules, altering how the gas cools.

"If you add radiation in a certain energy range, it destroys molecular hydrogen and creates conditions that prevent fragmentation of large clouds," Lu said.

But where does the radiation come from?

Only a very tiny portion of matter in the universe is the kind that makes up our bodies, our planet, the stars and everything else we can observe. The vast majority of matter, detected by its gravitational effects on stellar objects and by the bending of light rays from distant sources, is made of some new particles, which scientists have not yet identified.

The forms and properties of dark matter are therefore a mystery that remains to be solved. While we don't know what dark matter is, particle theorists have long speculated that it could contain unstable particles which can decay into photons, the particles of light. Including such dark matter in the simulations provided the radiation needed for the gas to remain in a large cloud while it is collapsing into a black hole.

Dark matter could be made of particles that slowly decay, or it could be made of more than one particle species: some stable and some that decay at early times. In either case, the product of decay could be radiation in the form of photons, which break up molecular hydrogen and prevent hydrogen clouds from cooling too quickly. Even very mild decay of dark matter yielded enough radiation to prevent cooling, forming large clouds and, eventually, supermassive black holes.

Read more at Science Daily

Engineers develop all-in-one solution to catch and destroy 'forever chemicals'

Chemical engineers at the University of British Columbia have developed a new treatment that traps and treats PFAS substances -- widely known as "forever chemicals" -- in a single, integrated system.

Per- and polyfluoroalkyl substances (PFAS) are widely used in manufacturing consumer goods like waterproof clothing due to their resistance to heat, water and stains. However, they are also pollutants, often ending up in surface and groundwater worldwide, where they have been linked to cancer, liver damage and other health issues.

"PFAS are notoriously difficult to break down, whether they're in the environment or in the human body," explained lead researcher Dr. Johan Foster, an associate professor of chemical and biological engineering in the faculty of applied science. "Our system will make it possible to remove and destroy these substances in the water supply before they can harm our health."

Catch and destroy

The UBC system combines an activated carbon filter with a special, patented catalyst that traps harmful chemicals and breaks them down into harmless components on the filter material. Scientists refer to this trapping of chemical components as adsorption.

"The whole process is fairly quick, depending on how much water you're treating," said Dr. Foster. "We can put huge volumes of water through this catalyst, and it will adsorb the PFAS and destroy it in a quick two-step process. Many existing solutions can only adsorb while others are designed to destroy the chemicals. Our catalyst system can do both, making it a long-term solution to the PFAS problem instead of just kicking the can down the road."

No light? No problem


Like other water treatments, the UBC system requires ultraviolet light to work, but it does not need as much UV light as other methods.

During testing, the UBC catalyst consistently removed more than 85 per cent of PFOA (perfluorooctanoic acid, a type of forever chemical) even under low light conditions.

"Our catalyst is not limited by ideal conditions. Its effectiveness under varying UV light intensities ensures its applicability in diverse settings, including regions with limited sunlight exposure," said Dr. Raphaell Moreira, a professor at Universität Bremen who conducted the research while working at UBC.

For example, a northern municipality that gets little sun could still benefit from this type of PFAS solution.

"While the initial experiments focused on PFAS compounds, the catalyst's versatility suggests its potential for removing other types of persistent contaminants, offering a promising solution to the pressing issues of water pollution," explained Dr. Moreira.

From municipal water to industry cleanups

The team believes the catalyst could be a low-cost, effective solution for municipal water systems as well as specialized industrial projects like waste stream cleanup.

They have set up a company, ReAct Materials, to explore commercial options for their technology.

"Our catalyst can eliminate up to 90 per cent of forever chemicals in water in as little as three hours -- significantly faster than comparable solutions on the market. And because it can be produced from forest or farm waste, it's more economical and sustainable compared to the more complex and costly methods currently in use," said Dr. Foster.

Read more at Science Daily

Neuroscientists explore the intersection of music and memory

The soundtrack of this story begins with a vaguely recognizable and pleasant groove. But if I stop writing and just listen for a second, the music reveals itself completely. In Freddie Hubbard's comfortable, lilting trumpet solo over Herbie Hancock's melodic, repetitive piano vamping, I recognize "Cantaloupe Island." Then, with my fingers again poised at the keyboard, Freddie and Herbie fade into the background, followed by other instrumental music: captivating -- but not distracting -- sonic nutrition, feeding my concentration and productivity.

Somewhere, I think, Yiren Ren is studying, focused on her research that demonstrates how music impacts learning and memory. Possibly, she's listening to Norah Jones, or another musician she's comfortable with. Because that's how it works: The music we know and might love, music that feels predictable or even safe -- that music can help us study and learn. Meanwhile, Ren has also discovered, other kinds of music can influence our emotions and reshape old memories.

Ren, a sixth-year Ph.D. student in Georgia Tech's School of Psychology, explores these concepts as the lead author of two new research papers in the journals PLOS Oneand Cognitive, Affective, & Behavioral Neuroscience (CABN).

"These studies are connected because they both explore innovative applications of music in memory modulation, offering insights for both every day and clinical use," says Ren.

But the collective research explores music's impacts in very different ways, explains Ren's faculty advisor and co-author of the study, Thackery Brown.

"One paper looks at how music changes the quality of your memory when you're first forming it -- it's about learning," says Brown, a cognitive neuroscientist who runs the MAP (Memory, Affect, and Planning) Lab at Tech. "But the other study focuses on memories we already have and asks if we can change the emotions attached to them using music."

Making Moods With Music


When we watch a movie with a robust score -- music created to induce emotions -- what we're hearing guides us exactly where the composer wants us to go. In their CABN study, Ren, Brown, and their collaborators from the University of Colorado (including former Georgia Tech Assistant Professor Grace Leslie) report that this kind of "mood music" can also be powerful enough to change how we remember our past.

Their study included 44 Georgia Tech students who listened to film soundtracks while recalling a difficult memory. Ren is quick to point out that this was not a clinical trial, so these participants were not identified as people suffering from mood disorders: "We wanted to start off with a random group of people and see if music has the power to modulate the emotional level of their memories."

Turns out, it does. The participants listened to movie soundtracks and incorporated new emotions into their memories that matched the mood of the music. And the effect was lasting. A day later, when the participants recalled these same memories -- but without musical accompaniment -- their emotional tone still matched the tone of the music played the day before.

The researchers could watch all this happening with fMRI (functional magnetic resonance imaging). They could see the altered brain activity in the study participants, the increased connectivity between the amygdala, where emotions are processed, and other areas of the brain associated with memory and integrating information.

"This sheds light on the malleability of memory in response to music, and the powerful role music can play in altering our existing memories," says Ren.

Ren is herself a multi-instrumentalist who originally planned on being a professional musician. As an undergraduate at Boston University, she pursued a dual major in film production and sound design, and psychology.

She found a way to combine her interests in music and neuroscience and is interested in how music therapy can be designed to help people with mood disorders like post-traumatic stress disorder (PTSD) or depression, "particularly in cases where someone might overexaggerate the negative components of a memory," Ren says.

There is no time machine that will allow us to go back and insert happy music into the mix while a bad event is happening and a memory is being formed, "but we can retrieve old memories while listening to affective music," says Brown. "And perhaps we can help people shift their feelings and reshape the emotional tone attached to certain memories."

Embracing the Familiar


The second study asks a couple of old questions: Should we listen to music while we work or study? And if so, are there more beneficial types of music than others? The answer to both questions might lie, at least partially, within the expansive parameters of personal taste. But even so, there are limits.

Think back to my description of "Cantaloupe Island" at the beginning of this story and how a familiar old jazz standard helped keep this writer's brain and fingers moving. In the same way, Norah Jones helps Ren when she's working on new research around music and memory. But if, for some reason, I wanted to test my concentration, I'd play a different kind of jazz, maybe 1950s bebop with its frenetic pace and off-center tone, or possibly a chorus of screeching cats. Same effect. It would demand my attention, and no work would get done.

For this study, Ren combined her gifts as a musician and composer with her research interests in examining whether music can improve -- or impair -- our ability to learn or remember new information. "We wanted to probe music's potential as a mnemonic device that helps us remember information more easily," she says. (An example of a mnemonic device is "Every Good Boy Does Fine," which stands for E-G-B-D-F and helps new piano players learn the order of notes on a keyboard.)

This study's 48 participants were asked to learn sequences of abstract shapes while listening to different types of music. Ren played a piece of music, in a traditional or familiar pattern of tone, rhythm, and melody. She then played the exact same set of notes, but out of order, giving the piece an atonal structure.

When they listened to familiar, predictable music, participants learned and remembered the sequences of shapes quicker as their brains created a structured framework, or scaffold, for the new information. Meanwhile, music that was familiar but irregular (think of this writer and the bebop example) made it harder for participants to learn.

"Depending its familiarity and structure, music can help or hinder our memory," says Ren, who wants to deepen her focus on the neural mechanisms through which music influences human behavior.

She plans to finish her Ph.D. studies this December and is seeking postdoctoral research positions that will allow her to continue the work she's started at Georgia Tech. Building on that, Ren wants to develop music-based therapies for conditions like depression or PTSD, while also exploring new rehabilitation strategies for aging populations and individuals with dementia.

Read more at Science Daily