Nov 25, 2023

NASA's Webb reveals new features in heart of Milky Way

The latest image from NASA's James Webb Space Telescope shows a portion of the dense center of our galaxy in unprecedented detail, including never-before-seen features astronomers have yet to explain. The star-forming region, named Sagittarius C (Sgr C), is about 300 light-years from the Milky Way's central supermassive black hole, Sagittarius A*.

"There's never been any infrared data on this region with the level of resolution and sensitivity we get with Webb, so we are seeing lots of features here for the first time," said the observation team's principal investigator Samuel Crowe, an undergraduate student at the University of Virginia in Charlottesville.

"Webb reveals an incredible amount of detail, allowing us to study star formation in this sort of environment in a way that wasn't possible previously."

"The galactic center is the most extreme environment in our Milky Way galaxy, where current theories of star formation can be put to their most rigorous test," added professor Jonathan Tan, one of Crowe's advisors at the University of Virginia.

Protostars

Amid the estimated 500,000 stars in the image is a cluster of protostars -- stars that are still forming and gaining mass -- producing outflows that glow like a bonfire in the midst of an infrared-dark cloud.

At the heart of this young cluster is a previously known, massive protostar over 30 times the mass of our Sun.

The cloud the protostars are emerging from is so dense that the light from stars behind it cannot reach Webb, making it appear less crowded when in fact it is one of the most densely packed areas of the image.

Smaller infrared-dark clouds dot the image, looking like holes in the starfield.

That's where future stars are forming.

Webb's NIRCam (Near-Infrared Camera) instrument also captured large-scale emission from ionized hydrogen surrounding the lower side of the dark cloud, shown cyan-colored in the image.

Typically, Crowe says, this is the result of energetic photons being emitted by young massive stars, but the vast extent of the region shown by Webb is something of a surprise that bears further investigation.

Another feature of the region that Crowe plans to examine further is the needle-like structures in the ionized hydrogen, which appear oriented chaotically in many directions.

"The galactic center is a crowded, tumultuous place. There are turbulent, magnetized gas clouds that are forming stars, which then impact the surrounding gas with their outflowing winds, jets, and radiation," said Rubén Fedriani, a co-investigator of the project at the Instituto Astrofísica de Andalucía in Spain.

"Webb has provided us with a ton of data on this extreme environment, and we are just starting to dig into it."

Around 25,000 light-years from Earth, the galactic center is close enough to study individual stars with the Webb telescope, allowing astronomers to gather unprecedented information on how stars form, and how this process may depend on the cosmic environment, especially compared to other regions of the galaxy.

For example, are more massive stars formed in the center of the Milky Way, as opposed to the edges of its spiral arms?

Read more at Science Daily

First comprehensive look at effects of 2020-2021 California megafires on terrestrial wildlife habitat

The only thing constant is change -- isn't that how the saying goes? We know that wildlife in western forests evolved with changing habitat and disturbances like wildfire. Each species responds differently, some benefiting from openings, others losing critical habitat. What we don't know is how increasing fire severity at large scales is impacting their habitat and survival, because many species are not adapted to these types of "megafires." Researchers at the Rocky Mountain Research Station set about finding some answers. They summarize their findings in "The 2020-2021 California megafires and their impacts to wildlife habitat," a paper that published today in the Proceedings of the National Academy of Sciences.

Why California and why this time period? In 2020 and 2021, California experienced fire activity unlike anything recorded in the modern record.

When the smoke cleared, the amount of burned forest totaled ten times more than the annual average going back to the late 1800s.

Nearly half of the forests that burned experienced high-severity fire, killing 75-100% of the vegetation, and much of this fire covered large continuous areas, rather than a patchy mosaic.

California's Department of Fish and Wildlife curates a comprehensive wildlife database, mapping habitat suitability of hundreds of species across the state.

Coupling that with Forest Service records of wildfires and some fancy computer footwork gave researchers an opportunity to take a broad look at how these types of "megafires" are shaping wildlife habitat within the state.

Jessalyn Ayars, the lead author, said, "Our intent was to take a broad look to gain a better understanding of the impacts of these kinds of fires on wildlife habitat as a whole." She continued, "and since each species is different, this study provides a good jumping-off point for others to be able to focus on a single species of interest or small group of species that share similar habitats."

The fires and habitat studied were mostly located in the Sierra Nevada, southern Cascades, and Klamath mountain regions of California.

Researchers looked at more than 600 wildlife species and found that for 50 species, fires spanned 15-30% of habitat within their range in the state.

One hundred species experience high severity fire over more than 10% of their geographic range within California.

Sixteen of those species are considered species of management concern, such as the great gray owl, wolverine, Pacific marten, and northern rubber boa.

Previous research shows that some species such as great gray owls may benefit from fire in terms of foraging habitat and can be somewhat resilient, but again, the unknown is whether that benefit holds true with this magnitude of habitat change in such a short time.

Some good news is that by looking more closely at some of the details around habitat change by species, scientists learned that these fires are not disproportionately impacting habitats for species of conservation concern compared to wildlife species in general, a finding that suggests that where these species live may serve as refugia for them.

Read more at Science Daily

From the first bite, our sense of taste helps pace our eating

When you eagerly dig into a long-awaited dinner, signals from your stomach to your brain keep you from eating so much you'll regret it -- or so it's been thought. That theory had never really been directly tested until a team of scientists at UC San Francisco recently took up the question.

The picture, it turns out, is a little different.

The team, led by Zachary Knight, PhD, a UCSF professor of physiology in the Kavli Institute for Fundamental Neuroscience, discovered that it's our sense of taste that pulls us back from the brink of food inhalation on a hungry day. Stimulated by the perception of flavor, a set of neurons -- a type of brain cell -- leaps to attention almost immediately to curtail our food intake.

"We've uncovered a logic the brainstem uses to control how fast and how much we eat, using two different kinds of signals, one coming from the mouth, and one coming much later from the gut," said Knight, who is also an investigator with the Howard Hughes Medical Institute and a member of the UCSF Weill Institute for Neurosciences. "This discovery gives us a new framework to understand how we control our eating."

The study, which appears Nov. 22, 2023 in Nature, could help reveal exactly how weight-loss drugs like Ozempic work, and how to make them more effective.

New views into the brainstem

Pavlov proposed over a century ago that the sight, smell and taste of food are important for regulating digestion. More recent studies in the 1970s and 1980s have also suggested that the taste of food may restrain how fast we eat, but it's been impossible to study the relevant brain activity during eating because the brain cells that control this process are located deep in the brainstem, making them hard to access or record in an animal that's awake.

Over the years, the idea had been forgotten, Knight said.

New techniques developed by lead author Truong Ly, PhD, a graduate student in Knight's lab, allowed for the first-ever imaging and recording of a brainstem structure critical for feeling full, called the nucleus of the solitary tract, or NTS, in an awake, active mouse. He used those techniques to look at two types of neurons that have been known for decades to have a role in food intake.

The team found that when they put food directly into the mouse's stomach, brain cells called PRLH (for prolactin-releasing hormone) were activated by nutrient signals sent from the GI tract, in line with traditional thinking and the results of prior studies.

However, when they allowed the mice to eat the food as they normally would, those signals from the gut didn't show up. Instead, the PRLH brain cells switched to a new activity pattern that was entirely controlled by signals from the mouth.

"It was a total surprise that these cells were activated by the perception of taste," said Ly. "It shows that there are other components of the appetite-control system that we should be thinking about."

While it may seem counterintuitive for our brains to slow eating when we're hungry, the brain is actually using the taste of food in two different ways at the same time. One part is saying, "This tastes good, eat more," and another part is watching how fast you're eating and saying, "Slow down or you're going to be sick."

"The balance between those is how fast you eat," said Knight.

The activity of the PRLH neurons seems to affect how palatable the mice found the food, Ly said. That meshes with our human experience that food is less appetizing once you've had your fill of it.

Brain cells that inspire weight-loss drugs

The PRLH-neuron-induced slowdown also makes sense in terms of timing. The taste of food triggers these neurons to switch their activity in seconds, from keeping tabs on the gut to responding to signals from the mouth.

Meanwhile, it takes many minutes for a different group of brain cells, called CGC neurons, to begin responding to signals from the stomach and intestines. These cells act over much slower time scales -- tens of minutes -- and can hold back hunger for a much longer period of time.

"Together, these two sets of neurons create a feed-forward, feed-back loop," said Knight. "One is using taste to slow things down and anticipate what's coming. The other is using a gut signal to say, 'This is how much I really ate. Ok, I'm full now!'"

The CGC brain cells' response to stretch signals from the gut is to release GLP-1, the hormone mimicked by Ozempic, Wegovy and other new weight-loss drugs.

These drugs act on the same region of the brainstem that Ly's technology has finally allowed researchers to study. "Now we have a way of teasing apart what's happening in the brain that makes these drugs work," he said.

A deeper understanding of how signals from different parts of the body control appetite would open doors to designing weight-loss regimens designed for the individual ways people eat by optimizing how the signals from the two sets of brain cells interact, the researchers said.

Read more at Science Daily

Nov 24, 2023

Telescope Array detects second highest-energy cosmic ray ever

In 1991, the University of Utah Fly's Eye experiment detected the highest-energy cosmic ray ever observed. Later dubbed the Oh-My-God particle, the cosmic ray's energy shocked astrophysicists. Nothing in our galaxy had the power to produce it, and the particle had more energy than was theoretically possible for cosmic rays traveling to Earth from other galaxies. Simply put, the particle should not exist.

The Telescope Array has since observed more than 30 ultra-high-energy cosmic rays, though none approaching the Oh-My-God-level energy. No observations have yet revealed their origin or how they are able to travel to the Earth.

On May 27, 2021, the Telescope Array experiment detected the second-highest extreme-energy cosmic ray. At 2.4 x 1020eV, the energy of this single subatomic particle is equivalent to dropping a brick on your toe from waist height. Led by the University of Utah (the U) and the University of Tokyo, the Telescope Array consists of 507 surface detector stations arranged in a square grid that covers 700 km2 (~270 miles2) outside of Delta, Utah in the state's West Desert. The event triggered 23 detectors at the north-west region of the Telescope Array, splashing across 48 km2 (18.5 mi2). Its arrival direction appeared to be from the Local Void, an empty area of space bordering the Milky Way galaxy.

"The particles are so high energy, they shouldn't be affected by galactic and extra-galactic magnetic fields. You should be able to point to where they come from in the sky," said John Matthews, Telescope Array co-spokesperson at the U and co-author of the study. "But in the case of the Oh-My-God particle and this new particle, you trace its trajectory to its source and there's nothing high energy enough to have produced it. That's the mystery of this -- what the heck is going on?"

In their observation that published on Nov. 24, 2023, in the journal Science, an international collaboration of researchers describe the ultra-high-energy cosmic ray, evaluate its characteristics, and conclude that the rare phenomena might follow particle physics unknown to science. The researchers named it the Amaterasu particle after the sun goddess in Japanese mythology. The Oh-My-God and the Amaterasu particles were detected using different observation techniques, confirming that while rare, these ultra-high energy events are real.

"These events seem like they're coming from completely different places in the sky. It's not like there's one mysterious source," said John Belz, professor at the U and co-author of the study. "It could be defects in the structure of spacetime, colliding cosmic strings. I mean, I'm just spit-balling crazy ideas that people are coming up with because there's not a conventional explanation."

Natural particle accelerators

Cosmic rays are echoes of violent celestial events that have stripped matter to its subatomic structures and hurled it through universe at nearly the speed of light. Essentially cosmic rays are charged particles with a wide range of energies consisting of positive protons, negative electrons, or entire atomic nuclei that travel through space and rain down onto Earth nearly constantly.

Cosmic rays hit Earth's upper atmosphere and blasts apart the nucleus of oxygen and nitrogen gas, generating many secondary particles. These travel a short distance in the atmosphere and repeat the process, building a shower of billions of secondary particles that scatter to the surface. The footprint of this secondary shower is massive and requires that detectors cover an area as large as the Telescope Array. The surface detectors utilize a suite of instrumentation that gives researchers information about each cosmic ray; the timing of the signal shows its trajectory and the amount of charged particles hitting each detector reveals the primary particle's energy.

Because particles have a charge, their flight path resembles a ball in a pinball machine as they zigzag against the electromagnetic fields through the cosmic microwave background. It's nearly impossible to trace the trajectory of most cosmic rays, which lie on the low- to middle-end of the energy spectrum. Even high-energy cosmic rays are distorted by the microwave background. Particles with Oh-My-God and Amaterasuenergy blast through intergalactic space relatively unbent. Only the most powerful of celestial events can produce them.

"Things that people think of as energetic, like supernova, are nowhere near energetic enough for this. You need huge amounts of energy, really high magnetic fields to confine the particle while it gets accelerated," said Matthews.

Ultra-high-energy cosmic rays must exceed 5 x 1019 eV. This means that a single subatomic particle carries the same kinetic energy as a major league pitcher's fast ball and has tens of millions of times more energy than any human-made particle accelerator can achieve. Astrophysicists calculated this theoretical limit, known as the Greisen-Zatsepin-Kuzmin (GZK) cutoff, as the maximum energy a proton can hold traveling over long distances before the effect of interactions of the microwave background radiation take their energy. Known source candidates, such as active galactic nuclei or black holes with accretion disks emitting particle jets, tend to be more than 160 million light years away from Earth. The new particle's 2.4 x 1020 eV and the Oh-My-God particle's 3.2 x 1020 eV easily surpass the cutoff.

Researchers also analyze cosmic ray composition for clues of its origins. A heavier particle, like iron nuclei, are heavier, have more charge and are more susceptible to bending in a magnetic field than a lighter particle made of protons from a hydrogen atom. The new particle is likely a proton. Particle physics dictates that a cosmic ray with energy beyond the GZK cutoff is too powerful for the microwave background to distort its path, but back tracing its trajectory points towards empty space.

"Maybe magnetic fields are stronger than we thought, but that disagrees with other observations that show they're not strong enough to produce significant curvature at these ten-to-the-twentieth electron volt energies," said Belz. "It's a real mystery."

Expanding the footprint

The Telescope Array is uniquely positioned to detect ultra-high-energy cosmic rays. It sits at about 1,200 m (4,000 ft), the elevation sweet-spot that allows secondary particles maximum development, but before they start to decay. Its location in Utah's West Desert provides ideal atmospheric conditions in two ways: the dry air is crucial because humidity will absorb the ultraviolet light necessary for detection; and the region's dark skies are essential, as light pollution will create too much noise and obscure the cosmic rays.

Read more at Science Daily

Protect delicate polar ecosystems by mapping biodiversity

Polar regions contain vast, undiscovered biodiversity but are both the most-threatened and least-understood areas of the world.

Now scientists led by the University of East Anglia (UEA) and the British Antarctic Survey (BAS) are calling for a roadmap of polar ecosystems to fill that knowledge gap, preserve polar life and even protect "our everyday life and our planet's health." The study would map all biodiversity in those regions, from the atmosphere to the deep sea and from land to the oceans.

The authors said concerted action is required to mitigate the impact of warming on polar ecosystems via conservation efforts, to sustainably manage these unique habitats and their ecosystem services, and for the sustainable bioprospecting of novel genes and compounds for societal gain.

'Multi-omics for studying and understanding polar life', is published today in Nature Communications. The paper is co-authored by UEA, BAS and the University of Bielefeld, Germany.

Polar ecosystems are the most threatened because they are the most sensitive to global warming. They are being lost at a rapid pace and with them all the biology that provides ecosystem services and biology-driven regulation of the climate, including the carbon cycle.

Prof Thomas Mock, Professor of Marine Microbiology in UEA's School of Environmental Sciences, is the joint lead author with Prof Melody Clark, Project Leader for the British Antarctic Survey.

Prof Thomas Mock said: "Biodiversity projections for the polar regions can only be reliably constructed if we have a sufficiently profound understanding of the diversity, ecological functions, and interrelations of polar organisms, as well as their resilience to climate change.

"These remote regions play substantial, often underappreciated, roles in the carbon cycle and drive global nutrient and dissolved organic matter fluxes. Consequently, polar environmental and ecological processes are intimately connected with our everyday life and our planet's health, much of which is underpinned by the endemic biota, from viruses to large animals.

"There is strong evidence that climate-induced changes in the polar regions are already altering species distributions on land and in the sea, with major impacts on ecosystem function."

Some species have shifted poleward, which has a knock-on effect on the food chain. Polar life, from microbes to seals, whales and polar bears, largely depends on overall low temperature and a substantial snow and ice cover, which are experiencing the impacts of global warming.

In the Arctic, temperatures are rising at least four times faster than elsewhere, destabilising the Arctic jet stream and increasing the likelihood of extreme weather events including heat waves, drought and flooding in temperate regions.

On land, permafrost melting and collapsing Arctic coastlines are dramatically altering ecological interactions and biogeochemistry due to the release of millennia-old carbon stores, trace elements, nutrients and potentially even deep-frozen ancient viruses and pathogenic bacteria.

In the oceans, the increased seasonal melting of sea ice is stabilizing surface waters too much, which reduces the amount of nutrients required for primary production to take place.

Similarly, the situation in the Southern Ocean and Antarctic continent is equally bleak, particularly for the Antarctic Peninsula, which has already experienced substantial levels of warming that has increased the loss of sea ice and glaciers.

The Southern Ocean is responsible for the uptake of three-quarters of the anthropogenic heat absorbed by the ocean and up to half of the carbon drawdown. It accounts for around 40 per cent of the global oceanic uptake of anthropogenic CO2 and around 50 per cent of the total atmospheric uptake. Furthermore, sequestering carbon by the organisms living in polar seas is probably the largest natural negative feedback against climate change.

The climate impacts on biodiversity and ecosystem functioning in both the Arctic and Antarctic serve as a bellwether for the consequences of global warming, including the persistence of biodiversity on Earth.

Prof Clark said: "Sequencing technologies have massively changed our abilities to decipher how organisms work. However the uptake in polar biology has been relatively low, especially when considering the tens of thousands of species that reside at the poles and are at threat in our warming world.

"Understanding how lots of very strange organisms living in extreme cold can help answer globally questions and provide real benefits for society. Failure to act now will result in a substantial loss of knowledge regarding evolutionary adaptation to the cold."

Genomic screening not only offers the possibility of identifying populations under stress, but it can also be used for the monitoring of invasive species, thereby facilitating early interventions.

Prof Mock said: "With the cold regions of our planet diminishing, there is a real imperative to obtain full genome sequences for diverse organisms inhabiting polar ecosystems, from the deep oceans to the permafrost on land, for both the Arctic and Antarctic. This will enable the wider application of omics technologies to polar species, which will revolutionise our understanding of evolution in the cold and adaptive responses to a warming world."

Read more at Science Daily

'Strange metal' is strangely quiet in noise experiment

True to form, a "strange metal" quantum material proved strangely quiet in recent quantum noise experiments at Rice University. Published this week in Science, the measurements of quantum charge fluctuations known as "shot noise" provide the first direct evidence that electricity seems to flow through strange metals in an unusual liquidlike form that cannot be readily explained in terms of quantized packets of charge known as quasiparticles.

"The noise is greatly suppressed compared to ordinary wires," said Rice's Douglas Natelson, the study's corresponding author.

"Maybe this is evidence that quasiparticles are not well-defined things or that they're just not there and charge moves in more complicated ways. We have to find the right vocabulary to talk about how charge can move collectively."

The experiments were performed on nanoscale wires of a quantum critical material with a precise 1-2-2 ratio of ytterbium, rhodium and silicon (YbRh2Si2), which has been studied in great depth during the past two decades by Silke Paschen, a solid-state physicist at the Vienna University of Technology (TU Wien). The material contains a high degree of quantum entanglement that produces a very unusual ("strange") temperature-dependent behavior that is very different from the one in normal metals such as silver or gold.

In such normal metals, each quasiparticle, or discrete unit, of charge is the product of incalculable tiny interactions between countless electrons.

First put forward 67 years ago, the quasiparticle is a concept physicists use to represent the combined effect of those interactions as a single quantum object for the purposes of quantum mechanical calculations.

Some prior theoretical studies have suggested that the charge in a strange metal might not be carried by such quasiparticles, and shot noise experiments allowed Natelson, study lead author Liyang Chen, a former student in Natelson's lab, and other Rice and TU Wien co-authors to gather the first direct empirical evidence to test the idea.

"The shot noise measurement is basically a way of seeing how granular the charge is as it goes through something," Natelson said.

"The idea is that if I'm driving a current, it consists of a bunch of discrete charge carriers. Those arrive at an average rate, but sometimes they happen to be closer together in time, and sometimes they're farther apart."

Applying the technique in YbRh2Si2 crystals presented significant technical challenges.

Shot noise experiments cannot be performed on single macroscopic crystals but, rather, require samples of nanoscopic dimensions.

Thus, the growth of extremely thin but nevertheless perfectly crystalline films had to be achieved, something that Paschen, Maxwell Andrews and their collaborators at TU Wien managed after almost a decade of hard work.

Next, Chen had to find a way to maintain that level of perfection while fashioning wires from these thin films that were about 5,000 times narrower than a human hair.

Rice co-author Qimiao Si, the lead theorist on the study and the Harry C. and Olga K. Wiess Professor of Physics and Astronomy, said he, Natelson and Paschen first discussed the idea for the experiments while Paschen was a visiting scholar at Rice in 2016.

Si said the results are consistent with a theory of quantum criticality he published in 2001 that he has continued to explore in a nearly two-decade collaboration with Paschen.

"The low shot noise brought about fresh new insights into how the charge-current carriers entwine with the other agents of the quantum criticality that underlies the strange metallicity," said Si, whose group performed calculations that ruled out the quasiparticle picture.

"In this theory of quantum criticality, the electrons are pushed to the verge of localization, and the quasiparticles are lost everywhere on the Fermi surface."

Natelson said the larger question is whether similar behavior might arise in any or all of the dozens of other compounds that exhibit strange metal behavior.

"Sometimes you kind of feel like nature is telling you something," Natelson said.

"This 'strange metallicity' shows up in many different physical systems, despite the fact that the microscopic, underlying physics is very different. In copper-oxide superconductors, for example, the microscopic physics is very, very different than in the heavy-fermion system we're looking at. They all seem to have this linear-in-temperature resistivity that's characteristic of strange metals, and you have to wonder is there something generic going on that is independent of whatever the microscopic building blocks are inside them."

Read more at Science Daily

'Woman the hunter': Studies aim to correct history

When Cara Ocobock was a young child, she often wondered at the images in movies, books, comics and cartoons portraying prehistoric men and women as such: "man the hunter" with spear in hand, accompanied by "woman the gatherer" with a baby strapped to her back and a basket of crop seeds in hand.

"This was what everyone was used to seeing," Ocobock said. "This was the assumption that we've all just had in our minds and that was carried through in our museums of natural history."

Many years later, Ocobock, an assistant professor in the Department of Anthropology and director of the Human Energetics Laboratory at the University of Notre Dame, found herself as a human biologist studying physiology and prehistoric evidence and discovering that many of these conceptions about early women and men weren't quite accurate. The accepted reconstruction of human evolution assumed males were biologically superior, but that interpretation wasn't telling the whole story.

Relying on both physiological and archaeological evidence, Ocobock and her research partner, Sarah Lacy, an anthropologist with expertise in biological archaeology at the University of Delaware, recently published two studies simultaneously in the journal American Anthropologist. Their joint research, coming from these two angles, found that not only did prehistoric women engage in the practice of hunting, but their female anatomy and biology would have made them intrinsically better suited for it.

Of her and her co-author's dual-pronged research, which was the cover story for the November issue of Scientific American, Ocobock said, "Rather than viewing it as a way of erasing or rewriting history, our studies are trying to correct the history that erased women from it."

Female physiology and estrogen, the 'unsung hero of life'

In their physiological study, the two researchers explained that prehistoric females were quite capable of performing the arduous physical task of hunting prey and were likely able to hunt successfully over prolonged periods of time. From a metabolic standpoint, Ocobock explained, the female body is better suited for endurance activity, "which would have been critical in early hunting because they would have had to run the animals down into exhaustion before actually going in for the kill."

Two huge contributors to that enhanced metabolism are hormones -- in this case, estrogen and adiponectin, which are typically present in higher quantities in female bodies than in male. These two hormones play a critical role in enabling the female body to modulate glucose and fat, a function that is key in athletic performance.

Estrogen, in particular, helps regulate fat metabolism by encouraging the body to use its stored fat for energy before using up its carbohydrate stores. "Since fat contains more calories than carbs do, it's a longer, slower burn," Ocobock explained, "which means that the same sustained energy can keep you going longer and can delay fatigue."

Estrogen also protects the body's cells from damage during heat exposure due to extreme physical activity. "Estrogen is really the unsung hero of life, in my mind," Ocobock said. "It is so important for cardiovascular and metabolic health, brain development and injury recovery."

Adiponectin also amplifies fat metabolism while sparing carbohydrate and/or protein metabolism, allowing the body to stay the course during extended periods, especially over great distances. In this way, adiponectin is able to protect the muscles from breaking down and keeps them in better condition for sustained exercise, Ocobock explained.

The female body structure itself is another element Ocobock and Lacy found to be of advantage in terms of endurance and effectiveness for prehistoric hunters. "With the typically wider hip structure of the female, they are able to rotate their hips, lengthening their steps," Ocobock detailed. "The longer steps you can take, the 'cheaper' they are metabolically, and the farther you can get, faster.

"When you look at human physiology this way, you can think of women as the marathon runners versus men as the powerlifters."

Archaeology tells more of the story of 'woman the hunter'

Several archaeological findings indicate prehistoric women not only shared in the resulting injuries of the dangerous business of close-contact hunting, but that it was an activity held in high esteem and valued by them. "We have constructed Neandertal hunting as an up-close-and-personal style of hunting," Ocobock said, "meaning that hunters would often have to get up underneath their prey in order to kill them. As such, we find that both males and females have the same resulting injuries when we look at their fossil records."

Ocobock described those traumatic injuries as being similar to those received by modern-day rodeo clowns -- injuries to the head and chest where they were kicked by the animal, or to the limbs where they were bitten or received a fracture. "We find these patterns and rates of wear and tear equally in both women and men," she said. "So they were both participating in ambush-style hunting of large game animals."

Second, Ocobock said, there is evidence of early female hunters in the Holocene period in Peru where females were buried with hunting weapons. "You don't often get buried with something unless it was important to you or was something that you used frequently in your life.

"Furthermore, we have no reason to believe that prehistoric women abandoned their hunting while pregnant, breastfeeding or carrying children," Ocobock added, "nor do we see in the deep past any indication that a strict sexual division of labor existed."

The bottom line, Ocobock noted, was that "hunting belonged to everyone, not just to males," especially in prehistoric societies where survival was an all-hands-on-deck activity. "There weren't enough people living in groups to be specialized in different tasks. Everyone had to be a generalist to survive."

Fighting bias

"This revelation is especially important in the current political moment of our society where sex and gender are in a spotlight," Ocobock said. "And I want people to be able to change these ideas of female physical inferiority that have been around for so long."

When talking about reconstructing the past in order to better understand it -- and to conduct "good science" -- Ocobock said scientists have to be extremely careful about how modern-day bias can seep into one's interpretations of the past. She cautioned that researchers have to be aware of their own biases and make sure they are asking the proper questions so the questions don't lead them down the road of looking for what it is they want to see.

Read more at Science Daily

Nov 23, 2023

'Triple star' discovery could revolutionize understanding of stellar evolution

A ground-breaking new discovery by University of Leeds scientists could transform the way astronomers understand some of the biggest and most common stars in the Universe.

Research by PhD student Jonathan Dodd and Professor René Oudmaijer, from the University's School of Physics and Astronomy, points to intriguing new evidence that massive Be stars -- until now mainly thought to exist in double stars -- could in fact be "triples."

The remarkable discovery could revolutionise our understanding of the objects -- a subset of B stars -- which are considered an important "test bed" for developing theories on how stars evolve more generally.

These Be stars are surrounded by a characteristic disc made of gas -- similar to the rings of Saturn in our own Solar System.

And although Be stars have been known for about 150 years -- having first been identified by renowned Italian astronomer Angelo Secchi in 1866 -- until now, no one has known how they were formed.

Consensus among astronomers so far has said the discs are formed by the rapid rotation of the Be stars, and that itself can be caused by the stars interacting with another star in a binary system.

Triple systems

Mr Dodd, corresponding author of the research, said: "The best point of reference for that is if you've watched Star Wars, there are planets where they have two Suns."

But now, by analysing data from the European Space Agency's Gaia satellite, the scientists say they have found evidence these stars actually exist in triple systems -- with three bodies interacting instead of just two.

Mr Dodd added: "We observed the way the stars move across the night sky, over longer periods like 10 years, and shorter periods of around six months. If a star moves in a straight line, we know there's just one star, but if there is more than one, we will see a slight wobble or, in the best case, a spiral.

"We applied this across the two groups of stars that we are looking at -- the B stars and the Be stars -- and what we found, confusingly, is that at first it looks like the Be stars have a lower rate of companions than the B stars. This is interesting because we'd expect them to have a higher rate."

However, Principal Investigator Prof Oudmaijer said: "The fact that we do not see them might be because they are now too faint to be detected."

Mass transfer

The researchers then looked at a different set of data, looking for companion stars that are further away, and found that at these larger separations the rate of companion stars is very similar between the B and Be stars.

From this, they were able to infer that in many cases a third star is coming into play, forcing the companion closer to the Be star -- close enough that mass can be transferred from one to the other and form the characteristic Be star disc.

This could also explain why we do not see these companions anymore; they have become too small and faint to be detected after the "vampire" Be star has sucked in so much of their mass.

The discovery could have huge impacts on other areas of astronomy -- including our understanding of black holes, neutron stars and gravitational wave sources.

Prof Oudmaijer said: "There's a revolution going on in physics at the moment around gravitational waves. We have only been observing these gravitational waves for a few years now, and these have been found to be due to merging black holes.

"We know that these enigmatic objects -- black holes and neutron stars -- exist, but we don't know much about the stars that would become them. Our findings provide a clue to understanding these gravitational wave sources."

He added: "Over the last decade or so, astronomers have found that binarity is an incredibly important element in stellar evolution. We are now moving more towards the idea it is even more complex than that and that triple stars need to be considered."

"Indeed," Oudmaijer said, "triples have become the new binaries."

Read more at Science Daily

Massive 2022 eruption reduced ozone layer levels

When the Hunga Tonga-Hunga Ha'apai volcano erupted on January 15, 2022 in the South Pacific, it produced a shock wave felt around the world and triggered tsunamis in Tonga, Fiji, New Zealand, Japan, Chile, Peru and the United States. It also changed the chemistry and dynamics of the stratosphere in the year following the eruption, leading to unprecedented losses in the ozone layer of up to 7% over large areas of the Southern Hemisphere, according to a recent study published in the Proceedings of the National Academy of Sciences (PNAS) from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and the University of Maryland.

Driving those atmospheric changes, according to the research, was the sheer amount of water vapor injected into the stratosphere by the undersea volcano.

The location of the stratosphere is approximately 8 -- 30 miles above Earth's surface and is where the protective ozone layer resides.

"The Hunga Tonga-Hunga Ha'apai eruption was truly extraordinary in that it injected about 300 billion pounds of water into the normally dry stratosphere, which is just an absolutely incredible amount of water from a single event," said David Wilmouth, a project scientist at SEAS and first author of the paper.

"This eruption put us in uncharted territory," said Ross Salawitch, professor at the University of Maryland's Earth System Science Interdisciplinary Center and co-author of the study.

"We've never seen, in the history of satellite records, this much water vapor injected into the atmosphere and our paper is the first that looks at the downstream consequences over broad regions of both hemispheres in the months following the eruption using satellite data and a global model."

The Hunga Tonga-Hunga Ha'apai eruption was the largest explosion ever recorded in the atmosphere.

The eruption hurled aerosols and gases deep into the stratosphere.

Some material reached the lower mesosphere, more than 30 miles above the Earth's surface, altitudes never recorded from a volcanic eruption.

Previous studies found that the eruption increased water vapor in the stratosphere by 10% worldwide, with even higher concentrations in some areas of the Southern Hemisphere.

Wilmouth, Salawitch and the rest of the research team used data from the Microwave Limb Sounder (MLS) aboard the NASA Aura satellite, to track not only how that water vapor moved across the globe but also monitor temperature and levels of chlorine monoxide (ClO), ozone (O3), nitric acid (HNO3), and hydrogen chloride (HCl) in the stratosphere for the year following the eruption.

They then compared those measurements to data collected by MLS from 2005 to 2021 prior to the eruption.

The team found that the injection of water vapor and sulfur dioxide (SO2) changed both the chemistry and the dynamics of the stratosphere.

In terms of chemistry, the SO2 led to an increase of sulfate aerosols, which provided new surfaces for chemical reactions to occur.

"Certain reactions that might not happen at all or only happen slowly can happen faster if there are aerosols available on which those reactions can take place," said Wilmouth.

"The injection of SO2 from the volcano allowed sulfate aerosols to form and the presence of water vapor led to the additional production of sulfate aerosols."

The increased sulfate aerosols and water vapor kicked off a chain of events in the complex atmospheric chemistry that led to widespread changes in the concentrations of a number of compounds, including ozone.

The extra water vapor also had a cooling effect in the stratosphere, leading to a change in circulation, which drove decreases in ozone in the southern hemisphere and an increase of ozone over the tropics.

The researchers found that the peak decrease in ozone occurred in October, nine months after the eruption.

"We had this enormous increase in water vapor in the stratosphere with modest increases in sulfate that set off a series of events that led to significant changes in temperature and circulation, ClO, HNO3, HCl, O3, and other gases," Wilmouth said.

Next, the researchers hope to continue the study by following the impact of the volcano into 2023 and beyond as the water vapor moves from the tropics and midlatitudes to the Southern Hemisphere pole, where it has the potential to amplify ozone losses in the Antarctic.

The water vapor is expected to stay elevated in the stratosphere for a period of several years.

Read more at Science Daily

Neanderthals were the world's first artists, research reveals

Recent research has shown that engravings in a cave in La Roche-Cotard (France), which has been sealed for thousands of years, were actually made by Neanderthals. This research was performed by Basel archaeologist Dorota Wojtczak together with a team of researchers from France and Denmark, whose findings reveal that the Neanderthals were in fact the first humans with an appreciation of art.

When the French archaeologist Jean-Claude Marquet entered the La Roche-Cotard cave in the Loire Valley for the first time back in 1974, he suspected that the fine lines on the wall could be of human origin.

He also found scrapers and other retouched pieces known as Mousterian stone artifacts that suggested the cave had been used by Neanderthals.

Were the marks on the wall evidence of early Neanderthal artistic activity?

Posing this question raised the possibility of breaking with the consensus of the time, which largely assumed that Homo neanderthalensis lacked any higher cognitive abilities.

Fearing he would be unable to provide sufficient scientific evidence to prove his hypothesis, Marquet left the cave untouched for almost 40 years.

Marks on the wall produced by human hands

Together with an international team, he made another attempt in 2016.

This time he was accompanied by Dr. Dorota Wojtczak from Integrative Prehistoric and Archaeological Science (IPAS) at the Department of Environmental Sciences of the University of Basel, who specializes in archaeological use-wear analysis.

"Our task was to use modern methods to prove the human origin of these wall engravings," explains Wojtczak in her office at IPAS.

The researchers recently published their findings in the journal PLoS ONE.

First with photos and drawings and later with a 3D scanner, the marks in the tuff rock of the cave wall were meticulously recorded.

In her laboratory in Basel, Wojtczak compared these samples from the cave with tuff she had worked on experimentally with wood, bone and stone tools, as well as with her hands.

"This research clearly showed that the cave marks were not made with tools, but by scratching with human fingers," says Wojtczak.

Cave sealed for over 50,000 years

At the same time, examination of cave sediment by researchers from Denmark showed that the cave must have been sealed off by mud residues from the Loire and soil sediments for over 50,000 years before being rediscovered.

This makes the La Roche-Cotard cave system a very special location -- a veritable "time capsule." "At this time, 50,000 years ago, there were no modern humans in Europe, only Neanderthals," says Wojtczak.

The wall marks and artifacts can therefore only come from these early humans.

While the clear geometric shapes with parallel and triangular lines suggest that these marks were not scribbled on the wall by chance, the researcher does not know what they represent.

"But they could only have been made by someone who proceeded with planning and understanding," she says.

And whether it was "art" as such, or a form of recording-keeping, is a matter of interpretation.

La Roche-Cotard promises further findings


The cave holds many other archaeological secrets. Jean-Claude Marquet also found an object that resembles the face of a human or animal back in 1976, and Wojtczak's use-wear analysis suggests that this object is also man-made.

Another object from the cave appears to be a small oil lamp.

"Specialists are currently investigating whether the object bears any pigments or soot substances that could help to identify the type of fuel used at the time," explains Wojtczak.

The chamber of La Roche-Cotard that has been explored so far is just one part of an entire cave system.

The researcher hopes to gain further insight into the Neanderthals' activities, particularly from Chamber 4, which is still largely covered by sediment.

Wojtczak is convinced that every investigation will help to further the dismantle traditional consensus of Neanderthals as mentally inferior humans, and reinforce the perception of them as more like the cousins of modern humans.

Read more at Science Daily

Why emotions stirred by music create such powerful memories

Time flows in a continuous stream -- yet our memories are divided into separate episodes, all of which become part of our personal narrative. How emotions shape this memory formation process is a mystery that science has only recently begun to unravel. The latest clue comes from UCLA psychologists, who have discovered that fluctuating emotions elicited by music helps form separate and durable memories.

The study, published in Nature Communications, used music to manipulate the emotions of volunteers performing simple tasks on a computer. The researchers found that the dynamics of people's emotions molded otherwise neutral experiences into memorable events.

"Changes in emotion evoked by music created boundaries between episodes that made it easier for people to remember what they had seen and when they had seen it," said lead author Mason McClay, a doctoral student in psychology at UCLA. "We think this finding has great therapeutic promise for helping people with PTSD and depression."

As time unfolds, people need to group information, since there is too much to remember (and not all of it useful). Two processes appear to be involved in turning experiences into memories over time: The first integrates our memories, compressing and linking them into individualized episodes; the other expands and separates each memory as the experience recedes into the past. There's a constant tug of war between integrating memories and separating them, and it's this push and pull that helps to form distinct memories. This flexible process helps a person understand and find meaning in their experiences, as well as retain information.

"It's like putting items into boxes for long-term storage," said corresponding author David Clewett, an assistant professor of psychology at UCLA. "When we need to retrieve a piece of information, we open the box that holds it. What this research shows is that emotions seem to be an effective box for doing this sort of organization and for making memories more accessible."

A similar effect may help explain why Taylor Swift's "Eras Tour" has been so effective at creating vivid and lasting memories: Her concert contains meaningful chapters that can be opened and closed to relive highly emotional experiences.

McClay and Clewett, along with Matthew Sachs at Columbia University, hired composers to create music specifically designed to elicit joyous, anxious, sad or calm feelings of varied intensity. Study participants listened to the music while imagining a narrative to accompany a series of neutral images on a computer screen, such as a watermelon slice, a wallet or a soccer ball. They also used the computer mouse to track moment-to-moment changes in their feelings on a novel tool developed for tracking emotional reactions to music.

Then, after performing a task meant to distract them, participants were shown pairs of images again in a random order. For each pair, they were asked which image they had seen first, then how far apart in time they felt they had seen the two objects. Pairs of objects that participants had seen immediately before and after a change of emotional state -- whether of high, low, or medium intensity -- were remembered as having occurred farther apart in time compared to images that did not span an emotional change. Participants also had worse memory for the order of items that spanned emotional changes compared to items they had viewed while in a more stable emotional state. These effects suggest that a change in emotion resulting from listening to music was pushing new memories apart.

"This tells us that intense moments of emotional change and suspense, like the musical phrases in Queen's 'Bohemian Rhapsody,' could be remembered as having lasted longer than less emotive experiences of similar length," McClay said. "Musicians and composers who weave emotional events together to tell a story may be imbuing our memories with a rich temporal structure and longer sense of time."

The direction of the change in emotion also mattered. Memory integration was best -- that is, memories of sequential items felt closer together in time, and participants were better at recalling their order -- when the shift was toward more positive emotions. On the other hand, a shift toward more negative emotions (from calmer to sadder, for example) tended to separate and expand the mental distance between new memories.

Participants were also surveyed the following day to assess their longer-term memory, and showed better memory for items and moments when their emotions changed, especially if they were experiencing intense positive emotions. This suggests that feeling more positive and energized can fuse different elements of an experience together in memory.

Sachs emphasized the utility of music as an intervention technique.

"Most music-based therapies for disorders rely on the fact that listening to music can help patients relax or feel enjoyment, which reduces negative emotional symptoms," he said. The benefits of music-listening in these cases are therefore secondary and indirect. Here, we are suggesting a possible mechanism by which emotionally dynamic music might be able to directly treat the memory issues that characterize such disorders."

Clewett said these findings could help people reintegrate the memories that have caused post-traumatic stress disorder.

Read more at Science Daily

Nov 22, 2023

Dwarf galaxies use 10-million-year quiet period to churn out stars

If you look at massive galaxies teeming with stars, you might be forgiven in thinking they are star factories, churning out brilliant balls of gas. But actually, less evolved dwarf galaxies have bigger regions of star factories, with higher rates of star formation.

Now, University of Michigan researchers have discovered the reason underlying this: These galaxies enjoy a 10-million-year delay in blowing out the gas cluttering up their environments.

Star-forming regions are able to hang on to their gas and dust, allowing more stars to coalesce and evolve.

In these relatively pristine dwarf galaxies, massive stars -- stars about 20 to 200 times the mass of our sun -- collapse into black holes instead of exploding as supernovae.

But in more evolved, polluted galaxies, like our Milky Way, they are more likely to explode, thereby generating a collective superwind.

Gas and dust get blasted out of the galaxy, and star formation quickly stops.

Their findings are published in the Astrophysical Journal.

"As stars go supernova, they pollute their environment by producing and releasing metals," said Michelle Jecmen, study first author and an undergraduate researcher.

"We argue that at low metallicity -- galaxy environments that are relatively unpolluted -- there is a 10-million-year delay in the start of strong superwinds, which, in turn, results in higher star formation."

The U-M researchers point to what's called the Hubble tuning fork, a diagram that depicts the way astronomer Edwin Hubble classified galaxies.

In the handle of the tuning fork are the largest galaxies. Huge, round and brimming with stars, these galaxies have already turned all of their gas into stars.

Along the tines of the tuning fork are spiral galaxies that do have gas and star-forming regions along their compact arms.

At the end of the tuning fork's tines are the least evolved, smallest galaxies.

"But these dwarf galaxies have just these really mondo star-forming regions," said U-M astronomer Sally Oey, senior author of the study.

"There have been some ideas around why that is, but Michelle's finding offers a very nice explanation: These galaxies have trouble stopping their star formation because they don't blow away their gas."

Additionally, this 10-million-year period of quiet offers astronomers the opportunity to peer at scenarios similar to the cosmic dawn, a period of time just after the Big Bang, Jecmen said.

In pristine dwarf galaxies, gas clumps together and forms gaps through which radiation can escape.

This previously known phenomenon is called the "picket fence" model, with UV radiation escaping between slats in the fence.

The delay explains why gas would have had time to clump together.

Ultraviolet radiation is important because it ionizes hydrogen -- a process that also occurred right after the Big Bang, causing the universe to go from opaque to transparent.

"And so looking at low-metallicity dwarf galaxies with lots of UV radiation is somewhat similar to looking all the way back to the cosmic dawn," Jecmen said.

"Understanding the time near the Big Bang is so interesting. It's foundational to our knowledge. It's something that happened so long ago -- it's so fascinating that we can see sort of similar situations in galaxies that exist today."

A second study, published in the Astrophysical Journal Letters and led by Oey, used the Hubble Space Telescope to look at Mrk 71, a region in a nearby dwarf galaxy about 10 million light years away.

In Mrk 71, the team found observational evidence of Jecmen's scenario.

Using a new technique with the Hubble Space Telescope, the team employed a filter set that looks at the light of triply ionized carbon.

In more evolved galaxies with lots of supernova explosions, those explosions heat gas in a star cluster to very high temperatures -- to millions of degrees Kelvin, Oey said.

As this hot superwind expands, it blasts the rest of the gas out of the star clusters.

But in low metallicity environments such as Mrk 71, where stars aren't blowing up, energy within the region is radiated away.

It doesn't have the chance to form a superwind.

The team's filters picked up a diffuse glow of the ionized carbon throughout Mrk 71, demonstrating that the energy is radiating away.

Therefore, there is no hot superwind, instead allowing dense gas to remain throughout the environment.

Read more at Science Daily

Babies as young as four months show signs of self-awareness

Babies as young as four months old can make sense of how their bodies interact with the space around them, according to new research from the University of Birmingham.

The findings, published today (21 November 2023) in Scientific Reports, shed new light on how self-awareness develops.

Experts from the Birmingham BabyLab showed babies a ball on a screen moving towards or away from them.

When the ball was closest to them on the screen, the babies were presented with a 'touch' (a small vibration) on their hands, whilst their brain activity was being measured.

The data collection for the study was conducted at Goldsmiths (University of London).

The researchers found that from just four months old, babies show enhanced somatosensory (tactile) brain activity when a touch is preceded by an object moving towards them.

Dr Giulia Orioli, Research Fellow in Psychology at the University of Birmingham, who led the study said: "Our findings indicate that even in the first few months of life, before babies have even learned to reach for objects, the multisensory brain is wired up to make links between what babies see and what they feel. This means they can sense the space around them and understand how their bodies interact with that space. This is sometimes referred to as peripersonal space.

"Of course, humans do this all the time as adults, using our combined senses to perceive where we are in space and making predictions about when we will touch an object or not. But now that we know that babies in the early stages of their development begin to show signs of this, it opens up questions about how much of these abilities are learnt, or innate."

The researchers also explored how an unexpected 'touch' would affect some of the older babies in the study.

They found that in babies aged eight months old when the touch on their hand was preceded by the ball on the screen moving away from them, the babies' brain activity showed signs that they were surprised.

Andrew Bremner, Professor of Developmental Psychology, commented: "Seeing the older babies show surprise responses suggests that they had not expected the touch due to the visual direction the object was moving in. This indicates that as babies proceed through their first year of life, their brains construct a more sophisticated awareness of how their body exists in the space around them."

Next, the researchers are hoping to follow up this study with younger and older participants.

Research with adults can illuminate the kinds of brain activity which infants are developing towards.

They are also hoping to be able to see if there are early signs of these "multisensory" abilities in newborn babies.

Read more at Science Daily

High temperatures may have caused over 70,000 excess deaths in Europe in 2022

The burden of heat-related mortality during the summer of 2022 in Europe may have exceeded 70,000 deaths according to a study led by the Barcelona Institute for Global Health (ISGlobal), a research centre supported by the "la Caixa" Foundation. The authors of the study, published in The Lancet Regional Health -- Europe, revised upwards initial estimates of the mortality associated with record temperatures in 2022 on the European continent.

In an earlier study, published in Nature Medicine, the same team used epidemiological models applied to weekly temperature and mortality data in 823 regions in 35 European countries and estimated the number of heat-related premature deaths in 2022 to be 62,862.

In that study, the authors acknowledged that the use of weekly data would be expected to underestimate heat-related mortality, and pointed out that daily time-series data are required to accurately estimate the impact of high temperatures on mortality.

The objective of the new study was to develop a theoretical framework capable of quantifying the errors arising from the use of aggregated data, such as weekly and monthly temperature and mortality time-series.

Models based on temporally aggregated data are useful because aggregated data are available in real-time from institutions such as Eurostat, facilitating quantification of the health hazard within a few days of its emergence.

To develop a theoretical framework, the research team aggregated daily temperatures and mortality records from 147 regions in 16 European countries.

They then analysed and compared the estimates of heat- and cold-related mortality by different levels of aggregation: daily, weekly, 2-weekly and monthly.

Analysis revealed differences in epidemiological estimates according to the time scale of aggregation.

In particular, it was found that weekly, 2-weekly and monthly models underestimated the effects of heat and cold as compared to the daily model, and that the degree of underestimation increased with the length of the aggregation period.

Specifically, for the period 1998-2004, the daily model estimated an annual cold and heat-related mortality of 290,104 and 39,434 premature deaths, respectively, while the weekly model underestimatedthese numbers by 8.56% and 21.56%, respectively.

"It is important to note that the differences were very small during periods of extreme cold and heat, such as the summer of 2003, when the underestimation by the weekly data model was only 4.62%," explains Joan Ballester Claramunt, the ISGlobalresearcherwho leads the European Research Council's EARLY-ADAPT project.

The team used this theoretical framework to revise the mortality burden attributed to the record temperatures experienced in 2022 in their earlier study.

According to the calculations made using the new methodological approach, that study underestimated the heat-related mortality by 10.28%, which would mean that the actual heat-related mortality burden in 2022, estimated using the daily data model, was 70,066 deaths, and not 62,862 deaths as originally estimated.

Using weekly data to analyse the effects of temperatures in the short term

"In general, we do not find models based on monthly aggregated data useful for estimating the short-term effects of ambient temperatures," explains Ballester.

"However, models based on weekly data do offer sufficient precision in mortality estimates to be useful in real-time practice in epidemiological surveillance and to inform public policies such as, for example, the activation of emergency plans for reducing the impact of heat waves and cold spells."

Read more at Science Daily

Our brains are not able to 'rewire' themselves, despite what most scientists believe, new study argues

Contrary to the commonly-held view, the brain does not have the ability to rewire itself to compensate for the loss of sight, an amputation or stroke, for example, say scientists from the University of Cambridge and Johns Hopkins University.

Writing in eLife, Professors Tamar Makin (Cambridge) and John Krakauer (Johns Hopkins) argue that the notion that the brain, in response to injury or deficit, can reorganise itself and repurpose particular regions for new functions, is fundamentally flawed -- despite being commonly cited in scientific textbooks. Instead, they argue that what is occurring is merely the brain being trained to utilise already existing, but latent, abilities.

One of the most common examples given is where a person loses their sight -- or is born blind -- and the visual cortex, previously specialised in processing vision, is rewired to process sounds, allowing the individual to use a form of 'echolocation' to navigate a cluttered room. Another common example is of people who have had a stroke and are initially unable to move their limbs repurposing other areas of the brain to allow them to regain control.

Krakauer, Director of the Center for the Study of Motor Learning and Brain Repair at Johns Hopkins University, said: "The idea that our brain has an amazing ability to rewire and reorganise itself is an appealing one. It gives us hope and fascination, especially when we hear extraordinary stories of blind individuals developing almost superhuman echolocation abilities, for example, or stroke survivors miraculously regaining motor abilities they thought they'd lost.

"This idea goes beyond simple adaptation, or plasticity -- it implies a wholesale repurposing of brain regions. But while these stories may well be true, the explanation of what is happening is, in fact, wrong."

In their article, Makin and Krakauer look at a ten seminal studies that purport to show the brain's ability to reorganise. They argue, however, that while the studies do indeed show the brain's ability to adapt to change, it is not creating new functions in previously unrelated areas -- instead it's utilising latent capacities that have been present since birth.

For example, one of the studies -- research carried out in the 1980s by Professor Michael Merzenich at University of California, San Francisco -- looked at what happens when a hand loses a finger. The hand has a particular representation in the brain, with each finger appearing to map onto a specific brain region. Remove the forefinger, and the area of the brain previously allocated to this finger is reallocated to processing signals from neighbouring fingers, argued Merzenich -- in other words, the brain has rewired itself in response to changes in sensory input.

Not so, says Makin, whose own research provides an alternative explanation.

In a study published in 2022, Makin used a nerve blocker to temporarily mimic the effect of amputation of the forefinger in her subjects. She showed that even before amputation, signals from neighbouring fingers mapped onto the brain region 'responsible' for the forefinger -- in other words, while this brain region may have been primarily responsible for process signals from the forefinger, it was not exclusively so. All that happens following amputation is that existing signals from the other fingers are 'dialled up' in this brain region.

Makin, from the Medical Research Council (MRC) Cognition and Brain Sciences Unit at the University of Cambridge, said: "The brain's ability to adapt to injury isn't about commandeering new brain regions for entirely different purposes. These regions don't start processing entirely new types of information. Information about the other fingers was available in the examined brain area even before the amputation, it's just that in the original studies, the researchers didn't pay much notice to it because it was weaker than for the finger about to be amputated."

Another compelling counterexample to the reorganisation argument is seen in a study of congenitally deaf cats, whose auditory cortex -- the area of the brain that processes sound -- appears to be repurposed to process vision. But when they are fitted with a cochlear implant, this brain region immediately begins processing sound once again, suggesting that the brain had not, in fact, rewired.

Examining other studies, Makin and Krakauer found no compelling evidence that the visual cortex of individuals that were born blind or the uninjured cortex of stroke survivors ever developed a novel functional ability that did not otherwise exist.

Makin and Krakauer do not dismiss the stories of blind people being able to navigate purely based on hearing, or individuals who have experienced a stroke regain their motor functions, for example. They argue instead that rather than completely repurposing regions for new tasks, the brain is enhancing or modifying its pre-existing architecture -- and it is doing this through repetition and learning.

Understanding the true nature and limits of brain plasticity is crucial, both for setting realistic expectations for patients and for guiding clinical practitioners in their rehabilitative approaches, they argue.

Makin added: "This learning process is a testament to the brain's remarkable -- but constrained -capacity for plasticity. There are no shortcuts or fast tracks in this journey. The idea of quickly unlocking hidden brain potentials or tapping into vast unused reserves is more wishful thinking than reality. It's a slow, incremental journey, demanding persistent effort and practice. Recognising this helps us appreciate the hard work behind every story of recovery and adapt our strategies accordingly.

Read more at Science Daily

Nov 21, 2023

'Teenage galaxies' are unusually hot, glowing with unexpected elements

Similar to human teenagers, teenage galaxies are awkward, experience growth spurts and enjoy heavy metal -- nickel, that is.

A Northwestern University-led team of astrophysicists has just analyzed the first results from the CECILIA (Chemical Evolution Constrained using Ionized Lines in Interstellar Aurorae) Survey, a program that uses NASA's James Webb Space Telescope (JWST) to study the chemistry of distant galaxies.

According to the early results, so-called "teenage galaxies" -- which formed two-to-three billion years after the Big Bang -- are unusually hot and contain unexpected elements, like nickel, which are notoriously difficult to observe.

The research will be published on Monday (Nov. 20) in The Astrophysical Journal Letters. It marks the first in a series of forthcoming studies from the CECILIA Survey.

"We're trying to understand how galaxies grew and changed over the 14 billion years of cosmic history," said Northwestern's Allison Strom, who led the study. "Using the JWST, our program targets teenage galaxies when they were going through a messy time of growth spurts and change. Teenagers often have experiences that determine their trajectories into adulthood. For galaxies, it's the same."

One of the principal investigators of the CECILIA Survey, Strom is an assistant professor of physics and astronomy at Northwestern's Weinberg College of Arts and Sciences and a member of Northwestern's Center for Interdisciplinary Exploration and Research in Astrophysics (CIERA). Strom co-leads the CECILIA Survey with Gwen Rudie, a staff scientist at Carnegie Observatories.

'Chemical DNA' gives insight into galaxy formation

Named after Cecilia Payne-Gaposchkin, one of the first women to earn a Ph.D. in astronomy, the CECILIA Survey observes spectra (or the amount of light across different wavelengths) from distant galaxies. Strom likens a galaxy's spectra to its "chemical DNA." By examining this DNA during a galaxy's "teenage" years, researchers can better understand how it grew and how it will evolve into a more mature galaxy.

For example, astrophysicists still don't understand why some galaxies appear "red and dead" while others, like our Milky Way, are still forming stars. A galaxy's spectrum can reveal its key elements, such as oxygen and sulfur, which provide a window into what a galaxy was previously doing and what it might do in the future.

"These teenage years are really important because that's when the most growth happens," Strom said. "By studying this, we can begin exploring the physics that caused the Milky Way to look like the Milky Way -- and why it might look different from its neighboring galaxies."

In the new study, Strom and her collaborators used the JWST to observe 33 distant teenaged galaxies for a continuous 30 hours this past summer. Then, they combined spectra from 23 of those galaxies to construct a composite picture.

"This washes out the details of individual galaxies but gives us a better sense of an average galaxy. It also allows us to see fainter features," Strom said. "It's significantly deeper and more detailed than any spectrum we could collect with ground-based telescopes of galaxies from this time period in the universe's history."

Spectra surprises

The ultra-deep spectrum revealed eight distinct elements: Hydrogen, helium, nitrogen, oxygen, silicon, sulfur, argon and nickel. All elements that are heavier than hydrogen and helium form inside stars. So, the presence of certain elements provides information about star formation throughout a galaxy's evolution.

While Strom expected to see lighter elements, she was particularly surprised by the presence of nickel. Heavier than iron, nickel is rare and incredibly difficult to observe.

"Never in my wildest dreams did I imagine we would see nickel," Strom said. "Even in nearby galaxies, people don't observe this. There has to be enough of an element present in a galaxy and the right conditions to observe it. No one ever talks about observing nickel. Elements have to be glowing in gas in order for us to see them. So, in order for us to see nickel, there may be something unique about the stars within the galaxies."

Another surprise: The teenage galaxies were extremely hot. By examining the spectra, physicists can calculate a galaxy's temperature. While the hottest pockets with galaxies can reach over 9,700 degrees Celsius (17,492 degrees Fahrenheit), the teenage galaxies clock in at higher than 13,350 degrees Celsius (24,062 degrees Fahrenheit).

"This is just additional evidence of how different galaxies likely were when they were younger," Strom said. "Ultimately, the fact that we see a higher characteristic temperature is just another manifestation of their different chemical DNA because the temperature and chemistry of gas in galaxies are intrinsically linked."

Read more at Science Daily

Coastal river deltas threatened by more than climate change

Worldwide, coastal river deltas are home to more than half a billion people, supporting fisheries, agriculture, cities, and fertile ecosystems. In a unique study covering 49 deltas globally, researchers from Lund University and Utrecht University have identified the most critical risks to deltas in the future. The research shows that deltas face multiple risks, and that population growth and poor environmental governance might pose bigger threats than climate change to the sustainability of Asian and African deltas, in particular.

"We can clearly show that many risks are not linked to climate. While climate change is a global problem, other important risk factors like land subsidence, population density and ineffective governance are local problems. Risks to deltas will only increase over time, so now is the time for governments to take action," says Murray Scown, associate senior lecturer, Lund University Centre for Sustainability Studies, and lead author.

Collapse of delta environments could have huge consequences for global sustainable development.

In the worst-case scenario, deltas could be lost to the sea; other consequences are flooding, salinization of water, which affects agriculture, coastal squeeze, and loss of ecosystems.

The study, published in Global Environmental Change, looked at five different IPCC scenarios for global development in 49 deltas all over the world, including famous deltas such as the Nile, Mekong, and Mississippi, but also more understudied deltas such as the Volta, Zambezi and Irrawaddy deltas.

The research identifies possible risks to deltas stretching 80 years into the future.

The researchers based their analysis on 13 well-known factors affecting risk in deltas and drew upon unique models to identify which of these risks are most likely to endanger different deltas in the future.

Risk factors include increasing population density, urban development, irrigated agriculture, changes to river discharge, land subsidence and relative sea-level rise, limited economic capacity, poor government effectiveness, and low adaptation readiness.

Population density, land subsidence and ineffective governance are high risk factors

The analysis shows that there are some risks that are more critical to deltas than others -- in all of the five future scenarios.

These include land subsidence and relative sea-level rise, population density, ineffective governance, economic capacity, and crop land use.

For some deltas, physical risks are especially pronounced. Land subsidence is, for example, the highest risk factor for the Mekong delta in Vietnam.

Extreme sea levels are among the most concerning risk factors for deltas in China, on the Korean peninsula, and in the Colorado (Mexico) and Rhine (Netherlands) deltas.

In the Nile (Egypt), Niger (Nigeria), and the Ganges (Bangladesh) deltas, it is increasing population density that is of most concern under certain scenarios.

For other deltas, it is the lack of economic capacity and government effectiveness to manage risks, for example in the Irrawaddy (Myanmar) and Congo (Angola and Democratic Republic of the Congo) deltas.

"Analysed all together, we can see that the Asian mega-deltas are at greatest risk, with potentially devastating consequences for millions of people, and for the environment. They are under pressure from population growth, intense agricultural land use, relative sea-level rise, and lagging adaptation readiness," says Murray Scown.

Local and global approaches and a mixture of hard and soft adaptation can mitigate risks

"Instead of sitting back, governments need to think long-term, and put plans in place to reduce or mitigate risks. In the Mekong delta, for example, the Vietnamese government are making strong efforts to restrict future groundwater extraction in the delta to reduce land subsidence and salinization," says Philip Minderhoud, assistant professor at Wageningen University and Research.

The researchers highlight that a mixture of hard ("grey") and soft ("green") adaptation approaches will be required to manage and mitigate delta risks.

They include both hard infrastructures, like sea walls to stop the sea inundating the delta, and soft approaches using nature-based solutions.

One example is the Dutch experience of creating room for the river in the Rhine delta, by lowering floodplains, relocating levees, and using spaces that are allowed to flood for grazing.

Initiatives to build up delta surfaces by allowing rivers to flood and deposit sediment on the delta to maintain elevation above sea level are also promising, notes Frances Dunn, assistant professor at Utrecht University.

Read more at Science Daily

Researchers develop comprehensive genetic map for bison, discover gene responsible for albinism

A research team led by scientists from the Texas A&M School of Veterinary Medicine & Biomedical Sciences (VMBS) has developed the most comprehensive genome yet for the North American bison, bringing the animal's genetic roadmap up to date with the latest genome sequencing technology. In doing so, the research team also discovered the gene responsible for albinism in bison.

The study -- recently published in G3: Genes, Genomes, and Genetics -- details the development of this high-resolution reference genome, which the researchers then used to produce the first test for genetic mutations, starting with the mutation responsible for albinism.

Albinism, a rare condition characterized by a lack of pigment in an animal's body, making them look white with red eyes, has historical significance in that albino bison have been recognized as a religious symbol for some Native American Indigenous tribes.

The study also lays the framework for determining other genetic variations that impact important bison traits, such as those that contribute to the health and production value of this species.

New Genome, New Possibilities

Dr. James Derr, a VMBS professor of veterinary pathobiology and genetics who led the research team that created the first bison genome back in 2015, assembled the team that developed this new reference genome. This team includes assistant professor of genetics Dr. Brian Davis, graduate student Sam Stroupe, and representatives from Texas Parks and Wildlife and the National Park Service.

"Because reference genomes can help researchers identify and characterize genes that are responsible for a large number of traits, this technology is used to do all kinds of things, including diagnosing health conditions and developing targeted treatments," Davis said.

The newest bison reference genome was developed using technology that allows researchers to create genomes based on DNA from hybrids, which are animals with DNA from two different species. In this case, the researchers used DNA from a type of bison-cow hybrid called an F1, or individuals with a perfect 50-50 split between its parents' DNA.

In general, F1 hybrids between bison and cattle are rare but have historically happened, since we now know that most bison herds in North America contain descendants of hybrids between bison and cattle -- a discovery that Derr and his research partners made last year.

"One day we got a call from Texas Parks and Wildlife saying they knew someone who had an F1 hybrid," Derr said. "It was the first fully documented, first-generation F1 hybrid I have seen in 25 years of working with bison. That's why we were able to do this."

To create the new bison genome, the researchers first sequenced the genome of the F1 hybrid as well as the bison mom and the domestic cattle father. With this information, they were able to separate bison DNA from the cattle DNA regions in the hybrid.

Since the cattle genome is already very advanced, it provided a reference for creating the new bison genome, helping to guide researchers in developing the complete high-resolution reference bison genome.

To prove the utility of the new genome, the team set out to discover which gene mutation was responsible for albinism in bison and to create a genetic test that could be used to identify carriers of that mutation.

The discovery is the first time anyone has successfully determined the gene mutation responsible for an observable trait in bison.

"We knew albinism was an inherited recessive trait, but we didn't know which gene was responsible," Stroupe said. "So, we sequenced the DNA from a few albino bison and compared them to those of normal coloration to find the mutation that causes albinism. As it turns out, the mutation causes an important enzyme to cease functioning correctly, which leads to the lack of skin pigmentation."

The Uniqueness Of Albino Bison

Many North American Indigenous peoples regard white bison as sacred entities with prophetic spiritual associations. While not all white bison have albinism, the birth of one is cause for celebration in some communities.

Despite this cultural significance, Derr isn't suggesting that people try to produce albino bison using genetic testing.

"Sadly, albino bison are often not very healthy," Derr said. "They tend to develop skin cancers, and they can develop other health problems as they age."

Albino bison are also different from white or tan bison that result from crossing bison with white cattle, particularly Charolais. These bison lack the red eyes and pink nose of true albinos.

Now that a more accurate bison genome exists, scientists can learn more about the genetic makeup of North America's bison population.

"The development of this new reference genome and the identification of a causative genetic mutation is exciting news for bison," Derr said. "It opens the doors for new discoveries and insights into bison genetics.

Read more at Science Daily

Microbiome development: Bacteria lay the foundations for their descendants

The microbiome (the symbiotic community of microbial organisms of a host) is of existential importance for the functioning of every plant and animal, including human beings. A research team from Düsseldorf and Kiel headed by Heinrich Heine University Düsseldorf (HHU) has now used the example of the sea anemone Nematostella vectenis to investigate how the microbiome develops together with the host. In the scientific journal Microbiome, the researchers describe that the bacterial community is primarily controlled by the host organism during the early stages of life, while bacteria-bacteria interactions play the lead role in subsequent development.

Every multicellular living creature -- from the simplest organisms to human beings -- lives in a community with a multitude of microorganisms, the so-called microbiome.

This microbiome comprises bacteria, fungi and viruses among other things and assumes various roles ranging from metabolism to immune defence.

For example, without the microbiome in the human intestine, many nutrients could not be absorbed from food and made available to the human body.

But how does the microbiome develop as the host develops? It is known that the composition and ratio of the microorganisms in the sea anemone Nematostella vectenis differ fundamentally between the different stages in its life cycle and only assume a stable form in the adult anemone.

But who and which factors decide how the microbiome changes as the host matures -- does the host control colonisation with the right microbes or do the microbes regulate themselves?

A team from HHU, Kiel University (CAU) and the GEOMAR Helmholtz Centre for Ocean Research Kiel addressed this question.

The study was headed by Professor Dr Sebastian Fraune from the Institute of Zoology and Organismic Interactions at HHU.

The research was conducted within the framework of the Collaborative Research Centre (CRC) 1182 "Origin and Function of Metaorganisms," which is headed by CAU.

Dr Hanna Domin, lead author of a study that has now been published in Microbiome: "We took adult Nematostella polyps, which had no microbiome following intensive antibiotic treatment, and then recolonised them in a targeted way. To do this, we used bacterial communities that corresponded to those of firstly a Nematostella larva, secondly a juvenile animal and thirdly an adult polyp."

In all three cases, the researchers examined how the microbiome developed over the course of time.

They discovered that only the initial colonisers -- i.e. the bacteria forming the microbiome of the youngest animals -- became really well-established in the adult polyps.

By contrast, it was difficult for the bacteria from older animals to become established.

Professor Fraune, corresponding author of the study: "Following recolonisation, the microbiome then undergoes a development process that is very similar to the normal development of host and microbiome. It takes around four weeks to reach the same status as adult animals that have undergone a normal growth process."

The researchers conclude from this that the host -- presumably through its innate immune system -- controls the composition of the original colony.

Domin: "However, the host no longer has a significant influence over the further development of the microbiome after this point. The bacteria control this themselves and lay suitable foundations for their descendants."

One important aspect of the project, which was driven forward by the research group headed by Professor Dr Christoph Kaleta in Kiel, was the examination of so-called metabolic networks.

This involved investigating how the different bacteria are linked via their metabolism and influence each other.

"We were able to identify metabolic pathways, which are specific to the initial colonisers as well as pathways that only play a role at a later stage," says Dr Johannes Zimmermann from CAU.

The research team established that the degradation of the polysaccharide chitin plays a central role for the initial colonisers in particular.

It was only recently discovered that Nematostella can produce chitin.

Why the animals do this was however unknown as they for example -- by contrast with insects -- do not need chitin for their structural development.

Fraune: "Our results provide clear indications that chitin plays a role for the microbiome."

The sea anemone only has an innate immune system. Nevertheless, the results are also relevant for medical research.

Newborn babies come into contact with numerous bacteria immediately after birth, whereby they also only have an innate immune system at that phase in their lives.

Consequently, initial colonisation with the right microbes is also key to establishing a functioning microbiome and training the adaptive immune system in humans.

Read more at Science Daily

Nov 20, 2023

Multiple evolutionary trajectories in aquatic crocodiles

In the geological past, several groups of crocodiles evolved towards a morphology adapted to marine life. However, the extent of these adaptations and their evolutionary trajectories remained unknown. An exhaustive study of their morphology by a scientific team from the Evolution & Diversity Dynamics Lab (EDDyLab) at the University of Liège has now shed light on the evolutionary mechanisms at work, thanks to three-dimensional reconstructions.

Contrary to what its few current species might suggest, the crocodile group was highly diversified in the past, with herbivorous, arboreal, and even totally marine species. Thalattosuchians and dyrosaurs, two crocodile species, colonised the marine environment independently in the geological past. "These two groups of crocodiles are also very interesting to study because they managed to survive major biological crises," explains Isaure Scavezzoni, a doctoral student at the Evolution & Diversity Dynamics Lab and principal author of the study. Thalattosuchians survived the Jurassic-Cretaceous transition (145 million years ago) and dyrosaurs the mass extinction at the end of the Cretaceous (66 million years ago)." However, the extent and diversity of these animals' adaptations to marine life are still very poorly understood because their body anatomy has been relatively little studied. We do not know the evolutionary trajectories underlying these evolutionary successes. Are they similar or did these groups take different routes to marine life? Researchers at the University of Liège's EDDyLab have attempted to answer this question using 3D modelling.

"The scale of the task involved in answering these questions is immense," explains Valentin Fischer, palaeontologist and director of the EDDyLab. We have carried out hundreds of scans and high-definition 3D reconstructions of the limb, shoulder and pelvis bones of a wide range of species of thalattosuchians, dyrosaurs and even modern crocodiles." These data enabled the team to analyse the evolutionary trajectories of these two species in order to detect possible convergences, i.e. cases of independent evolution of similar morphologies. To do this, several dozen reference points were placed on each bone; the resulting 3D coordinates were then compared between species and tested in a phylogenetic framework, i.e. taking into account the kinship links between the species analysed.

Read more at Science Daily

Naturally regrowing forests are helping to protect the remaining old forests in the Amazon

The climate crisis and UN Decade on Ecosystem Restoration have generated great interest in the value of secondary forests. These are forests that have regrown naturally on land abandoned from agriculture.

Collaborative research between Lancaster University, Bangor University and the University of British Columbia has produced new evidence of just how important they are in counteracting the effects of forest fragmentation across the Amazon basin. This has just been published in a paper in the journal Environmental Research Letters. It is an output of the PhD research of Charlotte Smith, in the Envision Doctoral Training Partnership, in which Bangor University is a partner.

Co-author of the paper, John Healey, Professor of Forest Sciences at Bangor University, described how it shows "that secondary forests cover just 190,000 km2 of the Amazon but connect more than 2 million isolated fragments of old-growth forest, prominent amongst the world's most important habitats for biodiversity conservation. The secondary forests are helping maintain connectivity for patches of old-growth forest that are too small to support long-term viable populations of rare species."

Charlotte Smith reported "secondary forests are buffering as much as 41% of old-growth forest edges, potentially shielding them from negative edge effects such as hotter temperatures and wind. Proximity to old-growth forests can also help the rate of biodiversity and biomass recovery in secondary forests. It is positive that 94% of secondary forests were connected to old-growth forest. However, may old-growth forest remnants are small and degraded patches, so only 57% of secondary forest was connected to an area of extensive, structurally-intact old-growth."

Professor Healey pointed out the importance of this research, "It provides powerful new evidence of the importance of managing forests at the landscape scale. Promoting forest restoration through secondary forests located next to old-growth forest remnants can play a vital role in both conserving biodiversity in these remnants and the rate of biodiversity recovery in the secondary forests themselves."

Read more at Science Daily

Cheap medicines prevented migraine as well as expensive ones

Migraine is more than just a headache. Often the pain is accompanied by nausea, vomiting, light sensitivity, and sound sensitivity. Chronic migraine can be disabling and may prevent many, especially women, from contributing to working life.

Still, it often takes a long time for migraine patients to find a treatment that works well for them. Researchers at the Norwegian Center for Headache Research (NorHead) have used data from the Norwegian Prescription Register to look at which medicines best prevent migraine in people in Norway:

"There has now been done a lot of research on this subject before. This may weaken the quality of the treatment and increase the cost of treatment for this patient group," says the leader of the study, Professor Marte-Helen Bjørk at the Department of Clinical Medicine, University of Bergen.

Three medicines had better effect than the first choice of medicines

The researchers used national register data from 2010 to 2020 to estimate treatment effect. They measured this by looking at the consumption of acute migraine medicines before and after starting preventive treatment, and investigated how long the people with migraine used the different preventive treatments. A total of over one hundred thousand migraine patients were in the study.

"When the withdrawal of acute migraine medicines changed little after starting preventive medicines, or people stopped quickly on the preventive medicines, the preventive medicine was interpreted as having little effect. If the preventive medicine was used on long, uninterrupted periods, and we saw a decrease in the consumption of acute medicines, we interpreted the preventive medicine as having good effect," Bjørk explains.

As a rule, so-called beta blockers are used as the first choice to prevent migraine attacks, but the researchers found that especially three medicines had better preventive effect than these: CGRP inhibitors, amitriptyline and simvastatin.

"The latter two medicines are also established medicines used for depression, chronic pain and high cholesterol, respectively, while CGRP inhibitors are developed and used specifically for chronic migraine," says the professor.

Can have great significance for the cost of health care

CGRP inhibitors are more expensive than the other medicines. In 2021 their reimbursement amounted to 500 million NOK (not including discounts given by pharma companies).

"Our analysis shows that some established and cheaper medicines can have a similar treatment effect as the more expensive ones. This may be of great significance both for the patient group and Norwegian health care" says Bjørk.

The researchers at NorHead have already started work on a large clinical study to measure the effect of established cholesterol-lowering medicines as a preventive measure against chronic and episodic migraine.

Read more at Science Daily