Jun 9, 2023

Elusive planets play 'hide and seek' with CHEOPS

With the help of the CHEOPS space telescope an international team of European astronomers managed to clearly identify the existence of four new exoplanets. The four mini-Neptunes are smaller and cooler, and more difficult to find than the so-called Hot Jupiter exoplanets which have been found in abundance. Two of the four resulting papers are led by researchers from the University of Bern and the University of Geneva who are also members of the National Centre of Competence in Research (NCCR) PlanetS.

CHEOPS is a joint mission by the European Space Agency (ESA) and Switzerland, under the leadership of the University of Bern in collaboration with the University of Geneva. Since its launch in December 2019, the extremely precise measurements of CHEOPS have contributed to several key discoveries in the field of exoplanets.

NCCR PlanetS members Dr. Solène Ulmer-Moll of the Universities of Bern and Geneva, and Dr. Hugh Osborn of the University of Bern, exploited the unique synergy of CHEOPS and the NASA satellite TESS, in order to detect a series of elusive exoplanets. The planets, called TOI 5678 b and HIP 9618 c respectively, are the size of Neptune or slightly smaller with 4.9 and 3.4 Earth radii. The respective papers have just been published in the journals Astronomy & Astrophysics and Monthly Notices of the Royal Astronomical Society. Publishing in the same journals, two other members of the international team, Amy Tuson from the University of Cambridge (UK) and Dr. Zoltán Garai from the ELTE Gothard Astrophysical Observatory (Hungary), used the same technique to identify two similar planets in other systems.

The synergy of two satellites

The CHEOPS satellite observes the luminosity of stars in order to capture the slight dimming that occurs when, and if, an orbiting planet happens to pass in front of its star from our point of view. By searching for these dimming events, called "transits," scientists have been able to discover the majority of the thousands of exoplanets known to orbit stars other than our Sun.

"NASA's TESS satellite excels at detecting the transits of exoplanets, even for the most challenging small planets. However, it changes its field of view every 27 days in order to scan rapidly most of the sky, which prevents it from finding planets on longer orbital periods," explains Hugh Osborn. Still, the TESS satellite was able to observe single transits around the stars TOI 5678 and HIP 9618. When returning to the same field of view after two years, it could again observe similar transits around the same stars. Despite these observations, it was still not possible to conclude unequivocally to the presence of planets around those stars as information was incomplete.

"This is where CHEOPS comes into play: Focusing on a single-star at a time, CHEOPS is a follow-up mission which is perfect to continue observing these stars to find the missing bits of information," complements Solène Ulmer-Moll.

A lengthy game of "hide and seek"

Suspecting the presence of exoplanets, the CHEOPS team designed a method to avoid spending blindly precious observing time in the hope to detect additional transits. They adopted a targeted approach based on the very few clues the transits observed by TESS provided. Based on this, Osborn developed a software which proposes and prioritizes candidate periods for each planet. "We then play a sort of 'hide and seek' game with the planets, using the CHEOPS satellite," as Osborn says.

"We point CHEOPS towards a target at a given time, and depending if we observe a transit or not, we can eliminate some of the possibilities and try again at another time until there is a unique solution for the orbital period." It took five and four attempts respectively for the scientists to clearly confirm the existence of the two exoplanets and determine that TOI 5678 b has a period of 48 days, while HIP 9618 c has a period of 52.5 days.

Ideal targets for the JWST

The story does not end there for the scientists. With the newly found constrained periods, they could turn to ground-based observations using another technique called radial velocity, which enabled the team to determine masses of respectively 20 and 7.5 Earth masses for TOI 5678 b and HIP 9618 c. With both the size and mass of a planet, its density is known, and scientists can get an idea of what it is made off. "For mini-Neptunes however, density is not enough, and there are still a few hypotheses as for the composition of the planets: they could either be rocky planets with a lot of gas, or planets rich in water and with a very steamy atmosphere," explains Ulmer-Moll. "Since the four newly discovered exoplanets are orbiting bright stars, it also makes them targets of prime interest for the mission of the James Webb Space Telescope JWST which might help to solve the riddle of their composition," Ulmer-Moll continues.

Read more at Science Daily

Greenhouse gas emissions at 'an all-time high' -- and it is causing an unprecedented rate of global warming, say scientists

Human-caused global warming has continued to increase at an "unprecedented rate" since the last major assessment of the climate system published two years ago, say 50 leading scientists.

One of the researchers said the analysis was a "timely wake-up call" that the pace and scale of climate action has been insufficient, and it comes as climate experts meet in Bonn to prepare the ground for the major COP28 climate conference in the UAE in December, which will include a stocktake of progress towards keeping global warming to 1.5°C by 2050.

Given the speed at which the global climate system is changing, the scientists argue that policymakers, climate negotiators and civil society groups need to have access to up-to-date and robust scientific evidence on which to base decisions.

The authoritative source of scientific information on the state of the climate is the UN's Intergovernmental Panel on Climate Change (IPCC) but the turnaround time for its major assessments is five or ten years, and that creates an "information gap," particularly when climate indicators are changing rapidly.

In an initiative being led by the University of Leeds, the scientists have developed an open data, open science platform -- the Indicators of Global Climate Change and website (https://igcc.earth/). It will update information on key climate indicators every year.

Critical decade for climate change

The Indicators of Global Climate Change Project is being co-ordinated by Professor Piers Forster, Director of the Priestley Centre for Climate Futures at Leeds. He said: "This is the critical decade for climate change.

"Decisions made now will have an impact on how much temperatures will rise and the degree and severity of impacts we will see as a result.

"Long-term warming rates are currently at a long-term high, caused by highest-ever levels of greenhouse gas emissions. But there is evidence that the rate of increase in greenhouse gas emissions has slowed.

"We need to be nimble footed in the face of climate change. We need to change policy and approaches in the light of the latest evidence about the state of the climate system. Time is no longer on our side. Access to up-to-date information is vitally important."

Writing in the journal Earth System Science Data, the scientists have revealed how key indicators have changed since the publication of the IPCC's Sixth Assessment Working Group 1 report in 2021- which produced the key data that fed into the subsequent IPCC Sixth Synthesis Report.

What the updated indicators show

Human-induced warming, largely caused by the burning of fossil fuels, reached an average of 1.14°C for the most recent decade (2013 to 2022) above pre-industrial levels. This is up from 1.07°C between 2010 and 2019.

Human-induced warming is now increasing at a pace of over 0.2°C per decade.

The analysis also found that greenhouse gas emissions were "at an all-time high," with human activity resulting in the equivalent of 54 (+/-5.3) gigatonnes (or billion metric tonnes) of carbon dioxide being released into the atmosphere on average every year over the last decade (2012-2021).

There has been positive move away from burning coal, yet this has come at a short-term cost in that it has added to global warming by reducing particulate pollution in the air, which has a cooling effect.

'Indicators critical to address climate crisis'

Professor Maisa Rojas Corradi, Minister of the Environment in Chile, IPCC author and a scientist involved in this study, said: "An annual update of key indicators of global change is critical in helping the international community and countries to keep the urgency of addressing the climate crisis at the top of the agenda and for evidence-based decision-making.

"In line with the "ratchet-mechanism" of increasing ambition envisioned by the Paris Agreement we need scientific information about emissions, concentration, and temperature as often as possible to keep international climate negotiations up to date and to be able to adjust and if necessary correct national policies.

"In the case of Chile, we have a climate change law that aims at aligning government-wide policies with climate action."

Remaining carbon budget

One of the major findings of the analysis is the rate of decline in what is known as the remaining carbon budget, an estimate of how much carbon that can be released into the atmosphere to give a 50% chance of keeping global temperature rise within 1.5°C.

In 2020, the IPCC calculated the remaining carbon budget was around 500 gigatonnes of carbon dioxide. By the start of 2023, the figure was roughly half that at around 250 gigatonnes of carbon dioxide.

The reduction in the estimated remaining carbon budget is due to a combination of continued emissions since 2020 and updated estimates of human-induced warming.

Professor Forster said: "Even though we are not yet at 1.5°C warming, the carbon budget will likely be exhausted in only a few years as we have a triple whammy of heating from very high CO2 emissions, heating from increases in other GHG emissions and heating from reductions in pollution.

"If we don't want to see the 1.5°C goal disappearing in our rearview mirror, the world must work much harder and urgently at bringing emissions down.

"Our aim is for this project to help the key players urgently make that important work happen with up-to-date and timely data at their fingertips."

Dr Valérie Masson-Delmotte, from the Université Paris Saclay who co-chaired Working Group 1 of the IPCC's Sixth Assessment report and was involved in the climate indicators project, said: "This robust update shows intensifying heating of our climate driven by human activities. It is a timely wake up call for the 2023 global stocktake of the Paris Agreement -- the pace and scale of climate action is not sufficient to limit the escalation of climate-related risks."

As recent IPCC reports have conclusively shown, with every further increment of global warming, the frequency and intensity of climate extremes, including hot extremes, heavy rainfall and agricultural droughts, increases.

The Indicators of Global Climate Change (https://igcc.earth/) will have annually updated information on greenhouse gas emissions, human-induced global warming and the remaining carbon budget.

Read more at Science Daily

Without fully implementing net-zero pledges, the world will miss climate goals

Without more legally binding and well-planned net-zero policies, the world is highly likely to miss key climate targets.

In the new study, led by Imperial College London and published today in Science, researchers ranked 90% of global net-zero greenhouse gas emissions pledges as providing low confidence in their full implementation.

The researchers recommend nations make their targets legally binding and back them up with long-term plans and short-term implementation policies to increase the likelihood of avoiding the worst impacts of climate change.

Lead researcher Professor Joeri Rogelj, director of research for the Grantham Institute at Imperial, said: "Climate policy is moving from setting ambitious targets to implementing them. However, our analysis shows most countries do not provide high confidence that they will deliver on their commitments. The world is still on a high-risk climate track, and we are far from delivering a safe climate future."

Assigning confidence

Climate goals set out in the Paris Agreement include keeping temperature rises well below 2°C above the average temperature before the industrial revolution and ideally below 1.5°C. The main way to achieve this is the reach 'net-zero' greenhouse gas emissions as soon as possible, where any remaining emissions are effectively offset.

Most countries have set net-zero goals and Nationally Determined Contributions (NDCs) -- non-binding national plans proposing climate actions. Taking these plans at face value, and assuming they will all be fully implemented, gives the world a chance of keeping warming to 1.5-2 degrees C. But taking current policies only, with no implementation of net-zero pledges, means models predict temperature rises could be as much as 2.5-3 degrees C by 2100, with warming still increasing.

To reduce the uncertainty in which of these scenarios is likely to happen, the team, including researchers from the UK, Austria, USA, Netherlands, Germany, and Brazil, assigned a 'confidence' to each net-zero policy. They assessed 35 net zero targets, covering every country with more than 0.1% of current global greenhouse gas emissions.

The confidence assessment was based on three policy characteristics: whether the policy was legally binding, whether there was a credible policy plan guiding implementation, and whether short-term plans would already put emissions on a downward path over the next decade.

Based on this, policies were given 'higher', 'lower' or 'much lower' confidence of being fully implemented. Some regions scored highly, including the European Union, the United Kingdom and New Zealand, but around 90% scored 'lower' or 'much lower' confidence, including China and the US, which together account for more than 35% of current emissions.

Modelling emissions

From this assessment, the team modelled five scenarios of future greenhouse gas emissions and resulting temperatures. These were: considering only current policies (the most conservative scenario); only adding in policies that have a high confidence of being implemented; adding policies with high and low confidence; adding all policies regardless of confidence as if they are implemented; and a scenario where all policies are fully implemented and all NDCs are met (the most forgiving scenario).

The most conservative scenario had the largest uncertainty, with a range of 1.7-3°C and a median estimate of 2.6°C. The most optimistic scenario has a range of 1.6-2.1, with a median estimate of 1.7°C. This might suggest that, if all net-zero policies are fully implemented, the Paris Agreement goals are withing reach. However, with so many policies ranked in the low-confidence end of the scale, this would be wishful thinking in absence of further efforts.

Co-author Taryn Fransen, from the World Resources Institute in Washington DC, and the Energy and Resources Group at the University of California-Berkeley, said: "Climate change targets are by their nature ambitious -- there's no point in setting a target for a foregone conclusion. But implementation must follow."

Catalysing action

Only twelve out of 35 net zero policies are currently legally binding, and the researchers say increasing this number would help ensure the policies survive long-term and catalyse action. Countries also need clear implementation pathways for different sectors, outlining exactly what changes are needed and where the responsibility lies.

Co-author Dr Robin Lamboll, from the Centre for Environmental Policy at Imperial, said: "Making targets legally binding is crucial to ensure long-term plans are adopted. We need to see concrete legislation in order to trust that action will follow from promises."

Read more at Science Daily

Lingering effects of Neanderthal DNA found in modern humans

Recent scientific discoveries have shown that Neanderthal genes comprise some 1 to 4% of the genome of present-day humans whose ancestors migrated out of Africa, but the question remained open on how much those genes are still actively influencing human traits -- until now.

A multi-institution research team including Cornell University has developed a new suite of computational genetic tools to address the genetic effects of interbreeding between humans of non-African ancestry and Neanderthals that took place some 50,000 years ago. (The study applies only to descendants of those who migrated from Africa before Neanderthals died out, and in particular, those of European ancestry.)

In a study published in eLife, the researchers reported that some Neanderthal genes are responsible for certain traits in modern humans, including several with a significant influence on the immune system. Overall, however, the study shows that modern human genes are winning out over successive generations.

"Interestingly, we found that several of the identified genes involved in modern human immune, metabolic and developmental systems might have influenced human evolution after the ancestors' migration out of Africa," said study co-lead author April (Xinzhu) Wei, an assistant professor of computational biology in the College of Arts and Sciences. "We have made our custom software available for free download and use by anyone interested in further research."

Using a vast dataset from the UK Biobank consisting of genetic and trait information of nearly 300,000 Brits of non-African ancestry, the researchers analyzed more than 235,000 genetic variants likely to have originated from Neanderthals. They found that 4,303 of those differences in DNA are playing a substantial role in modern humans and influencing 47 distinct genetic traits, such as how fast someone can burn calories or a person's natural immune resistance to certain diseases.

Unlike previous studies that could not fully exclude genes from modern human variants, the new study leveraged more precise statistical methods to focus on the variants attributable to Neanderthal genes.

While the study used a dataset of almost exclusively white individuals living in the United Kingdom, the new computational methods developed by the team could offer a path forward in gleaning evolutionary insights from other large databases to delve deeper into archaic humans' genetic influences on modern humans.

"For scientists studying human evolution interested in understanding how interbreeding with archaic humans tens of thousands of years ago still shapes the biology of many present-day humans, this study can fill in some of those blanks," said senior investigator Sriram Sankararaman, an associate professor at the University of California, Los Angeles. "More broadly, our findings can also provide new insights for evolutionary biologists looking at how the echoes of these types of events may have both beneficial and detrimental consequences."

Read more at Science Daily

Jun 8, 2023

New study identifies mechanism driving the sun's fast wind

The fastest winds ever recorded on Earth reached more than 200 miles per hour, but even those gusts pale in comparison to the sun's wind.

In a paper published June 7, 2023 in the journal Nature, a team of researchers used data from NASA's Parker Solar Probe to explain how the solar wind is capable of surpassing speeds of 1 million miles per hour. They discovered that the energy released from the magnetic field near the sun's surface is powerful enough to drive the fast solar wind, which is made up of ionized particles -- called plasma -- that flow outward from the sun.

James Drake, a Distinguished University Professor in the University of Maryland's Department of Physics and Institute for Physical Science and Technology (IPST), co-led this research alongside first author Stuart Bale of UC Berkeley. Drake said scientists have been trying to understand solar wind drivers since the 1950s -- and with the world more interconnected than ever, the implications for Earth are significant.

The solar wind forms a giant magnetic bubble, known as the heliosphere, that protects planets in our solar system from a barrage of high-energy cosmic rays that whip around the galaxy. However, the solar wind also carries plasma and part of the sun's magnetic field, which can crash into Earth's magnetosphere and cause disturbances, including geomagnetic storms.

These storms occur when the sun experiences more turbulent activity, including solar flares and enormous expulsions of plasma into space, known as coronal mass ejections. Geomagnetic storms are responsible for spectacular aurora light shows that can be seen near the Earth's poles, but at their most powerful, they can knock out a city's power grid and potentially even disrupt global communications. Such events, while rare, can also be deadly to astronauts in space.

"Winds carry lots of information from the sun to Earth, so understanding the mechanism behind the sun's wind is important for practical reasons on Earth," Drake said. "That's going to affect our ability to understand how the sun releases energy and drives geomagnetic storms, which are a threat to our communication networks."

Previous studies revealed that the sun's magnetic field was somehow driving the solar wind, but researchers didn't know the underlying mechanism. Earlier this year, Drake co-authored a paper which argued that the heating and acceleration of the solar wind is driven by magnetic reconnection -- a process that Drake has dedicated his scientific career to studying.

The authors explained that the entire surface of the sun is covered in small "jetlets" of hot plasma that are propelled upward by magnetic reconnection, which occurs when magnetic fields pointing in opposite directions cross-connect. In turn, this triggers the release of massive amounts of energy.

"Two things pointing in opposite directions often wind up annihilating each other, and in this case doing so releases magnetic energy," Drake said. "These explosions that happen on the sun are all driven by that mechanism. It's the annihilation of a magnetic field."

To better understand these processes, the authors of the new Nature paper used data from the Parker Solar Probe to analyze the plasma flowing out of the corona -- the outermost and hottest layer of the sun. In April 2021, Parker became the first spacecraft to enter the sun's corona and has been nudging closer to the sun ever since. The data cited in this paper was taken at a distance of 13 solar radii, or roughly 5.6 million miles from the sun.

"When you get very close to the sun, you start seeing stuff that you just can't see from Earth," Drake said. "All the satellites that surround Earth are 210 solar radii from the sun, and now we're down to 13. We're about as close as we're going to get."

Using this new data, the Nature paper authors provided the first characterization of the bursts of magnetic energy that occur in coronal holes, which are openings in the sun's magnetic field as well as the source of the solar wind.

The researchers demonstrated that magnetic reconnection between open and closed magnetic fields -- known as interchange connection -- is a continuous process, rather than a series of isolated events as previously thought. This led them to conclude that the rate of magnetic energy release, which drives the outward jet of heated plasma, was powerful enough to overcome gravity and produce the sun's fast wind.

By understanding these smaller releases of energy that are constantly occurring on the sun, researchers hope to understand -- and possibly even predict -- the larger and more dangerous eruptions that launch plasma out into space. In addition to the implications for Earth, findings from this study can be applied to other areas of astronomy as well.

"Winds are produced by objects throughout the universe, so understanding what drives the wind from the sun has broad implications," Drake said. "Winds from stars, for example, play a crucial role in shielding planetary systems from galactic cosmic rays, which can impact habitability."

Read more at Science Daily

The other side of the story: How evolution impacts the environment

The story of the peppered moths is a textbook evolutionary tale. As coal smoke darkened tree bark near England's cities during the Industrial Revolution, white-bodied peppered moths became conspicuous targets for predators and their numbers quickly dwindled. Meanwhile, black-bodied moths, which had been rare, thrived and became dominant in their newly darkened environment.

The peppered moths became a classic example of how environmental change drives species evolution. But in recent years, scientists have begun thinking about the inverse process. Might there be a feedback loop in which species evolution drives ecological change? Now, a new study by researchers at the University of Rhode Island shows some of the best evidence yet for that very phenomenon.

In research published in the Proceedings of the National Academy of Sciences, the researchers show that an evolutionary change in the length of lizards' legs can have a significant impact on vegetation growth and spider populations on small islands in the Bahamas. This is one of the first times, the researchers say, that such dramatic evolution-to-environment effects have been documented in a natural setting.

"The idea here is that, in addition to the environment shaping the traits of organisms through evolution, those trait changes should feed back and drive changes in predator-prey relationships and other ecological interactions between species," said Jason Kolbe, a professor of biological sciences at the University of Rhode Island and one of the study's senior authors. "And we really need to understand how those dynamics work so we can make predictions about how populations are going to persist, and what sort of ecological changes might result."

For the last 20 years, Kolbe and his colleagues have been observing the evolutionary dynamics of anole lizard populations on a chain of tiny islands in the Bahamas. The chain is made up of around 40 islands ranging from a few dozen to a few hundred meters in area -- small enough that the researchers can keep close tabs on the lizards living there. And the islands are far enough apart that lizards can't easily hop from one island to another, so distinct populations can be isolated from each other.

Previous research had shown that brown anoles adapt quickly to the characteristics of surrounding vegetation. In habitats where the diameter of brush and tree limbs is smaller, natural selection favors lizards with shorter legs, which enable individuals to move more quickly when escaping predators or chasing a snack. In contrast, lankier lizards tend to fare better where the tree and plant limbs are thicker. Researchers have shown that this limb length trait can evolve quickly in brown anoles -- in just a few generations.

For this new study, Kolbe and his team wanted to see how this evolved limb-length trait might affect the ecosystems on the tiny Bahamian islands. The idea was to separate short- and long-legged lizards on islands of their own, then look for differences in how the lizard populations affect the ecology of their island homes.

Armed with specialized lizard wrangling gear -- poles with tiny lassos made of dental floss at the end -- the team captured hundreds of brown anoles. They then measured the leg length of each lizard, keeping the ones whose limbs were either especially long or especially short and returning the rest to the wild. Once they had distinct populations of short- and long-limbed lizards, they set each population free on islands that previously had no lizards living on them.

Since the experimental islands were mostly covered by smaller diameter vegetation, the researchers expected that the short-legged lizards would be better adapted to that environment, that is, more maneuverable and better able to catch prey in the trees and brush. The question the researchers wanted to answer was whether the ecological effects of those highly effective hunters could be detected.

After eight months, the researchers checked back on the islands to look for ecological differences between islands stocked with the short- and long-legged groups. The differences, it turned out, were substantial. On islands with shorter-legged lizards, populations of web spiders -- a key prey item for brown anoles -- were reduced by 41% compared to islands with lanky lizards. There were significant differences in plant growth as well. Because the short-legged lizards were better at preying on insect herbivores, plants flourished. On islands with short-legged lizards, buttonwood trees had twice as much shoot growth compared to trees on islands with long-legged lizards, the researchers found.

The results, Kolbe says, help to bring the interaction between ecology and evolution full circle.

"These findings help us to close that feedback loop," Kolbe said. "We knew from previous research that ecological factors shape limb length, and now we show the reciprocal relationship of that evolutionary change on the environment."

Read more at Science Daily

How does dopamine regulate both learning and motivation?

A new study from the Netherlands Institute for Neuroscience brings together two schools of thought on the function of the neurotransmitter dopamine: one saying that dopamine provides a learning signal, the other saying that dopamine drives motivation. 'But it is probably both', says Ingo Willuhn.

It is well-known that the dopamine system is implicated in signaling reward-related information as well as in actions that generate rewarding outcomes. This can be investigated using either Pavlovian and operant conditioning experiments. Pavlovian conditioning describes how your brain makes an association between two situations or stimuli that previously seemed unrelated. A famous example is Pavlov's experiment, where a dog heard a sound before receiving food. After several such pairings of the sound with food delivery, the sound alone began to cause the dog to salivate. Operant conditioning, or instrumental learning, differs from this in that the behavior of an individual is important to earn a food reward. Meaning that the individual after hearing a sound, has to perform a so-called operant action to receive the reward. In animal experiments, such a operant response is often the pressing of a lever.

Dopamine measurements in nucleus accumbens

In the final PhD paper of Jessica Goedhoop in collaboration with Tara Arbab and Ingo Willuhn from the Netherlands Institute for Neuroscience, they take a closer look at the role of dopamine signaling in learning and motivation. The team directly compared the two conditioning paradigms: male rats underwent either Pavlovian or operant conditioning while dopamine release was measured in the nucleus accumbens, a brain region central for processing this information. During the experiments a cue light was illuminated for a duration of 5 seconds. For the Pavlovian group, a food pellet was delivered into the reward magazine directly after the cue light turned off. For the operant conditioning group, turning off the cue light was followed by extension of the lever below the cue light into the operant box. The lever was retracted after one lever press, which immediately resulted in the delivery of one food pellet reward into the food magazine. If there was no lever press within 5 seconds after lever extension, the lever was retracted and no reward was delivered.

Sustained dopamine release in operant conditioning

Rats in both groups released the same quantity of dopamine at the onset of the reward-predictive cue. However, only the operant-conditioning group showed a subsequent, sustained plateau in dopamine concentration throughout the entire 5-second cue presentation (throughout cue presentation and before lever press). This dopamine sustainment was observed reliably and consistently throughout systematic manipulation of experimental parameters and behavioral training. Therefore, the researchers believe that sustained dopamine levels may be an intermediate between learning and action, conceptually related to the motivation to generate a reward-achieving action.

Ingo Willuhn: 'There have been a lot of studies on dopamine. We have a decent idea of when dopamine is released in the brain, but there is still lots of discussion on what the precise variables are that determine such dopamine signaling. Essentially discussion on what dopamine "means." To investigate this, scientists usually perform either Pavlovian or operant conditioning experiments. But they test slightly different things. Both have to do with learning an association between a neutral stimulus and a reward. But operant conditioning requires the motivation to perform an action in addition to that (to earn the reward). Therefore, we compared the two types of conditioning in the same experiment.'

Adding a piece to the puzzle

'Our results bring together the two camps of scientists that often battle with each other: one says that dopamine is a so-called reward-prediction error signal, meaning that dopamine is released when something better than expected happens, and is suppressed when something worse than expected happens. It is a learning (or teaching) signal. The other camp says that this is not true. They say that dopamine has something to do with motivation. Increased dopamine release will invigorate the subjects and they work harder to get the reward. There have been a few attempts in the past to bring these two camps together, but there is still need for more knowledge on the subject.'

'What we saw in our study is that only in the operant-learning task dopamine levels stayed high. It seems that the motivation is encoded in this plateau. Reward prediction is the initial dopamine peak, but how much the signal stays up, reflects motivation. Thus, our paper suggests that there is a possibility that dopamine is involved in both, learning and motivation. The next steps will be to get more details out of this. We need to replicate the experiments and make them more sophisticated. The more sophisticated you make it, the more precise our predictions have to be. We are going to build on it and see whether it still holds up.'

Implications

'Dopamine is not only involved in everyday life but also in disorders such as addiction, Parkinson's disease, and schizophrenia. Because of the two camps existing, there is disagreement about what happens exactly. For example, some researchers say that when addicts take drugs dopamine release increases and as a consequence all the environmental cues become more meaningful. Addicts learn that these cues are associated with the drug and they take more and more drug, because they are constantly reminded of the drug everywhere. In this view, addiction is misguided learning. Other researchers would say that motivation to take the drug intensifies with more frequent drug intake, because the drug elevates dopamine release. This study indicates that it may be both. Depending on the precise timing, both systems could be the driver, and both could be involved.'

Read more at Science Daily

Remains of an extinct world of organisms discovered

Newly discovered biomarker signatures point to a whole range of previously unknown organisms that dominated complex life on Earth about a billion years ago. They differed from complex eukaryotic life as we know it, such as animals, plants and algae in their cell structure and likely metabolism, which was adapted to a world that had far less oxygen in the atmosphere than today. An international team of researchers, including GFZ geochemist Christian Hallmann, now reports on this breakthrough for the field of evolutionary geobiology in the journal Nature.

The previously unknown "protosteroids" were shown to be surprisingly abundant throughout Earth´s Middle Ages. The primordial molecules were produced at an earlier stage of eukaryotic complexity -- extending the current record of fossil steroids beyond 800 and up to 1,600 million years ago. Eukaryotes is the term for a kingdom of life including all animals, plants and algae and set apart from bacteria by having a complex cell structure that includes a nucleus, as well as a more complex molecular machinery. "The highlight of this finding is not just the extension of the current molecular record of eukaryotes," Hallmann says: "Given that the last common ancestor of all modern eukaryotes, including us humans, was likely capable of producing 'regular' modern sterols, chances are high that the eukaryotes responsible for these rare signatures belonged to the stem of the phylogenetic tree."

Unprecedented glimpse of a lost world

This "stem" represents the common ancestral lineage that was a precursor to all still living branches of eukaryotes. Its representatives are long extinct, yet details of their nature may shed more light on the conditions surrounding the evolution of complex life. Although more research is needed to evaluate what percentage of protosteroids may have had a rare bacterial source, the discovery of these new molecules not only reconciles the geological record of traditional fossils with that of fossil lipid molecules, but yields a rare and unprecedented glimpse of a lost world of ancient life. The competitive demise of stem group eukaryotes, marked by the first appearance of modern fossil steroids some 800 Million years ago, may reflect one of the most incisive events in the evolution of increasingly complex life.

"Almost all eukaryotes biosynthesise steroids, such as cholesterol that is produced by humans and most other animals" adds Benjamin Nettersheim from the University of Bremen, first author of the study -- "due to potentially adverse health effects of elevated cholesterol levels in humans, cholesterol doesn't have the best reputation from a medical perspective. However, these lipid molecules are integral parts of eukaryotic cell membranes where they aid in a variety of physiological functions. By searching for fossilised steroids in ancient rocks, we can trace the evolution of increasingly complex life."

What the Nobel laureate thaught impossible...

Nobel laureate Konrad Bloch had already speculated about such a biomarker in an essay almost 30 years ago. Bloch suggested that short-lived intermediates in the modern biosynthesis of steroids may not always have been intermediates. He believed that lipid biosynthesis evolved in parallel with changing environmental conditions throughout Earth history. In contrast to Bloch, who did not believe that these ancient intermediates could ever be found, Nettersheim started searching for protosteroids in ancient rocks that were deposited at a time when those intermediates could actually have been the final product.

But how to find such molecules in ancient rocks? "We employed a combination of techniques to first convert various modern steroids to their fossilised equivalent; otherwise we wouldn't have even known what to look for," says Jochen Brocks, professor at the Australian National University who shares the first-authorship of the new study with Nettersheim. Scientists had overlooked these molecules for decades because they do not conform to typical molecular search images. "Once we knew our target, we discovered that dozens of other rocks, taken from billion-year-old waterways across the world, were oozing with similar fossil molecules."

The oldest samples with the biomarker are from the Barney Creek Formation in Australia and are 1.64 billion years old. The rock record of the next 800 Million years only yields fossil molecules of primordial eukaryotes before molecular signatures of modern eukaryotes first appear in the Tonian period. According to Nettersheim "the Tonian Transformation emerges as one of the most profound ecological turning points in our planet's history." Hallmann adds that "both primordial stem groups and modern eukaryotic representatives such as red algae may have lived side by side for many hundreds of millions of years." During this time, however, the Earth's atmosphere became increasingly enriched with oxygen -- a metabolic product of cyanobacteria and of the first eukaryotic algae that would have been toxic to many other organisms. Later, global "Snowball Earth" glaciations occurred and the protosterol communities largely died out. The last common ancestor of all living eukaryotes may have lived 1.2 to 1.8 billion years ago. Its descendants were likely better able to survive heat and cold as well as UV radiation and displaced their primordial relatives.

Read more at Science Daily

Jun 7, 2023

Not your average space explosion: Very long baseline array finds classical novae are anything but simple

While studying classical novae using the National Radio Astronomy Observatory's Very Long Baseline Array (VLBA), a graduate researcher uncovered evidence the objects may have been erroneously typecast as simple. The new observations, which detected non-thermal emission from a classical nova with a dwarf companion, were presented today at a press conference during the 242nd proceedings of the American Astronomical Society in Albuquerque, New Mexico.

V1674 Herculis is a classical nova hosted by a white dwarf and dwarf companion and is currently the fastest classical nova on record. While studying V1674Her with the VLBA, Montana Williams, a graduate student at New Mexico Tech who is leading the investigation into the VLBA properties of this nova, confirmed the unexpected: non-thermal emission coming from it. This data is important because it tells Williams and her collaborators a lot about what's happening in the system. What the team has found is anything but the simple heat-induced explosions scientists previously expected from classical novae.

"Classical novae have historically been considered simple explosions, emitting mostly thermal energy," said Williams. "However, based on recent observations with the Fermi Large Area Telescope, this simple model is not entirely correct. Instead, it seems they're a bit more complicated. Using the VLBA, we were able to get a very detailed picture of one of the main complications, the non-thermal emission."

Very long baseline interferometry (VLBI) detections of classical novae with dwarf companions like V1674Her are rare. They're so rare, in fact, that this same type of detection, with resolved radio synchrotron components, has been reported just one other time to date. That's partly because of the assumed nature of classical novae.

"VLBI detections of novae are only recently becoming possible because of improvements to VLBI techniques, most notably the sensitivity of the instruments and the increasing bandwidth or the amount of frequencies we can record at a given time," said Williams. "Additionally, because of the previous theory of classical novae they weren't thought to be ideal targets for VLBI studies. We now know this isn't true because of multi-wavelength observations which indicate a more complex scenario."

That rarity makes the team's new observations an important step in understanding the hidden lives of classical novae and what ultimately leads to their explosive behavior.

"By studying images from the VLBA and comparing them to other observations from the Very Large Array (VLA), Fermi-LAT, NuSTAR, and NASA-Swift, we can determine what might be the cause of the emission and also make adjustments to the previous simple model," said Williams. "Right now, we're trying to determine if the non-thermal energy is coming from clumps of gas running into other clumped gas which produces shocks, or something else."

Because Fermi-LAT and Nu-Star observations had already indicated that there might be non-thermal emission coming from V1674Her, that made the classical nova an ideal candidate for study because Williams and her collaborators are on a mission to either confirm or deny those types of findings. It was also more interesting, or cute, as Williams puts it, because of its hyper-fast evolution, and because, unlike supernovae, the host system isn't destroyed during that evolution, but rather, remains almost completely intact and unchanged after the explosion. "Many astronomical sources don't change much over the course of a year or even 100 years. But this nova got 10,000 times brighter in a single day, then faded back to its normal state in just about 100 days," she said. "Because the host systems of classical novae remain intact they can be recurrent, which means we might see this one erupt, or cutely explode, again and again, giving us more opportunities to understand why and how it does."

Read more at Science Daily

Weather anomalies are keeping insects active longer

As Earth's climate continues to warm due to the emission of greenhouse gasses, extreme and anomalous weather events are becoming more common. But predicting and analyzing the effects of what is, by definition, an anomaly can be tricky.

Scientists say museum specimens can help. In the first study of its kind, researchers at the University of Florida used natural history specimens to show that unseasonably warm and cold days can prolong the active period of moths and butterflies by nearly a month.

"The results are not at all what we expected," said lead author Robert Guralnick, curator of biodiversity informatics at the Florida Museum of Natural History.

Most studies view climate change and its consequences through a periscope of average temperature increases. As temperature goes up over time, the plants and animals in a particular region become active earlier in the spring, delay dormancy until later in the fall and slowly shift their ranges to align with the climate in which they're best suited to survive.

Erratic weather adds a layer of complexity to these patterns, with unknown consequences that erect an opaque screen ahead of scientists attempting to predict the future of global ecosystems.

"There had been hints in the scientific literature that weather anomalies can have cumulative effects on ecosystems, but there wasn't anything that directly addressed this question at a broad scale," Guralnick said.

This omission, he explained, was due primarily to a lack of sufficient data. While climate data has been reliably collected in many areas of the world for more than a century, records documenting the location and activity of organisms are harder to come by.

Natural history museums have been increasingly regarded as a potential solution. The oldest museums have accumulated specimens for hundreds of years, and recent efforts to digitize collections have made their contents widely available. But digital museum records come with their own unique pitfalls and drawbacks.

In 2022, study co-author Michael Belitz constructed a dataset of moths and butterflies from museum collections to chart a course for other researchers hoping to use similar data. The result was a comprehensive instruction manual for how to gather, organize and analyze information from natural history specimens.

With this robust resource at their disposal, Belitz and his colleagues wanted to see if they could detect a signal from aberrant weather patterns. Restricting their analyses to the eastern United States, the authors used records for 139 moth and butterfly species collected from the 1940s through the 2010s.

Their results were unequivocal: Unusually warm and cold weather has significantly altered insect activity to a greater extent than the average increase in global temperature for the last several decades.

The location and timing of extreme weather events influenced how insects responded. In higher latitudes, warm days in winter meant moths and butterflies became active earlier in the spring. Unusually cold days kept insects at all latitudes active longer, and the combination of exceptionally high and low temperatures had the strongest effect.

"If you have a succession of abnormally cold and warm days, it limits the ability of insects to function at peak performance," Guralnick said. "If cold doesn't kill you, it slows you down, and it might force insects into a torpor. Insects can recover from the cold snaps pretty quickly and go on to have longer lifespans as a direct result of sudden temperature declines."

Insects being active for longer periods of time might initially seem like a good thing. But rather than a counterweight to the negative repercussions of climate change, co-author Lindsay Campbell -- who studies mosquitos -- points out that longer or altered insect lifespans may also mean more opportunities for pathogen transmission.

"There's a correlation between El Niño and rift valley fever outbreaks in East Africa, and there are anecdotal observations that show unusually warm or hot and dry springs, followed by a heavy precipitation event, are also linked with increased outbreaks," said Campbell, an assistant professor at the University of Florida.

Long-term ecosystem stability is also entirely dependent on the synchronized activity of its constituent parts, and plants may not respond to extreme weather in the same way as insects. If moths and butterflies take flight too early, they risk encountering plants that haven't yet produced leaves or flowers, expending their energy in a vain search for food.

And with a constantly shifting baseline for what constitutes 'extreme,' it's unclear if insects will be able to keep pace with the changes.

Read more at Science Daily

Robot 'chef' learns to recreate recipes from watching food videos

Researchers have trained a robotic 'chef' to watch and learn from cooking videos, and recreate the dish itself.

The researchers, from the University of Cambridge, programmed their robotic chef with a 'cookbook' of eight simple salad recipes. After watching a video of a human demonstrating one of the recipes, the robot was able to identify which recipe was being prepared and make it.

In addition, the videos helped the robot incrementally add to its cookbook. At the end of the experiment, the robot came up with a ninth recipe on its own. Their results, reported in the journal IEEE Access, demonstrate how video content can be a valuable and rich source of data for automated food production, and could enable easier and cheaper deployment of robot chefs.

Robotic chefs have been featured in science fiction for decades, but in reality, cooking is a challenging problem for a robot. Several commercial companies have built prototype robot chefs, although none of these are currently commercially available, and they lag well behind their human counterparts in terms of skill.

Human cooks can learn new recipes through observation, whether that's watching another person cook or watching a video on YouTube, but programming a robot to make a range of dishes is costly and time-consuming.

"We wanted to see whether we could train a robot chef to learn in the same incremental way that humans can -- by identifying the ingredients and how they go together in the dish," said Grzegorz Sochacki from Cambridge's Department of Engineering, the paper's first author.

Sochacki, a PhD candidate in Professor Fumiya Iida's Bio-Inspired Robotics Laboratory, and his colleagues devised eight simple salad recipes and filmed themselves making them. They then used a publicly available neural network to train their robot chef. The neural network had already been programmed to identify a range of different objects, including the fruits and vegetables used in the eight salad recipes (broccoli, carrot, apple, banana and orange).

Using computer vision techniques, the robot analysed each frame of video and was able to identify the different objects and features, such as a knife and the ingredients, as well as the human demonstrator's arms, hands and face. Both the recipes and the videos were converted to vectors and the robot performed mathematical operations on the vectors to determine the similarity between a demonstration and a vector.

By correctly identifying the ingredients and the actions of the human chef, the robot could determine which of the recipes was being prepared. The robot could infer that if the human demonstrator was holding a knife in one hand and a carrot in the other, the carrot would then get chopped up.

Of the 16 videos it watched, the robot recognised the correct recipe 93% of the time, even though it only detected 83% of the human chef's actions. The robot was also able to detect that slight variations in a recipe, such as making a double portion or normal human error, were variations and not a new recipe. The robot also correctly recognised the demonstration of a new, ninth salad, added it to its cookbook and made it.

"It's amazing how much nuance the robot was able to detect," said Sochacki. "These recipes aren't complex -- they're essentially chopped fruits and vegetables, but it was really effective at recognising, for example, that two chopped apples and two chopped carrots is the same recipe as three chopped apples and three chopped carrots."

The videos used to train the robot chef are not like the food videos made by some social media influencers, which are full of fast cuts and visual effects, and quickly move back and forth between the person preparing the food and the dish they're preparing. For example, the robot would struggle to identify a carrot if the human demonstrator had their hand wrapped around it -- for the robot to identify the carrot, the human demonstrator had to hold up the carrot so that the robot could see the whole vegetable.

"Our robot isn't interested in the sorts of food videos that go viral on social media -- they're simply too hard to follow," said Sochacki. "But as these robot chefs get better and faster at identifying ingredients in food videos, they might be able to use sites like YouTube to learn a whole range of recipes."

Read more at Science Daily

Study shows promising treatment for tinnitus

Tinnitus, the ringing, buzzing or hissing sound of silence, varies from slightly annoying in some to utterly debilitating in others. Up to 15% of adults in the United States have tinnitus, where nearly 40% of sufferers have the condition chronically and actively seek relief.

A recent study from researchers at the University of Michigan's Kresge Hearing Research Institute suggests relief may be possible.

Susan Shore, Ph.D., Professor Emerita in Michigan Medicine's Department of Otolaryngology and U-M's Departments of Physiology and Biomedical Engineering, led research on how the brain processes bi-sensory information, and how these processes can be harnessed for personalized stimulation to treat tinnitus.

Her team's findings were published in JAMA Network Open.

The study, a double-blind, randomized clinical trial, recruited 99 individuals with somatic tinnitus, a form of the condition in which movements such as clenching the jaw, or applying pressure to the forehead, result in a noticeable change in pitch or loudness of experienced sounds. Nearly 70% of tinnitus sufferers have the somatic form.

According to Shore, candidates with bothersome, somatic tinnitus, as well as normal-to-moderate hearing loss, were eligible to participate.

"After enrollment, participants received a portable device developed and manufactured by in2being, LLC, for in-home use," she said. "The devices were programmed to present each participant's personal tinnitus spectrum, which was combined with electrical stimulation to form a bi-sensory stimulus, while maintaining participant and study team blinding."

Study participants were randomly assigned to one of two groups. The first group received bi-sensory, or active, treatment first, while the second received sound-alone, or control, treatment.

For the first six weeks, participants were instructed to use their devices for 30 minutes each day. The next six weeks gave participants a break from daily use, followed by six more weeks of the treatment not received in the beginning of the study.

Shore notes that every week, participants completed the Tinnitus Functional Index, or TFI, and Tinnitus Handicap Inventory, or THI, which are questionnaires that measure the impact tinnitus has on individuals' lives. Participants also had their tinnitus loudness assessed during this time.

The team found that when participants received the bi-sensory treatment, they consistently reported improved quality of life, lower handicap scores and significant reductions in tinnitus loudness. However, these effects were not seen when receiving sound-only stimulation.

Further, more than 60% of participants reported significantly reduced tinnitus symptoms after the six weeks of active treatment, but not control treatment. This is consistent with an earlier study by Shore's team, which showed that the longer participants received active treatment, the greater the reduction in their tinnitus symptoms.

"This study paves the way for the use of personalized, bi-sensory stimulation as an effective treatment for tinnitus, providing hope for millions of tinnitus sufferers," said Shore.

Read more at Science Daily

Jun 4, 2023

Towering plume of water escaping from Saturn moon

Two Southwest Research Institute scientists were part of a James Webb Space Telescope (JWST) team that observed a towering plume of water vapor more than 6,000 miles long -- roughly the distance from the U.S. to Japan -- spewing from the surface of Saturn's moon, Enceladus. In light of this NASA JWST Cycle 1 discovery, SwRI's Dr. Christopher Glein also received a Cycle 2 allocation to study the plume as well as key chemical compounds on the surface, to better understand the potential habitability of this ocean world.

During its 13-year reconnaissance of the Saturn system, the Cassini spacecraft discovered that Enceladus has a subsurface ocean of liquid water, and Cassini analyzed samples as plumes of ice grains and water vapor erupted into space from cracks in the moon's icy surface.

"Enceladus is one of the most dynamic objects in the solar system and is a prime target in humanity's search for life beyond Earth," said Glein, a leading expert in extraterrestrial oceanography. He is a co-author of a paper recently accepted by Nature Astronomy. "In the years since NASA's Cassini spacecraft first looked at Enceladus, we never cease to be amazed by what we find is happening on this extraordinary moon."

Once again, the latest observations made with Webb's Near InfraRed Spectrograph have yielded remarkable results.

"When I was looking at the data, at first, I was thinking I had to be wrong, it was just so shocking to map a plume more than 20 times the diameter of the moon," said Geronimo Villanueva of NASA's Goddard Space Flight Center and lead author of the recent paper. "The plume extends far beyond what we could have imagined."

Webb's sensitivity reveals a new story about Enceladus and how it feeds the water supply for the entire system of Saturn and its rings. As Enceladus whips around the gas giant in just 33 hours, the moon spews water, leaving a halo, almost like a donut, in its wake. The plume is not only huge, but the water spreads across Saturn's dense E-ring. JWST data indicate that roughly 30 percent of the water stays in the moon's wake, while the other 70 percent escapes to supply the rest of the Saturnian system.

"The Webb observations, for the first time, are visually illustrating how the moon's water vapor plumes are playing a role in the formation of the torus," said SwRI's Dr. Silvia Protopapa, an expert in the compositional analysis of icy bodies in the solar system who was also on the Cycle 1 team. "This serves as a stunning testament to Webb's extraordinary abilities. I'm thrilled to be part of the Cycle 2 team as we initiate our search for new indications of habitability and plume activity on Enceladus."

Spurred by the incredible findings from Webb's first fleeting glimpse of Enceladus, Glein is leading the same team that will observe Enceladus again with JWST in the next year.

"We will search for specific indicators of habitability, such as organic signatures and hydrogen peroxide," Glein said. "Hydrogen peroxide is particularly interesting because it can provide much more potent sources of metabolic energy than what we previously identified. Cassini didn't give us a clear answer on the availability of such strong oxidants on Enceladus."

The new observations will provide the best remote opportunity to search for habitability indicators on the surface, by boosting the signal-to-noise ratio by up to a factor of 10 compared with Cycle 1. Understanding the time variability of plume outgassing is also important to plan for future planetary science missions that target the plume.

Read more at Science Daily

Thermal energy stored by land masses has increased significantly

There are many effects of climate change. Perhaps the most broadly known is global warming, which is caused by heat building up in various parts of the Earth system, such as the atmosphere, the ocean, the cryosphere and the land. 89 percent of this excess heat is stored in the oceans, with the rest in ice and glaciers, the atmosphere and land masses (including inland water bodies). An international research team led by the Helmholtz Centre for Environmental Research (UFZ) has now studied the quantity of heat stored on land, showing the distribution of land heat among the continental ground, permafrost soils, and inland water bodies. The calculations, published in Earth System Dynamics, show that more than 20 times as much heat has been stored there since the 1960s, with the largest increase being in the ground.

The increase in anthropogenic greenhouse gases in the atmosphere prevents the emission of heat into space. As a result, the earth constantly absorbs more heat through solar radiation than it can give back off through thermal radiation. Previous studies show where this additional energy is stored: primarily in the oceans (89 percent), but also in the land masses of the continents (5-6 percent), in ice and glaciers (4 percent) and in the atmosphere (1-2 percent). However, this knowledge is incomplete: For example, it was previously uncertain just how this additional heat was distributed in the continental landmasses.

The research team, headed by the UFZ and with the participation of scientists from the Alfred Wegener Institute (Helmholtz Centre for Polar and Marine Research (AWI)), Vrije Universiteit Brussel and other research centres, was able to quantify more precisely how much heat has been stored in the continental land masses between 1960 and 2020. The result: continental landmasses have absorbed a total of 23.8 x 1021 Joules of heat between 1960 and 2020. For comparison: This corresponds to roughly 1800 times the electric power consumption of Germany over in the same period. Most of this heat, roughly 90 percent, is stored up to 300 metres deep in the earth. 9 percent of the energy is used to thaw permafrost in the Arctic and 0.7 percent is stored in inland water bodies such as lakes and reservoirs. "Although the inland water bodies and permafrost store less heat than the ground, they have to be monitored continuously because the additional energy in these subsystems causes significant changes in ecosystems," says UFZ researcher and lead author of the study Francisco José Cuesta-Valero.

The scientists also demonstrated that the quantity of heat stored in the ground, in permafrost and in lakes has been increasing continuously since the 1960s. For example, a comparison of the two decades from 1960-1970 and from 2010-2020, this quantity increased by nearly 20 times from 1.007 to 18.83 x 1021 Joules in the ground, from 0.058 to 2.0 x 1021 Joules in permafrost regions and from -0.02 to 0.17 x 1021 Joules in inland water bodies. The researchers used more than 1,000 temperature profiles worldwide to calculate the quantities of heat stored at depths of up to 300 metres. They used models to estimate the thermal storage in permafrost and inland water bodies. For example, they combined global lake models, hydrological models and earth system models to model the waters. They estimated thermal storage in permafrost with a permafrost model that accounts for various plausible distributions of ground ice in the Arctic. "Using models enabled us to compensate for the lack of observations in many lakes and in the Arctic and to better estimate the uncertainties due to the limited number of observations," explains Francisco José Cuesta-Valero.

Quantifying this thermal energy is important because its increase is associated with processes that can change ecosystems and can thus have consequences for society. This applies, for example, to the permanently frozen ground in the Arctic. "Although the quantity of heat stored in the permafrost may only comprise nine percent of continental heat storage, the increase over recent years further promotes the release of greenhouse gases such as carbon dioxide and methane due to thawing of permafrost," says Francisco José Cuesta-Valero. If the thermal energy stored in the ground increases, the surface of the earth heats up, thereby placing the stability of the carbon pool in the ground at risk, for example. In agricultural areas, the associated warming of the surface could pose a risk to harvests and hence the food security of the population. In inland water bodies, the changed thermal state could affect the dynamics of the ecosystems: Water quality worsens, the carbon cycle is thrown off; algal blooms increase and in turn affect oxygen concentration and primary productivity, thereby affecting fishery production.

Read more at Science Daily

Desert ant increase the visibility of their nest entrances in the absence of landmarks

Desert ants have outstanding navigational skills. They live in the saltpans of North Africa, an extremely inhospitable environment. To find food for their nest mates, foraging ants have to walk far into the desert. Once they have found food, for example a dead insect, their actual problem begins: How do they find their way back to their nest as quickly as possible in the extremely hot and barren environment? "The desert ant Cataglyphis fortis stands out due its remarkable ability to successfully navigate and forage in even the harshest environments, making it an excellent subject for studying the intricacies of navigation. With an innate navigation mechanism called path integration, these ants use both a sun compass and a step counter to measure the distances they cover. In addition, they possess the ability to learn and utilize visible and olfactory cues. We believe that this extremely harsh habitat has led, during evolution, to a navigation system of unsurpassed precision," said Marilia Freire, the study's lead author, summarizing what is known so far about the amazing orientation skills of these small animals.

The scientists had noticed during previous studies in Tunisia that the nests in the center of the saltpans, where there are hardly any visible landmarks, had high mounds at the nest entrances. In contrast, nest hills near the shrub-covered edges of the saltpans were lower or barely noticeable. So the research team has wondered for some time if these visible differences serve a purpose in helping the ants better find their way home. "It's always hard to tell whether an animal does something on purpose or not. The high nest mounds in the middle of the saltpans could have been a side effect of differences in soil structure or wind conditions. However, crucial for our study was the idea to remove the mounds and to provide some nests with artificial landmarks and others not, and to observe what would happen," Markus Knaden, head of the Project Group Odor-guided Behavior in the Department of Evolutionary Neuroethology, explains the goal of the study.

For their experiments, the researchers followed the ants with a GPS device. This allowed them to track the ants on their way to the saltpan and back home. "We observed that desert ants are capable of traveling much greater distances than previously reported. The farthest distance a single animal traveled was more than two kilometers. However, we also observed an unexpectedly high mortality rate. About 20% of foraging ants do not find their way back home after extremely long runs and died in front of our eyes, which explains the enormous selection pressure for even better orientation," says Marilia Freire.

Experiments in which ants could be tracked with particular accuracy during the last meters to the nest, thanks to a grid painted on the floor, showed that the nest hills are important visual cues. If they were removed, fewer ants found their way back to the nest, while their nest mates simultaneously began to rebuild nest mounds as quickly as possible. If, on the other hand, the scientists placed artificial landmarks in the form of small black cylinders near the nest entrances whose mounds they had previously removed, the ants did not invest in building new ones. Apparently, the cylinders were sufficient for orientation.

In ant nests, labor is divided. Ants that go foraging are usually older and more experienced nest members, while younger ants are busy building. Therefore, there must be some kind of information flow between the two groups. The researchers do not yet know exactly how this is achieved. "One possibility would be that ants in the nest somehow notice that fewer foragers return home, and as a result, hill-building activities at the nest entrance are increased," says Marilia Freire.

Read more at Science Daily