Aug 11, 2023

After seventeen years, a spacecraft makes its first visit home

On Aug. 12, 2023, NASA's STEREO-A spacecraft will pass between the Sun and Earth, marking the first Earth flyby of the nearly 17-year-old mission. The visit home brings a special chance for the spacecraft to collaborate with NASA missions near Earth and reveal new insights into our closest star.

 The twin STEREO (Solar TErrestrial RElations Observatory) spacecraft launched on Oct. 25, 2006, from the Cape Canaveral Air Force Station in Florida. STEREO-A (for "Ahead") advanced its lead on Earth as STEREO-B (for "Behind") lagged behind, both charting Earth-like orbits around the Sun.

During the first years after launch, the dual-spacecraft mission achieved its landmark goal: providing the first stereoscopic, or multiple-perspective, view of our closest star. On Feb. 6, 2011, the mission achieved another landmark: STEREO-A and -B reached a 180-degree separation in their orbits. For the first time, humanity saw our Sun as a complete sphere.

"Prior to that we were 'tethered' to the Sun-Earth line -- we only saw one side of the Sun at a time," said Lika Guhathakurta, STEREO program scientist at NASA Headquarters in Washington, D.C. "STEREO broke that tether and gave us a view of the Sun as a three-dimensional object."

The mission accomplished many other scientific feats over the years, and researchers studied both spacecraft views until 2014, when mission control lost contact with STEREO-B after a planned reset. However, STEREO-A continues its journey, capturing solar views unavailable from Earth.

On Aug. 12, 2023, STEREO-A's lead on Earth has grown to one full revolution as the spacecraft "laps" us in our orbit around the Sun. In the few weeks before and after STEREO-A's flyby, scientists are seizing the opportunity to ask questions normally beyond the mission's reach.

A 3D View of the Sun

During the Earth flyby, STEREO-A will once again do something it used to do with its twin in the early years: combine views to achieve stereoscopic vision.

Stereoscopic vision allows us to extract 3D information from two-dimensional, or flat, images. It's how two eyeballs, looking out at the world from offset locations, create depth perception. Your brain compares the images from each eye, and the slight differences between those images reveal which objects are closer or farther away.

STEREO-A will enable such 3D viewing by synthesizing its views with NASA's and the European Space Agency's Solar and Heliospheric Observatory (SOHO) and NASA's Solar Dynamics Observatory (SDO). Better yet, STEREO-A's distance from Earth changes throughout the flyby, optimizing its stereo vision for different sized solar features at different times. It's as if scientists were adjusting the focus on a several million-mile-wide telescope.

STEREO scientists are using the opportunity to make much-needed measurements. They are identifying active regions, the magnetically complex regions underlying sunspots, hoping to uncover 3D information about their structure usually lost in 2D images. They'll also test a new theory that coronal loops -- giant arches often seen in close-up images of the Sun -- aren't what they appear to be.

"There is a recent idea that coronal loops might just be optical illusions," said Terry Kucera, STEREO project scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. Some scientists have suggested that our limited viewing angles make them appear to have shapes they may not truly have. "If you look at them from multiple points of view, that should become more apparent," Kucera added.

Inside a Solar Eruption

It's not just what STEREO-A will see as it flies by Earth, but also what it will "feel," that could lead to major discoveries.

When a plume of solar material known as a coronal mass ejection, or CME, arrives at Earth, it can disrupt satellite and radio signals, or even cause surges in our power grids. Or, it may have hardly any effect at all. It all depends on the magnetic field embedded within it, which can change dramatically in the 93 million miles between the Sun and Earth.

To understand how a CME's magnetic field evolves on the way to Earth, scientists build computer models of these solar eruptions, updating them with each new spacecraft observation. But a single spacecraft's data can only tell us so much.

"It's like the parable about the blind men and the elephant -- the one who feels the legs says 'it's like a tree trunk,' and the one who feels the tail says 'it's like a snake,'" said said Toni Galvin, a professor at the University of New Hampshire and principal investigator for one of STEREO-A's instruments. "That's what we're stuck with right now with CMEs, because we typically only have one or two spacecraft right next to each other measuring it."

During the months before and after STEREO-A's Earth flyby, any Earth-directed CMEs will pass over STEREO-A and other near-Earth spacecraft, giving scientists much-needed multipoint measurements from inside a CME.

A Fundamentally Different Sun

STEREO-A was also close to Earth in 2006, shortly after launch. That was during "solar minimum," the low-point in the Sun's roughly 11-year cycle of high and low activity.

"The Sun was so quiet at that point! I was looking back at the data and I said 'Oh yeah, I recognize that active region' -- there was one, and we studied it," Kucera said, laughing. "OK, it wasn't quite that bad -- but it was close."

Now, as we approach solar maximum predicted for 2025, the Sun isn't quite so sleepy.

Read more at Science Daily

Climate protection: Land use changes cause the carbon sink to decline

Terrestrial carbon sinks can mitigate the greenhouse effect. Researchers of Karlsruhe Institute of Technology (KIT) and other research institutions pooled various data sources and found that European carbon storage takes place mainly in surface biomass in East Europe. However, changes of land use in particular have caused this carbon sink to decline. The researchers report in Communications Earth & Environment.

Forests can bind large amounts of carbon on the land surface. In this way, they decisively contribute to reducing net greenhouse gas emissions. For some areas, however, data are still lacking. In East Europe, in particular, the network of installed measurement stations is very loose, such that little has been known about carbon flows and their drivers there. "But East European forests have a great potential as a long-term carbon sink," says Karina Winkler from the Atmospheric Environmental Research Department of the Institute of Meteorology and Climate Research (IMK-IFU), KIT's Campus Alpine in Garmisch-Partenkirchen. "Political upheavals in East Europe, however, have caused big changes of land use. Moreover, climate change there increasingly affects the forests. This unique interaction of socioeconomic and climatic factors influences the carbon sinks."

Study Area Covers 13 Countries

Researchers of IMK-IFU's Land Use Change & Climate Group, together with researchers from other European research institutions, have now recalculated the carbon sinks in East Europe. The area studied covers 13 countries, from Poland in the West to the Russian Ural Mountains in the East, from the Kola peninsula in the North to Rumania in the South. Calculations are based on different data sources, such as models, satellite-based biomass estimates, forest inventories, and national statistics.

"From the datasets, we concluded that East Europe stored most of Europe's carbon from 2010 to 2019," Winkler says. Comparison of carbon balances revealed that the land surface in East Europe bound about 410 million tons of carbon in biomass every year. This corresponds to about 78 percent of the carbon sink of entire Europe. The biggest carbon sinks can be found in border region of Ukraine, Belarus and Russia, in the southern Ural Mountains, and on the Kola peninsula.

Timber Extraction Has the Biggest Influence on the Carbon Sink in East Europe

However, data also show that carbon absorption in East Europe with time was anything but constant and has even declined. The East European carbon sink is shrinking. To determine the causes, researchers compared the trends of carbon changes with factors of land use, such as land conversion for agriculture, timber extraction, and share of abandoned agricultural areas, as well as with environmental factors, such as temperature, precipitation, soil humidity, and carbon dioxide (CO2) and nitrogen concentrations in the atmosphere.

They found that environmental impacts, such as the change in soil humidity, have a big influence on the carbon balance. Still, spatial patterns of the carbon sink in East Europe can be explained mainly by land use changes. From 2010 to 2019, timber extraction had the biggest influence on the land-based carbon sink in East Europe. Data analysis suggests that an increase in timber extraction in West Russia and reduced forest growth on former agricultural areas caused the carbon sink in East Europe to decline between 2010 and 2019.

Read more at Science Daily

New orally available drug for spinal cord injury found to be safe and tolerable in healthy participants

New research from the Institute of Psychiatry, Psychology & Neuroscience (IoPPN) at King's College London has demonstrated the safety and tolerability of a new drug treatment designed as a therapeutic intervention for spinal cord injury (SCI).

The research, published in British Journal of Clinical Pharmacology, found that the KCL-286 drug -- which works by activating retinoic acid receptor beta (RARb) in the spine to promote recovery -- was well tolerated by participants in a Phase 1 clinical trial, with no severe side effects. Researchers are now seeking funding for a Phase 2a trial studying the safety and tolerability of the drug in those with SCI.

Global prevalence of SCI is estimated to be between 0.7 and 1.2 million cases per year, with falls and road accidents being the major causes. Despite incurring a cost of $4 billion per year in direct healthcare and indirect costs (i.e. inability to work and social care) in the US alone, there are no licensed drugs that can tackle the intrinsic failure of the adult central nervous system to regenerate, and thus remains a largely unmet clinical need.

Previous research by various groups has shown that nerve growth can be stimulated by activating the RARb2 receptor, but no drug suitable for humans has been developed. KCL-286, an RARb2 agonist1, was developed by Professor Corcoran and team and used in a first in man study to test its safety in humans.

109 healthy males were divided into one of two trial groups; single ascending dose (SAD) adaptive design with a food interaction (FI) arm, and multiple ascending dose (MAD) arm. Participants in each arm were further divided into different dose treatments.

SAD studies are designed to establish the safe dosage range of a medicine by providing participants with small doses before gradually increasing the dose provided. Researchers look for any side effects, and measure how the medicine is processed within the body. MAD studies explore how the body interacts with repeated administration of the drug, and investigate the potential for a drug to accumulate within the body.

Researchers found that participants were able to safely take 100mg doses of KCL-286, with no severe adverse events.

Professor Jonathan Corcoran, Professor of Neuroscience and Director of the Neuroscience Drug Discovery Unit, at King's IoPPN and the study's senior author said, "This represents an important first step in demonstrating the viability of KCL-286 in treating spinal cord injuries. This first-in-human study has shown that a 100mg dose delivered via a pill can be safely taken by humans. Furthermore, we have also shown evidence that it engages with the correct receptor.

"Our focus can hopefully now turn to researching the effects of this intervention in people with spinal cord injuries."

Dr. Bia Goncalves, a senior scientist and project manager of the study, and the study's first author from King's IoPPN said, "Spinal Cord Injuries are a life changing condition that can have a huge impact on a person's ability to carry out the most basic of tasks, and the knock-on effects on their physical and mental health are significant.

"The outcomes of this study demonstrate the potential for therapeutic interventions for SCI, and I am hopeful for what our future research will find."

Read more at Science Daily

How a massive North Atlantic cooling event disrupted early human occupation in Europe

A new study published in the journal Science finds that around 1.12 million years ago a massive cooling event in the North Atlantic and corresponding shifts in climate, vegetation and food resources disrupted early human occupation of Europe.

The study published by an international group of scientists from the UK, South Korea and Spain presents observational and modelling evidence documenting that unprecedented climate stress changed the course of early human history.

Archaic humans, known as Homo erectus moved from Africa into central Eurasia around 1.8 million years. From there on they spread towards western Europe, reaching the Iberian peninsula around 1.5 million years ago (Ma). Experiencing initially rather mild climatic conditions, these groups eventually established a foothold in southern Europe, as documented by several dated fossils and stone tools from this period. But given the increasing intensity of glacial cycles in Europe from 1.2 Ma onwards, it remains unknown for how long early humans lived in this area and whether the occupation was interrupted by worsening climate conditions.

To better understand the environmental conditions, which early human species in Europe experienced, the team of pollen experts, oceanographers, climate modelers, archeologists, and anthropologists combined data of a deep ocean sediment cores from the eastern subtropical Atlantic with new supercomputer climate model and human habitat model simulations covering the period of the depopulation event.

Sieving through thousands of small plant pollen stored in the ocean sediment core and analyzing preserved temperature-sensitive organic compounds left by tiny algae, which lived over a million years ago, the scientists discovered that around 1.127 million years ago, the climate over the eastern North Atlantic and the adjacent land suddenly cooled by 7oC.

"This massive cooling marks one of the first terminal stadial events in the paleoclimatic record. It occurred during the last phase of a glacial cycle, when ice-sheets disintegrated, releasing large amounts of freshwater into the ocean, and causing ocean circulation changes and a southward expansion of sea ice," says Prof. Chronis Tzedakis from University College London (UCL), senior author of the study.

The pollen data extracted from the ocean sediment core further add to this scenario "Rivers and winds bring tiny pollen from the adjacent land to the ocean, where they sink and get deposited in the deep ocean. According to our ocean sediment core pollen analysis, the North Atlantic cooling event switched western European vegetation to an inhospitable semi-desert landscape.," adds Dr. Vasiliki Margari from UCL, lead author of the study.

To quantify how early humans may have reacted to such an unprecedented climate anomaly, scientists from the IBS Center for Climate Physics (ICCP) in South Korea, conducted new computer model simulations for this period. By adding glacial freshwater to the North Atlantic, Dr. Kyung-Sook Yun, and Ms. Hyuna Kim from the ICCP were able to reproduce key features of the terminal stadial event, such as the cooling and drying over southern Europe. "We then used this global climate model simulation as an input for a human habitat model, which determines whether certain environmental conditions were suitable for early Homo erectus or not. We found that over many areas of southern Europe, early human species such as Homo erectus would have not been able to survive" describes Prof. Axel Timmermann, Director of the ICCP at Pusan National University and co-corresponding author of the study.

Even though the cooling event only lasted for about 4,000 years, a lack of stone tools and human remains over the next 200,000 years further raises the possibility of a long-lasting hiatus in European occupation. Europe was again repopulated around 900 thousand years ago by a group that is often referred to as Homo antecessor. This group and its descendants were much more resilient, because they were able to adapt to the increasing intensity of glacial conditions over Europe.

Read more at Science Daily

Aug 10, 2023

Physicists demonstrate how sound can be transmitted through vacuum

A classic movie was once promoted with the punchline: "In space, no one can hear you scream." Physicists Zhuoran Geng and Ilari Maasilta from the Nanoscience Center at the University of Jyväskylä, Finland, have demonstrated, on the contrary, that in certain situations sound can be transmitted strongly across a vacuum region!

In a recent publication they show that in some cases a sound wave can jump or "tunnel" fully across a vacuum gap between two solids if the materials in question are piezoelectric. In such materials, vibrations (sound waves) produce an electrical response, as well, and since an electric field can exist in vacuum, it can transmit the sound waves across. The requirement is that the size of the gap is smaller than the wavelength of the sound wave. This effect works not only in audio range of frequencies (Hz-kHz), but also in ultrasound (MHz) and hypersound (GHz) frequencies, as long as the vacuum gap is made smaller as the frequencies increase.

- In most cases the effect is small, but we also found situations, where the full energy of the wave jumps across the vacuum with 100 % efficiency, without any reflections. As such, the phenomenon could find applications in microelectromechanical components (MEMS, smartphone technology) and in the control of heat, says professor Ilari Maasilta from the Nanoscience Center at the University of Jyväskylä.

Read more at Science Daily

Measuring the extent of global droughts in unprecedented detail

While some parts of the world suffer extreme heat and persistent drought, others are being flooded. Overall, continental water volumes vary so much over time that global sea levels fluctuate significantly too. By combining the hydrological model WaterGAP with GRACE satellite data, a team of geodesists at the University of Bonn have come up with a new set of data that shows how the total distribution of water over the Earth's land surfaces has changed over the past 20 years more accurately than ever before. Their findings are now being published in the Journal of Geodesy.

"The new method allows us to test out model calculations on the future effects of climate change, particularly how rising temperatures and changes in precipitation patterns will impact the water balance in different parts of the world," says Prof. Dr.-Ing. Jürgen Kusche from the Institute of Geodesy and Geoinformation at the University of Bonn. The process involves comparing climate models, which invariably cover a certain period of time in the past, with the results of actual measurements, and Kusche and his team are planning several such studies over the coming months.

The improved resolution that the team has achieved shows that droughts are significantly more common across the world than the GRACE satellite data would suggest in isolation. "What we're seeing is that even extensive droughts like the massive one that struck the whole of the Amazon in 2010 are spread across much wider areas than the satellite data indicates on its own," Kusche says. "This means that the satellites aren't picking up many of the more localized droughts."

Working together with counterparts from Goethe University Frankfurt and the Polish capital Warsaw, a team of researchers from the University of Bonn has now combined satellite measurements with high-resolution meteorological data for the first time. "What's special about this method is that it's enabled us to improve the resolution of the water distribution maps that are generated from around 300 kilometers to 50 kilometers," explains Kusche, who is a member of the Modelling and Sustainable Futures Transdisciplinary Research Areas and the Regional Climate Change Collaborative Research Center at the University of Bonn. To do so, the researchers used the "WaterGAP" hydrological model developed at Goethe University Frankfurt plus a mathematical technique borrowed from weather forecasting.

Masses of water causing changes in the gravitational field

Between 2002 and 2017, the GRACE (Gravity Recovery and Climate Experiment) twin satellites measured changes in the Earth's gravitational force. Its successor project, "GRACE-FO," launched in 2018, and it was this data that the researchers from the University of Bonn used. Since the Earth's gravitational force is dependent on changes in mass, this allows conclusions to be drawn about the water cycle close to its surface. Gravity is affected by changes in groundwater and surface reservoirs and by melting glaciers.

"One unique advantage of the GRACE measurements is that they cover all kinds of reservoir, i.e. including changes in groundwater reserves that are hidden deep below the Earth's surface and in tens of thousands of artificial lakes and wetlands," says Kusche's colleague Helena Gerdener. The disadvantage, she says, is that the spatial resolution of the data on the gravitational field is relatively inexact at about 300 to 350 kilometers as a result of the measurement principle applied. This means that reliable statements can only be made for areas around 100,000 square kilometers in size. To give some idea of scale, this minimum area is still larger than Bavaria, Germany's largest federal state at "only" 70,000 or so square kilometers.

By contrast, global hydrological models permit a resolution of 50 kilometers or even less. These use meteorological measurements of precipitation, temperature and radiation as well as maps of land use and soil composition and data on how water is being used by industry, agriculture and other consumers. Hydrological models simulate evaporation as well as changes to water levels in the soil and groundwater-bearing strata, lakes, rivers and reservoirs. "However, the drawbacks of these models are that they can only reflect reality to a limited extent and meteorological measurements often contain systematic errors," Kusche says, for example if no data on the extraction of groundwater is made available.

For the first time, the researchers have now combined measurements from the GRACE and GRACE-FO satellites with the WaterGAP hydrological model, which itself integrates high-resolution meteorological data. This has enabled the resolution of the water distribution maps thus generated to be improved to 50 kilometers. To do so, the researchers used a mathematical technique known as data assimilation, which is more usually to be found in weather forecasting. However, the scientists did not simply take the results of the hydrological model and the satellite data and calculate the average values. As Kusche explains: "The calculations from the hydrological model are adjusted so that you get close to the satellite data while modifying the physics that the hydrological model draws on as little as possible."

Read more at Science Daily

Drops of seawater contain traces of an ancient world

Sea salt hides a secret: tiny droplets of the seawater from which it came, preserving geologic history.

Using specializing equipment obtained from National Science Foundation grant funds, Mebrahtu Weldeghebriel, PhD '22, a postdoctoral fellow at Princeton University, and Binghamton University Distinguished Professor of Earth Sciences Tim Lowenstein were able to reconstruct changes in seawater chemistry over the last 150 million years, also gaining insight into related geological processes and climate changes. Their article, "Seafloor Hydrothermal Systems Control Long-Term Changes in Seawater [Li+]: Evidence from Fluid Inclusions," was recently published in the journal Science Advances.

The ocean "is like a giant soup of different elements," Lowenstein explained. "Sodium and chloride are the most common ones, but there are dozens of others dissolved in seawater in trace amounts such as lithium."

They looked at sea salt (halite) formed at various times over the past 150 million years in geographically diverse sedimentary basins in the United States, Europe, Asia and Africa. Within the salt samples were tiny pockets containing a bit of ancient seawater.

To access the tiny droplets, the researchers used a laser to drill holes into the salt crystals and then a mass spectrometer to analyze the different trace elements present. In this research, they focused specifically on the concentration of lithium, a trace element that sustained a seven-fold decrease over the past 150 million years, paralleled by a rise in magnesium to calcium ratios.

But why?

The cause for the long-term variations in seawater composition has been debated for the past two decades. The researchers proposed that the decline in lithium concentration in seawater is mainly associated with reduced production of oceanic crust and decreased seafloor hydrothermal activity, both of which are influenced by the movements of tectonic plates. The slowdown in plate activity over the past 150 million years led to less lithium being added to the ocean and reduced amounts of carbon dioxide released into the atmosphere, which ultimately led to global cooling and the present ice age. Turning back the clock 150 million years, the earth was a warmer place with more carbon dioxide in the atmosphere and more lithium in the sea.

"There is a close link between ocean chemistry and atmospheric chemistry," Weldeghebriel said. "Whatever changes happen in the ocean also reflect what's happening in the atmosphere."

Overall, Weldeghebriel and Lowenstein's research has made a significant advance in understanding the chemistry of Earth's ancient oceans and how the movement of tectonic plates has influenced the composition of our Earth's hydrosphere and atmosphere. Such chemical changes impact biology, as well, such as the marine creatures that build their shells out of calcium carbonate.

Read more at Science Daily

How psychedelic drugs affect a rat's brain

Researchers at Lund University have developed a technique for simultaneously measuring electrical signals from 128 areas of the brain in awake rats. They have then used the information to measure what happens to the neurons when the rats are given psychedelic drugs. The results show an unexpected and simultaneous synchronisation among neurons in several regions of the brain.

The idea that electrical oscillations in the brain could be used to teach us more about our experiences was conceived several years ago. Pär Halje and the research team was studying rats with Parkinson's disease that had problems with involuntary movements. The researchers discovered a tone -- an oscillation or wave in the electrical fields -- of 80 hertz in the brains of the rats with Parkinson's disease. It turned out that the wave was closely connected to the involuntary movements.

"A Polish researcher had observed similar waves after giving rats the anaesthetic ketamine. The ketamine was given at a low dose so that the rats were conscious, and the equivalent dose in a human causes psychedelic experiences. The waves they saw were in more cognitive regions of the brain than in the rats with Parkinson's, and the frequency was higher, but that still made us consider whether there were links between the two phenomena. Perhaps excessive brain waves in the motor regions of the brain cause motor symptoms, while excessive waves in cognitive regions give cognitive symptoms," says Pär Halje, researcher in neurophysiology at Lund University.

The research team that Pär Halje belongs to has developed a method that uses electrodes to simultaneously measure oscillations from 128 separate areas of the brain in awake rats. The electrical waves are caused by the cumulative activity in thousands of neurons, but the researchers also succeeded in isolating signals from individual neurons.

"For several of these areas, it is the first time anyone has successfully shown how individual neurons are affected by LSD in awake animals. When we gave the rats the psychedelic substances LSD and ketamine, the waves were clearly registered."

Collective wave patterns

Despite ketamine and LSD affecting different receptors in the brain -- they have completely different ways into the nervous system -- they resulted in the same wave patterns even if the signals from individual cells differed. When the rats were given LSD, researchers saw that their neurons were inhibited -- they signalled less -- in all parts of the brain. Ketamine seemed to have a similar effect on the large neurons -- pyramidal cells -- which saw their expression inhibited, while interneurons, which are smaller neurons that are only collected locally in tissue, increased their signalling.

Pär Halje interprets the results seen in the study, which is published in Communication Biology, to mean that the wave phenomenon is connected to the psychedelic experience.

"Activity in the individual neurons caused by ketamine and LSD looks quite different, and as such cannot be directly linked to the psychedelic experience. Instead, it seems to be this distinctive wave phenomenon -- how the neurons behave collectively -- that is most strongly linked to the psychedelic experience."

Research model for psychoses

Even if what is happening in individual cells is interesting, Pär Halje argues that the whole is bigger and more exciting than the individual parts.

"The oscillations behave in a strange way. One might think that a strong wave starts somewhere, which then spreads to other parts of the brain. But instead, we see that the neurons' activity synchronises itself in a special way -- the waves in the brain go up and down essentially simultaneously in all parts of the brain where we are able to take measurements. This suggests that there are other ways in which the waves are communicated than through chemical synapses, which are relatively slow."

Pär Halje emphasises that it is difficult to know whether the waves cause hallucinations or are merely an indication of them. But, he argues, it opens up the possibility that this could be used as a research model for psychoses, where no good models exist today.

"Given how drastically a psychosis manifests itself, there ought to be a common pattern that we can measure. So far, we have not had that, but we now see a very specific oscillation pattern in rats that we are able to measure."

Read more at Science Daily

Aug 9, 2023

Webb reveals colors of Earendel, most distant star ever detected

NASA's James Webb Space Telescope has followed up on observations by the Hubble Space Telescope of the farthest star ever detected in the very distant universe, within the first billion years after the big bang. Webb's NIRCam (Near-Infrared Camera) instrument reveals the star to be a massive B-type star more than twice as hot as our Sun, and about a million times more luminous.

The star, which the research team has dubbed Earendel, is located in the Sunrise Arc galaxy and is detectable only due to the combined power of human technology and nature via an effect called gravitational lensing. Both Hubble and Webb were able to detect Earendel due to its lucky alignment behind a wrinkle in space-time created by the massive galaxy cluster WHL0137-08. The galaxy cluster, located between us and Earendel, is so massive that it warps the fabric of space itself, which produces a magnifying effect, allowing astronomers to look through the cluster like a magnifying glass.

While other features in the galaxy appear multiple times due to the gravitational lensing, Earendel only appears as a single point of light even in Webb's high-resolution infrared imaging. Based on this, astronomers determine the object is magnified by a factor of at least 4,000, and thus is extremely small -- the most distant star ever detected, observed 1 billion years after the big bang. The previous record-holder for the most distant star was detected by Hubble and observed around 4 billion years after the big bang. Another research team using Webb recently identified a gravitationally lensed star they nicknamed Quyllur, a red giant star observed 3 billion years after the big bang.

Stars as massive as Earendel often have companions. Astronomers did not expect Webb to reveal any companions of Earendel since they would be so close together and indistinguishable on the sky. However, based solely on the colors of Earendel, astronomers think they see hints of a cooler, redder companion star. This light has been stretched by the expansion of the universe to wavelengths longer than Hubble's instruments can detect, and so was only detectable with Webb.

Webb's NIRCam also shows other notable details in the Sunrise Arc, which is the most highly magnified galaxy yet detected in the universe's first billion years. Features include both young star-forming regions and older established star clusters as small as 10 light-years across. On either side of the wrinkle of maximum magnification, which runs right through Earendel, these features are mirrored by the distortion of the gravitational lens. The region forming stars appears elongated, and is estimated to be less than 5 million years old. Smaller dots on either side of Earendel are two images of one older, more established star cluster, estimated to be at least 10 million years old. Astronomers determined this star cluster is gravitationally bound and likely to persist until the present day. This shows us how the globular clusters in our own Milky Way might have looked when they formed 13 billion years ago.

Astronomers are currently analyzing data from Webb's NIRSpec (Near-Infrared Spectrograph) instrument observations of the Sunrise Arc galaxy and Earendel, which will provide precise composition and distance measurements for the galaxy.

Read more at Science Daily

An early warning system for joint heat and ozone extremes in China

High temperatures exacerbate ground-level ozone production, resulting in a deadly combination of extreme heat and poor air quality that is especially dangerous for children, seniors, and people suffering from preexisting respiratory illnesses.

Like most of the globe, China is dealing with increasing temperatures and longer and more frequent heat waves. But, because of its rapid, energy-intensive development, it's also seeing increased production of the main precursors of ozone, volatile organic compounds (VOCs) and oxides of nitrogen (NOx). In a country as populous as China, this combination poses a serious threat to human health, especially in large urban areas such as Beijing.

Now, a team of collaborating researchers from the Harvard-China Project at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and Hong Kong Baptist University has identified large-scale climate patterns that could be used to predict the co-occurrence of extreme heat and ozone days in China months before they occur. Like predictions for hurricane and wildfire seasons, the forecasts could help the government prepare resources and implement policies to mitigate the severity of the season.

The research was published recently in the Proceedings of the National Academy of Sciences.

"We've already seen record-breaking heat waves around the globe this summer, including in China, where local emissions have led to substantial ozone pollution," said Fan Wang, a visiting fellow at SEAS and the Harvard-China Project, Ph.D. candidate at Hong Kong Baptist University, and co-lead author of the study. "Our research could have important implications in the future that would allow agencies such as the Ministry of Ecology and Environment in China to prepare for high summer heat and ozone in springtime."

The research team, led by Michael McElroy, the Gilbert Butler Professor of Environmental Studies at SEAS and the faculty chair of the Harvard-China Project, and Meng Gao, professor at Hong Kong Baptist University and former postdoctoral researcher at SEAS, looked to past meteorological data and daily ozone levels to spot patterns that could be used to predict the season.

Because of the lack of long-term daily observations of ground-level ozone concentrations, the researchers used a sophisticated machine learning model to reconstruct levels back to 2005. Using this dataset, the team identified patterns in sea surface warming in the western Pacific Ocean, the western Indian Ocean and the Ross Sea, off the coast of Antarctica, that preceded summers with high heat and ozone in northeast China, including Beijing.

Warm sea surface temperatures in these regions lead to a decrease in precipitation, cloud cover and circulation across this region of China, known as the North China Plain, which is home to roughly 300 million people.

"These sea surface temperature anomalies influence precipitation, radiation and more, which modulate the co-occurrence of heat waves and ozone pollution," said Gao, co-first author of the paper and Associate of the Harvard-China Project.

The team's model correlated these anomalies with increases in heat waves and ozone about 80 percent of the time.

Governmental agencies could use these predictions to not only issue warnings for human health and agriculture but to reduce the components of ozone and its precursors in the atmosphere before the extreme heat waves hit.

"The ability to forecast prospects for unusually hot summers and unusually high levels of summertime ozone in China simply on the basis of patterns of temperature observed months earlier in remote regions of the ocean is truly exciting," said McElroy.

Read more at Science Daily

New Antarctic extremes 'virtually certain' as world warms

Extreme events in Antarctica such as ocean heatwaves and ice loss will almost certainly become more common and more severe, researchers say.

With drastic action now needed to limit global warming to the Paris Agreement target of 1.5°C, the scientists warn that recent extremes in Antarctica may be the tip of the iceberg.

The study reviews evidence of extreme events in Antarctica and the Southern Ocean, including weather, sea ice, ocean temperatures, glacier and ice shelf systems, and biodiversity on land and sea.

It concludes that Antarctica's fragile environments "may well be subject to considerable stress and damage in future years and decades" -- and calls for urgent policy action to protect it.

"Antarctic change has global implications," said lead author Professor Martin Siegert, from the University of Exeter. "Reducing greenhouse gas emissions to net zero is our best hope of preserving Antarctica, and this must matter to every country -- and individual -- on the planet."

Professor Siegert said the rapid changes now happening in Antarctica could place many countries in breach of an international treaty.

"Signatories to the Antarctic Treaty (including the UK, USA, India and China) pledge to preserve the environment of this remote and fragile place," he said.

"Nations must understand that by continuing to explore, extract and burn fossil fuels anywhere in the world, the environment of Antarctica will become ever more affected in ways inconsistent with their pledge."

The researchers considered the vulnerability of Antarctica to a range of extreme events, to understand the causes and likely future changes -- following a series of recent extremes.

For example, the world's largest recorded heatwave (38.5°C above the mean) occurred in East Antarctica in 2022 and, at present, winter sea ice formation is the lowest on record.

Extreme events can also affect biodiversity. For example, high temperatures have been linked to years with lower krill numbers, leading to breeding failures of krill-reliant predators -- evidenced by many dead fur seal pups on beaches.

Co-author Professor Anna Hogg, from the University of Leeds, said: "Our results show that while extreme events are known to impact the globe through heavy rainfall and flooding, heatwaves and wildfires, such as those seen in Europe this summer, they also impact the remote polar regions.

"Antarctic glaciers, sea ice and natural ecosystems are all impacted by extreme events. Therefore, it is essential that international treaties and policy are implemented in order to protect these beautiful but delicate regions."

Dr Caroline Holmes, a sea ice expert at British Antarctic Survey, said: "Antarctic sea ice has been grabbing headlines in recent weeks, and this paper shows how sea ice records -- first record highs but, since 2017, record lows -- have been tumbling in Antarctica for several years.

"On top of that, there are deep interconnections between extreme events in different aspects of the Antarctic physical and biological system, almost all of them vulnerable to human influence in some way."

The retreat of Antarctic sea ice will make new areas accessible by ships, and the researchers say careful management will be required to protect vulnerable sites.

The European Space Agency and European Commission Copernicus Sentinel satellites are an essential tool for regular monitoring of the whole Antarctic region and Southern Ocean.

Read more at Science Daily

The more you walk, the lower your risk of early death, even if you walk fewer than 5,000 steps

The number of steps you should walk every day to start seeing benefits to your health is lower than previously thought, according to the largest analysis to investigate this.

The study, published in the European Journal of Preventive Cardiology [1] today (Wednesday), found that walking at least 3967 steps a day started to reduce the risk of dying from any cause, and 2337 steps a day reduced the risk of dying from diseases of the heart and blood vessels (cardiovascular disease).

However, the new analysis of 226,889 people from 17 different studies around the world has shown that the more you walk, the greater the health benefits. The risk of dying from any cause or from cardiovascular disease decreases significantly with every 500 to 1000 extra steps you walk. An increase of 1000 steps a day was associated with a 15% reduction in the risk of dying from any cause, and an increase of 500 steps a day was associated with a 7% reduction in dying from cardiovascular disease.

The researchers, led by Maciej Banach, Professor of Cardiology at the Medical University of Lodz, Poland, and Adjunct Professor at the Ciccarone Center for the Prevention of Cardiovascular Disease, Johns Hopkins University School of Medicine, found that even if people walked as many as 20,000 steps a day, the health benefits continued to increase. They have not found an upper limit yet.

"Our study confirms that the more you walk, the better," says Prof. Banach. "We found that this applied to both men and women, irrespective of age, and irrespective of whether you live in a temperate, sub-tropical or sub-polar region of the world, or a region with a mixture of climates. In addition, our analysis indicates that as little as 4,000 steps a day are needed to significantly reduce deaths from any cause, and even fewer to reduce deaths from cardiovascular disease."

There is strong evidence that a sedentary lifestyle may contribute to an increase in cardiovascular disease and a shorter life. Studies have shown that insufficient physical activity affects more than a quarter of the world's population. More women than men (32% versus 23%), and people in higher income countries compared to low-income countries (37% versus 16%) do not undertake a sufficient amount of physical activity. According to World Health Organization data, insufficient physical activity is the fourth most frequent cause of death in the world, with 3.2 million deaths a year related to physical inactivity. The COVID-19 pandemic also resulted in a reduction in physical activity, and activity levels have not recovered two years on from it.

Dr Ibadete Bytyçi from the University Clinical Centre of Kosovo, Pristina, Kosovo, senior author of the paper, says: "Until now, it's not been clear what is the optimal number of steps, both in terms of the cut-off points over which we can start to see health benefits, and the upper limit, if any, and the role this plays in people's health. However, I should emphasise that there were limited data available on step counts up to 20,000 a day, and so these results need to be confirmed in larger groups of people."

This meta-analysis is the first not only to assess the effect of walking up to 20,000 steps a day, but also to look at whether there are any differences depending on age, sex or where in the world people live.

The studies analysed by the researchers followed up participants for a median (average) of seven years. The mean (average) age was 64, and 49% of participants were female.

In people aged 60 years or older, the size of the reduction in risk of death was smaller than that seen in people aged younger than 60 years. In the older adults, there was a 42% reduction in risk seen in those who walked between 6,000 and 10,000 steps a day, while there was a 49% reduction in risk in younger adults who walked between 7,000 and 13,000 steps a day.

Prof. Banach says: "In a world where we have more and more advanced drugs to target specific conditions such as cardiovascular disease, I believe we should always emphasise that lifestyle changes, including diet and exercise, which was a main hero of our analysis, might be at least as, or even more effective in reducing cardiovascular risk and prolonging lives. We still need good studies to investigate whether these benefits may exist for intensive types of exertion, such as marathon running and iron man challenges, and in different populations of different ages, and with different associated health problems. However, it seems that, as with pharmacological treatments, we should always think about personalising lifestyle changes."

Strengths of the meta-analysis include its size and that it was not restricted to looking at studies limited to a maximum of 16,000 steps a day. Limitations include that it was an observational study and so cannot prove that increased step counts cause the reduction in the risk of death, only that it is associated with it. The impact of step counts was not tested on people with different diseases; all the participants were generally healthy when they entered the studies analysed. The researchers were not able to account for differences in race and socioeconomic status, and the methods for counting steps were not identical in all the studies included in this meta-analysis.

Read more at Science Daily

Aug 8, 2023

The trilobites' guide to surviving environmental change

Scientists have worked out how one unusual species of trilobite -- an ancient, sea-dwelling relative of spiders and lobsters -- was able to defend itself against predators and survive a bumpy ride as Earth's oxygen levels fluctuated.

The seas were filled with trilobites for nearly 300 million years starting in the Cambrian Period, some 520 million years ago. During their time on Earth, which lasted much longer than the dinosaurs, they survived two major episodes of mass extinctions and dominated ocean floor ecosystems.

Their armored bodies are divided into three sections: a head, a thorax or middle section, and a rigid tail. There are more than 20,000 known trilobite species and, when mature, most of them have a very specific number of segments in their mid-sections. However, in Aulacopleura koninckii, scientists discovered something unusual.

Though each early growth stage showed little variation in size and shape, mature Aulacopleura developed anywhere between 18 and 22 mid-section segments.

"My collaborators and I thought this species was weird. We couldn't understand why Aulacopleura bodies varied and others living at the same time had a constant number," said Nigel Hughes, UC Riverside paleobiologist and corresponding author of a new study about this trilobite.

"Seeing trilobites with variable numbers of segments in the thorax is like seeing humans born with different numbers of vertebrae in their backs," Hughes said.

The researchers had questions about this anomaly, how it affected the animals' ability to protect itself, and why it might have developed in this way. These questions are answered in a new study published in the Proceedings of the Royal Society B: Biological Sciences.

Like modern pillbugs or "rollie pollies," trilobites curled up into a ball shape to protect themselves from large squid-like creatures, fish, and other predators. When rolled up, they could tuck their tails neatly under their heads, so the soft tissues were protected by their hard exterior skeletons. In the case of Aulacopleura, 3D modelling showed that protection during rolling up was restricted to smaller, immature forms with less than 18 segments in the middle.

"As the number of segments increased, the body proportions did not allow them to tuck their posteriors neatly under their heads and still be completely shielded," Hughes said. "So, why did this species keep adding segments anyway, and how could it survive the nasty predators?"

Based on their virtual reconstructions, it seems highly likely that when Aulacopleura with a large number of mid-segments felt threatened, they would roll up like their relatives and simply let their tails extend past their heads, minimizing the exposed gap.

"Other possible defense maneuvers would have left gaps on the sides that exposed critical organs -- highly unlikely," Hughes said.

As to the question of why this trilobite varied in the number of mid-section segments, the researchers turned to their earlier work. "What is underneath these segments? Legs that serve as gills!" Hughes said. "The more segments, the more surface area for respiration."

Growing additional breathing apparati likely gave these animals the ability to tolerate dips in local seafloor oxygen levels that excluded other species, such as those that preyed on larger Aulacopleura. Parts of the sea floor becoming anoxic forced predators to retreat to sites where oxygen remained sufficient. But larger Aulacopleura, with their extra gills, could stay put, predator-free.

Learning how this species adapted to both biological and physical pressures gives researchers a better understanding of how survival strategies evolve. The way trilobites developed holds clues to how the common ancestor to major groups of modern arthropods, including insects and arachnids, first evolved.

Read more at Science Daily

Carbon dioxide -- not water -- triggers explosive basaltic volcanoes

Geoscientists have long thought that water -- along with shallow magma stored in Earth's crust -- drives volcanoes to erupt. Now, thanks to newly developed research tools at Cornell, scientists have learned that gaseous carbon dioxide can trigger explosive eruptions.

A new model suggests that basaltic volcanoes, typically located on the interior of tectonic plates, are fed by a deep magma within the mantle, stored about 20 to 30 kilometers below Earth's surface.

The research, which offers a clearer picture of our planet's deep internal dynamics and composition, with implications for improving volcanic-hazards planning, will publish August 7, 2023 at 3:00pm ET in the Proceedings of the National Academy of Sciences.

"We used to think all the action happened in the crust," said senior author Esteban Gazel, the Charles N. Mellowes Professor in Engineering in the Department of Earth and Atmospheric Sciences, in Cornell Engineering. "Our data implies the magma comes directly from the mantle -- passing fast through the crust - driven by the exsolution (the process phase of separating gas from liquid) of carbon dioxide.

"This completely changes the paradigm of how these eruptions happen," Gazel said. "All volcanic models had been dominated by water as the main eruption driver, but water has little to do with these volcanoes. It's carbon dioxide that brings this magma from the deep Earth."

About four years ago, Gazel and Charlotte DeVitre, Ph.D. '22, now a postdoctoral researcher at University of California, Berkeley, developed a high-precision carbon dioxide densimeter (which measures density in a tiny vessel) for Raman spectroscopy (a device that examines scattered photons through a microscope).

The natural samples -- microscopic-sized carbon dioxide rich bubbles trapped in crystals emanating from the volcanic eruption -- are then measured via Raman and quantified applying the newly developed densimeter. Essentially, the scientists are examining a microscopic time capsule to provide a history of the magma. This new technique is critical for near real-time precise estimations of magma storage, tested during the 2021 eruption in Las Palmas, in the Canary Islands by Gazel's group.

Further, the scientists developed methods to assess the effect of laser heating on carbon-dioxide rich inclusions (found swathed in the crystals), and to accurately assess melt inclusion and bubble volumes. They also developed an experimental reheating method to increase accuracy and properly account for carbon dioxide trapped as carbonate crystals inside the bubbles.

"The method of development and instrument design were challenging, especially during the height of the pandemic," Gazel said.

Using these new tools, the scientists scrutinized volcanic deposits from the Fogo volcano in Cabo Verde, west of Senegal in the Atlantic Ocean. They found a high concentration of volatiles in the micro-sized melt inclusions encased within the magnesium-iron silicate crystals. The higher amount of carbon dioxide enclosed in the crystals suggested that the magma was stored tens of kilometers below the surface -- within the Earth's mantle.

The group also discovered that this process is connected to the deep mantle source that supply these volcanoes.

This implies that eruptions such as Fogo's volcanic flareups start and are fed from the mantle, effectively bypassing storage in the Earth's crust and driven by deep carbon dioxide, according to the paper.

"These magmas have extremely low viscosities and come directly from the mantle," DeVitre said. "So here, viscosity and water cannot play the common roles that they do in shallower and/or more silicic (rich in silica) volcanic systems. Rather at Fogo volcano the magma must be driven up fast by the carbon dioxide and this likely plays a significant role in its explosive behavior. This is a major step in our understanding of the controls on basaltic explosivity."

Comprehending magma storage helps best prepare society for future eruptions, said Gazel, who is also a faculty fellow at the Cornell Atkinson Center for Sustainability.

"As deep magma storage will not be detected by ground deformation until the melt is close to surface," he said, "this has important repercussions to our understanding of volcanic hazards. We need to understand the drivers of these eruptions. The only way to see these processes now is by observing earthquakes, but earthquakes don't tell you exactly what's happening."

Said Gazel: "With precise measurements that tell us where eruptions start, where magmas melt and where they are stored -- and what triggers the eruption -- we can develop a much better plan for future eruptions."

Read more at Science Daily

Brain's 'appetite control center' different in people who are overweight or living with obesity

Cambridge scientists have shown that the hypothalamus, a key region of the brain involved in controlling appetite, is different in the brains of people who are overweight and people with obesity when compared to people who are a healthy weight.

The researchers say their findings add further evidence to the relevance of brain structure to weight and food consumption.

Current estimations suggest that over 1.9 billion people worldwide are either overweight or obese. In the UK, according to the Office for Health Improvement & Disparities, almost two-thirds of adults are overweight or living with obesity. This increases an individual's risk of developing a number of health problems, including type 2 diabetes, heart disease and stroke, cancer and poorer mental health.

A large number of factors influence how much we eat and the types of food we eat, including our genetics, hormone regulation, and the environment in which we live. What happens in our brains to tell us that we are hungry or full is not entirely clear, though studies have shown that the hypothalamus, a small region of the brain about the size of an almond, plays an important role.

Dr Stephanie Brown from the Department of Psychiatry and Lucy Cavendish College, University of Cambridge, said: "Although we know the hypothalamus is important for determining how much we eat, we actually have very little direct information about this brain region in living humans. That's because it is very small and hard to make out on traditional MRI brain scans."

The majority of evidence for the role of the hypothalamus in appetite regulation comes from animal studies. These show that there are complex interacting pathways within the hypothalamus, with different cell populations acting together to tell us when we are hungry or full.

To get around this, Dr Brown and colleagues used an algorithm developed using machine learning to analyse MRI brain scans taken from 1,351 young adults across a range of BMI scores, looking for differences in the hypothalamus when comparing individuals who are underweight, healthy weight, overweight and living with obesity.

In a study published today in Neuroimage: Clinical, the team found that the overall volume of the hypothalamus was significantly larger in the overweight and obese groups of young adults. In fact, the team found a significant relationship between volume of the hypothalamus and body-mass index (BMI).

These volume differences were most apparent in those sub-regions of the hypothalamus that control appetite through the release of hormones to balance hunger and fullness.

While the precise significance of the finding is unclear -- including whether the structural changes are a cause or a consequence of the changes in body weight -- one possibility is that the change relates to inflammation. Previous animal studies have shown that a high fat diet can cause inflammation of the hypothalamus, which in turn prompts insulin resistance and obesity. In mice, just three days of a fat-rich diet is enough to cause this inflammation. Other studies have shown that this inflammation can raise the threshold at which animals are full -- in other words, they have to eat more food than usual to feel full.

Dr Brown, the study's first author, added: "If what we see in mice is the case in people, then eating a high-fat diet could trigger inflammation of our appetite control centre. Over time, this would change our ability to tell when we've eaten enough and to how our body processes blood sugar, leading us to put on weight."

Inflammation may explain why the hypothalamus is larger in these individuals, the team say. One suggestion is that the body reacts to inflammation by increasing the size of the brain's specialist immune cells, known as glia.

Professor Paul Fletcher, the study's senior author, from the Department of Psychiatry and Clare College, Cambridge, said: "The last two decades have given us important insights about appetite control and how it may be altered in obesity. Metabolic researchers at Cambridge have played a leading role in this.

"Our hope is that by taking this new approach to analysing brain scans in large datasets, we can further extend this work into humans, ultimately relating these subtle structural brain findings to changes in appetite and eating and generating a more comprehensive understanding of obesity."

The team say more research is needed to confirm whether increased volume in the hypothalamus is a result of being overweight or whether people with larger hypothalami are predisposed to eat more in the first place. It is also possible that these two factors interact with each other causing a feedback loop.

Read more at Science Daily

Whale-like filter-feeding discovered in prehistoric marine reptile

A remarkable new fossil from China reveals for the first time that a group of reptiles were already using whale-like filter feeding 250 million years ago.

New research by a team from China and the UK has shown details of the skull of an early marine reptile called Hupehsuchus that indicate it had soft structures such as an expanding throat region to allow it to engulf great masses of water containing shrimp-like prey, and baleen whale-like structures to filter food items as it swam forward.

The team also found that the Hupehsuchus skulls show the same grooves and notches along the edges of its jaws similar to baleen whales, which have keratin strips instead of teeth.

"We were amazed to discover these adaptations in such an early marine reptile," said Zichen Fang of the Wuhan Center of China Geological Survey, who led the research. "The hupehsuchians were a unique group in China, close relatives of the ichthyosaurs, and known for 50 years, but their mode of life was not fully understood."

"The hupesuchians lived in the Early Triassic, about 248 million years ago, in China and they were part of a huge and rapid re-population of the oceans," said Professor Michael Benton, a collaborator at the University of Bristol's School of Earth Sciences. "This was a time of turmoil, only three million years after the huge end-Permian mass extinction which had wiped out most of life. It's been amazing to discover how fast these large marine reptiles came on the scene and entirely changed marine ecosystems of the time."

"We discovered two new hupehsuchian skulls," said Professor Long Cheng, also of the Wuhan Center of China Geological Survey, who directed the project. "These were more complete than earlier finds and showed that the long snout was composed of unfused, straplike bones, with a long space between them running the length of the snout. This construction is only seen otherwise in modern baleen whales where the loose structure of the snout and lower jaws allows them to support a huge throat region that balloons out enormously as they swim forward, engulfing small prey."

Read more at Science Daily

Aug 7, 2023

Scientists help discover the highest-energy light coming from the sun

Sometimes, the best place to hide a secret is in broad daylight. Just ask the sun.

"The sun is more surprising than we knew," said Mehr Un Nisa, a postdoctoral research associate at Michigan State University. "We thought we had this star figured out, but that's not the case."

Nisa, who will soon be joining MSU's faculty, is the corresponding author of a new paper in the journal Physical Review Letters that details the discovery of the highest-energy light ever observed from the sun.

The international team behind the discovery also found that this type of light, known as gamma rays, is surprisingly bright. That is, there's more of it than scientists had previously anticipated.

Watching like a HAWC

Although the high-energy light doesn't reach the Earth's surface, these gamma rays create telltale signatures that were detected by Nisa and her colleagues working with the High-Altitude Water Cherenkov Observatory, or HAWC.

Funded by the National Science Foundation and the National Council of Humanities Science and Technology, HAWC is an important part of the story. Unlike other observatories, it works around the clock.

"We now have observational techniques that weren't possible a few years ago," said Nisa, who works in the Department of Physics and Astronomy in the College of Natural Science.

"In this particular energy regime, other ground-based telescopes couldn't look at the sun because they only work at night," she said. "Ours operates 24/7."

In addition to working differently from conventional telescopes, HAWC looks a lot different from the typical telescope.

Rather than a tube outfitted with glass lenses, HAWC uses a network of 300 large water tanks, each filled with about 200 metric tons of water. The network is nestled between two dormant volcano peaks in Mexico, more than 13,000 feet above sea level.

From this vantage point, it can observe the aftermath of gamma rays striking air in the atmosphere. Such collisions create what are called air showers, which are a bit like particle explosions that are imperceptible to the naked eye.

The energy of the original gamma ray is liberated and redistributed amongst new fragments consisting of lower energy particles and light. It's these particles -- and the new particles they create on their way down -- that HAWC can "see."

When the shower particles interact with water in HAWC's tanks, they create what's known as Cherenkov radiation that can be detected with the observatory's instruments.

Nisa and her colleagues began collecting data in 2015. In 2021, the team had accrued enough data to start examining the sun's gamma rays with sufficient scrutiny.

"After looking at six years' worth of data, out popped this excess of gamma rays," Nisa said. "When we first saw it, we were like, 'We definitely messed this up. The sun cannot be this bright at these energies.'"

Making history

The sun gives off a lot of light spanning a range of energies, but some energies are more abundant than others.

For example, through its nuclear reactions, the sun provides a ton of visible light -- that is, the light we see. This form of light carries an energy of about 1 electron volt, which is a handy unit of measure in physics.

The gamma rays that Nisa and her colleagues observed had about 1 trillion electron volts, or 1 tera electron volt, abbreviated 1 TeV. Not only was this energy level surprising, but so was the fact that they were seeing so much of it.

In the 1990s, scientists predicted that the sun could produce gamma rays when high-energy cosmic rays -- particles accelerated by a cosmic powerhouse like a black hole or supernova -- smash into protons in the sun. But, based on what was known about cosmic rays and the sun, the researchers also hypothesized it would be rare to see these gamma rays reach Earth.

At the time, though, there wasn't an instrument capable of detecting such high-energy gamma rays and there wouldn't be for a while. The first observation of gamma rays with energies of more than a billion electron volts came from NASA's Fermi Gamma-ray Space Telescope in 2011.

Over the next several years, the Fermi mission showed that not only could these rays be very energetic, but also that there were about seven times more of them than scientists had originally expected. And it looked like there were gamma rays left to discover at even higher energies.

When a telescope launches into space, there's a limit to how big and powerful its detectors can be. The Fermi telescope's measurements of the sun's gamma rays maxed out around 200 billion electron volts.

Theorists led by John Beacom and Annika Peter, both professors at Ohio State University, encouraged the HAWC Collaboration to take a look.

"They nudged us and said, 'We're not seeing a cutoff. You might be able to see something," Nisa said.

The HAWC Collaboration includes more than 30 institutions across North America, Europe and Asia, and a sizable portion of that is represented in the nearly 100 authors on the new paper. That includes three additional Spartans: graduate student Daniel Salazar-Gallegos, Professor Emeritus James Linnemann and Kirsten Tollefson, a professor of physics and astronomy and associate dean in the Graduate School at MSU.

Now, for the first time, the team has shown that the energies of the sun's rays extend into the TeV range, up to nearly 10 TeV, which does appear to be the maximum, Nisa said.

Currently, the discovery creates more questions than answers. Solar scientists will now scratch their heads over how exactly these gamma rays achieve such high energies and what role the sun's magnetic fields play in this phenomenon, Nisa said.

When it comes to the cosmos, though, that's part of the excitement. It tells us that there was something wrong, missing or perhaps both when it comes to how we understand our nearest and dearest star.

Read more at Science Daily

Workers are less productive and make more typos in the afternoon -- especially on Fridays

If there's one thing most office workers can agree on, it's that they tend to feel less productive toward the end of the day and the end of each work week. Now, a team of researchers at Texas A&M University has found objective evidence of this phenomenon in action.

A recent interdisciplinary study at the Texas A&M School of Public Health used a novel method of data collection to show that employees really are less active and more prone to mistakes on afternoons and Fridays, with Friday afternoon representing the lowest point of worker productivity.

The study, published in a recent issue of PLOS ONE, was authored by Drs. Taehyun Roh and Nishat Tasnim Hasan from the Department of Epidemiology and Biostatistics, along with Drs. Chukwuemeka Esomonu, Joseph Hendricks and Mark Benden from the Department of Environmental and Occupational Health, and graduate student Anisha Aggarwal from the Department of Health Behavior.

The researchers looked at the computer usage metrics of 789 in-office employees at a large energy company in Texas over a two-year period -- January 1, 2017, to December 31, 2018.

"Most studies of worker productivity use employee self-reports, supervisory evaluations or wearable technology, but these can be subjective and invasive," said Benden, professor and head of the Department of Environmental and Occupational Health. "Instead, we used computer usage metrics -- things like typing speed, typing errors and mouse activity -- to get objective, noninvasive data on computer work patterns."

The team then compared computer usage patterns across different days of the week and times of the day to see what kinds of patterns emerged.

"We found that computer use increased during the week, then dropped significantly on Fridays," said Roh, assistant professor in the Department of Epidemiology and Biostatistics. "People typed more words and had more mouse movement, mouse clicks and scrolls every day from Monday through Thursday, then less of this activity on Friday."

In addition, Roh said, computer use decreased every afternoon, and especially on Friday afternoons.

"Employees were less active in the afternoons and made more typos in the afternoons -- especially on Fridays," he said. "This aligns with similar findings that the number of tasks workers complete increases steadily from Monday through Wednesday, then decreases on Thursday and Friday."

What is the takeaway for employers? To start, flexible work arrangements, such as hybrid work or a four-day work week, may lead to happier and more productive employees.

As of May 2023, about 60 percent of full-time, paid workers in the United States worked entirely on-site. The remainder either worked remotely or had a hybrid arrangement that involved a combination of remote and on-site work. In addition, many employees have a compressed workweek in which they work longer hours, but on fewer days.

"Other studies have found that those who work from home or work fewer days have less stress from commuting, workplace politics and other factors, and thus have more job satisfaction," Benden said. "These arrangements give workers more time with their families and thus reduce work-family conflicts, and also give them more time for exercise and leisure activities, which have been shown to improve both physical and mental health."

Read more at Science Daily

AI model can help determine where a patient's cancer arose

For a small percentage of cancer patients, doctors are unable to determine where their cancer originated. This makes it much more difficult to choose a treatment for those patients, because many cancer drugs are typically developed for specific cancer types.

A new approach developed by researchers at MIT and Dana-Farber Cancer Institute may make it easier to identify the sites of origin for those enigmatic cancers. Using machine learning, the researchers created a computational model that can analyze the sequence of about 400 genes and use that information to predict where a given tumor originated in the body.

Using this model, the researchers showed that they could accurately classify at least 40 percent of tumors of unknown origin with high confidence, in a dataset of about 900 patients. This approach enabled a 2.2-fold increase in the number of patients who could have been eligible for a genomically guided, targeted treatment, based on where their cancer originated.

"That was the most important finding in our paper, that this model could be potentially used to aid treatment decisions, guiding doctors toward personalized treatments for patients with cancers of unknown primary origin," says Intae Moon, an MIT graduate student in electrical engineering and computer science who is the lead author of the new study.

Alexander Gusev, an associate professor of medicine at Harvard Medical School and Dana-Farber Cancer Institute, is the senior author of the paper, which appears today in Nature Medicine.

Mysterious origins

In 3 to 5 percent of cancer patients, particularly in cases where tumors have metastasized throughout the body, oncologists don't have an easy way to determine where the cancer originated. These tumors are classified as cancers of unknown primary (CUP).

This lack of knowledge often prevents doctors from being able to give patients "precision" drugs, which are typically approved for specific cancer types where they are known to work. These targeted treatments tend to be more effective and have fewer side effects than treatments that are used for a broad spectrum of cancers, which are commonly prescribed to CUP patients.

"A sizeable number of individuals develop these cancers of unknown primary every year, and because most therapies are approved in a site-specific way, where you have to know the primary site to deploy them, they have very limited treatment options," Gusev says.

Moon, an affiliate of the Computer Science and Artificial Intelligence Laboratory who is co-advised by Gusev, decided to analyze genetic data that is routinely collected at Dana-Farber to see if it could be used to predict cancer type. The data consist of genetic sequences for about 400 genes that are often mutated in cancer. The researchers trained a machine-learning model on data from nearly 30,000 patients who had been diagnosed with one of 22 known cancer types. That set of data included patients from Memorial Sloan Kettering Cancer Center and Vanderbilt-Ingram Cancer Center, as well as Dana-Farber.

The researchers then tested the resulting model on about 7,000 tumors that it hadn't seen before, but whose site of origin was known. The model, which the researchers named OncoNPC, was able to predict their origins with about 80 percent accuracy. For tumors with high-confidence predictions, which constituted about 65 percent of the total, its accuracy rose to roughly 95 percent.

After those encouraging results, the researchers used the model to analyze a set of about 900 tumors from patients with CUP, which were all from Dana-Farber. They found that for 40 percent of these tumors, the model was able to make high-confidence predictions.

The researchers then compared the model's predictions with an analysis of the germline, or inherited, mutations in a subset of tumors with available data, which can reveal whether the patients have a genetic predisposition to develop a particular type of cancer. The researchers found that the model's predictions were much more likely to match the type of cancer most strongly predicted by the germline mutations than any other type of cancer.

Guiding drug decisions

To further validate the model's predictions, the researchers compared data on the CUP patients' survival time with the typical prognosis for the type of cancer that the model predicted. They found that CUP patients who were predicted to have cancer with a poor prognosis, such as pancreatic cancer, showed correspondingly shorter survival times. Meanwhile, CUP patients who were predicted to have cancers that typically have better prognoses, such as neuroendocrine tumors, had longer survival times.

Another indication that the model's predictions could be useful came from looking at the types of treatments that CUP patients analyzed in the study had received. About 10 percent of these patients had received a targeted treatment, based on their oncologists' best guess about where their cancer had originated. Among those patients, those who received a treatment consistent with the type of cancer that the model predicted for them fared better than patients who received a treatment typically given for a different type of cancer than what the model predicted for them.

Using this model, the researchers also identified an additional 15 percent of patients (2.2-fold increase) who could have received an existing targeted treatment, if their cancer type had been known. Instead, those patients ended up receiving more general chemotherapy drugs.

"That potentially makes these findings more clinically actionable because we're not requiring a new drug to be approved. What we're saying is that this population can now be eligible for precision treatments that already exist," Gusev says.

The researchers now hope to expand their model to include other types of data, such as pathology images and radiology images, to provide a more comprehensive prediction using multiple data modalities. This would also provide the model with a comprehensive perspective of tumors, enabling it to predict not just the type of tumor and patient outcome, but potentially even the optimal treatment.

Read more at Science Daily

Geomagnetic field protects Earth from electron showers

Tohoku University geophysicist Yuto Katoh led a study into the activity of high energy electrons and clarified the unexpected role of the geomagnetic field surrounding the Earth in protecting.

Understanding the ionosphere high in the Earth's atmosphere is important due to its effects on communications systems, satellites and crucial chemical features including the ozone layer. New insights into the activity of high energy electrons have come from a simulation study led by geophysicist Yuto Katoh at Tohoku University, reported in the journal Earth, Planets and Space.

"Our results clarify the unexpected role of the geomagnetic field surrounding the Earth in protecting the atmosphere from high energy electrons," says Katoh.

The ionosphere is a wide region between roughly 60 and more than 600 kilometers above the Earth's surface. It contains electrically charged particles that are a mixture of ions and free electrons generated by the interaction of the atmosphere with radiation from the sun.

Polar regions of the ionosphere are subjected to a particularly steady and energetic stream of incoming electrons in a process called electron precipitation. These 'relativistic' electrons move at close to the speed of light, where the effects of Einstein's relativity theory become ever more significant. They collide with gas molecules and contribute to many phenomena in the ionosphere, including colourful auroral displays. The processes are heavily influenced by the effects of the geomagnetic field on the charged particles involved.

The Tohoku team, with colleagues in Germany and other institutions in Japan, developed a sophisticated software code that focused particular attention on simulating the effects of a relatively unstudied 'mirror force' on the electron precipitation. This is caused by the magnetic force acting on charged particles under the influence of the geomagnetic field.

The simulations demonstrated how the mirror force causes relativistic electrons to bounce back upwards, to an extent dependent on the angles at which the electrons arrive. The predicted effects mean that electrons collide with other charged particles higher in the ionosphere than previously suspected.

Illustrating one example of the significance of this work, Katoh comments: "Precipitating electrons that manage to pass through the mirror force can reach the middle and lower atmosphere, contributing to chemical reactions related to variations in ozone levels." Decreased ozone levels at the poles caused by atmospheric pollution reduce the protection ozone offers living organisms from ultraviolet radiation.

Katoh emphasizes the key theoretical advance of the research is in revealing the surprising significance of the geomagnetic field and the mirror force in protecting the lower atmosphere from the effects of electron precipitation activities by keeping them further away.

Read more at Science Daily

Aug 6, 2023

Study examines Earth and Mars to determine how climate change affects the paths of rivers

In a new study published in Nature Geosciences, researchers, led by a Tulane University sedimentologist , investigated why the paths of meandering rivers change over time and how they could be affected by climate change.

Chenliang Wu, PhD, a postdoctoral researcher at Tulane University School of Science and Engineering, began this research by looking at the Mississippi River before adding other rivers on Earth and ancient riverbeds on Mars to the study.

The study specifically looks at river sinuosity, or how much rivers curve. The sinuosity of rivers changes over time, depending on the age of the river and environmental changes. Some of these changes include sediment and water supply and riverbank vegetation, all of which are affected by climate change. The study found that river sinuosity is related to the changes in how much water flows through the river. Rivers have different water levels depending on environmental factors, like precipitation levels.

The researchers looked at maps of the rivers on Earth over time by using historical data from as early as the fifth century and images from as early as 1939. They used data of 21 lowland meandering rivers. For the ancient riverbeds on Mars, they used previously identified ancient river channels from remote sensing data.

The ancient riverbeds on Mars, untouched by human influence, gave Wu and his team a system to test their hypotheses on how the river systems migrated and what their sinuosity looked like by the time they dried up. Their analysis is also a step toward understanding what the hydroclimate on Mars was like when there was still surface water.

"It really lays the foundation for more advanced topics," Wu said, "like, were the environmental conditions suitable for life on Mars?"

After performing analysis on the rivers, the researchers separated them into two categories: variable-sinuosity and constant-sinuosity. The variable-sinuosity rivers never reached a steady state, meaning their sinuosities continue changing, and the constant sinuosity rivers did reach a steady state, meaning their average sinuosity remained relatively constant. Of the 21 Earth rivers studied, 13, including the Mississippi River, had variable sinuosity, while 8 had constant sinuosity.

Understanding what factors affect the sinuosity of rivers will give researchers and engineers insight into how to manage rivers in the future. It can help with river restoration, future infrastructure projects and flood management. This insight can be invaluable in attempts to mitigate the impacts of climate change.

Read more at Science Daily

Thermal imaging innovation allows AI to see through pitch darkness like broad daylight

Researchers at Purdue University are advancing the world of robotics and autonomy with their patent-pending method that improves on traditional machine vision and perception.

Zubin Jacob, the Elmore Associate Professor of Electrical and Computer Engineering in the Elmore Family School of Electrical and Computer Engineering, and research scientist Fanglin Bao have developed HADAR, or heat-assisted detection and ranging. Their research was featured on the cover of the July 26 issue of the peer-reviewed journal Nature. A video about HADAR is available on YouTube. Nature also has released a podcast episode that includes an interview with Jacob.

Jacob said it is expected that one in 10 vehicles will be automated and that there will be 20 million robot helpers that serve people by 2030.

"Each of these agents will collect information about its surrounding scene through advanced sensors to make decisions without human intervention," Jacob said. "However, simultaneous perception of the scene by numerous agents is fundamentally prohibitive."

Traditional active sensors like LiDAR, or light detection and ranging, radar and sonar emit signals and subsequently receive them to collect 3D information about a scene. These methods have drawbacks that increase as they are scaled up, including signal interference and risks to people's eye safety. In comparison, video cameras that work based on sunlight or other sources of illumination are advantageous, but low-light conditions such as nighttime, fog or rain present a serious impediment.

Traditional thermal imaging is a fully passive sensing method that collects invisible heat radiation originating from all objects in a scene. It can sense through darkness, inclement weather and solar glare. But Jacob said fundamental challenges hinder its use today.

"Objects and their environment constantly emit and scatter thermal radiation, leading to textureless images famously known as the 'ghosting effect,'" Bao said. "Thermal pictures of a person's face show only contours and some temperature contrast; there are no features, making it seem like you have seen a ghost. This loss of information, texture and features is a roadblock for machine perception using heat radiation."

HADAR combines thermal physics, infrared imaging and machine learning to pave the way to fully passive and physics-aware machine perception.

"Our work builds the information theoretic foundations of thermal perception to show that pitch darkness carries the same amount of information as broad daylight. Evolution has made human beings biased toward the daytime. Machine perception of the future will overcome this long-standing dichotomy between day and night," Jacob said.

Bao said, "HADAR vividly recovers the texture from the cluttered heat signal and accurately disentangles temperature, emissivity and texture, or TeX, of all objects in a scene. It sees texture and depth through the darkness as if it were day and also perceives physical attributes beyond RGB, or red, green and blue, visible imaging or conventional thermal sensing. It is surprising that it is possible to see through pitch darkness like broad daylight."

The team tested HADAR TeX vision using an off-road nighttime scene.

"HADAR TeX vision recovered textures and overcame the ghosting effect," Bao said. "It recovered fine textures such as water ripples, bark wrinkles and culverts in addition to details about the grassy land."

Additional improvements to HADAR are improving the size of the hardware and the data collection speed.

"The current sensor is large and heavy since HADAR algorithms require many colors of invisible infrared radiation," Bao said. "To apply it to self-driving cars or robots, we need to bring down the size and price while also making the cameras faster. The current sensor takes around one second to create one image, but for autonomous cars we need around 30 to 60 hertz frame rate, or frames per second."

HADAR TeX vision's initial applications are automated vehicles and robots that interact with humans in complex environments. The technology could be further developed for agriculture, defense, geosciences, health care and wildlife monitoring applications.

Read more at Science Daily

New study links brain waves directly to memory

Neurons produce rhythmic patterns of electrical activity in the brain. One of the unsettled questions in the field of neuroscience is what primarily drives these rhythmic signals, called oscillations. University of Arizona researchers have found that simply remembering events can trigger them, even more so than when people are experiencing the actual event.

The researchers, whose findings are published in the journal Neuron, specifically focused on what are known as theta oscillations, which emerge in the brain's hippocampus region during activities like exploration, navigation and sleep. The hippocampus plays a crucial role in the brain's ability to remember the past.

Prior to this study, it was believed that the external environment played a more important role in driving theta oscillations, said Arne Ekstrom, professor of cognition and neural systems in the UArizona Department of Psychology and senior author of the study. But Ekstrom and his collaborators found that memory generated in the brain is the main driver of theta activity.

"Surprisingly, we found that theta oscillations in humans are more prevalent when someone is just remembering things, compared to experiencing events directly," said lead study author Sarah Seger, a graduate student in the Department of Neuroscience.

The results of the study could have implications for treating patients with brain damage and cognitive impairments, including patients who have experienced seizures, stroke and Parkinson's disease, Ekstrom said. Memory could be used to create stimulations from within the brain and drive theta oscillations, which could potentially lead to improvements in memory over time, he said.

UArizona researchers collaborated on the study with researchers from the University of Texas Southwestern Medical Center in Dallas, including neurosurgeon Dr. Brad Lega and research technician Jennifer Kriegel. The researchers recruited 13 patients who were being monitored at the center in preparation for epilepsy surgery. As part of the monitoring, electrodes were implanted in the patients' brains for detecting occasional seizures. The researchers recorded the theta oscillations in the hippocampus of the brain.

The patients participated in a virtual reality experiment, in which they were given a joystick to navigate to shops in a virtual city on a computer. When they arrived at the correct destination, the virtual reality experiment was paused. The researchers asked the participants to imagine the location at which they started their navigation and instructed them to mentally navigate the route they just passed through. The researchers then compared theta oscillations during initial navigation to participants' subsequent recollection of the route.

During the actual navigation process using the joystick, the oscillations were less frequent and shorter in duration compared to oscillations that occurred when participants were just imagining the route. So, the researchers conclude that memory is a strong driver of theta oscillations in humans.

One way to compensate for impaired cognitive function is by using cognitive training and rehabilitation, Ekstrom said.

"Basically, you take a patient who has memory impairments, and you try to teach them to be better at memory," he said.

In the future, Ekstrom is planning to conduct this research in freely walking patients as opposed to patients in beds and find how freely navigating compares to memory with regard to brain oscillations.

Read more at Science Daily