Feb 25, 2023

James Webb spots super old, massive galaxies that shouldn't exist

In a new study, an international team of astrophysicists has discovered several mysterious objects hiding in images from the James Webb Space Telescope: six potential galaxies that emerged so early in the universe’s history and are so massive they should not be possible under current cosmological theory.

Each of the candidate galaxies may have existed at the dawn of the universe roughly 500 to 700 million years after the Big Bang, or more than 13 billion years ago. They’re also gigantic, containing almost as many stars as the modern-day Milky Way Galaxy.

“It’s bananas,” said Erica Nelson, co-author of the new research and assistant professor of astrophysics at the University of Colorado Boulder. “You just don’t expect the early universe to be able to organize itself that quickly. These galaxies should not have had time to form.”
        
Nelson and her colleagues, including first author Ivo Labbé of the Swinburne University of Technology in Australia, published their results Feb. 22 in the journal Nature.

The latest finds aren’t the earliest galaxies observed by James Webb, which launched in December 2021 and is the most powerful telescope ever sent into space. Last year, another team of scientists spotted four galaxies that likely coalesced from gas around 350 million years after the Big Bang. Those objects, however, were downright shrimpy compared to the new galaxies, containing many times less mass from stars.

The researchers still need more data to confirm that these galaxies are as big as they look, and date as far back in time. Their preliminary observations, however, offer a tantalizing taste of how James Webb could rewrite astronomy textbooks.

“Another possibility is that these things are a different kind of weird object, such as faint quasars, which would be just as interesting,” Nelson said.

Fuzzy dots

There’s a lot of excitement going around: Last year, Nelson and her colleagues, who hail from the United States, Australia, Denmark and Spain, formed an ad hoc team to investigate the data James Webb was sending back to Earth.

Their recent findings stem from the telescope’s Cosmic Evolution Early Release Science (CEERS) Survey. These images look deep into a patch of sky close to the Big Dipper—a relatively boring, at least at first glance, region of space that the Hubble Space Telescope first observed in the 1990s.

Nelson was peering at a postage stamp-sized section of one image when she spotted something strange: a few “fuzzy dots” of light that looked way too bright to be real.

“They were so red and so bright,” Nelson said. “We weren’t expecting to see them.”

She explained that in astronomy, red light usually equals old light. The universe, Nelson said, has been expanding since the dawn of time. As it expands, galaxies and other celestial objects move farther apart, and the light they emit stretches out—think of it like the cosmic equivalent of saltwater taffy. The more the light stretches, the redder it looks to human instruments. (Light from objects coming closer to Earth, in contrast, looks bluer).

The team ran calculations and discovered that their old galaxies were also huge, harboring tens to hundreds of billions of sun-sized stars worth of mass, on par with the Milky Way.

These primordial galaxies, however, probably didn’t have much in common with our own.  

 “The Milky Way forms about one to two new star every year,” Nelson said. “Some of these galaxies would have to be forming hundreds of new stars a year for the entire history of the universe.”

Nelson and her colleagues want to use James Webb to collect a lot more information about these mysterious objects, but they’ve seen enough already to pique their curiosity. For a start, calculations suggest there shouldn’t have been enough normal matter—the kind that makes up planets and human bodies—at that time to form so many stars so quickly.

“If even one of these galaxies is real, it will push against the limits of our understanding of cosmology,” Nelson said.

Seeing back in time

For Nelson, the new findings are a culmination of a journey that began when she was in elementary school. When she was 10, she wrote a report about Hubble, a telescope that launched in 1990 and is still active today. Nelson was hooked.

“It takes time for light to go from a galaxy to us, which means that you're looking back in time when you're looking at these objects,” she said. “I found that concept so mind blowing that I decided at that instant that this was what I wanted to do with my life.”

The fast pace of discovery with James Webb is a lot like those early days of Hubble, Nelson said. At the time, many scientists believed that galaxies didn’t begin forming until billions of years after the Big Bang. But researchers soon discovered that the early universe was much more complex and exciting than they could have imagined.

“Even though we learned our lesson already from Hubble, we still didn’t expect James Webb to see such mature galaxies existing so far back in time,” Nelson said. “I’m so excited.”

Read more at Science Daily

Successful cure of HIV infection after stem cell transplantation, study suggests

Haematopoietic stem cell transplantation for the treatment of severe blood cancers is the only medical intervention that has cured two people living with HIV in the past. An international group of physicians and researchers from Germany, the Netherlands, France, Spain, and the United States has now identified another case in which HIV infection has been shown to be cured in the same way. In a study published this week in Nature Medicine, in which DZIF scientists from Hamburg and Cologne played a leading role, the successful healing process of this third patient was for the first time characterised in great detail virologically and immunologically over a time span of ten years.

An infection with the human immunodeficiency virus (HIV) was previously considered incurable. The reason for this is that the virus "sleeps" in the genome of infected cells for long periods of time, making it invisible and inaccessible to both the immune system and antiviral drugs. The "Düsseldorf patient," a 53-year-old man, is now the third person in the world to be completely cured of the HI virus by a stem cell transplant.

The patient, treated at the University Hospital Düsseldorf for his HIV infection, had received a stem cell transplant due to a blood cancer. As in the cases of the first two patients named "Berlin" and "London," the Düsseldorf patient received stem cells from a healthy donor whose genome contains a mutation in the gene for the HIV-1 co-receptor CCR5. This mutation makes it impossible for most HI viruses to enter human CD4+ T-lymphocytes, their major target cells.

Following transplantation, the patient was carefully monitored virologically and immunologically for almost ten years. Using a variety of sensitive techniques, the researchers analysed the patient's blood and tissue samples to closely monitor immune responses to HIV and the continued presence or even replication of the virus. Already shortly after transplantation and over the entire course of the study years, neither replicating virus nor antibodies or reactive immune cells against HIV were detected. More than four years ago, the antiviral therapy against HIV was discontinued. Ten years after transplantation and four years after the end of anti-HIV therapy, the Düsseldorf patient could be declared cured by the international research consortium.

"This case of curing a chronic HIV infection by stem cell transplantation shows that HIV can in principle be cured," says Prof. Julian Schulze zur Wiesch, DZIF scientist at the University Medical Center Hamburg-Eppendorf and one of the study leads. "In particular, the results of this study are also enormously important for further research into a cure for HIV for the vast majority of people living with HIV for whom stem cell transplantation is not an option."

Read more at Science Daily

Evolution of dinosaur body size through different developmental mechanisms

The meat-eating dinosaurs known as theropods that roamed the ancient Earth ranged in size from the bus-sized T. rex to the smaller, dog-sized Velociraptor. Scientists puzzling over how such wildly different dinosaur sizes evolved recently found -- to their surprise- that smaller and larger theropod dinosaurs like these didn't necessarily get that way merely by growing slower or faster.

In a new paper published in Science, "Developmental strategies underlying gigantism and miniaturization in non-avialan theropod dinosaurs," researchers including Ohio University professor Patrick O'Connor and Ph.D. student Riley Sombathy discovered through examining the bones of dinosaurs that there was no relationship between growth rate and body size.

"Most animals are thought to evolve to be larger by growing faster than their ancestors, but this study shows that it's just as likely that bigger and smaller animals grew for longer or shorter periods of time during growth spurts," said Michael D. D'Emic, a paleontologist at Adelphi University and lead author of the study.

The bones of many animals, including dinosaurs, slowed or paused growth every year, leaving marks like tree rings that indicate the animal's age and can be used to estimate the rate of growth. "Rings like these are called cortical growth marks," said D'Emic. "Widely spaced rings indicate faster growth and narrowly spaced rings tell us that an animal was growing more slowly."

D'Emic, O'Connor, Sombathy and a team of international researchers measured about 500 such growth rings in about 80 different theropod bones, the two-legged, mostly meat-eating species of dinosaurs closely related to birds.

"We found that there was no relationship between growth rate and size," said D'Emic. "Some gigantic dinosaurs grew very slowly, slower than alligators do today. And some smaller dinosaurs grew very fast, as fast as mammals that are alive today." This made sense to co-author Thomas Pascucci, whose graduate thesis contributed to the project: "Extinct animals like dinosaurs inspire awe because of how different they seem from our modern world, but they were animals that grew under similar constraints and environmental factors as those that exist today."

According to O'Connor, this study opens the door to future investigations of how animals regulate their growth. "Alteration of different growth control mechanisms, at molecular or genetic levels, likely accounts for the range of developmental strategies our team observed in theropod dinosaurs. Future studies of living organisms provide an opportunity to elucidate mechanisms related to the evolution of body size in vertebrates more generally."

Sombathy hopes to take up some of those investigations, adding "One of the things that interests me most about the results is the apparent decoupling between growth rate and body size. My Ph.D. dissertation will investigate the impacts of growth rate and body size on bone shape and function."

"This has really important implications because changes in rate versus timing can correlate to many other things, like how many or how large your offspring are, how long you live, or how susceptible to predators you are," D'Emic added. "Hopefully this research spurs investigations into other groups, both alive and extinct, to see what developmental mechanisms are most important in other types of animals."

Read more at Science Daily

Feb 24, 2023

Discovery of massive early galaxies defies prior understanding of the universe

Six massive galaxies discovered in the early universe are upending what scientists previously understood about the origins of galaxies in the universe.

"These objects are way more massive? than anyone expected," said Joel Leja, assistant professor of astronomy and astrophysics at Penn State, who modeled light from these galaxies. "We expected only to find tiny, young, baby galaxies at this point in time, but we've discovered galaxies as mature as our own in what was previously understood to be the dawn of the universe."

Using the first dataset released from NASA's James Webb Space Telescope, the international team of scientists discovered objects as mature as the Milky Way when the universe was only 3% of its current age, about 500-700 million years after the Big Bang. The telescope is equipped with infrared-sensing instruments capable of detecting light that was emitted by the most ancient stars and galaxies. Essentially, the telescope allows scientists to see back in time roughly 13.5 billion years, near the beginning of the universe as we know it, Leja explained.

"This is our first glimpse back this far, so it's important that we keep an open mind about what we are seeing," Leja said. "While the data indicates they are likely galaxies, I think there is a real possibility that a few of these objects turn out to be obscured supermassive black holes. Regardless, the amount of mass we discovered means that the known mass in stars at this period of our universe is up to 100 times greater than we had previously thought. Even if we cut the sample in half, this is still an astounding change."

In a paper published today (Feb. 22) in Nature, the researchers show evidence that the six galaxies are far more massive than anyone expected and call into question what scientists previously understood about galaxy formation at the very beginning of the universe.

"The revelation that massive galaxy formation began extremely early in the history of the universe upends what many of us had thought was settled science," said Leja. "We've been informally calling these objects 'universe breakers' -- and they have been living up to their name so far."

Leja explained that the galaxies the team discovered are so massive that they are in tension with 99% percent of models for cosmology. Accounting for such a high amount of mass would require either altering the models for cosmology or revising the scientific understanding of galaxy formation in the early universe -- that galaxies started as small clouds of stars and dust that gradually grew larger over time. Either scenario requires a fundamental shift in our understanding of how the universe came to be, he added.

"We looked into the very early universe for the first time and had no idea what we were going to find," Leja said. "It turns out we found something so unexpected it actually creates problems for science. It calls the whole picture of early galaxy formation into question."

On July 12, NASA released the first full-color images and spectroscopic data from the James Webb Space Telescope. The largest infrared telescope in space, Webb was designed to see the genesis of the cosmos, its high resolution allowing it to view objects too old, distant or faint for the Hubble Space Telescope.

"When we got the data, everyone just started diving in and these massive things popped out really fast," Leja said. "We started doing the modeling and tried to figure out what they were, because they were so big and bright. My first thought was we had made a mistake and we would just find it and move on with our lives. But we have yet to find that mistake, despite a lot of trying."

Leja explained that one way to confirm the team's finding and alleviate any remaining concerns would be to take a spectrum image of the massive galaxies. That would provide the team data on the true distances, and also the gasses and other elements that made up the galaxies. The team could then use the data to model a clearer of picture of what the galaxies looked like, and how massive they truly were.

"A spectrum will immediately tell us whether or not these things are real," Leja said. "It will show us how big they are, how far away they are. What's funny is we have all these things we hope to learn from James Webb and this was nowhere near the top of the list. We've found something we never thought to ask the universe -- and it happened way faster than I thought, but here we are."

Read more at Science Daily

Novel air filter captures wide variety of pollutants

An air filter made out of corn protein instead of petroleum products can concurrently capture small particulates as well as toxic chemicals like formaldehyde that current air filters can't.

The research could lead to better air purifiers, particularly in regions of the world that suffer from very poor air quality. Washington State University engineers report on the design and tests of materials for this bio-based filter in the journal Separation and Purification Technology.

"Particulate matter is not that challenging to filter but to simultaneously capture various kinds of chemical gas molecules, that's more significant," said Katie Zhong, professor in WSU's School of Mechanical and Materials Engineering and a corresponding author on the paper. "These protein-based air filtering materials should be very promising to capture multiple species of air pollutants."

Poor air quality is a factor in diseases such as asthma, heart disease and lung cancer. Commercial air purifiers remove tiny particles in soot, smoke or car exhaust, which could be inhaled directly into the lungs, but air pollution also often contains other hazardous gaseous molecules, such as carbon monoxide, formaldehyde and other volatile organic compounds.

With micron-sized pores, typical high efficiency particulate air filters, also known as HEPA filters, can capture the small particles but aren't able to capture gaseous molecules. They are most often made of petroleum products and glass, which leads to secondary pollution when old filters are thrown away, Zhong said.

The WSU researchers developed a more environmentally friendly air filter made from corn protein fibers that was able to simultaneously capture 99.5% of small particulate matter, similar to commercial HEPA filters, and 87% of formaldehyde, which is higher than specially designed air filters for those types of toxics.

The researchers chose corn to study because of its abundance as an agricultural product in the U.S. The corn protein is also hydrophobic, which means that the protein repels water and could work well in a moist environment such as in a mask.

The amino acids in the corn protein are known as functional groups. When exposed at the protein's surface, these functional groups act like multiple hands, grabbing the toxic chemical molecules. The researchers demonstrated this by exposing a functional group at the protein surface, where it grabbed formaldehyde. They theorize that further rearrangement of the proteins could develop a tentacle-like set of functional groups that could grab a variety of chemicals from the air.

"From the mechanism, it's very reasonable to expect that this protein-based air filter could capture more species of toxic chemical molecules," Zhong said.

The three-dimensional structure that they developed also offers more promise for a simple manufacturing method than thin films of proteins that the research team developed previously. They used a small amount of a chemical, polyvinyl alcohol, to glue the nanofibers together into a lightweight foam-like material.

"This work provides a new route to fabricating environmentally friendly and multi-functional air filters made from abundant natural biomass," Zhong said. "I believe this technology is very important for people's health and our environment, and it should be commercialized."

Read more at Science Daily

Skipping breakfast may compromise the immune system

Fasting may be detrimental to fighting off infection, and could lead to an increased risk of heart disease, according to a new study by the Icahn School of Medicine at Mount Sinai. The research, which focused on mouse models, is among the first to show that skipping meals triggers a response in the brain that negatively affects immune cells. The results that focus on breakfast were published in the February 23 issue of Immunity, and could lead to a better understanding of how chronic fasting may affect the body long term.

"There is a growing awareness that fasting is healthy, and there is indeed abundant evidence for the benefits of fasting. Our study provides a word of caution as it suggests that there may also be a cost to fasting that carries a health risk," says lead author Filip Swirski, PhD, Director of the Cardiovascular Research Institute at Icahn Mount Sinai. "This is a mechanistic study delving into some of the fundamental biology relevant to fasting. The study shows that there is a conversation between the nervous and immune systems."

Researchers aimed to better understand how fasting -- from a relatively short fast of only a few hours to a more severe fast of 24 hours -- affects the immune system. They analyzed two groups of mice. One group ate breakfast right after waking up (breakfast is their largest meal of the day), and the other group had no breakfast. Researchers collected blood samples in both groups when mice woke up (baseline), then four hours later, and eight hours later.

When examining the blood work, researchers noticed a distinct difference in the fasting group. Specifically, the researchers saw a difference in the number of monocytes, which are white blood cells that are made in the bone marrow and travel through the body, where they play many critical roles, from fighting infections, to heart disease, to cancer.

At baseline, all mice had the same amount of monocytes. But after four hours, monocytes in mice from the fasting group were dramatically affected. Researchers found 90 percent of these cells disappeared from the bloodstream, and the number further declined at eight hours. Meanwhile monocytes in the non-fasting group were unaffected.

In fasting mice, researchers discovered the monocytes traveled back to the bone marrow to hibernate. Concurrently, production of new cells in the bone marrow diminished. The monocytes in the bone marrow -- which typically have a short lifespan -- significantly changed. They survived longer as a consequence of staying in the bone marrow, and aged differently than the monocytes that stayed in the blood.

The researchers continued to fast mice for up to 24 hours, and then reintroduced food. The cells hiding in the bone marrow surged back into the bloodstream within a few hours. This surge led to heightened level of inflammation. Instead of protecting against infection, these altered monocytes were more inflammatory, making the body less resistant to fighting infection.

This study is among the first to make the connection between the brain and these immune cells during fasting. Researchers found that specific regions in the brain controlled the monocyte response during fasting. This study demonstrated that fasting elicits a stress response in the brain -- that's what makes people "hangry" (feeling hungry and angry) -- and this instantly triggers a large-scale migration of these white blood cells from the blood to the bone marrow, and then back to the bloodstream shortly after food is reintroduced.

Dr. Swirski emphasized that while there is also evidence of the metabolic benefits of fasting, this new study is a useful advance in the full understanding of the body's mechanisms.

"The study shows that, on the one hand, fasting reduces the number of circulating monocytes, which one might think is a good thing, as these cells are important components of inflammation. On the other hand, reintroduction of food creates a surge of monocytes flooding back to the blood, which can be problematic. Fasting, therefore regulates this pool in ways that are not always beneficial to the body's capacity to respond to a challenge such as an infection," explains Dr. Swirski. "Because these cells are so important to other diseases like heart disease or cancer, understanding how their function is controlled is critical."

Read more at Science Daily

Trained brains rapidly suppress visual distractions

Have you ever found yourself searching for your keys or phone only to end up getting distracted by a brightly colored object that grabs your attention? This type of attentional capture by objects that stand out from their surroundings is known as 'pop-out'. Pop-out is often functional, for instance when we want people to pay attention to bright red road signs. It can however also distract us from our goals, for instance when a brightly colored binder prevents us from finding our keys on a cluttered desk. Would it not be nice if pop-out for distracting items could somehow be blocked or suppressed to avoid distractions and help us find whatever we are looking for faster?

New research from the Vision and Cognition group at the Netherlands Institute for Neuroscience, published in PNAS, demonstrates that this is indeed possible. After training, the visual brain can suppress neuronal responses to pop-out distractors that are usually enhanced compared to responses to other, non-distracting, items. The researchers trained monkeys to play a video game in which they searched for a unique shape among multiple items, while a uniquely colored item tried to distract them. As soon as the monkeys found the unique shape, they made an eye movement to it to indicate their choice. After some training, monkeys became very good at this game and almost never made eye movements to the distractor.

Neurons in area V4 of the visual cortex, a brain area that processes visual information relatively early after is is captured by the eyes, showed consistently enhanced responses to the shape target stimuli. Responses to the distracting color stimuli on the other hand were only very briefly enhanced but became rapidly suppressed. It appears that the brain first briefly detects the presence of the distracting stimulus, and then quickly suppresses it to avoid that it will interfere with the search for the shape target. The color pop-out signal that might cause distraction is thus essentially inverted into a kind of negative pop-out, or "pop-in," to avoids distraction.

Read more at Science Daily

Feb 23, 2023

'Forbidden' planet orbiting small star challenges gas giant formation theories

A team of astronomers led by Carnegie's Shubham Kanodia has discovered an unusual planetary system in which a large gas giant planet orbits a small red dwarf star called TOI-5205. Their findings which are published in The Astronomical Journal, challenge long-held ideas about planet formation.

Smaller and cooler than our Sun, M dwarfs are the most common stars in our Milky Way galaxy. Due to their small size, these stars tend to be about half as hot as the Sun and much redder. They have very low luminosities, but extremely long lifespans. Although red dwarfs host more planets, on average, than other, more massive types of stars, their formation histories make them unlikely candidates to host gas giants.

The newly discovered planet -- TOI 5205b -- was first identified as a potential candidate by NASA's Transiting Exoplanet Survey Satellite (TESS). Kanodia's team, which included Carnegie's Anjali Piette, Alan Boss, Johanna Teske, and John Chambers, then confirmed its planetary nature and characterized it using a variety of ground-based instruments and facilities.

"The host star, TOI-5205, is just about four times the size of Jupiter, yet it has somehow managed to form a Jupiter-sized planet, which is quite surprising!" exclaimed Kanodia, who specializes in studying these stars, which comprise nearly three-quarters of our galaxy yet can't be seen with the naked eye.

A small number of gas giants have been discovered orbiting older M dwarf stars. But until now no gas giant has been found in a planetary system around a low-mass M dwarf like TOI-5205. To grasp the size comparison here, a Jupiter-like planet orbiting a Sun-like star could be compared to a pea going around a grapefruit; for TOI-5205b, because the host star is so much smaller, it is more like a pea going around a lemon. In fact, when the Jupiter-mass TOI 5205b crosses in front of its host, it blocks about seven percent of its light -- one of the largest known exoplanet transits.

Planets are born in the rotating disk of gas and dust that surrounds young stars. The most commonly used theory of gas planet formation requires about 10 Earth masses of this rocky material to accumulate and form a massive rocky core, after which it rapidly sweeps up large amounts of gas from the neighboring regions of the disk to form the giant planet we see today.

The time frame in which this happens is crucial.

"TOI-5205b's existence stretches what we know about the disks in which these planets are born," Kanodia explained. "In the beginning, if there isn't enough rocky material in the disk to form the initial core, then one cannot form a gas giant planet. And at the end, if the disk evaporates away before the massive core is formed, then one cannot form a gas giant planet. And yet TOI-5205b formed despite these guardrails. Based on our nominal current understanding of planet formation, TOI-5205b should not exist; it is a "forbidden" planet."

The team demonstrated that the planet's very large transit depth makes it extremely conducive for future observations with the recently launched JWST, which could shed some light on its atmosphere and offer some additional clues about the mystery of its formation.

Read more at Science Daily

Tracking how magnetism affects animal behavior

For over 50 years, scientists have observed that the behaviour of a wide variety of animals can be influenced by the Earth's magnetic field. However, despite decades of research, the exact nature of this 'magnetic sense' remains elusive. Will Schneider and Richard Holland from Bangor University in Wales and their co-worker Oliver Lindecke from the Institute for Biology, Oldenburg, Germany have now written a comprehensive overview of this cross-disciplinary field, with an emphasis on the methodology involved. This work is now published in the journal EPJ Special Topics.

This magnetic sense, or 'magnetoreception', was first noticed in birds, and particularly in migratory songbirds. It has now been observed in many other species including mammals, fish and insects. However, the exact relationship between the magnetic field and the behaviour is difficult to pin down because it can be masked by other environmental factors. Experiments must be very carefully designed if their results are to be statistically sound.

"We aim to provide a balanced overview for researchers who wish to enter this exciting area of sensory biology," explains Schneider. He and his co-authors outlined a range of methods that are used to deduce whether an animal's behaviour is affected by a magnetic field. These include using GPS to mark animals' alignment with the Earth's field during normal activities, such as cows grazing; observing behaviour after tissues thought to be responsible for magnetoreception have been removed, or genes knocked out; and attaching small magnets on or near the animals' bodies to disrupt the mechanism. Further work by animal physiologists, neuroscientists, geneticists and others will also be necessary to truly understand this phenomenon.

And this research is not only of academic interest. "Understanding animal magnetoreception will help us to protect animals released into unknown environments in the wild," adds Lindecke.

From Science Daily

Archaeologists uncover early evidence of brain surgery in Ancient Near East

Archaeologists know that people have practiced cranial trephination, a medical procedure that involves cutting a hole in the skull, for thousands of years. They've turned up evidence that ancient civilizations across the globe, from South America to Africa and beyond, performed the surgery.

Now, thanks to a recent excavation at the ancient city of Megiddo, Israel, there's new evidence that one particular type of trephination dates back to at least the late Bronze Age.

Rachel Kalisher, a Ph.D. candidate at Brown University's Joukowsky Institute for Archaeology and the Ancient World, led an analysis of the excavated remains of two upper-class brothers who lived in Megiddo around the 15th century B.C. She found that not long before one of the brothers died, he had undergone a specific type of cranial surgery called angular notched trephination. The procedure involves cutting the scalp, using an instrument with a sharp beveled edge to carve four intersecting lines in the skull, and using leverage to make a square-shaped hole.

Kalisher said the trephination is the earliest example of its kind found in the Ancient Near East.

"We have evidence that trephination has been this universal, widespread type of surgery for thousands of years," Kalisher said. "But in the Near East, we don't see it so often -- there are only about a dozen examples of trephination in this entire region. My hope is that adding more examples to the scholarly record will deepen our field's understanding of medical care and cultural dynamics in ancient cities in this area."

Kalisher's analysis, written in collaboration with scholars in New York, Austria and Israel, was published on Wednesday, Feb. 22, in PLOS ONE.

Two brothers, up close

Israel Finkelstein, who co-authored the study and serves as director of the School of Archaeology and Maritime Cultures at the University of Haifa, said that 4,000 years ago, Megiddo stood at and controlled part of the Via Maris, an important land route that connected Egypt, Syria, Mesopotamia and Anatolia. As a result, the city had become one of the wealthiest and most cosmopolitan cities in the region by about the 19th century B.C., with an impressive skyline of palaces, temples, fortifications and gates.

"It's hard to overstate Megiddo's cultural and economic importance in the late Bronze Age," Finkelstein said.

According to Kalisher, the two brothers whose bones she analyzed came from a domestic area directly adjacent to Megiddo's late Bronze Age palace, suggesting that the pair were elite members of society and possibly even royals themselves. Many other facts bear that out: The brothers were buried with fine Cypriot pottery and other valuable possessions, and as the trephination demonstrates, they received treatment that likely wouldn't have been accessible to most citizens of Megiddo.

"These brothers were obviously living with some pretty intense pathological circumstances that, in this time, would have been tough to endure without wealth and status," Kalisher said. "If you're elite, maybe you don't have to work as much. If you're elite, maybe you can eat a special diet. If you're elite, maybe you're able to survive a severe illness longer because you have access to care."

In her analysis, Kalisher spotted several skeletal abnormalities in both brothers. The older brother had an additional cranial suture and an extra molar in one corner of his mouth, suggesting he may have had a congenital syndrome such as Cleidocranial dysplasia. Both of the brothers' bones show minor evidence of sustained iron deficiency anemia in childhood, which could have impacted their development.

Those developmental irregularities could explain why the brothers died young, one in his teens or early 20s and the other sometime between his 20s and 40s. But Kalisher said it's more likely that the two ultimately succumbed to an infectious disease. A third of one brother's skeleton, and half of the other brother's, shows porosity, legions and signs of previous inflammation in the membrane covering the bones -- which together suggest they had systemic, sustained cases of an infectious disease like tuberculosis or leprosy.

Kalisher said that while some skeletal evidence points to leprosy, it's tough to deduce cases of leprosy using bones alone. She's currently working with researchers at Germany's Max Planck Institute for Evolutionary Anthropology to conduct DNA analyses of specific lesions in the bones. If they find bacterial DNA consistent with leprosy, these brothers will be among the earliest documented examples of leprosy in the world.

"Leprosy can spread within family units, not just because of the close proximity but also because your susceptibility to the disease is influenced by your genetic landscape," Kalisher said. "At the same time, leprosy is hard to identify because it affects the bones in stages, which might not happen in the same order or with the same severity for everyone. It's hard for us to say for sure whether these brothers had leprosy or some other infectious disease."

It's also difficult to know, Kalisher said, whether it was the disease, the congenital conditions or something else that prompted one brother to undergo cranial surgery. But there's one thing she does know: If the angular notched trephination was meant to keep him alive, it didn't succeed. He died shortly after the surgery -- within days, hours or perhaps even minutes.

Digging into medical history

Despite all the evidence of trephination uncovered over the last 200 years, Kalisher said, there's still much archaeologists don't know. It's not clear, for example, why some trephinations are round -- suggesting the use of some sort of analog drill -- and some are square or triangular. Nor is it clear how common the procedure was in each region, or what ancient peoples were even trying to treat. (Doctors today perform a similar procedure, called a craniotomy, to relieve pressure in the brain.) Kalisher is pursuing a follow-up research project that will investigate trephination across multiple regions and time periods, which she hopes will shed more light on ancient medical practices.

"You have to be in a pretty dire place to have a hole cut in your head," Kalisher said. "I'm interested in what we can learn from looking across the scientific literature at every example of trephination in antiquity, comparing and contrasting the circumstances of each person who had the surgery done."

Aside from enriching colleagues' understanding of early trephinations, Kalisher said she hopes her analysis also shows the general public that ancient societies didn't necessarily live by "survival of the fittest" principles, as many might imagine.

"In antiquity, there was a lot more tolerance and a lot more care than people might think," Kalisher said. "We have evidence literally from the time of Neanderthals that people have provided care for one another, even in challenging circumstances. I'm not trying to say it was all kumbaya -- there were sex- and class-based divisions. But in the past, people were still people."

Read more at Science Daily

Custom, 3D-printed heart replicas look and pump just like the real thing

No two hearts beat alike. The size and shape of the the heart can vary from one person to the next. These differences can be particularly pronounced for people living with heart disease, as their hearts and major vessels work harder to overcome any compromised function.

MIT engineers are hoping to help doctors tailor treatments to patients' specific heart form and function, with a custom robotic heart. The team has developed a procedure to 3D print a soft and flexible replica of a patient's heart. They can then control the replica's action to mimic that patient's blood-pumping ability.

The procedure involves first converting medical images of a patient's heart into a three-dimensional computer model, which the researchers can then 3D print using a polymer-based ink. The result is a soft, flexible shell in the exact shape of the patient's own heart. The team can also use this approach to print a patient's aorta -- the major artery that carries blood out of the heart to the rest of the body.

To mimic the heart's pumping action, the team has fabricated sleeves similar to blood pressure cuffs that wrap around a printed heart and aorta. The underside of each sleeve resembles precisely patterned bubble wrap. When the sleeve is connected to a pneumatic system, researchers can tune the outflowing air to rhythmically inflate the sleeve's bubbles and contract the heart, mimicking its pumping action.

The researchers can also inflate a separate sleeve surrounding a printed aorta to constrict the vessel. This constriction, they say, can be tuned to mimic aortic stenosis -- a condition in which the aortic valve narrows, causing the heart to work harder to force blood through the body.

Doctors commonly treat aortic stenosis by surgically implanting a synthetic valve designed to widen the aorta's natural valve. In the future, the team says that doctors could potentially use their new procedure to first print a patient's heart and aorta, then implant a variety of valves into the printed model to see which design results in the best function and fit for that particular patient. The heart replicas could also be used by research labs and the medical device industry as realistic platforms for testing therapies for various types of heart disease.

"All hearts are different," says Luca Rosalia, a graduate student in the MIT-Harvard Program in Health Sciences and Technology. "There are massive variations, especially when patients are sick. The advantage of our system is that we can recreate not just the form of a patient's heart, but also its function in both physiology and disease."

Rosalia and his colleagues report their results in a study appearing today in Science Robotics. MIT co-authors include Caglar Ozturk, Debkalpa Goswami, Jean Bonnemain, Sophie Wang, and Ellen Roche, along with Benjamin Bonner of Massachusetts General Hospital, James Weaver of Harvard University, and Christopher Nguyen, Rishi Puri, and Samir Kapadia at the Cleveland Clinic in Ohio.

Print and pump

In January 2020, team members, led by mechanical engineering professor Ellen Roche, developed a "biorobotic hybrid heart" -- a general replica of a heart, made from synthetic muscle containing small, inflatable cylinders, which they could control to mimic the contractions of a real beating heart.

Shortly after those efforts, the Covid-19 pandemic forced Roche's lab, along with most others on campus, to temporarily close. Undeterred, Rosalia continued tweaking the heart-pumping design at home.

"I recreated the whole system in my dorm room that March," Rosalia recalls.

Months later, the lab reopened, and the team continued where it left off, working to improve the control of the heart-pumping sleeve, which they tested in animal and computational models. They then expanded their approach to develop sleeves and heart replicas that are specific to individual patients. For this, they turned to 3D printing.

"There is a lot of interest in the medical field in using 3D printing technology to accurately recreate patient anatomy for use in preprocedural planning and training," notes Wang, who is a vascular surgery resident at Beth Israel Deaconess Medical Center in Boston.

An inclusive design

In the new study, the team took advantage of 3D printing to produce custom replicas of actual patients' hearts. They used a polymer-based ink that, once printed and cured, can squeeze and stretch, similarly to a real beating heart.

As their source material, the researchers used medical scans of 15 patients diagnosed with aortic stenosis. The team converted each patient's images into a three-dimensional computer model of the patient's left ventricle (the main pumping chamber of the heart) and aorta. They fed this model into a 3D printer to generate a soft, anatomically accurate shell of both the ventricle and vessel.

The team also fabricated sleeves to wrap around the printed forms. They tailored each sleeve's pockets such that, when wrapped around their respective forms and connected to a small air pumping system, the sleeves could be tuned separately to realistically contract and constrict the printed models.

The researchers showed that for each model heart, they could accurately recreate the same heart-pumping pressures and flows that were previously measured in each respective patient.

"Being able to match the patients' flows and pressures was very encouraging," Roche says. "We're not only printing the heart's anatomy, but also replicating its mechanics and physiology. That's the part that we get excited about."

Going a step further, the team aimed to replicate some of the interventions that a handful of the patients underwent, to see whether the printed heart and vessel responded in the same way. Some patients had received valve implants designed to widen the aorta. Roche and her colleagues implanted similar valves in the printed aortas modeled after each patient. When they activated the printed heart to pump, they observed that the implanted valve produced similarly improved flows as in actual patients following their surgical implants.

Finally, the team used an actuated printed heart to compare implants of different sizes, to see which would result in the best fit and flow -- something they envision clinicians could potentially do for their patients in the future.

"Patients would get their imaging done, which they do anyway, and we would use that to make this system, ideally within the day," says co-author Nyugen. "Once it's up and running, clinicians could test different valve types and sizes and see which works best, then use that to implant."

Ultimately, Roche says the patient-specific replicas could help develop and identify ideal treatments for individuals with unique and challenging cardiac geometries.

Read more at Science Daily

Feb 22, 2023

Physicists create new model of ringing black holes

When two black holes collide into each other to form a new bigger black hole, they violently roil spacetime around them, sending ripples called gravitational waves outward in all directions. Previous studies of black hole collisions modeled the behavior of the gravitational waves using what is known as linear math, which means that the gravitational waves rippling outward did not influence, or interact, with each other. Now, a new analysis has modeled the same collisions in more detail and revealed so-called nonlinear effects.

"Nonlinear effects are what happens when waves on the beach crest and crash" says Keefe Mitman, a Caltech graduate student who works with Saul Teukolsky (PhD '74), the Robinson Professor of Theoretical Astrophysics at Caltech with a joint appointment at Cornell University. "The waves interact and influence each other rather than ride along by themselves. With something as violent as a black hole merger, we expected these effects but had not seen them in our models until now. New methods for extracting the waveforms from our simulations have made it possible to see the nonlinearities."

The research, published in the journal Physical Review Letters, come from a team of researchers at Caltech, Columbia University, University of Mississippi, Cornell University, and the Max Planck Institute for Gravitational Physics.

In the future, the new model can be used to learn more about the actual black hole collisions that have been routinely observed by LIGO (Laser Interferometer Gravitational-wave Observatory) ever since it made history in 2015 with the first direct detection of gravitational waves from space. LIGO will turn back on later this year after getting a set of upgrades that will make the detectors even more sensitive to gravitational waves than before.

Mitman and his colleagues are part of a team called the Simulating eXtreme Spacetimes collaboration, or SXS. Founded by Teukolsky in collaboration with Nobel Laureate Kip Thorne (BS '62), Richard P. Feynman Professor of Theoretical Physics, Emeritus, at Caltech, the SXS project uses supercomputers to simulate black hole mergers. The supercomputers model how the black holes evolve as they spiral together and merge using the equations of Albert Einstein's general theory of relativity. In fact, Teukolsky was the first to understand how to use these relativity equations to model the "ringdown" phase of the black hole collision, which occurs right after the two massive bodies have merged.

"Supercomputers are needed to carry out an accurate calculation of the entire signal: the inspiral of the two orbiting black holes, their merger, and the settling down to a single quiescent remnant black hole," Teukolsky says. "The linear treatment of the settling down phase was the subject of my PhD thesis under Kip quite a while ago. The new nonlinear treatment of this phase will allow more accurate modeling of the waves and eventually new tests of whether general relativity is, in fact, the correct theory of gravity for black holes."

The SXS simulations have proved instrumental in identifying and characterizing the nearly 100 black hole smashups detected by LIGO so far. This new study represents the first time that the team has identified nonlinear effects in simulations of the ringdown phase.

"Imagine there are two people on a trampoline," Mitman says. "If they jump gently, they shouldn't influence the other person that much. That's what happens when we say a theory is linear. But if one person starts bouncing with more energy, then the trampoline will distort, and the other person will start to feel their influence. This is what we mean by nonlinear: the two people on the trampoline experience new oscillations because of the presence and influence of the other person."

In gravitational terms, this means that the simulations produce new types of waves. "If you dig deeper under the large waves, you will find an additional new wave with a unique frequency," Mitman says.

In the big picture, these new simulations will help researchers to better characterize future black hole collisions observed by LIGO and to better test Einstein's general theory of relativity.

Read more at Science Daily

Better tools needed to determine ancient life on Mars

Current state-of-the-art instrumentation being sent to Mars to collect and analyze evidence of life might not be sensitive enough to make accurate assessments, according to a research team co-led by a Cornell University astronomer.

In a paper published in Nature Communications, visiting planetary scientist Alberto Fairén, and an international team of researchers, claim that ancient organic material in Martian rocks could be difficult, if not impossible, to detect with current instruments and techniques.

Fairén -- also a research professor at the Center of Astrobiology (CAB) in Madrid -- and colleagues conducted tests on sedimentary rocks found in the Red Stone Jurassic fossil delta of the Atacama Desert in northwestern Chile, the oldest and driest desert on Earth and a popular geological analog to Mars.

The researchers conducted geological tests at Red Stone using four instruments that are currently or will soon be on Mars. They found the samples display numerous microorganisms of undetermined classification -- what the researchers term "dark microbiome" -- and a mix of biosignatures from current and ancient microorganisms that can barely be detected with state-of-the-art laboratory equipment.

This revealed to the researchers that the instrumentation sent to Mars might not be sensitive enough, depending on the instrument used and the organic compound being sought. "Specifically, the chance of obtaining false negatives in the search for life on Mars highlights the need for more powerful tools," said lead author Armando Azua-Bustos, a research scientist on Fairén's team at CAB.

Either putting complex instrumentation on Mars, approximately 53 million miles away, or bringing Martian samples to Earth is necessary in order "to conclusively address whether life ever existed on Mars," the researchers wrote. In this case, both options are extremely difficult, Fairén said.

"You need to decide whether is more advantageous having limited capability for analysis on the surface of Mars to interrogate a wide variety of samples," he said, "or having limited samples to be analyzed with the wide variety of state-of-the-art instrumentation on Earth."

NASA is currently partnering with the European Space Agency and others in an effort to safely transport Martian geological samples gathered by the Perseverance rover to Earth. And Fairén said the first European Mars rover, named Rosalind Franklin, is also expected to launch as early as 2028.

Read more at Science Daily

Climate 'spiral' threatens land carbon stores

The world's forests are losing their ability to absorb carbon due to increasingly 'unstable' conditions caused by humans, a landmark study has found.

Dramatic changes to forests, and other habitats that store carbon in plants and soils, are becoming more likely in some regions across Earth, with less carbon consistently absorbed by the 'land carbon sink' provided by trees, soil and plants, according to scientists writing in Nature.

The short-term impacts of rising temperatures, deforestation and farming on many vulnerable landscapes means carbon stores on land are less likely to recover in the longer term, the scientists say. This reduces the overall storage capacity of the land to absorb carbon and undermines global efforts to curb or reduce levels of greenhouse gases in the atmosphere.

Dr Patrick McGuire, a climate scientist working jointly in the Department of Meteorology and the National Centre for Atmospheric Science branch, both at the University of Reading, UK, was a co-author of the new study, which was led by colleagues at CREAF, Barcelona, and Antwerp University.

Dr McGuire said: "We found that large regions of the world are vulnerable to sudden and dramatic changes to their landscape, because the ability of their ecosystems to absorb carbon starts to destabilise.

"For example, forest fires in California are more likely because of extremely dry and hot conditions caused by a hotter atmosphere. More fires means forest turns to scrubland, sometimes permanently. This reduces the land's overall ability to suck carbon out of the atmosphere as it did before.

"This creates a vicious cycle as areas such as these become more vulnerable to the effects of climate change in the future."

Unstable carbon storage

Researchers found that from 1981-2018, ecosystems across the world moved through different phases, ranging from high productivity, when plants were able to take in more carbon, to low productivity, when plants were less able to absorb carbon.

The scale of these fluctuations creates a greater risk of destabilisation, increasing the risk of abrupt landscape changes as ecosystems cannot acclimate to climate change, deforestation, and changes to biodiversity, among other factors.

The study, published today (Wednesday, 22 February 2023) in Nature, found the regions most at risk typically have less forest cover and more cropland, are warmer, and have experienced greater rises in temperature, which could be related to an increase in extreme weather events, such as heatwaves and cold snaps. The areas identified as most at risk include the Mediterranean Basin, Southeast Asia and the west coasts of North and Central America.

The researchers said these vulnerable areas have developed a 'memory' -- described as a 'temporal autocorrelation' -- meaning that years where carbon uptake is lower are more likely to be followed by years where carbon uptake diminishes further. Researchers say that as less carbon is absorbed in areas where forestland dominates, the likelihood of scrubland becoming the permanent landscape increases and forests could be lost forever.

Global variation

While several regions are at risk of abrupt changes in their landscapes, there are parts of the world where carbon absorption levels are consistent and ecosystem collapse is less likely as a result of carbon fluctuations. This includes the tropical forests of the Amazon, and parts of central and northern Europe, where carbon absorption capacity has increased. However, the researchers warn that regions such as the Amazon face other climate threats, such as future shifts in regular patterns of rainfall.

The scientists say these global variations could make it harder to predict the global impact of schemes to absorb carbon, such as planting trees, in helping the world reach carbon net zero.

Read more at Science Daily

A new model offers an explanation for the huge variety of sizes of DNA in nature

A new model developed at Tel Aviv University offers a possible solution to the scientific question of why neutral sequences, sometimes referred to as "junk DNA," are not eliminated from the genome of living creatures in nature and continue to exist within it even millions of years later.

According to the researchers, the explanation is that junk DNA is often located in the vicinity of functional DNA. Deletion events around the borders between junk and functional DNA are likely to damage the functional regions and so evolution rejects them. The model contributes to the understanding of the huge variety of genome sizes observed in nature.

The phenomenon that the new model describes is called by the team of researchers "border induced selection." It was developed under the leadership of the PhD student Gil Loewenthal in the laboratory of Prof. Tal Pupko from the Shmunis School of Biomedicine and Cancer Research, Faculty of Life Sciences and in collaboration with Prof. Itay Mayrose (Faculty of Life Sciences, Tel Aviv University). The study was published in the journal Open Biology.

The researchers explain that throughout evolution, the size of the genome in living creatures in nature changes. For example, some salamander species have a genome ten times larger than the human genome.

Prof. Pupko explains: "The rate of deletions and short insertions, which are termed in short as 'indels', is usually measured by examining pseudogenes. Pseudogenes are genes that have lost their function, and in which there are frequent mutations, including deletions and insertions of DNA segments. In previous studies that characterized the indels, it was found that the rate of deletions is greater than the rate of additions in a variety of creatures including bacteria, insects, and even mammals such as humans. The question we tried to answer is how the genomes are not deleted when the probability of DNA deletion events is significantly greater than DNA addition events."

Read more at Science Daily

Feb 21, 2023

A star is born: Nearby galaxies provide clues about star formation

It is a popular notion that aside from large celestial objects like planets, stars and asteroids, outer space is empty. In fact, galaxies are filled with something called the interstellar medium (ISM) -- that is, the gas and dust that permeate the space in between those large objects. Importantly, under the right conditions, it is from the ISM that new stars are formed.

Now researchers from the University of California San Diego, in collaboration with a worldwide project team, have released their findings in a special issue of The Astrophysical Journal Letters dedicated to their work using advanced telescope images through the JWST Cycle 1 Treasury Program.

"With JWST, you can make incredible maps of nearby galaxies at very high resolution that provide amazingly detailed images of the interstellar medium," stated Associate Professor of Physics Karin Sandstrom who is a co-principal investigator on the project.

Although JWST can look at very distant galaxies, the ones Sandstrom's group studied are relatively close at about 30 million light years away, including one known as the Phantom Galaxy. Also known as M74 or NGC 628, astronomers have known of the Phantom Galaxy's existence since at least the 18th century.

Sandstrom, along with postdoctoral scholar Jessica Sutter and former postdoctoral scholar Jeremy Chastenet (now at University of Ghent), focused on a specific component of the ISM called polycyclic aromatic hydrocarbons (PAHs). PAHs are small particles of dust -- the size of a molecule -- and it's their small size that makes them so valuable to researchers.

When PAHs absorb a photon from a star, they vibrate and produce emission features that can be detected in the mid-infrared electromagnetic spectrum -- something that typically doesn't happen with larger dust grains from the ISM. The vibrational features of PAHs allow researchers to observe many important characteristics including size, ionization and structure.

This is something Sandstrom has been interested in since graduate school. "The Spitzer Space Telescope looked at the mid-infrared and that's what I used in my Ph.D. thesis. Since Spitzer was retired, we haven't had much access to the mid-infrared spectrum, but JWST is incredible," she stated. "Spitzer had a mirror that was 0.8 meters; JWST's mirror is 6.5 meters. It's a huge telescope and it has amazing instruments. I've been waiting a very long time for this."

Even though PAHs are not by mass a big fraction of the overall ISM, they're important because they're easily ionized -- a process that can produce photoelectrons which heat the rest of the gas in the ISM. A better understanding of PAHs will lead to a better understanding of the physics of the ISM and how it operates. Astrophysicists are hopeful JWST can provide a view into how PAHs are formed, how they change and how they're destroyed.

Because PAHs are evenly distributed throughout the ISM, they allow researchers to see not just the PAHs themselves, but everything around them as well. Previous maps, such as ones taken by Spitzer, contained much less detail -- they essentially looked like galactic blobs. With the clarity JWST provides, astrophysicists can now see gas filaments and even "bubbles" blown by newly formed stars, whose intense radiation fields and resulting supernova evaporate the gas clouds around them.

To get observation time on JWST, the Cycle 1 Treasury Program team had to design observations that included details such as exposure length and filters. Once their submission was accepted, Space Telescope Science Institute, which is responsible for the science and mission operations for JWST, captures and processes the data. This program includes data from 19 galaxies in total.

The Cycle 1 Treasury Program is part of a bigger project called PHANGS (Physics at High Angular Resolution in Nearby GalaxieS). PHANGS studies star formation and the ISM using multi-wavelength images from the Atacama Large Millimeter Array (ALMA) and the Very Large Telescope, both in Chile. However, because the dense clouds in which star formation happens contain a lot of dust, it is difficult for optical light to penetrate to see what's happening inside. Using the mid-infrared spectrum allows researchers to use that same dust and its bright emission to get high-resolution, detailed images.

"One of the things I'm most excited about is now that we have this high-resolution tracer of the ISM, we can map all kinds of things, including the structure of the diffuse gas, which has to become denser and molecular for star formation to occur," said Sandstrom. "We can also map the gas surrounding newly formed stars where there is a lot of 'feedback' such as from supernova explosions. We really get to see the whole cycle of the ISM in a lot of detail. That is the core of how a galaxy is going to form stars."

Read more at Science Daily

Earthquake scientists have a new tool in the race to find the next big one

An everyday quirk of physics could be an important missing piece in scientists' efforts to predict the world's most powerful earthquakes.

In a study published in the journal Science, researchers at The University of Texas at Austin discovered that a frictional phenomenon could be key to understanding when and how violently faults move. That's because the phenomenon, which explains why it takes more effort to shove a heavy box from a standstill than it does to keep it moving, governs how quickly the fault surfaces bond together, or heal, after an earthquake. A fault that is slow to heal is more likely to move harmlessly, while one that heals quickly is more likely to stick until it breaks in a large, damaging earthquake.

The discovery could be key to understanding when, and how violently, faults move. That alone won't allow scientists to predict when the next big one will strike -- the forces behind large earthquakes are too complex -- but it does give researchers a valuable new way to investigate the causes and potential for a large, damaging earthquake to happen, the authors said.

"The same physics and logic should apply to all different kinds of faults around the world," said the study's co-lead author Demian Saffer, director of the University of Texas Institute for Geophysics at the Jackson School of Geosciences. "With the right samples and field observations we can now start to make testable predictions about how big and how often large seismic slip events might occur on other major faults, like Cascadia in the Pacific Northwest."

To make the discovery, researchers devised a test that combined rocks from a well-studied fault off the coast of New Zealand and a computer model, to successfully calculate that a harmless kind of "slow motion" earthquake would happen every few years because the clay-rich rocks within the fault are very slow to heal.

The rock samples the researchers tested were drilled from about half a mile under the seafloor in a fault in New Zealand. They squeezed the fault zone rocks in a hydraulic press and found that they were very slow to heal and slipped easily. When they plugged the rock data into a computer model of the fault, the result was a small, slow-motion tremor every two years, a near exact match with observations from the New Zealand fault.

The researchers think the clay-rich rocks, which are common at many large faults, could be regulating earthquakes by allowing plates to slip quietly past each other, which limits the buildup of stress. The discovery could be used to determine whether a fault is prone to slipping in large, damaging earthquakes, said study co-lead Srisharan Shreedharan, affiliate researcher at the University of Texas Institute for Geophysics and assistant professor at Utah State University.

"This doesn't get us any closer to actually predicting earthquakes, but it does tell us whether a fault is likely to slip silently with no earthquakes, or have large ground-shaking earthquakes," he said.

At Cascadia, there is little evidence of shallow, slow-motion tremors. That's one of the reasons the Pacific Northwest Seismic Network wants to place sensors across key areas of the fault. The new study gives them the framework to do so, said network Director Harold Tobin.

"We want to zero in on the processes in the shallow part of the fault because that's what governs the size of the tsunami," said Tobin, who was not part of the study. "Fault healing doesn't explain everything, but it does give us a window into the working of subduction zone faults that we didn't have before."

Read more at Science Daily

Climate: Lessons from the latest global warming

56 million years ago, the Earth experienced one of the largest and most rapid climate warming events in its history: the Paleocene-Eocene Thermal Maximum (PETM), which has similarities to current and future warming. This episode saw global temperatures rise by 5-8°C. It was marked by an increase in the seasonality of rainfalls, which led to the movement of large quantities of clay into the ocean, making it uninhabitable for certain living species. This scenario could be repeated today. This is what a team from the University of Geneva (UNIGE) has revealed, thanks to the analysis of sediments taken from the deep waters of the Gulf of Mexico. These results can be found in the journal Geology.

The Paleocene-Eocene Thermal Maximum (PETM), which occurred 56 million years ago, is the largest and most rapid climatic disturbance of the Cenozoic era (65.5 million years ago to the present day). Exceptional both in terms of its amplitude (5-8°C increase) and its suddenness (5,000 years, a very short time on a geological scale), this episode was marked by a warming of temperatures on a global scale. It lasted for about 200 000 years and led to numerous marine and terrestrial extinctions.

It would have been caused by a high concentration of carbon dioxide -- the famous CO2 -- and methane in the atmosphere, two powerful greenhouse gases. As is the case currently, these gases may have been released by several phenomena, certainly in combination: the release of methane hydrates trapped on the seabed, the sudden and significant melting of the permafrost, and the injection of magma into the organic sediments of the western edge of Norway. The origin of these processes is still under debate. The impact of a meteorite and/or the effects of intense volcanic activity in the depths of the North Atlantic could be responsible.

A geological ''archive'' of unprecedented quality

Because of the many similarities between the PETM and the current warming, the geological remains of this period are being closely studied by scientists. A team from the UNIGE is now reporting new elements. ''The objective of our study was to investigate the influence of these climatic changes on sedimentary systems, i.e. on the processes of sediment formation and deposition, and to understand how these changes could have been transmitted from the atmosphere to the depths of the ocean,'' explains Lucas Vimpere, a post-doctoral scholar at the Section of Earth and Environmental Sciences of the UNIGE's Faculty of Science and first author of the study.

The researchers analysed sediments taken from more than 8km deep in the Gulf of Mexico. This basin acts as a giant ''sink'' into which material eroded and transported from the North American continent over millions of years is discharged. ''For reasons of cost and infrastructure, the sediments used to study the PETM are generally taken from shallow marine or continental environments. Thanks to the collaboration of an oil company, we were able to obtain a sample of unprecedented quality, without any alteration'', says the researcher. The 543-metre-long core contains a 180-metre-thick PETM sedimentary record, making it the most complete geological ''archive'' of this period in the world.

More clay on the ocean floor

The UNIGE scientists found that it was composed first of a large layer of clay and then of a layer of sand, a counter-intuitive result. ''At the time of the PETM, we thought that there had been more precipitation, and therefore more erosion, and that large quantities of sand had then been transported first by the fluvial systems into the oceans. However, thanks to our sample, we were able to determine that it was the clays and not the sands that were transported in the first instance'', explains Sébastien Castelltort, full professor at the Earth and Environmental Sciences Section of the UNIGE Faculty of Science, and last author of the study.

This established that the period was not marked by an increase in the annual rate of precipitation but by an increase in its seasonality and intensity. ''This resulted in increased mobility of the river channels -- the deepest areas of a river -- which in turn transported large quantities of fluvial clays deposited on the adjacent alluvial plains to the ocean depths. We can now consider the presence of clay in deep basins as a marker of increased rainfall seasonality,'' says Lucas Vimpere. The phenomenon has led to an increase in ocean turbidity that is harmful to marine life, especially corals.

Read more at Science Daily

Cohesion and connection drop in aging population

Social cohesion and connection decline in an ageing population, according to a new study of one of humanity's closest relatives.

For decades, researchers have been observing the rhesus macaques on Cayo Santiago (known as "Monkey Island") in Puerto Rico.

Recent research showed that female macaques "actively reduce" the size of their social networks and prioritise existing connections as they age -- something also seen in humans.

The new study, by an international team led by the University of Exeter, examines how this affects the overall cohesion and connection of the groups older monkeys live in.

While the observed macaque populations (which had no more than 20% "old" individuals) were not affected at group level, computer simulations showed higher proportions of old macaques would reduce cohesion and connection.

"For both humans and macaques, focusing on close friends and family in later life may bring a variety of benefits," said Dr Erin Siracusa, from Exeter's Centre for Research in Animal Behaviour.

"Our study aimed to find out what knock-on effect these individual age-related changes have for how well connected a society is overall.

"We had information on six monkey groups collected over eight years, representing in total 19 social networks.

"The first thing we found is that that older female macaques are poor influencers -- by having fewer friends, older females are less able to transmit knowledge and experience outside their immediate social circles."

The researchers tested whether monkey networks with a greater number of old females (over 18 years old) were less cohesive and connected.

In the macaque populations observed, they didn't find a difference between networks that were older compared to those with a greater number of young adults.

However, no more than 20% of monkeys were old in any given group we studied. It was still possible that even older networks would be affected.

So the scientists created a computer model that simulated the effect of higher proportions of old macaques, and found a decline in network cohesiveness and connectedness.

"We found really substantial consequences for network structure, which could affect useful things like information transmission and cooperation, and could also limit the spread of disease," said Professor Lauren Brent, also from the University of Exeter.

"In humans, population ageing is poised to be one of the most significant social transformations of the 21st Century.

"Our findings suggest this could have far-reaching effects on the structure of our societies and the way they function."

With the global human population of over-60s expected to double by 2050, the findings suggest social structures, cohesion and connectedness could all change significantly.

While the human population is ageing, some animal populations are becoming younger on average -- also with potentially serious consequences.

For example, older male elephants are often targeted by trophy hunters for their large tusks -- and a 2021 University of Exeter study found that male elephants are more aggressive to things like tourist vehicles when fewer older males are present.

The new study was carried out by a team including the University of Coimbra (Portugal), the Technical University of Denmark, Arizona State University, New York University, and the University of Pennsylvania (USA).

Read more at Science Daily

Feb 20, 2023

Astrophysics: Scientists observe high-speed star formation

Even though SOFIA is no longer in operation, the data collected so far are essential for basic astronomical research because there is no longer an instrument that extensively maps the sky in this wavelength range (typically 60 to 200 micrometres). The now active James Webb Space Telescope observes in the infrared at shorter wavelengths and focuses on spatially small areas. Therefore, the analysis of the data collected by SOFIA is ongoing and continues to provide important insights – also regarding other star-forming regions: “In the list of FEEDBACK sources, there are other gas clouds in different stages of evolution, where we are now looking for the weak CII radiation at the peripheries of the clouds to detect similar interactions as in the Cygnus X region,” Schneider concluded.

al standards. The results of the study ‘Ionized carbon as a tracer for the assembly of interstellar clouds’ will appear in the next issue of Nature Astronomy.

The observations were carried out in an international project led by Dr Nicola Schneider at the University of Cologne and Prof Alexander Tielens at the University of Maryland as part of the FEEDBACK programme on board the flying observatory SOFIA (Stratospheric Observatory for Infrared Astronomy). The new findings modify previous perceptions that this specific process of star formation is quasi-static and quite slow. The dynamic formation process now observed would also explain the formation of particularly massive stars.

By comparing the distribution of ionized carbon, molecular carbon monoxide and atomic hydrogen, the team found that the shells of interstellar gas clouds are made of hydrogen and collide with each other at speeds of up to twenty kilometres per second. “This high speed compresses the gas into denser molecular regions where new, mainly massive stars form. We needed the CII observations to detect this otherwise ‘dark’ gas,” said Dr Schneider. The observations show for the first time the faint CII radiation from the periphery of the clouds, which could not be observed before. Only SOFIA and its sensitive instruments were capable of detecting this radiation.

SOFIA was operated by NASA and the German Aerospace Center (DLR) until September 2022. The observatory consisted of a converted Boeing 747 with a built-in 2.7-metre telescope. It was coordinated by the German SOFIA Institute (DSI) and the Universities Space Research Association (USRA). SOFIA observed the sky from the stratosphere (above 13 kilometres) and covered the infrared region of the electromagnetic spectrum, just beyond what humans can see. The Boeing thus flew above most of the water vapour in the Earth’s atmosphere, which otherwise blocks out infrared light. This allowed the scientists to observe a wavelength range that is not accessible from Earth. For the current results, the team used the upGREAT receiver installed on SOFIA in 2015 by the Max Planck Institute for Radio Astronomy in Bonn and the University of Cologne.

Read more at Science Daily

Rationing: A fairer way to fight climate change?

World War II-style rationing could be an effective way to reduce carbon emissions, according to new research from the University of Leeds.

In a paper published today in the journal Ethics, Policy and Environment, academics argue that rationing could help states to reduce greenhouse gas emissions rapidly and fairly.

Policymakers have considered other schemes to reduce emissions, including carbon taxes and personal carbon trading schemes, but the researchers say these favour the wealthy, who could buy the right to pollute if trading were allowed.

The authors argue that carbon rationing would instead allow people to receive an equitable portion of resources based on their needs, therefore sharing out the effort to protect the planet.

The authors were based across the University of Leeds' Inter-Disciplinary Ethics Applied Centre, Sustainability Research Institute and School of History when they conducted the research.

Joint lead author Dr Nathan Wood, who is now a Postdoctoral Fellow at Utrecht University's Fair Energy Consortium, said: "The concept of rationing could help, not only in the mitigation of climate change, but also in reference to a variety of other social and political issues -- such as the current energy crisis."

Lessons from the past

Records from World War II show that compulsory food rationing was more acceptable to the UK public than voluntary changes to diet when resources became scarce. The policy aimed to share goods and burdens more equally, regardless of wealth, which was an important part of its popularity and success.

Historic rationing policies also introduced price controls on goods to keep key resources affordable for most people. As a result, rates of malnutrition went down during World War II, despite the shortages.

A key difference between World War II rationing and the climate crisis is public perception, the researchers say. The availability of thousands of garments, gadgets and goods at the click of a button can give the illusion that resources are available in abundance, but the reality is starkly different.

Dr Rob Lawlor, joint lead author and Lecturer at Leeds' Inter-Disciplinary Ethics Applied Centre, said: "There is a limit to how much we can emit if we are to reduce the catastrophic impacts of climate change. In this sense, the scarcity is very real."

Dr Wood said: "The cost of living crisis has shown what happens when scarcity drives up prices, with energy prices rising steeply and leaving vulnerable groups unable to pay their bills. Currently, those living in energy poverty cannot use anywhere near their fair share of energy supply, whereas the richest in society are free to use as much energy as they can afford."

Dr Lawlor added: "It seems feasible to reduce emissions overall even while the lowest emitters, often the worst off, may be able to increase their emissions -- not despite rationing, but because of rationing and price controls."

What equitable rationing could look like

The researchers suggest that rationing probably wouldn't be the first step. Instead, policy changes could start with stricter regulations and an accompanying information campaign to communicate the benefits of rationing.

Initially, governments could regulate the biggest polluters, such as oil, gas and petrol, long-haul flights and intensive farming, creating scarcity in products that harm the planet. Rationing could then be introduced gradually, to manage the resulting scarcity with the aim of meeting everyone's basic needs.

The academics identified two options for rationing policy. Policymakers could introduce an all-encompassing carbon allowance, giving out 'carbon cards' like bank cards to track and limit usage. Alternatively, governments could ration specifically selected goods, such as flights, petrol, household energy, or even meat or clothing.

Dr Lawlor said: "Many have proposed carbon allowances and carbon cards before. What is new (or old, taking inspiration from World War II) is the idea that the allowances should not be tradable. Another feature of World War II-style rationing is that price controls on rationed goods would prevent prices from rising with increased demand, benefitting those with the least money."

According to the researchers, it's likely that rationing would accelerate the transition from fossil fuels to cleaner energy and more sustainable lifestyles. Dr Wood said: "For example, rationing petrol could encourage greater use of, and investment in, low carbon public transport, such as railways and local trams."

Read more at Science Daily

Geckos know their own odor

Geckos can use their tongue to differentiate their own odor from that of other members of their species, as researchers from the University of Bern have shown in a new experimental study. The findings show that geckos are able to communicate socially, meaning that they are more intelligent than was previously assumed.

Self-recognition is the ability to detect stimuli which come from oneself. We as people, and also some animals, can identify ourselves visually when we look in the mirror. However, not all animals rely on their sense of sight, first and foremost. Geckos, and also other lizards and snakes, use their tongues to perceive chemicals, so-called pheromones, from other individuals. For instance, when climbing a wall, geckos pause every so often to dart their tongues around. This enables them to detect potential partners or rivals. But can geckos also detect their own odor and recognize themselves by smell?

In a study recently published in the journal Animal Cognition, researchers at the Institute of Ecology and Evolution of the University of Bern focused on whether Tokay geckos can detect skin chemicals that they themselves produce, and whether they can discriminate between these chemicals and those of other geckos of the same sex. The experiments confirmed that geckos are capable of this. During the tests, the animals were more interested in the skin chemicals of other geckos than in their own. This shows that geckos use pheromones for social communication.

Gecko and peppermint odor on cotton swabs

During the experiment, the researchers presented the geckos with various odors on cotton swabs. As well as their own odor, these were odors from other geckos, or control odors such as water and peppermint. When they reacted, the geckos showed two types of behavior: on one hand, they stuck out their tongues in the direction of the odor on the swab and, on the other hand, towards the surrounding area, their own home enclosure. The researchers interpreted this behavior as a sign that the geckos first perceive the odor on the swab, and then compare it with their own odor on the walls of the enclosure. "The geckos have to compare more frequently when confronted with the odor of another gecko, compared to their own odor. This indicates that they know their own odor," explains Birgit Szabo, lead author of the study from the Division of Behavioural Ecology at the University of Bern's Institute of Ecology and Evolution.

In an experiment, the team was also able to show that geckos detect and use the odors of their feces to distinguish themselves from others. Geckos also deposit pheromones on their excrement, for instance, to mark their territory. This is because, just like many mammals, geckos have preferred areas for defecation so that they can communicate their presence.

More social and intelligent than we thought

The findings of the study show that geckos can communicate socially by using chemicals from their skin and excrement, and that they use these chemicals to distinguish themselves from other geckos. "Lizards and reptiles are generally seen as unsocial primitive animals. We must recognize that reptiles are more social and intelligent than we thought," says Birgit Szabo.

Read more at Science Daily

Scientists make breakthrough for 'next generation' cancer treatment

Scientists at the University of East Anglia are a step closer to creating a new generation of light-activated cancer treatments.

The futuristic sounding treatment would work by switching on LED lights embedded close to a tumour, which would then activate biotherapeutic drugs.

These new treatments would be highly targeted and more effective than current state-of-the-art cancer immunotherapies.

New research published today reveals the science behind this innovative idea.

It shows how the UEA team have engineered antibody fragments -- which not only 'fuse' with their target but are also light activated.

It means that in future, immunotherapy treatments could be engineered to attack tumours more precisely than ever before.

The principal scientist for this study, Dr Amit Sachdeva, from UEA's School of Chemistry, said: "Current cancer treatments like chemotherapy kill cancer cells, but they can also damage healthy cells in your body such as blood and skin cells.

"This means that they can cause side effects including hair loss, feeling tired and sick, and they also put patients at increased risk of picking up infections.

"There has therefore been a very big drive to create new treatments that are more targeted and don't have these unwanted side-effects.

"Several antibodies and antibody fragments have already been developed to treat cancer. These antibodies are much more selective than the cytotoxic drugs used in chemotherapy, but they can still cause severe side effects, as antibody targets are also present on healthy cells."

Now, the UEA team has engineered one of the first antibody fragments that binds to, and forms a covalent bond with, its target -- upon irradiation with UV light of a specific wavelength.

Dr Sachdeva said: "A covalent bond is a bit like melting two pieces of plastic and fusing them together. It means that drug molecules could for example be permanently fixed to a tumour.

"We hope that our work will lead to the development of a new class of highly targeted light-responsive biotherapeutics. This would mean that antibodies could be activated at the site of a tumour and covalently stick to their target upon light activation.

"In other words, you could activate antibodies to attack tumour cells by shining light - either directly on to the skin, in the case of skin cancer, or using small LED lights that could be implanted at the site of a tumour inside the body.

"This would allow cancer treatment to be more efficient and targeted because it means that only molecules in the vicinity of the tumour would be activated, and it wouldn't affect other cells.

"This would potentially reduce side effects for patients, and also improve antibody residence time in the body."

"It would work for cancers like skin cancer, or where there is a solid tumour - but not for blood cancers like leukaemia.

"Development of these antibody fragments would not have been possible without pioneering work from several other research groups across the globe who developed and optimised methods for site-specific incorporation of non-natural amino acids into proteins expressed in live cells.

"We employed some of these methods to site-specifically install unique light-sensitive amino acids into antibody fragments."

If the researchers are successful in the next stages of their work, they hope to see the 'next generation' light-activated immunotherapies being used to treat cancer patients within five to 10 years.

Read more at Science Daily

Feb 19, 2023

Space travel influences the way the brain works

Scientists of the University of Antwerp and University of Liège have found how the human brain changes and adapts to weightlessness, after being in space for 6 months. Some of the changes turned out to be lasting -- even after 8 months back on Earth. Raphaël Liégeois, soon to be the third Belgian in space, acknowledges the importance of the research, "to prepare the new generation of astronauts for longer missions."

A child who learns not to drop a glass on the floor, or a tennis player predicting the course of an incoming ball to hit it accurately are examples of how the brain incorporates the physical laws of gravity to optimally function on Earth. Astronauts who go to space reside in a weightless environment, where the brain's rules about gravity are no longer applicable. A new study on brain function in cosmonauts has revealed how the brain's organization is changed after a six-month mission to the International Space Station (ISS), demonstrating the adaptation that is required to live in weightlessness.

The University of Antwerp has been leading this BRAIN-DTI scientific project through the European Space Agency. Magnetic resonance imaging (MRI) data were taken from 14 astronaut brains before and several times after their mission to space. Using a special MRI technique, the researchers collected the astronauts' brain data in a resting condition, hence without having them engage in a specific task. This resting-state functional MRI technique enabled the researchers to investigate the brain's default state and to find out whether this changes or not after long-duration spaceflight.

Learning effect

In collaboration with the University of Liège, recent analyses of the brain's activity at rest revealed how functional connectivity, a marker of how activity in some brain areas is correlated with the activity in others, changes in specific regions.

"We found that connectivity was altered after spaceflight in regions which support the integration of different types of information, rather than dealing with only one type each time, such as visual, auditory, or movement information', say Steven Jillings and Floris Wuyts (University of Antwerp). "Moreover, we found that some of these altered communication patterns were retained throughout 8 months of being back on Earth. At the same time, some brain changes returned to the level of how the areas were functioning before the space mission."

Both scenarios of changes are plausible: retained changes in brain communication may indicate a learning effect, while transient changes may indicate more acute adaptation to changed gravity levels.

"This dataset is so special as their participants themselves. Back in 2016, we were historically the first to show how spaceflight may affect brain function on a single cosmonaut. Some years later we are now in a unique position to investigate the brains of more astronauts, several times. Therefore, we are deciphering the potential of the human brain all the more in confidence," says Dr. Athena Demertzi (GIGA Institute, University of Liège), co-supervisor of this this work.

New generation of astronauts


"Understanding physiological and behavioral changes triggered by weightlessness is key to plan human space exploration. Therefore, mapping changes of brain function using neuroimaging techniques as done in this work is an important step to prepare the new generation of astronauts for longer missions," comments Raphaël Liégeois, Doctor of Engineering Science (ULiège) with a Thesis in the field of Neuroscience, future ESA Astronaut.

Read more at Science Daily

How a record-breaking copper catalyst converts CO2 into liquid fuels

Since the 1970s, scientists have known that copper has a special ability to transform carbon dioxide into valuable chemicals and fuels. But for many years, scientists have struggled to understand how this common metal works as an electrocatalyst, a mechanism that uses energy from electrons to chemically transform molecules into different products.

Now, a research team led by Lawrence Berkeley National Laboratory (Berkeley Lab) has gained new insight by capturing real-time movies of copper nanoparticles (copper particles engineered at the scale of a billionth of a meter) as they convert CO2 and water into renewable fuels and chemicals: ethylene, ethanol, and propanol, among others. The work was reported in the journal Nature last week.

"This is very exciting. After decades of work, we're finally able to show -- with undeniable proof -- how copper electrocatalysts excel in CO2 reduction," said Peidong Yang, a senior faculty scientist in Berkeley Lab's Materials Sciences and Chemical Sciences Divisions who led the study. Yang is also a professor of chemistry and materials science and engineering at UC Berkeley. "Knowing how copper is such an excellent electrocatalyst brings us steps closer to turning CO2 into new, renewable solar fuels through artificial photosynthesis."

The work was made possible by combining a new imaging technique called operando 4D electrochemical liquid-cell STEM (scanning transmission electron microscopy) with a soft X-ray probe to investigate the same sample environment: copper nanoparticles in liquid. First author Yao Yang, a UC Berkeley Miller postdoctoral fellow, conceived the groundbreaking approach under the guidance of Peidong Yang while working toward his Ph.D. in chemistry at Cornell University.

Scientists who study artificial photosynthesis materials and reactions have wanted to combine the power of an electron probe with X-rays, but the two techniques typically can't be performed by the same instrument.

Electron microscopes (such as STEM or TEM) use beams of electrons and excel at characterizing the atomic structure in parts of a material. In recent years, 4D STEM (or "2D raster of 2D diffraction patterns using scanning transmission electron microscopy") instruments, such as those at Berkeley Lab's Molecular Foundry, have pushed the boundaries of electron microscopy even further, enabling scientists to map out atomic or molecular regions in a variety of materials, from hard metallic glass to soft, flexible films.

On the other hand, soft (or lower-energy) X-rays are useful for identifying and tracking chemical reactions in real time in an operando, or real-world, environment.

But now, scientists can have the best of both worlds. At the heart of the new technique is an electrochemical "liquid cell" sample holder with remarkable versatility. A thousand times thinner than a human hair, the device is compatible with both STEM and X-ray instruments.

The electrochemical liquid cell's ultrathin design allows reliable imaging of delicate samples while protecting them from electron beam damage. A special electrode custom-designed by co-author Cheng Wang, a staff scientist at Berkeley Lab's Advanced Light Source, enabled the team to conduct X-ray experiments with the electrochemical liquid cell. Combining the two allows researchers to comprehensively characterize electrochemical reactions in real time and at the nanoscale.

Getting granular


During 4D-STEM experiments, Yao Yang and team used the new electrochemical liquid cell to observe copper nanoparticles (ranging in size from 7 nanometers to 18 nanometers) evolve into active nanograins during CO2 electrolysis -- a process that uses electricity to drive a reaction on the surface of an electrocatalyst.

The experiments revealed a surprise: copper nanoparticles combined into larger metallic copper "nanograins" within seconds of the electrochemical reaction.

To learn more, the team turned to Wang, who pioneered a technique known as "resonant soft X-ray scattering (RSoXS) for soft materials," at the Advanced Light Source more than 10 years ago.

With help from Wang, the research team used the same electrochemical liquid cell, but this time during RSoXS experiments, to determine whether copper nanograins facilitate CO2 reduction. Soft X-rays are ideal for studying how copper electrocatalysts evolve during CO2 reduction, Wang explained. By using RSoXS, researchers can monitor multiple reactions between thousands of nanoparticles in real time, and accurately identify chemical reactants and products.

The RSoXS experiments at the Advanced Light Source -- along with additional evidence gathered at Cornell High Energy Synchrotron Source (CHESS) -- proved that metallic copper nanograins serve as active sites for CO2 reduction. (Metallic copper, also known as copper(0), is a form of the element copper.)

During CO2 electrolysis, the copper nanoparticles change their structure during a process called "electrochemical scrambling." The copper nanoparticles' surface layer of oxide degrades, creating open sites on the copper surface for CO2 molecules to attach, explained Peidong Yang. And as CO2 "docks" or binds to the copper nanograin surface, electrons are then transferred to CO2, causing a reaction that simultaneously produces ethylene, ethanol, and propanol along with other multicarbon products.

"The copper nanograins essentially turn into little chemical manufacturing factories," Yao Yang said.

Further experiments at the Molecular Foundry, the Advanced Light Source, and CHESS revealed that size matters. All of the 7-nanometer copper nanoparticles participated in CO2 reduction, whereas the larger nanoparticles did not. In addition, the team learned that only metallic copper can efficiently reduce CO2 into multicarbon products. The findings have implications for "rationally designing efficient CO2 electrocatalysts," Peidong Yang said.

The new study also validated Peidong Yang's findings from 2017: That the 7-nanometer-sized copper nanoparticles require low inputs of energy to start CO2 reduction. As an electrocatalyst, the 7-nanometer copper nanoparticles required a record-low driving force that is about 300 millivolts less than typical bulk copper electrocatalysts. The best-performing catalysts that produce multicarbon products from CO2 typically operate at high driving force of 1 volt.

The copper nanograins could potentially boost the energy efficiency and productivity of some catalysts designed for artificial photosynthesis, a field of research that aims to produce solar fuels from sunlight, water, and CO2. Currently, researchers within the Department of Energy-funded Liquid Sunlight Alliance (LiSA) plan to use the copper nanograin catalysts in the design of future solar fuel devices.

"The technique's ability to record real-time movies of a chemical process opens up exciting opportunities to study many other electrochemical energy conversion processes. It's a huge breakthrough, and it would not have been possible without Yao and his pioneering work," Peidong Yang said.

Read more at Science Daily