Jan 28, 2023

Mercury helps to detail Earth's most massive extinction event

The Latest Permian Mass Extinction (LPME) was the largest extinction in Earth's history to date, killing between 80-90% of life on the planet, though finding definitive evidence for what caused the dramatic changes in climate has eluded experts.

An international team of scientists, including UConn Department of Earth Sciences researchers Professor and Department Head Tracy Frank and Professor Christopher Fielding, are working to understand the cause and how the events of the LPME unfolded by focusing on mercury from Siberian volcanoes that ended up in sediments in Australia and South Africa. The research has been published in Nature Communications.

Though the LPME happened over 250 million years ago, there are similarities to the major climate changes happening today, explains Frank:

"It's relevant to understanding what might happen on earth in the future. The main cause of climate change is related to a massive injection of carbon dioxide into the atmosphere around the time of the extinction, which led to rapid warming."

In the case of the LPME, it is widely accepted that the rapid warming associated with the event is linked to massive volcanism occurring at a huge deposit of lava called the Siberian Traps Large Igneous Province (STLIP), says Frank, but direct evidence was still lacking.

Volcanos leave helpful clues in the geological record. With the outpouring of lava, there was also a huge quantity of gases released, such as CO2 and methane, along with particulates and heavy metals that were launched into the atmosphere and deposited around the globe.

"However, it's hard to directly link something like that to the extinction event," says Frank. "As geologists, we're looking for a signature of some kind -- a smoking gun -- so that we can absolutely point to the cause."

In this case, the smoking gun the researchers focused on was mercury, one of the heavy metals associated with volcanic eruptions. The trick is finding areas where that record still exists.

Frank explains there is a continuous record of the earth's history contained in sediments in marine environments which acts almost like a tape recorder because deposits are quickly buried and protected. These sediments yield an abundance of data about the extinction and how it unfolded in the oceans. On land, it is more difficult to find such well-preserved records from this time period.

To illustrate this, Frank uses Connecticut as an example: the state is rich with 400-500-million-year-old metamorphic rocks at or near the surface, with a covering of glacial deposits dating to around 23,000 years ago.

"There's a big gap in the record here. You have to be lucky to preserve terrestrial records and that's why they aren't as well studied, because there are fewer of them out there," says Frank.

Not all terrains around the world have such massive gaps in the geologic record, and previous studies of the LPME have focused primarily on sites found in the northern hemisphere. However, the Sydney Basin in Eastern Australia and the Karoo Basin in South Africa are two areas in the southern hemisphere that happen to have an excellent record of the event, and are areas Frank and Fielding have studied previously. A colleague and co-author, Jun Shen from the State Key Laboratory of Geological Processes and Mineral Resources at the China University of Geosciences, reached out and connected with Frank, Fielding, and other co-authors for samples, with hopes to analyze them for mercury isotopes.

Shen was able to analyze the mercury isotopes in the samples and tie all the data together says Frank.

"It turns out that volcanic emissions of mercury have a very specific isotopic composition of the mercury that accumulated at the extinction horizon. Knowing the age of these deposits, we can more definitively tie the timing of the extinction to this massive eruption in Siberia. What is different about this paper is we looked not only at mercury, but the isotopic composition of the mercury from samples in the high southern latitudes, both for the first time."

This definitive timing is something that scientists have been working on refining, but as Fielding points out, the more that we learn, the more complicated it gets.

"As a starting point, geologists have pinpointed the timing of the major extinction event at 251.9 million years with a high degree of precision from radiogenic isotope dating methods. Researchers know that is when the major extinction event happened in the marine environment and it was just assumed that the terrestrial extinction event happened at the same time."

In Frank and Fielding's previous research, they found that the extinction event on land happened 200-600,000 years earlier, however.

"That suggests that the event itself wasn't just one big whammy that happened instantaneously. It wasn't just one very bad day on Earth, so to speak, it took some time to build and this feeds in well into the new results because it suggests the volcanism was the root cause," says Fielding. "That's just the first impact of the biotic crisis that happened on land, and it happened early. It took time to be transmitted into the oceans. The event 251.9 million years ago was the major tipping point in environmental conditions in the ocean that had deteriorated over some time."

Retracing the events relies on knowledge from many different geologists all specializing in different methods, from sedimentology, geochemistry, paleontology, and geochronology, says Frank.

Read more at Science Daily

New AI tool makes speedy gene-editing possible

An artificial intelligence program may enable the first simple production of customizable proteins called zinc fingers to treat diseases by turning genes on and off.

The researchers at NYU Grossman School of Medicine and the University of Toronto who designed the tool say it promises to accelerate the development of gene therapies on a large scale.

Illnesses including cystic fibrosis, Tay-Sachs disease, and sickle cell anemia are caused by errors in the order of DNA letters that encode the operating instructions for every human cell. Scientists can in some cases correct these mistakes with gene editing methods that rearrange these letters.

Other conditions are caused, not by a mistake in the code itself, but by problems in how the cellular machinery reads DNA (epigenetics). A gene, which provides the recipe for a particular protein, often partners with molecules called transcription factors that tell the cell how much of that protein to make. When this process goes awry, over- or underactive genes contribute to diabetes, cancer, and neurological disorders. As a result, researchers have been exploring ways to restore normal epigenetic activity.

One such technique is zinc-finger editing, which can both change and control genes. Among the most abundant protein structures in the human body, zinc fingers can guide DNA repair by grabbing onto scissor-like enzymes and directing them to cut faulty segments out of the code.

Similarly, zinc fingers can also hook onto transcription factors and pull them toward a gene segment in need of regulation. By customizing these instructions, genetic engineers can tailor any gene's activity. A drawback, however, is that artificial zinc fingers are challenging to design for a specific task. Since these proteins attach to DNA in complex groups, researchers would need to be able to tell -- out of countless possible combinations -- how every zinc finger interacts with its neighbor for each desired genetic change.

The study authors' new technology, called ZFDesign, overcomes this obstacle by using artificial intelligence (AI) to model and design these interactions. The model is based on data generated by the screen of nearly 50 billion possible zinc finger-DNA interactions in the researchers' labs. A report on the tool is publishing online Jan. 26 in the journal Nature Biotechnology.

"Our program can identify the right grouping of zinc fingers for any modification, making this type of gene editing faster than ever before," says study lead author David Ichikawa, PhD, a former graduate student at NYU Langone Health.

Ichikawa notes that zinc-finger editing offers a potentially safer alternative to CRISPR, a key gene-editing technology with applications that range from finding new ways to kill cancer cells to designing more nourishing crops. Unlike the entirely human-derived zinc fingers, CRISPR, which stands for clustered regularly interspaced short palindromic repeat, relies on bacterial proteins to interact with genetic code. These "foreign" proteins could trigger patients' immune defense systems, which may attack them like any other infection and lead to dangerous inflammation.

The study authors add that besides posing a lower immune risk, the small size of zinc-finger tools may also provide more flexible gene therapy techniques compared with CRISPR by enabling more ways to deliver the tools to the right cells in patients.

"By speeding up zinc-finger design coupled with their smaller size, our system paves the way for using these proteins to control multiple genes at the same time," says study senior author Marcus Noyes, PhD. "In the future, this approach may help correct diseases that have multiple genetic causes, such as heart disease, obesity, and many cases of autism."

To test the computer's AI design code, Noyes and his team used a customized zinc finger to disrupt the coding sequence of a gene in human cells. In addition, they built several zinc fingers that successfully reprogrammed transcription factors to bind near a target gene sequence and turn up or down its expression, demonstrating that their technology can be used for epigenetic changes.

Noyes, an assistant professor in the Department of Biochemistry and Molecular Pharmacology at NYU Langone, cautions that, while promising, zinc fingers can be difficult to control. Since they are not always specific to a single gene, some combinations can affect DNA sequences beyond a particular target, leading to unintended changes in genetic code.

As a result, Noyes says the team next plans to refine their AI program so it can build more precise zinc-finger groupings that only prompt the desired edit. Noyes is also a member of NYU Langone's Institute for System Genetics.

Read more at Science Daily

Jan 27, 2023

How a 3 cm glass sphere could help scientists understand space weather

Solar flares and other types of space weather can wreak havoc with spaceflight and with telecommunications and other types of satellites orbiting the Earth. But, to date, scientists' ability to research ways to overcome that challenge has been severely limited. That's because experiments they conduct in laboratories here on Earth are affected by gravity in ways that are so different from conditions in space.

But a new study by UCLA physicists could, at last, help conquer that issue -- which could be a big step toward safeguarding humans (and equipment) during space expeditions, and to ensuring the proper functioning of satellites. The paper is published in Physical Review Letters.

The UCLA researchers effectively reproduced the type of gravity that exists on or near stars and other planets inside of a glass sphere measuring 3 centimeters in diameter (about 1.2 inches). To do so, they used sound waves to create a spherical gravitational field and generate plasma convection -- a process in which gas cools as it nears the surface of a body and then reheats and rises again as it nears the core -- creating a fluid current that in turn generates a magnetic current.

The achievement could help scientists overcome the limiting role of gravity in experiments that are intended to model convection that occurs in stars and other planets.

"People were so interested in trying to model spherical convection with laboratory experiments that they actually put an experiment in the space shuttle because they couldn't get a strong enough central force field on the ground," said Seth Putterman, a UCLA physics professor and the study's senior author. "What we showed is that our system of microwave-generated sound produced gravity so strong that Earth's gravity wasn't a factor. We don't need to go into space to do these experiments anymore."

UCLA researchers used microwaves to heat sulfur gas to 5,000 degrees Fahrenheit inside the glass sphere. The sound waves inside the ball acted like gravity, constraining movement of the hot, weakly ionized gas, known as plasma, into patterns that resemble the currents of plasma in stars.

"Sound fields act like gravity, at least when it comes to driving convection in gas," said John Koulakis, a UCLA project scientist and the study's first author. "With the use of microwave-generated sound in a spherical flask of hot plasma, we achieved a gravity field that is 1,000 times stronger than Earth's gravity."

On Earth's surface, hot gas rises because gravity holds denser, colder gas closer to the planet's center.

Indeed, the researchers found that hot, bright gas near the outer half of the sphere also moved outward toward the walls of the sphere. The strong, sustained gravity generated turbulence that resembled that seen near the Sun's surface. In the inner half of the sphere, the acoustic gravity changed direction and pointed outward, which causes hot gas to sink to the center. In the experiment, acoustic gravity naturally held the hottest plasma at the center of the sphere, where it also occurs in stars.

The ability to control and manipulate plasma in ways that mirror solar and planetary convection will help researchers understand and predict how solar weather affects spacecraft and satellite communications systems. Last year, for example, a solar storm knocked out 40 SpaceX satellites. The phenomenon has also been problematic for military technology: the formation of turbulent plasma around hypersonic missiles, for example, can interfere with weapons systems communications.

Read more at Science Daily

Traffic pollution impairs brain function

A new study by researchers at the University of British Columbia and the University of Victoria has shown that common levels of traffic pollution can impair human brain function in only a matter of hours.

The peer-reviewed findings, published in the journal Environmental Health, show that just two hours of exposure to diesel exhaust causes a decrease in the brain's functional connectivity -- a measure of how The study provides the first evidence in humans, from a controlled experiment, of altered brain network connectivity induced by air pollution.

"For many decades, scientists thought the brain may be protected from the harmful effects of air pollution," said senior study author Dr. Chris Carlsten, professor and head of respiratory medicine and the Canada Research Chair in occupational and environmental lung disease at UBC. "This study, which is the first of its kind in the world, provides fresh evidence supporting a connection between air pollution and cognition."

For the study, the researchers briefly exposed 25 healthy adults to diesel exhaust and filtered air at different times in a laboratory setting. Brain activity was measured before and after each exposure using functional magnetic resonance imaging (fMRI).

The researchers analyzed changes in the brain's default mode network (DMN), a set of inter-connected brain regions that play an important role in memory and internal thought. The fMRI revealed that participants had decreased functional connectivity in widespread regions of the DMN after exposure to diesel exhaust, compared to filtered air.

"We know that altered functional connectivity in the DMN has been associated with reduced cognitive performance and symptoms of depression, so it's concerning to see traffic pollution interrupting these same networks," said Dr. Jodie Gawryluk, a psychology professor at the University of Victoria and the study's first author. "While more research is needed to fully understand the functional impacts of these changes, it's possible that they may impair people's thinking or ability to work."

Taking steps to protect yourself

Notably, the changes in the brain were temporary and participants' connectivity returned to normal after the exposure. Dr. Carlsten speculated that the effects could be long lasting where exposure is continuous. He said that people should be mindful of the air they're breathing and take appropriate steps to minimize their exposure to potentially harmful air pollutants like car exhaust.

"People may want to think twice the next time they're stuck in traffic with the windows rolled down," said Dr. Carlsten. "It's important to ensure that your car's air filter is in good working order, and if you're walking or biking down a busy street, consider diverting to a less busy route."

While the current study only looked at the cognitive impacts of traffic-derived pollution, Dr. Carlsten said that other products of combustion are likely a concern.

"Air pollution is now recognized as the largest environmental threat to human health and we are increasingly seeing the impacts across all major organ systems," says Dr. Carlsten. "I expect we would see similar impacts on the brain from exposure to other air pollutants, like forest fire smoke. With the increasing incidence of neurocognitive disorders, it's an important consideration for public health officials and policymakers."

Read more at Science Daily

Recyclable mobile phone batteries a step closer with rust-busting invention

Mobile phone batteries with a lifetime up to three times longer than today's technology could be a reality thanks to an innovation led by engineers at RMIT University.

Rather than disposing of batteries after two or three years, we could have recyclable batteries that last for up to nine years, the team says, by using high-frequency sound waves to remove rust that inhibits battery performance.

Only 10% of used handheld batteries, including for mobile phones, are collected for recycling in Australia, which is low by international standards. The remaining 90% of batteries go to landfill or are disposed of incorrectly, which causes considerable damage to the environment.

The high cost of recycling lithium and other materials from batteries is a major barrier to these items being reused, but the team's innovation could help to address this challenge.

The team are working with a nanomaterial called MXene, a class of materials that they say promises to be an exciting alternative to lithium for batteries in the future.

Leslie Yeo, Distinguished Professor of Chemical Engineering and lead senior researcher, said MXene was similar to graphene with high electrical conductivity.

"Unlike graphene, MXenes are highly tailorable and open up a whole range of possible technological applications in the future," said Yeo from RMIT's School of Engineering.

The big challenge with using MXene was that it rusted easily, thereby inhibiting electrical conductivity and rendering it unusable, he said.

"To overcome this challenge, we discovered that sound waves at a certain frequency remove rust from MXene, restoring it to close to its original state," Yeo said.

The team's innovation could one day help to revitalise MXene batteries every few years, extending their lifetime up to three times, he said.

"The ability to prolong the shelf life of MXene is critical to ensuring its potential to be used for commercially viable electronic parts," Yeo said.

The research is published in Nature Communications.

How the innovation works


Co-lead author Mr Hossein Alijani, a PhD candidate, said the greatest challenge with using MXene was the rust that forms on its surface in a humid environment or when suspended in watery solutions.

"Surface oxide, which is rust, is difficult to remove especially on this material, which is much, much thinner than a human hair," said Alijani from RMIT's School of Engineering.

"Current methods used to reduce oxidation rely on the chemical coating of the material, which limits the use of the MXene in its native form.

"In this work, we show that exposing an oxidised MXene film to high-frequency vibrations for just a minute removes the rust on the film. This simple procedure allows its electrical and electrochemical performance to be recovered."

The potential applications of the team's work

The team says their work to remove rust from Mxene opens the door for the nanomaterial to be used in a wide range of applications in energy storage, sensors, wireless transmission and environmental remediation.

Associate Professor Amgad Rezk, one of the lead senior researchers, said the ability to quickly restore oxidised materials to an almost pristine state represented a gamechanger in terms of the circular economy.

"Materials used in electronics, including batteries, generally suffer deterioration after two or three years of use due to rust forming," said Rezk from RMIT's School of Engineering.

"With our method, we can potentially extend the lifetime of battery components by up to three times."

Read more at Science Daily

Mimicking an enigmatic property of circadian rhythms through an artificial chemical clock

Circadian rhythms are natural, internal oscillations that synchronize an organism's behaviors and physiological processes with their environment. These rhythms normally have a period of 24 hours and are regulated by internal chemical clocks that respond to cues from outside the body, such as light.

Although well studied in animals, plants, and bacteria, circadian rhythms all share an enigmatic property -- the oscillation period is not significantly affected by temperature, even though the rate of most biochemical reactions changes exponentially with temperature. This clearly indicates that some sort of temperature-compensation mechanism is at play. Interestingly, some scientists have managed to replicate such temperature-invariant qualities in certain oscillating chemical reactions. However, these reactions are often troublesome and require extremely precise adjustments on the reacting chemicals.

But what if there was a simpler way to achieve temperature compensation in an oscillating chemical reaction? In a recent study published in Scientific Reports, a team of researchers including Assistant Professor Yuhei Yamada of Tokyo Institute of Technology (Tokyo Tech), Japan, came up with a clever idea for a temperature compensation mechanism using a reaction called the Belousov-Zhabotinsky (BZ) oscillating reaction.

The key to their approach lies in soft, temperature-responsive gels made from poly(N-isopropylacrylamide), or 'PNIPAAm' for short, in which the BZ reaction can occur. These gels consist of polymeric strands that can accommodate a certain volume of solvent. However, because these gels shrink as temperature increases, the amount of solvent contained in the gel decreases as temperature rises.

The researchers exploited this property of PNIPAAm gels by adding ruthenium (Ru) sites on its constituent polymers. The periodic nature of the particular BZ reaction the researchers studied relies partially on the back-and-forth oxidation and reduction of ruthenium (Ru) ions. Thus, the speed of this reaction is affected by the relative concentrations of solvent and Ru. Because the PNIPAAm gels can accommodate less solvent when they shrink, the relative concentration of Ru in the gels increases with temperature.

As the research team demonstrated through experimental measurements and a thorough mathematical analysis, the abovementioned effects combine to form a temperature-compensation mechanism that renders the period of the BZ reaction unaffected by shifts in temperature. "The prepared BZ gels exhibited temperature compensability just like the circadian rhythms observed in living organisms," remarks Yamada.

Overall, this study demonstrates a completely new way to achieve temperature compensation in artificial biological clocks based on periodic reactions. Intriguingly, it's even possible that similar temperature-compensation mechanisms using temperature-responsive soft bodies exist in biological systems in nature, as Yamada explains: "Our study suggests that temperature compensation can be naturally self-sustainable through the output system of circadian machinery. This may explain why temperature compensation is a universal property of circadian rhythms seen in animals, plants, and bacteria, regardless of the molecular species involved."

Read more at Science Daily

Jan 26, 2023

Asteroid findings from specks of space dust could save the planet

Curtin University-led research into the durability and age of an ancient asteroid made of rocky rubble and dust, revealed significant findings that could contribute to potentially saving the planet if one ever hurtled toward Earth.

The international team studied three tiny dust particles collected from the surface of ancient 500-metre-long rubble pile asteroid, Itokawa, returned to Earth by the Japanese Space Agency’s Hayabusa 1 probe.

The study’s results showed asteroid Itokawa, which is 2 million kilometres from Earth and around the size of Sydney Harbour Bridge, was hard to destroy and resistant to collision.

Lead author Professor Fred Jourdan, Director of the Western Australian Argon Isotope Facility, part of the John de Laeter Centre and the School of Earth and Planetary Sciences at Curtin, said the team also found Itokawa is almost as old as the solar system itself.

"Unlike monolithic asteroids, Itokawa is not a single lump of rock, but belongs to the rubble pile family which means it’s entirely made of loose boulders and rocks, with almost half of it being empty space,” Professor Jourdan said.

“The survival time of monolithic asteroids the size of Itokawa is predicted to be only several hundreds of thousands of years in the asteroid belt.

“The huge impact that destroyed Itokawa’s monolithic parent asteroid and formed Itokawa happened at least 4.2 billion years ago. Such an astonishingly long survival time for an asteroid the size of Itokawa is attributed to the shock-absorbent nature of rubble pile material.

“In short, we found that Itokawa is like a giant space cushion, and very hard to destroy.”

The Curtin-led team used two complementary techniques to analyse the three dust particles. The first one is called Electron Backscattered Diffraction and can measure if a rock has been shocked by any meteor impact. The second method - argon-argon dating - is used to date asteroid impacts.

Co-author Associate Professor Nick Timms, also from Curtin’s School of Earth and Planetary Sciences, said the durability of rubble pile asteroids was previously unknown, jeopardising the ability to design defence strategies in case one was hurtling toward Earth.

“We set out to answer whether rubble pile asteroids are resistant to being shocked or whether they fragment at the slightest knock,” Associate Professor Timms said.

“Now that we have found they can survive in the solar system for almost its entire history, they must be more abundant in the asteroid belt than previously thought, so there is more chance that if a big asteroid is hurtling toward Earth, it will be a rubble pile.

“The good news is that we can also use this information to our advantage - if an asteroid is detected too late for a kinetic push, we can then potentially use a more aggressive approach like using the shockwave of a close-by nuclear blast to push a rubble-pile asteroid off course without destroying it.”

Read more at Science Daily

The single oil spill that can disrupt the global energy supply

Over the last year, the world's energy market has been highly volatile. The warmer-than-average winter in Europe helped avoid a gas crisis this year, but the forecast for the next winter is unclear as instabilities persist. More than 20% of global liquefied natural gas exports originate from a single port in Qatar. A new paper in Nature Sustainability byateam of researchers at the University of Louvain, the University of Southern California (USC) Viterbi School of Engineering, and the Qatar Environment and Energy Research Institute, pinpoints the location of what the authors call a "high vulnerability zone," where an oil spill could cause liquified natural gas export facilities and desalination plants on the coast to be completely shut down for several days. (In the presence of an oil spill, tankers cannot navigate through thick oil slicks. Further, desalination plants, which rely on the intake of seawater, cannot perform normal operations with a heavily polluted water source) This shutdown, the researchers explain, could cause significant disruption in the global gas supply and cause an unprecedented water shortage for inhabitants of the Qatari Peninsula, while simultaneously compromising containment efforts.

Awareness of such a vulnerability is imperative, say the researchers. Qatar's export capacity is expected to increase by approximately 64% in the next five years. Thus, this key port will continue to be a crucial hotspot for the global energy supply chain. The researchers also note that the increasing number of tanker accidents in the Gulf adds a level of concern, particularly related to how such accidents could impact critical coastal infrastructures that export a vital source of energy for the planet and ensure the safety of desalinated water for one of the world's most arid climates.

The paper uses advanced numerical modeling to corollate maritime data transports, atmospheric circulation, ocean currents, waves, and seafloor topographic map data acquired over a period of five years to locate specific, offshore areas of the Qatar Peninsula that are vulnerable to oil spills and assess potential disruptions to the global supply of liquid natural gas.

The study suggests that tankers crossing this area are the principal risks for oil spills and, not the numerous oil rigs in the northern part of the Peninsula. Should there be a spill in this area, the researchers contend, Qatar would have only a few days to contain the oil spills before such spills would reach the country's main liquified gas export facility and main desalination plant. The authors indicate that these events could potentially cause disruptions or even a total shutdown for day for the desalination plants, pushing the nation to rely on its small freshwater reserve and sending liquified natural gas prices to higher values.

To put the size of the issue in context, experts believe that the largest liquid natural gas tankers from Qatar provide enough energy to heat the entire city of London for one week.

The study advocates for increased remote sensing using satellite and airborne images in the Gulf's most vulnerable areas to provide early warning for spills and better model their evolution. The above actions are crucial, say the researchers, to guide mitigation efforts to avoid negative consequences both locally and globally.

Co-author Essam Heggy of the USC Arid Climate and Water Research Center argues that the Middle East's vulnerability to environmental and climatic hazards is largely underestimated. "Global containment of major oil spills has always been challenging, but it is even harder in the shallow water of the Gulf where any intervention has to account for the complex circulation currents, a harsh operational environment, and the presence of highly-sensitive ecosystems on which three million humans rely for drinking water." He added, "I hope serious resources are put into resolving this vulnerability."

Read more at Science Daily

Comparing airfares instead of seat size fairer indicator of passenger carbon emissions

Allocating passenger aircraft emissions using airfares rather than travel class would give a more accurate idea of individual contributions, finds a study led by UCL.

Emissions calculators base their estimates on travel class, assuming that someone travelling in a higher class and therefore taking up more space on the plane is responsible for more emissions.

The study, published in Environmental Research Letters, describes how including airfares in calculations shows which passengers contribute the most revenue to the airline operating the aircraft, thereby allowing the plane to fly.

Although in general, premium (business) seats are more expensive than economy, the researchers found when looking at data that many late bookings in economy class, often made for business trips or by high income travellers, cost as much as, or more than, premium seats.

Lead author Dr Stijn van Ewijk (UCL Civil, Environmental & Geomatic Engineering) said: "The paper shows we should follow the money when calculating emissions of individual travellers, as it is revenue that decides whether an airline can operate a plane or not. Someone who has paid twice as much as a fellow traveller contributes twice as much to the revenue of the airline and should be allocated twice the emissions. The seat size of each travel class, which is currently used to allocate emissions, is only a rough approximation of how much passengers pay."

The researchers say that using airfares to calculate passenger emissions would benefit efforts to address climate change by encouraging people on all budgets to find alternative modes of transport where possible. It would also increase estimates of corporate emissions because it allocates more to expensive late bookings, which are often made for business purposes.

Implementing a tax that is proportionate to the price of the ticket could make the total costs of flying fairer. People buying the most expensive tickets would pay the highest tax, encouraging them to seek alternatives.

Whilst taxes differ between countries, typically the rates are the same across each travel class. Travellers buying expensive tickets, who are more likely to have higher incomes, pay a relatively low tax and are not currently discouraged from flying.

Dr Van Ewijk added: "An equitable approach to reducing airline emissions should not just deter travellers who can only afford the cheapest early bookings but also the big spenders who bankroll the airline. By assigning emissions based on ticket prices, and taxing those emissions, we can make sure everyone pays their fair share, and is equally encouraged to look for alternatives."

A ticket tax should also take into account the distance flown and the model and age of plane, which can indicate how polluting it is.

The authors used a dataset from the USA to test their fare-based allocation approach. They used the Airline Origin Survey database, which includes ticket fare data, origin and destination, travel class and fare per mile. From this, they calculated the distribution of ticket prices across all passengers on a typical flight.

Based on the price distribution, the authors allocated emissions to passengers, and compared the results with estimates from widely used emissions calculators. Since ticket prices vary strongly by time of booking, the emissions per passenger varied too, far more than on the basis of seat size and travel class.

Using an economic supply-demand model, the researchers estimated how a carbon tax on emissions would affect travellers, depending on whether the emissions the tax applied to were calculated from seat size and travel class, or the airfare. In all scenarios, a tax on emissions calculated from airfares had a more equitable effect because it reduced flying more evenly across income groups.

Read more at Science Daily

Pop-up electrode device could help with 3D mapping of the brain

Understanding the neural interface within the brain is critical to understanding aging, learning, disease progression and more. Existing methods for studying neurons in animal brains to better understand human brains, however, all carry limitations, from being too invasive to not detecting enough information. A newly developed, pop-up electrode device could gather more in-depth information about individual neurons and their interactions with each other while limiting the potential for brain tissue damage.

The researchers, co-led by Huanyu "Larry" Cheng, James L. Henderson, Jr. Memorial Associate Professor of Engineering Science and Mechanics in the College of Engineering, published their results in npj Flexible Electronics.

"It's a challenge to understand the connectivity in between the large number of neuron cells within the brain," Cheng said. "In the past, people developed a device that is placed directly on the cortex to detect information on the surface layer, which is less invasive. But without inserting the device into the brain, it's challenging to detect the intercortical information."

In response to this limitation, researchers developed probe-based electrodes that are inserted into the brain. The problem with this method is that it is not possible to get a 3D layout of the neurons and brain without doing multiple probes, which are difficult to place on a flexible surface and would be too damaging to the brain tissue.

"To address this issue, we use the pop-up design," Cheng said. "We can fabricate the sensor electrodes with resolution and performance comparable with the existing fabrication. But at the same time, we can pop them up into the 3D geometry before they are inserted into the brain. They are similar to the children's pop-up books: You have the flat shape, and then you apply the compressive force. It transforms the 2D into 3D. It provides a 3D device with the performance comparable with the 2D."

The researchers said that in addition to the unique design that pops up into three dimensions after being inserted into the brain, their device also uses a combination of materials that had not been used in this particular way before. Specifically, they used polyethylene glycol, a material that has been used before, as a biocompatible coating to create stiffness, which is not a purpose for which it has been used previously.

"To insert the device in the brain, it needs to be stiff, but after the device is in the brain, it needs to be flexible," said co-corresponding author Ki Jun Yu of Yonsei University in the Republic of Korea. "So we used a biodegradable coating that provides a stiff outer layer on the device. Once the device is in the brain, that stiff coating dissolves, restoring the initial flexibility. Taking together the material structure and the geometry of this device, we'll be able to get input from the brain to study the 3D neuron connectivity."

Next steps for the research include iterating on the design to make it beneficial not only for gaining a better understanding of the brain but also for surgeries and disease treatments.

"In addition to animal studies, some applications of the device use could be operations or treatments for diseases where you may not need to get the device out, but you'll certainly want to make sure the device is biocompatible over a long period of time," Cheng said. "It is beneficial to design the structure as small, soft and porous as possible so that the brain tissue can penetrate into and be able to use the device as a scaffold to grow up on top of that, leading to a much better recovery. We also would like to use biodegradable material that can be dissolved after use."

Read more at Science Daily

Jan 25, 2023

Were galaxies much different in the early universe?

An array of 350 radio telescopes in the Karoo desert of South Africa is getting closer to detecting "cosmic dawn" -- the era after the Big Bang when stars first ignited and galaxies began to bloom.

In a paper accepted for publication in The Astrophysical Journal, the Hydrogen Epoch of Reionization Array (HERA) team reports that it has doubled the sensitivity of the array, which was already the most sensitive radio telescope in the world dedicated to exploring this unique period in the history of the universe.

While they have yet to actually detect radio emissions from the end of the cosmic dark ages, their results do provide clues to the composition of stars and galaxies in the early universe. In particular, their data suggest that early galaxies contained very few elements besides hydrogen and helium, unlike our galaxies today.

When the radio dishes are fully online and calibrated, ideally this fall, the team hopes to construct a 3D map of the bubbles of ionized and neutral hydrogen as they evolved from about 200 million years ago to around 1 billion years after the Big Bang. The map could tell us how early stars and galaxies differed from those we see around us today, and how the universe as a whole looked in its adolescence.

"This is moving toward a potentially revolutionary technique in cosmology. Once you can get down to the sensitivity you need, there's so much information in the data," said Joshua Dillon, a research scientist in the University of California, Berkeley's Department of Astronomy and lead author of the paper. "A 3D map of most of the luminous matter in the universe is the goal for the next 50 years or more."

Other telescopes also are peering into the early universe. The new James Webb Space Telescope (JWST) has now imaged a galaxy that existed about 325 million years after the birth of the universe in the Big Bang. But the JWST can see only the brightest of the galaxies that formed during the Epoch of Reionization, not the smaller but far more numerous dwarf galaxies whose stars heated the intergalactic medium and ionized most of the hydrogen gas.

HERA seeks to detect radiation from the neutral hydrogen that filled the space between those early stars and galaxies and, in particular, determine when that hydrogen stopped emitting or absorbing radio waves because it became ionized.

The fact that the HERA team has not yet detected these bubbles of ionized hydrogen within the cold hydrogen of the cosmic dark age rules out some theories of how stars evolved in the early universe.

Specifically, the data show that the earliest stars, which may have formed around 200 million years after the Big Bang, contained few other elements than hydrogen and helium. This is different from the composition of today's stars, which have a variety of so-called metals, the astronomical term for elements, ranging from lithium to uranium, that are heavier than helium. The finding is consistent with the current model for how stars and stellar explosions produced most of the other elements.

"Early galaxies have to have been significantly different than the galaxies that we observe today in order for us not to have seen a signal," said Aaron Parsons, principal investigator for HERA and a UC Berkeley associate professor of astronomy. "In particular, their X-ray characteristics have to have changed. Otherwise, we would have detected the signal we're looking for."

The atomic composition of stars in the early universe determined how long it took to heat the intergalactic medium once stars began to form. Key to this is the high-energy radiation, primarily X-rays, produced by binary stars where one of them has collapsed to a black hole or neutron star and is gradually eating its companion. With few heavy elements, a lot of the companion's mass is blown away instead of falling onto the black hole, meaning fewer X-rays and less heating of the surrounding region.

The new data fit the most popular theories of how stars and galaxies first formed after the Big Bang, but not others. Preliminary results from the first analysis of HERA data, reported a year ago, hinted that those alternatives -- specifically, cold reionization -- were unlikely.

"Our results require that even before reionization and by as late as 450 million years after the Big Bang, the gas between galaxies must have been heated by X-rays. These likely came from binary systems where one star is losing mass to a companion black hole," Dillon said. "Our results show that if that's the case, those stars must have been very low 'metallicity,' that is, very few elements other than hydrogen and helium in comparison to our sun, which makes sense because we're talking about a period in time in the universe before most of the other elements were formed."

The Epoch of Reionization

The origin of the universe in the Big Bang 13.8 billion years ago produced a hot cauldron of energy and elementary particles that cooled for hundreds of thousands of years before protons and electrons combined to form atoms -- primarily hydrogen and helium. Looking at the sky with sensitive telescopes, astronomers have mapped in detail the faint variations in temperature from this moment -- what's known as the cosmic microwave background -- a mere 380,000 years after the Big Bang.

Aside from this relict heat radiation, however, the early universe was dark. As the universe expanded, the clumpiness of matter seeded galaxies and stars, which in turn produced radiation -- ultraviolet and X-rays -- that heated the gas between stars. At some point, hydrogen began to ionize -- it lost its electron -- and formed bubbles within the neutral hydrogen, marking the beginning of the Epoch of Reionization.

To map these bubbles, HERA and several other experiments are focused on a wavelength of light that neutral hydrogen absorbs and emits, but ionized hydrogen does not. Called the 21-centimeter line (a frequency of 1,420 megahertz), it is produced by the hyperfine transition, during which the spins of the electron and proton flip from parallel to antiparallel. Ionized hydrogen, which has lost its only electron, doesn't absorb or emit this radio frequency.

Since the Epoch of Reionization, the 21 centimeter line has been red-shifted by the expansion of the universe to a wavelength 10 times as long -- about 2 meters, or 6 feet. HERA's rather simple antennas, a construct of chicken wire, PVC pipe and telephone poles, are 14 meters across in order to collect and focus this radiation onto detectors.

"At two meters wavelength, a chicken wire mesh is a mirror," Dillon said. "And all the sophisticated stuff, so to speak, is in the supercomputer backend and all of the data analysis that comes after that."

The new analysis is based on 94 nights of observing in 2017 and 2018 with about 40 antennas -- phase 1 of the array. Last year's preliminary analysis was based on 18 nights of phase 1 observations.

The new paper's main result is that the HERA team has improved the sensitivity of the array by a factor of 2.1 for light emitted about 650 million years after the Big Bang (a redshift, or an increase in wavelength, of 7.9), and 2.6 for radiation emitted about 450 million years after the Big Bang (a redshift of 10.4).

The HERA team continues to improve the telescope's calibration and data analysis in hopes of seeing those bubbles in the early universe, which are about 1 millionth the intensity of the radio noise in the neighborhood of Earth. Filtering out the local radio noise to see the radiation from the early universe has not been easy.

"If it's Swiss cheese, the galaxies make the holes, and we're looking for the cheese," so far, unsuccessfully, said David Deboer, a research astronomer in UC Berkeley's Radio Astronomy Laboratory.

Extending that analogy, however, Dillon noted, "What we've done is we've said the cheese must be warmer than if nothing had happened. If the cheese were really cold, it turns out it would be easier to observe that patchiness than if the cheese were warm."

That mostly rules out cold reionization theory, which posited a colder starting point. The HERA researchers suspect, instead, that the X-rays from X-ray binary stars heated up the intergalactic medium first.

"The X-rays will effectively heat up the whole block of cheese before the holes will form," Dillon said. "And those holes are the ionized bits."

"HERA is continuing to improve and set better and better limits," Parsons said. "The fact that we're able to keep pushing through, and we have new techniques that are continuing to bear fruit for our telescope, is great."

Read more at Science Daily

Reducing steel corrosion vital to combating climate change

Every year, the United States spends nearly a trillion dollars fighting metallic corrosion, an electrochemical reaction that occurs when metals oxidize and begin to rust. By taking on this surprisingly insidious issue, researchers have now estimated how much corrosion is gradually worsening global carbon emissions.

Global steel production has been rising steadily for decades -- and because steel has poor resistance to corrosion, part of that demand is to replace steel used in construction materials that have become corroded over time, in everything from bridges to automobiles. Reducing the amount of steel that needs to be replaced due to corrosion could have measurable effects on how much greenhouse gases are produced to make steel, said Gerald Frankel, co-author of the study and a professor in materials science and engineering at The Ohio State University,

Though previous studies have estimated the current economic cost of corrosion to be about 3 to 4% of a nation's gross domestic product, this new study, led by Ohio State alum Mariano Iannuzzi, is the first to quantify the environmental impact associated with steel corrosion.

The study was recently published in the journalnpj Materials Degradation.

"Given society's reliance on coal fuel, iron and steel production is one of the largest greenhouse gases emitters of any industry," said Frankel. "But most of the costs associated with the industry actually stem from the energy that goes into creating steel, and that energy is lost as the steel reverts to rust, which is similar to its original form of iron ore."

The time it takes steel to corrode largely depends on the severity of the environment and the alloy composition, but this environmentally expensive issue is only getting worse, said Frankel.

Using historical carbon dioxide intensity data to estimate carbon dioxide levels per year beginning from 1960, the researchers found that in 2021, steel production accounted for 27% of the carbon emissions of the global manufacturing sector, and about 10.5% of the total global carbon emissions worldwide. Corroded steel replacement accounted for about 1.6 to 3.4% of emissions.

But there is some good news, the study noted. Due to regulations placed on the steel industry, technological advances in the steelmaking process have resulted in a 61% reduction in energy consumption over the last 50 years.

Despite this improvement, the results of the study are a call to action for policymakers and industry officials to amend and coordinate international policy regarding steel production and corrosion management, Frankel said.

"Coordinated international strategies, as well as decreasing global steel demand, by using best practices for corrosion mitigation, could better improve global corrosion management strategies and drastically reduce the rise in greenhouse gas emissions we're seeing due to repeatedly replacing corroded steel," he said.

If actions to improve steel's carbon footprint aren't taken soon, the study notes that greenhouse gas emissions produced by the steel industry could reach about 27.5% of the world's total carbon emissions by 2030, with corroded steel representing about 4 to 9% of that number. Such a result would make the goals set by the Paris Agreement to limit Earth's warming to 1.5 degrees Celsius as well as the U.S.'s own domestic climate goals almost completely unfeasible. The study notes that management strategies such as taking advantage of machine learning technologies could be one of the best chances we have to reduce Earth's carbon dioxide levels.

That said, if humans cannot meet these conditions, the consequences for Earth's climate will be dire, so more people need to be made aware that a low-carbon steel industry is needed to prevent such a dystopia, said Frankel.

Read more at Science Daily

A butterfly flaps its wings and scientists make jewelry

The further out in time, the more unreliable a weather forecast. That's because small variations in initial weather conditions can completely change the entire system, making it unpredictable. Put another way, in the "butterfly effect," an insect can flap its wings and create a microscopic change in initial conditions that leads to a hurricane halfway around the world.

This chaos is seen everywhere, from weather to labor markets to brain dynamics. And now, in the journal Chaos, by AIP Publishing, researchers from the University of Calabria explored how to turn the twisting, fractal structures behind the science into jewelry with 3D printing.

The jewelry shapes are based on the Chua circuit, a simple electronic system that was the first physical, mathematical, and experimental proof of chaos. Instead of an ordinary circuit, which produces an oscillating current, Chua's circuit results in oscillations that never repeat.

"These chaotic configurations, called strange attractors, are complex structures that had never been observed before," said author Eleonora Bilotta. "The depictions of such structures are strikingly beautiful, continually shifting when the point of view is changing. Jewelry seemed to be the best way to interpret the beauty of chaotic shapes."

At first, the team tried to employ goldsmiths to create prototypes of the twisting, arcing patterns. But the chaotic forms proved too difficult to manufacture with traditional methods. In contrast, additive printing allows for the necessary detail and structure. By 3D-printing the jewelry, the team created a counter-mold for a goldsmith to use as a cast.

"Seeing the chaotic shapes transformed into real, polished, shiny, physical jewelry was a great pleasure for the whole team. Touching and wearing them was also extremely exciting," said Bilotta. "We think it is the same joy that a scientist feels when her theory takes form, or when an artist finishes a painting."

The jewelry can also be used as an educational tool, providing students the ability to develop their scientific knowledge and artistic creativity. By building Chua's circuit, they can manipulate chaos and discover the extreme sensitivity to initial conditions. While designing the jewelry before sending it to be printed, they can tweak the parameters to generate different shapes according to personal taste.

Read more at Science Daily

Neuronal molecule makes prostate cancer more aggressive

Prostate cancer is the second most common cancer and the second leading cause of cancer death among American men. Now, researchers have discovered key molecular players that drive prostate cancer to progress into a highly aggressive form of the disease called neuroendocrine prostate cancer that currently has no effective treatment. The finding uncovers new avenues to explore for therapeutics to treat neuroendocrine prostate cancer.

"We have found novel pathways that promote neuroendocrine prostate cancer," says senior author Lucia R. Languino, PhD, a professor in the department of Pharmacology, Physiology and Cancer Biology and director of the Genetics, Genomics, and Cancer Biology PhD Program at Thomas Jefferson University. She and her team published the new research online on November 7, 2022 in the journal Scientific Reports.

Most prostate cancers are a type of disease called prostate adenocarcinoma. Other types of prostate cancer, including neuroendocrine tumors, are rare. However, unlike prostate adenocarcinoma, neuroendocrine prostate cancer is very aggressive and can quickly spread to other parts of the body. Treatments that are effective for adenocarcinomas in the prostate do not work against neuroendocrine prostate cancers.

Adenocarcinoma prostate cancers can progress into neuroendocrine prostate cancer. Until now, how this transition occurs has been a mystery.

To better understand how neuroendocrine prostate cancer develops, Dr. Languino and colleagues looked for biomarkers of the disease. In previous work, they discovered that a molecule known as aVb3 integrin is abundant in mice and humans with neuroendocrine prostate cancer, but missing in prostate adenocarcinoma.

To look for molecules unique to neuroendocrine prostate cancer, the researchers found that aVb3 integrin expression in prostate cancer cells bumped up the expression of a known marker of neuroendocrine prostate cancer and significantly increased the expression of a molecule called Nogo receptor 2 (NgR2).

The finding "was a big discovery," Dr. Languino says, who is also a researcher with the Sidney Kimmel Cancer Center -- Jefferson Health. That's because NgR2 is a protein found in nerve cells, where it contributes to neuronal functions. It has never before been studied in cancer, of any kind.

Dr. Languino and her colleagues wanted to find out what this molecule, a neuronal protein, is doing in cancer.

An initial experiment revealed that NgR2 binds the aVb3 integrin. The scientists also saw that in mice with neuroendocrine prostate tumors, aVb3 integrin and NgR2 were both present in the primary tumor and in cancerous lesions that had formed in the lungs of the animals. A follow-up experiment made it clear that both aVb3 integrin and NgR2 are necessary for neuroendocrine prostate cancers.

When Dr. Languino and her team lowered the amount of NgR2 in neuroendocrine prostate cancer cells, neuroendocrine markers also decreased. The results suggest that NgR2 plays a role in the development of neuroendocrine prostate cancer. Lowering the amount of NgR2 also reduced the ability of cancer cells to grow and move, indicating that NgR2 may have a hand in cancer spreading to other parts of the body, in a process known as metastasis. Metastases are often what makes cancers fatal.

"These two molecules, aVb3 integrin and NgR2, seem to create a combination that is lethal," Dr. Languino says.

She and her colleagues are now looking for a molecule or antibody that would block the effect of NgR2, or the aVb3 integrin/NgR2 complex, to inhibit their ability to promote neuroendocrine prostate cancer growth and development, and make the cancer more susceptible to therapy.

Read more at Science Daily

Jan 24, 2023

Darkest view ever of interstellar ices

An international team including Southwest Research Institute, Leiden University and NASA used observations from the James Webb Space Telescope (JWST) to achieve the darkest ever view of a dense interstellar cloud. These observations have revealed the composition of a virtual treasure chest of ices from the early universe, providing new insights into the chemical processes of one of the coldest, darkest places in the universe as well as the origins of the molecules that make up planetary atmospheres.

"The JWST allowed us to study ices that exist on dust grains within the darkest regions of interstellar molecular clouds," said SwRI Research Scientist Dr. Danna Qasim, co-author of the study published in Nature Astronomy. "The clouds are so dense that these ices have been mostly protected from the harsh radiation of nearby stars, so they are quite pristine. These are the first ices to be formed and also contain biogenic elements, which are important to life."

NASA's JWST has a 6.5-meter-wide mirror providing remarkable spatial resolution and sensitivity, optimized for infrared light. As a result, the telescope has been able to image the densest, darkest clouds in the universe for the first time.

"These observations provide new insights into the chemical processes in one of the coldest, darkest places in the universe to better understand the molecular origins of protoplanetary disks, planetary atmospheres, and other Solar System objects," Qasim said.

Most interstellar ices contain very small amounts of elements like oxygen and sulfur. Qasim and her co-authors seek to understand the lack of sulfur in interstellar ices.

"The ices we observed only contain 1% of the sulfur we're expecting. 99% of that sulfur is locked-up somewhere else, and we need to figure out where in order to understand how sulfur will eventually be incorporated into the planets that may host life," Qasim explained.

In the study, Qasim and colleagues propose that the sulfur may be locked in reactive minerals like iron sulfide, which may react with ices to form the sulfur-bearing ices observed.

"Iron sulfide is a highly reactive mineral that has been detected in the accretion disks of young stars and in samples returned from comets. It's also the most common sulfide mineral in lunar rocks," Qasim said. "If sulfur is locked-up in these minerals, that could explain the low amount of sulfur in interstellar ices, which has implications for where sulfur is stored in our Solar System. For example, the atmosphere of Venus has sulfur-containing molecules, in which the sulfur could have partially come from interstellar-inherited minerals."

From Science Daily

Agriculture linked to changes in age-independent mortality in North America

The transition to agriculture from hunting and gathering in pre-colonial North America led to changes in age-independent mortality, or mortality caused by factors that are not associated with age, according to a new study by a Penn State-led research team. The team found that the intensification of crop use occurred in two phases, the first of which led to a decline in human age-independent mortality, while the second is associated with a rise in it. The study is the first to tie patterns of age-independent mortality to food production.

"This study tells the story of our shared human experience," said George Milner, distinguished professor of anthropology at Penn State and lead author. "We have several examples around the world where we see a move toward crop domestication as an independent event -- eastern North America, particularly the midcontinent, being one of them, but so too the Fertile Crescent in the Middle East. Also, there are demographic changes happening. This paper addresses the relationship between the move toward agriculture and demographic change."

The researchers examined previously published data to identify general trends in archaeobotanical samples, or the remains of plants in the archaeological record, and skeletal samples from sites across eight states stretching from Illinois to northern Alabama. They wanted to study the relationship between the domestication of crops and an index that uses skeletal data to capture the frequency of juveniles aged five to 19 years old relative to all individuals aged five or more. Anthropologists normally use the index to measure fertility rates and population growth, but the new work shows it is more responsive to age-independent mortality.

Mortality models, including those for pre-industrial societies, contain three components: juvenile mortality, which declines as children get older; adult mortality, where the probability of dying increases with advancing age; and age-independent mortality, an equal probability of dying for members of all age groups, which might occur in extreme events like food shortages, epidemics or warfare.

The researchers studied the archaeobotanical data to identify where the record showed an increase in the consumption of domesticated crops compared to foraged foods like nuts. They also examined skeletal data to identify decreases or increases in the indicator of age-independent mortality. The index focuses on individuals between five and 19 years old because in human populations that age range is characterized by low mortality relative to other age groups. Increases in mortality for this age group would indicate the occurrence of events like famines or conflict.

The researchers identified a strong correlation between crop domestication and changing age-independent mortality rates. Crop domestication happened in two stages in pre-colonial North America, with a decrease in age-independent mortality noted during the first stage of crop domestication and a rise during the second stage. The researchers reported their findings in the Proceedings of the National Academy of Sciences.

"What we've found is the index that has traditionally been interpreted as a fertility and population growth indicator is more tightly correlated to age-independent mortality, which reflects the number of deaths in the part of the age distribution where very few people die," said Milner. "This means that the pattern of first adoption of agriculture, seen elsewhere in the world and observed in eastern North America as well, coincides with lower age-independent mortality. Basically, it's good times, and that's what we see culturally."

The first stage of agricultural intensification in North America, which includes the cultivation of plants such as squash, sunflower and other native plants, occurred approximately 2,000 years ago during the Middle Woodland period up to about A.D. 500, said Milner. Indigenous societies flourished during this time. They established long-distance exchange networks, had an incredibly rich ceremonial life, and constructed big mounds and earthwork complexes.

The archaeological record shows that in the centuries just before A.D. 1000, and from that time onward, there was an increase in warfare. During this time Indigenous societies began cultivating maize and beans, and a number of new cultural changes occurred, including the initial development of powerful chiefdom societies. Age-independent mortality increased during this period, presumably due to conflict and the spread of diseases from higher numbers of individuals living near one another.

"The overall pattern seen in the demographic picture of North American pre-European contact is similar to other datasets from around the world," Milner said. "The entire story makes perfect sense in terms of agricultural productivity, demographic change and cultural developments, including change over time in conflict and sociopolitical systems."

The study links, for the first time, a worldwide pattern to age-independent mortality and agricultural developments, according to Milner.

Read more at Science Daily

Young chimpanzees and human teens share risk-taking behaviors

Adolescent chimpanzees share some of the same risk-taking behaviors as human teens, but they may be less impulsive than their human counterparts, according to research published by the American Psychological Association. The study gets at age-old nature/nurture questions about why adolescents take more risks: because of environment or because of biological predispositions?

"Adolescent chimpanzees are in some sense facing the same psychological tempest that human teens are," said lead researcher Alexandra Rosati, PhD, an associate professor of psychology and anthropology at the University of Michigan. "Our findings show that several key features of human adolescent psychology are also seen in our closest primate relatives."

The researchers conducted two tests involving food rewards with 40 wild-born chimpanzees at a sanctuary in Republic of Congo. The chimpanzees voluntarily participated in the games in order to receive food treats. The research was published online in the Journal of Experimental Psychology: General.

Chimpanzees can live to be 50 years old and experience adolescence at about 8 to 15. Like humans, chimpanzees show rapid changes in hormone levels during adolescence, start forming new bonds with peers, show increases in aggression and compete for social status.

In the first test, adolescent and adult chimpanzees could choose between two containers in a gambling task. One container always contained peanuts, a food that chimpanzees somewhat like. The other container concealed either an unliked food -- a cucumber slice -- or a favorite food -- a banana slice. The chimpanzees could play it safe and get the peanuts, or take a chance for some coveted banana at the risk of ending up with unappetizing cucumber.

The chimpanzees' emotional reactions and vocalizations were recorded, including moans, whimpers, screams, banging on the table or scratching themselves. Saliva samples also were collected to track hormone levels.

During several rounds of the test, adolescent chimpanzees took the risky option more often than adult chimpanzees, but adolescents and adults had similar negative reactions when they received cucumber.

The second test, modeled after the famous "marshmallow test" with human children, examined delayed gratification where chimpanzees could receive one banana slice immediately or wait for one minute to receive three slices.

Both adolescent and adult chimpanzees chose the greater delayed reward at a similar rate. Human teens tend to be more impulsive than adults so they would be more likely to take the immediate reward.

"Prior research indicates that chimpanzees are quite patient compared with other animals, and our study shows that their ability to delay gratification is already mature at a fairly young age, unlike in humans," Rosati said.

However, adolescent chimpanzees weren't happy about waiting for the extra banana slices and they threw more tantrums during the one-minute delay than adult chimpanzees.

Read more at Science Daily

Genome editing procedures optimized

In the course of optimising key procedures of genome editing, researchers from the department of Developmental Biology / Physiology at the Centre for Organismal Studies of Heidelberg University have succeeded in substantially improving the efficiency of molecular genetic methods such as CRISPR/Cas9 and related systems, and in broadening their areas of application. Together with colleagues from other disciplines, the life scientists fine-tuned these tools to enable, inter alia, effective genetic screening for modelling specific gene mutations. In addition, initially inaccessible DNA sequences can now be modified. According to Prof. Dr Joachim Wittbrodt, this opens up extensive new areas of work in basic research and, potentially, therapeutic application.

Genome editing means the deliberate altering of DNA with molecular genetic methods. It is used to breed plants and animals, but also in basic medical and biological research. The most common procedures include the "gene scissors" CRISPR/Cas9 and its variants known as base editors. In both cases, enzymes have to be transported into the nucleus of the target cell. Upon arrival, the CRISPR/Cas9 system cuts the DNA at specific sites, which causes a double strand break. New DNA segments can then be inserted at that site. Base editors use a similar molecular mechanism but they do not cut the DNA double strand. Instead, an enzyme coupled with the Cas9 protein performs a targeted exchange of nucleotides -- the basic building blocks of the genome. In three successive studies, Prof. Wittbrodt's team succeeded in considerably enhancing the efficiency and applicability of these methods.

A challenge when using CRISPR/Cas9 consists in the efficient delivery of the required Cas9 enzymes to the nucleus. "The cell has an elaborate 'bouncer' mechanism. It distinguishes between proteins that are allowed to translocate into the nucleus and those that are supposed to stay in the cytoplasm," explains Dr Tinatini Tavhelidse-Suck from Prof. Wittbrodt's team. Access is enabled here by a tag made up of a few amino acids that functions like an "admission ticket." The scientists have now come up with a kind of generally valid "VIP admission ticket" which lets enzymes equipped with it into the nucleus very quickly. They have named it "high efficiency-tag," "hei-tag" for short. "Other proteins that have to penetrate the cell nucleus are also more successful with 'hei-tag'," concludes Dr Thomas Thumberger, who is also a researcher at the Centre for Organismal Studies (COS). In cooperation with pharmacologists from Heidelberg University, the team could show that Cas9 in connection with the "hei-tag" ticket can enable highly efficient, targeted genome alterations not only in the model organism medaka, the Japanese ricefish (Oryzias latipes), but also in mammalian cell cultures and mouse embryos.

In a further study, the Heidelberg scientists showed that base editors operate highly efficiently in the living organism and are even suited to genetic screening. In an experiment with Japanese rice fish, they were able to show that these locally limited, targeted modifications in individual buildings blocks of the DNA achieve an outcome that is otherwise only obtained by the comparatively laborious breeding of organisms with altered genes. The research team at COS, in cooperation with Dr Dr Jakob Gierten, a paediatric cardiologist at Heidelberg University Hospital, focused on certain genetic mutations. These mutations were suspected of triggering congenital heart defects in humans. Through modifying individual building blocks of the DNA of the relevant genes in the model organism, the scientists were able to imitate and study fish embryos with the described heart defects. The targeted intervention led to visible changes in the heart already during early stages of fish embryonic development, say Bettina Welz and Dr Alex Cornean, two of the first authors of the study from Prof. Wittbrodt's team. That enabled the researchers to confirm the original suspicion and establish a causal connection between genetic alteration and clinical symptoms.

The precise intervention in the genome of the fish embryos was made possible through especially developed software ACEofBASEs, which is available online. It allows for identifying genetic locations that very efficiently lead to desired changes in the target genes and the resultant proteins. The scientists say that the Japanese ricefish is an excellent genetic model organism for modelling mutations like those identified from the respective patients. "Our method enables an efficient screening analysis and could therefore offer a starting point for developing individualised medical treatment," according to Jakob Gierten.

A third study, again from the Wittbrodt group, deals with the limitations of base editors. For such an editor to bind the DNA of a target cell, there has to be a certain sequence motif. It is called Protospacer Adjacent Motif, PAM for short. "If this motif is lacking near the DNA building block to be changed, it is impossible to exchange nucleotides," explains Dr Thumberger. A team under his direction has now found a way to get around this limitation. Two base editors in a single cell are used in succession. In an initial step, a new DNA binding motif for a further base editor is generated, upon which this second editor, which is applied simultaneously, can edit a site that was inaccessible before. This staggered use turned out to be highly efficient, explains Kaisa Pakari, the first author of the study. With this trick, the Heidelberg scientists were able to increase the number of possible application sites of established base editors by 65 percent. Now DNA sequences that were initially inaccessible can also be modified.

"Optimising the existing tools for genome editing and their fine-tuned application results in enormously varied possibilities for basic research and, potentially, novel therapeutic approaches," Joachim Wittbrodt underlines.

Read more at Science Daily

Jan 23, 2023

Massive fuel-hungry black holes feed off intergalactic gas

Research led by the University of Southampton has revealed how supermassive black holes (SMBHs) are feeding off gas clouds which reach them by travelling hundreds of thousands of light years from one galaxy to another.

An international team of scientists has shown there is a crucial link between the interaction of neighbouring galaxies and the enormous amount of gas needed to 'fuel' these giant, super-dense, space phenomena. Their findings are due to be published in the journal Nature Astronomy.

A black hole can be created when a star collapses, squeezing matter into a relatively tiny space. This increases the force of gravity to a point where nothing can escape, not even light -- hence the name.

Some black holes are gigantic, with masses millions of times greater than our sun, emitting enormous amounts of energy. These are known as 'supermassive black holes' and exactly how they are formed or gain enough fuel to power themselves is still a mystery.

Astrophysicist and lead researcher from the University of Southampton, Dr Sandra Raimundo, comments: "Supermassive black holes fuel their activity by, in part, the gradual accumulation of gas from the environment around them. Supermassive black holes can make the centres of galaxies shine very brightly when they capture gas and it's thought this process can be a major influence on the way that galaxies look today. How SMBHs get enough fuel to sustain their activity and growth still puzzles astronomers, but the work we have carried out provides a step towards understanding this."

The Southampton scientist, working with researchers at the universities of Copenhagen and California, used data from the 4-metre Anglo-Australian telescope in New South Wales, Australia* to study the orbits of gas and stars in a large sample of more than 3000 galaxies. They identified those with the presence of what is known as 'misaligned' gas -- in other words, gas which rotates in a different direction from the stars in the galaxy, signalling a past galaxy interaction. They then found that galaxies with misaligned gas had a higher fraction of active supermassive black holes.

The results showed a clear link between misaligned gas and supermassive black hole activity -- suggesting the gas is transferred where two galaxies meet, meanders vast distances through space and then succumbs to the huge gravitational forces of the supermassive black hole -- pulled in and swallowed up as a vital source of fuel. Astronomers have long suspected that a merger with another galaxy could provide this source of gas, but direct evidence for this has been elusive.

Dr Raimundo explains: "The work that we carried out shows the presence of gas that is misaligned from stars is associated with an increase in the fraction of active supermassive black holes. Since misaligned gas is a clear sign of a past interaction between two galaxies, our work shows that galaxy interactions provide fuel to power active supermassive black holes.

"This is the first time that a direct connection has been observed between the formation and presence of misaligned gas and the fuelling of active supermassive black holes."

Dr Marianne Vestergaard, a co-author in the study, highlights: "What is exciting about these observations is that we can now, for the very first time, identify the captured gas and trace it all the way to the centre where the black hole is devouring it."

Read more at Science Daily

Bacteria really eat plastic

The bacterium Rhodococcus ruber eats and actually digests plastic. This has been shown in laboratory experiments by PhD student Maaike Goudriaan at Royal Netherlands Institute for Sea Research (NIOZ). Based on a model study with plastic in artificial seawater in the lab, Goudriaan calculated that bacteria can break down about one percent of the fed plastic per year into CO2 and other harmless substances. "But," Goudriaan emphasizes, "this is certainly not a solution to the problem of the plastic soup in our oceans. It is, however, another part of the answer to the question of where all the 'missing plastic' in the oceans has gone."

Special plastic

Goudriaan had a special plastic manufactured especially for these experiments with a distinct form of carbon (13C) in it. When she fed that plastic to bacteria after pretreatment with "sunlight" -- a UV lamp -- in a bottle of simulated seawater, she saw that special version of carbon appear as CO2 above the water. "The treatment with UV light was necessary because we already know that sunlight partially breaks down plastic into bite-sized chunks for bacteria," the researcher explains.

Proof of principle

"This is the first time we have proven in this way that bacteria actually digest plastic into CO2 and other molecules," Goudriaan states. It was already known that the bacterium Rhodococcus ruber can form a so-called biofilm on plastic in nature. It had also been measured that plastic disappears under that biofilm. "But now we have really demonstrated that the bacteria actually digest the plastic."

Underestimate

When Goudriaan calculates the total breakdown of plastic into CO2, she estimates that the bacteria can break down about one percent of the available plastic per year. "That's probably an underestimate," she adds. "We only measured the amount of carbon-13 in CO2, so not in the other breakdown products of the plastic. There will certainly be 13C in several other molecules, but it's hard to say what part of that was broken down by the UV light and what part was digested by the bacteria."

No solution

Even though marine microbiologist Goudriaan is very excited about the plastic-eating bacteria, she stresses that microbial digestion is not a solution to the huge problem of all the plastic floating on and in our oceans. "These experiments are mainly a proof of principle. I see it as one piece of the jigsaw, in the issue of where all the plastic that disappears into the oceans stays. If you try to trace all our waste, a lot of plastic is lost. Digestion by bacteria could possibly provide part of the explanation."

From lab to mudflats

To discover whether 'wild' bacteria also eat plastic 'in the wild', follow-up research needs to be done. Goudriaan already did some pilot experiments with real sea water and some sediment that she had collected from the Wadden Sea floor. "The first results of these experiments hints at plastic being degraded, even in nature," she says. "A new PhD student will have to continue that work. Ultimately, of course, you hope to calculate how much plastic in the oceans really is degraded by bacteria. But much better than cleaning up, is prevention. And only we humans can do that," Goudriaan says.

Read more at Science Daily

'Smart' walking stick could help visually impaired with groceries, finding a seat

Engineers at the University of Colorado Boulder are tapping into advances in artificial intelligence to develop a new kind of walking stick for people who are blind or visually impaired.

Think of it as assistive technology meets Silicon Valley.

The researchers say that their "smart" walking stick could one day help blind people navigate tasks in a world designed for sighted people -- from shopping for a box of cereal at the grocery store to picking a private place to sit in a crowded cafeteria.

"I really enjoy grocery shopping and spend a significant amount of time in the store," said Shivendra Agrawal, a doctoral student in the Department of Computer Science. "A lot of people can't do that, however, and it can be really restrictive. We think this is a solvable problem."

In a study published in October, Agrawal and his colleagues in the Collaborative Artificial Intelligence and Robotics Lab got one step closer to solving it.

The team's walking stick resembles the white-and-red canes that you can buy at Walmart. But it also includes a few add-ons: Using a camera and computer vision technology, the walking stick maps and catalogs the world around it. It then guides users by using vibrations in the handle and with spoken directions, such as "reach a little bit to your right."

The device isn't supposed to be a substitute for designing places like grocery stores to be more accessible, Agrawal said. But he hopes his team's prototype will show that, in some cases, AI can help millions of Americans become more independent.

"AI and computer vision are improving, and people are using them to build self-driving cars and similar inventions," Agrawal said. "But these technologies also have the potential to improve quality of life for many people."

Take a seat

Agrawal and his colleagues first explored that potential by tackling a familiar problem: Where do I sit?

"Imagine you're in a café," he said. "You don't want to sit just anywhere. You usually take a seat close to the walls to preserve your privacy, and you usually don't like to sit face-to-face with a stranger."

Previous research has suggested that making these kinds of decisions is a priority for people who are blind or visually impaired. To see if their smart walking stick could help, the researchers set up a café of sorts in their lab -- complete with several chairs, patrons and a few obstacles.

Study subjects strapped on a backpack with a laptop in it and picked up the smart walking stick. They swiveled to survey the room with a camera attached near the cane handle. Like a self-driving car, algorithms running inside the laptop identified the various features in the room then calculated the route to an ideal seat.

The team reported its findings this fall at the International Conference on Intelligent Robots and Systems in Kyoto, Japan. Researchers on the study included Bradley Hayes, assistant professor of computer science, and doctoral student Mary Etta West.

The study showed promising results: Subjects were able to find the right chair in 10 out of 12 trials with varying levels of difficulty. So far, the subjects have all been sighted people wearing blindfolds. But the researchers plan to evaluate and improve their device by working people who are blind or visually impaired once the technology is more dependable.

"Shivendra's work is the perfect combination of technical innovation and impactful application, going beyond navigation to bring advancements in underexplored areas, such as assisting people with visual impairment with social convention adherence or finding and grasping objects," Hayes said.

Let's go shopping

Next up for the group: grocery shopping.

In new research, which the team hasn't yet published, Agrawal and his colleagues adapted their device for a task that can be daunting for anyone: finding and grasping products in aisles filled with dozens of similar-looking and similar-feeling choices.

Again, the team set up a makeshift environment in their lab: this time, a grocery shelf stocked with several different kinds of cereal. The researchers created a database of product photos, such as boxes of Honey Nut Cheerios or Apple Jacks, into their software. Study subjects then used the walking stick to scan the shelf, searching for the product they wanted.

"It assigns a score to the objects present, selecting what is the most likely product," Agrawal said. "Then the system issues commands like 'move a little bit to your left.'"

He added that it will be a while before the team's walking stick makes it into the hands of real shoppers. The group, for example, wants to make the system more compact, designing it so that it can run off a standard smartphone attached to a cane.

But the human-robot interaction researchers also hope that their preliminary results will inspire other engineers to rethink what robotics and AI are capable of.

Read more at Science Daily

We need to learn to live with less steel

Steel is one of the most important materials in the world, integral to the cars we drive, the buildings we inhabit, and the infrastructure that allows us to travel from place to place. Steel is also responsible for 7% of global greenhouse gas emissions. In 2021, 45 countries made a commitment to pursue near-zero-emission steel in the next decade. But how possible is it to produce the steel we need in society with zero emissions?

A new study focused on the Japanese steel industry shows that if we are truly committed to reaching zero emissions, we must be prepared for a scenario where the amount of steel we can produce is lower. Japan has set a target for a 46% reduction in emissions from steel by 2030, and zero emissions by 2050. So far, the roadmap for achieving this relies heavily on future innovations in technology. Hope is held out for developments in carbon capture and storage (CCS) and hydrogen-based technologies.

In the study, Dr. Takuma Watari, a researcher at the National Institute for Environmental Studies, Japan, currently working with the University of Cambridge, argues that there is no silver bullet. He says that current plans to cut carbon emissions underestimate how difficult it will be to develop CCS and hydrogen technologies and deploy them widely: "These technologies still face serious technical, economic, and social challenges, and have yet to be implemented at scale. And importantly, it is highly uncertain whether there will be sufficient non-emitting electricity to use these technologies." We need to confront the possibility that technological innovations might not be ready in time to allow us to maintain current levels of steel production whilst cutting emissions to zero.

The research involved mapping the current flows of steel in Japan's industry and using a model to explore how the industry might change if a strict carbon budget were applied in future. Dr. Watari explains that with current practice, the quantity and quality of steel produced would dramatically decrease under a zero-emission carbon budget. This is because of a lack of resources and the practice of downcycling, in which scraps of steel containing impurities are used to make new products. It is difficult to remove these impurities, so the new products have different quality and functionality from the original steel.

According to Dr. Watari, "zero-emission steel production is possible by 2050, but in limited quantity and quality compared to current total production. This is due to the limited availability of zero-emission compatible resources and downcycling practices of scrap steel."

The research indicates that with a carbon budget of zero emissions, the production of steel goods would be dramatically restricted compared to today, reaching about half the current levels at best. In this case, higher-quality steel production (e.g., sheet steel) would be especially hard hit.

The implication is clear. It is not enough to rely on a technological silver bullet materialising to transform the supply of steel. We also need to look seriously at strategies to reduce demand by shifting our culture of steel use and improving our material efficiency. We also need to pursue upcycling to produce high-grade steel from scrap steel.

This will require collaboration from those who use steel as well as those who produce it. Steel products could be made more resource efficient if they are designed to last longer or to be lightweight. Once steel products reach the end of their life, upcycling could be achieved through advanced sorting and shredding to remove impurities from scrap steel. As a society, Japan may also have to become less steel-dependent and shift to a model of 'service use' rather than ownership of products. Unlike today, when steel is abundant and cheap, a net-zero future will require us to use scarcer, more expensive steel resources with greater efficiency. 

Read more at Science Daily

Jan 22, 2023

The mechanism of cosmic magnetic fields explored in the laboratory

Recent research shows that magnetic fields can spontaneously emerge in a plasma if the plasma has a temperature anisotropy. This mechanism is known as the Weibel instability. This new research is the first to unambiguously observe the Weibel instability in the laboratory. It offers a possible solution to the problem of the origin of the microgauss-level magnetic fields that permeate the galaxies.

Plasma is matter that is so hot that the electrons are separated from atoms. The electrons float freely and the atoms become ions. This creates an ionized gas -- plasma -- that makes up nearly all of the visible universe. Recent research shows that magnetic fields can spontaneously emerge in a plasma. This can happen if the plasma has a temperature anisotropy -- temperature that is different along different spatial directions. This mechanism is known as the Weibel instability. It was predicted by plasma theorist Eric Weibel more than six decades ago but only now has been unambiguously observed in the laboratory. The new research finds that this process can convert a significant fraction of the energy stored in the temperature anisotropy into magnetic field energy. It also finds that the Weibel instability could be a source of magnetic fields that permeate throughout the cosmos.

The Impact

The matter in our observable universe is plasma state and it is magnetized. Magnetic fields at the micro-gauss level (about a millionth of the Earth's magnetic fields) permeate the galaxies. These magnetic fields are thought to be amplified from weak seed fields by the spiral motion of the galaxies, known as the galactic dynamo. How the seed magnetic fields are created is a longstanding question in astrophysics. This new work offers a possible solution to this vexing problem of the origin of the microgauss level seed magnetic fields. The research used a novel platform that has great potential for studying the ultrafast dynamics of magnetic fields in the laboratory plasmas that are relevant to astro- and high-energy density physics.

Summary

First theorized six decades ago, the Weibel instability driven by temperature anisotropy is thought to be an important mechanism for self-magnetization of many laboratory and astrophysical plasmas. However, scientists have faced two challenges in unambiguously demonstrating the Weibel instability. First, until recently, researchers were not able to generate a plasma with a known temperature anisotropy as initially envisioned by Weibel. Second, researchers had no suitable technique to measure the complex and rapidly evolving topology of the magnetic fields subsequently generated in the plasma.

This work, enabled by the unique capability of the Accelerator Test Facility, a Department of Energy (DOE) user facility at Brookhaven National Laboratory, employed a novel experimental platform that allowed the researchers to create a hydrogen plasma with a known highly anisotropic electron velocity distributions on a tens of trillionth of a second timescale by using an ultrashort but intense carbon dioxide laser pulse. The subsequent thermalization of the plasma occurs via self-organization of plasma currents that produces magnetic fields driven by Weibel instability. These fields are large enough to deflect relativistic electrons to reveal an image of the magnetic fields a certain distance from the plasma. The researchers obtained a movie of the evolution of these magnetic fields with exquisite spatiotemporal resolution by using an one picosecond relativistic electron beam to probe these fields.

Read more at Science Daily

New small laser device can help detect signs of life on other planets

As space missions delve deeper into the outer solar system, the need for more compact, resource-conserving and accurate analytical tools has become increasingly critical -- especially as the hunt for extraterrestrial life and habitable planets or moons continues.

A University of Maryland-led team developed a new instrument specifically tailored to the needs of NASA space missions. Their mini laser-sourced analyzer is significantly smaller and more resource efficient than its predecessors -- all without compromising the quality of its ability to analyze planetary material samples and potential biological activity onsite. The team's paper on this new device was published in the journal Nature Astronomy on January 16, 2023.

Weighing only about 17 pounds, the instrument is a physically scaled-down combination of two important tools for detecting signs of life and identifying compositions of materials: a pulsed ultraviolet laser that removes small amounts of material from a planetary sample and an OrbitrapTM analyzer that delivers high-resolution data about the chemistry of the examined materials.

"The Orbitrap was originally built for commercial use," explained Ricardo Arevalo, lead author of the paper and an associate professor of geology at UMD. "You can find them in the labs of pharmaceutical, medical and proteomic industries. The one in my own lab is just under 400 pounds, so they're quite large, and it took us eight years to make a prototype that could be used efficiently in space -- significantly smaller and less resource-intensive, but still capable of cutting-edge science."

The team's new gadget shrinks down the original Orbitrap while pairing it with laser desorption mass spectrometry (LDMS) -- techniques that have yet to be applied in an extraterrestrial planetary environment. The new device boasts the same benefits as its larger predecessors but is streamlined for space exploration and onsite planetary material analysis, according to Arevalo.

Thanks to its diminutive mass and minimal power requirements, the mini Orbitrap LDMS instrument can be easily stowed away and maintained on space mission payloads. The instrument's analyses of a planetary surface or substance are also far less intrusive and thus much less likely to contaminate or damage a sample than many current methods that attempt to identify unknown compounds.

"The good thing about a laser source is that anything that can be ionized can be analyzed. If we shoot our laser beam at an ice sample, we should be able to characterize the composition of the ice and see biosignatures in it," Arevalo said. "This tool has such a high mass resolution and accuracy that any molecular or chemical structures in a sample become much more identifiable."

The laser component of the mini LDMS Orbitrap also allows researchers access to larger, more complex compounds that are more likely to be associated with biology. Smaller organic compounds like amino acids, for example, are more ambiguous signatures of life forms.

"Amino acids can be produced abiotically, meaning that they're not necessarily proof of life. Meteorites, many of which are chock full of amino acids, can crash onto a planet's surface and deliver abiotic organics to the surface," Arevalo said. "We know now that larger and more complex molecules, like proteins, are more likely to have been created by or associated with living systems. The laser lets us study larger and more complex organics that can reflect higher fidelity biosignatures than smaller, simpler compounds."

For Arevalo and his team, the mini LDMS Orbitrap will offer much-needed insight and flexibility for future ventures into the outer solar system, such as missions focused on life detection objectives (e.g., Enceladus Orbilander) and exploration of the lunar surface (e.g., the NASA Artemis Program). They hope to send their device into space and deploy it on a planetary target of interest within the next few years.

"I view this prototype as a pathfinder for other future LDMS and Orbitrap-based instruments," Arevalo said. "Our mini Orbitrap LDMS instrument has the potential to significantly enhance the way we currently study the geochemistry or astrobiology of a planetary surface."

Read more at Science Daily