Jan 22, 2022

Highly eccentric black hole merger discovered

For the first time, scientists believe they have detected a merger of two black holes with eccentric orbits. According to a paper published in Nature Astronomy by researchers from Rochester Institute of Technology's Center for Computational Relativity and Gravitation and the University of Florida, this can help explain how some of the black hole mergers detected by LIGO Scientific Collaboration and the Virgo Collaboration are much heavier than previously thought possible.

Eccentric orbits are a sign that black holes could be repeatedly gobbling up others during chance encounters in areas densely populated with black holes such as galactic nuclei. The scientists studied the most massive gravitational wave binary observed to date, GW190521, to determine if the merger had eccentric orbits.

"The estimated masses of the black holes are more than 70 times the size of our sun each, placing them well above the estimated maximum mass predicted currently by stellar evolution theory," said Carlos Lousto, a professor in the School of Mathematical Sciences and a member of the CCRG. "This makes an interesting case to study as a second generation binary black hole system and opens up to new possibilities of formation scenarios of black holes in dense star clusters."

A team of RIT researchers including Lousto, Research Associate James Healy, Jacob Lange '20 Ph.D. (astrophysical sciences and technology), Professor and CCRG Director Manuela Campanelli, Associate Professor Richard O'Shaughnessy, and collaborators from the University of Florida formed to give a fresh look at the data to see if the black holes had highly eccentric orbits before they merged. They found the merger is best explained by a high-eccentricity, precessing model. To achieve this, the team performed hundreds of new full numerical simulations in local and national lab supercomputers, taking nearly a year to complete.

"This represents a major advancement in our understanding of how black holes merge," said Campanelli. "Through our sophisticated supercomputer simulations and the wealth of new data provided by LIGO and Virgo's rapidly advancing detectors, we are making new discoveries about the universe at astonishing rates."

Read more at Science Daily

Novel nanoantibiotics kill bacteria without harming healthy cells

The Centers for Disease Control and Prevention estimates that more than 2.8 million Americans experience antibiotic-resistant infections each year; more than 35,000 die from those infections.

To address this critical and worldwide public health issue, a team of researchers led by Hongjun (Henry) Liang, Ph.D., from the Texas Tech University Health Sciences Center (TTUHSC) Department of Cell Physiology and Molecular Biophysics, recently investigated whether or not a series of novel nanoparticles can kill some of the pathogens that lead to human infection without affecting healthy cells.

The study, "Hydrophilic Nanoparticles that Kill Bacteria while Sparing Mammalian Cells Reveal the Antibiotic Role of Nanostructures," was published Jan. 11 by Nature Communications. Other study members of the Liang team, all from TTUHSC, included Yunjiang Jiang, Ph.D., Wan Zheng, Ph.D., Keith Tran, Elizabeth Kamilar, Jitender Bariwal, Ph.D., and Hairong Ma, Ph.D.

Past research has shown that hydrophobicity (a molecule's ability to repel water) and hydrophilicity (a molecule's ability to attract and dissolve in water) affects cells; the more hydrophobic a substance is, the more adverse the reaction it will cause. However, Liang said, there is no quantitative standard for how much hydrophobicity is acceptable.

"Basically, you can kill bacteria when you increase hydrophobicity," Liang said. "But it will also kill healthy cells, and we don't want that."

For their study, the Liang team used novel hydrophilic nanoparticles known as nanoantibiotics that were developed by Liang's laboratory. Structurally speaking, these novel nanoantibiotics resemble tiny hairy spheres, each composed of many hydrophilic polymer brushes grafted onto silica nanoparticles of different sizes.

These synthetic compounds, which Liang's lab produces, are designed to kill bacteria via membrane disruptions like antimicrobial peptides do, but through a different mode of membrane remodeling that damages bacterial membranes and not mammalian cells. Antimicrobial peptides are a diverse class of amphipathic molecules (partially hydrophilic-partially hydrophobic), which occur naturally and serve as the first line of defense for all multicellular organisms. The direct use of antimicrobial peptides as antibiotics is limited by their stability and toxicity.

There have been other studies in which researchers grafted amphipathic molecules onto nanoparticles, and they too kill bacteria. However, Liang said the primary issue in using amphipathic molecules is that it becomes very difficult to strike the right balance between their hydrophobicity and hydrophilicity so that the toxicity of these molecules to our own cells is significantly reduced.

"In our case, we remove that uncertainty from the equation because we started with a hydrophilic polymer," Liang pointed out. "The cytotoxicity of hydrophobic moieties is not a concern anymore. Those hydrophilic polymers by themselves, or the silica nanoparticles alone don't kill bacteria; they have to be grafted onto the nanostructure to be able to kill bacteria. And so, this is the first important discovery."

The Liang team also discovered that the degree of antibiotic activity is affected by the size of the hairy spheres, which according to Liang is the second important discovery of this research. Those measuring 50 nanometers and below appear to be much more active than those whose size exceeds 50 nanometers. Liang said those measuring approximately 10 nanometers appear to be the most active. (Using synchrotron small angle x-ray scattering and other methods, the Liang team is able to interpret the molecular mechanism of the size-dependent antibiotic activity.)

These discoveries are important because using nanoantibiotics to kill bacteria evades all known mechanisms of bacterial resistance unless bacteria completely revamp their pathways for making cell membranes, which Liang said is unlikely.

Read more at Science Daily

Jan 21, 2022

Consistent asteroid showers rock previous thinking on Mars craters

New Curtin University research has confirmed the frequency of asteroid collisions that formed impact craters on Mars has been consistent over the past 600 million years.

The study, published in Earth and Planetary Science Letters, analysed the formation of more than 500 large Martian craters using a crater detection algorithm previously developed at Curtin, which automatically counts the visible impact craters from a high-resolution image.

Despite previous studies suggesting spikes in the frequency of asteroid collisions, lead researcher Dr Anthony Lagain, from Curtin's School of Earth and Planetary Sciences, said his research had found they did not vary much at all for many millions of years.

Dr Lagain said counting impact craters on a planetary surface was the only way to accurately date geological events, such as canyons, rivers and volcanoes, and to predict when, and how big, future collisions would be.

"On Earth, the erosion of plate tectonics erases the history of our planet. Studying planetary bodies of our Solar System that still conserve their early geological history, such as Mars, helps us to understand the evolution of our planet," Dr Lagain said.

"The crater detection algorithm provides us with a thorough understanding of the formation of impact craters including their size and quantity, and the timing and frequency of the asteroid collisions that made them."

Past studies had suggested that there was a spike in the timing and frequency of asteroid collisions due to the production of debris, Dr Lagain said.

"When big bodies smash into each other, they break into pieces or debris, which is thought to have an effect on the creation of impact craters," Dr Lagain said.

"Our study shows it is unlikely that debris resulted in any changes to the formation of impact craters on planetary surfaces."

Co-author and leader of the team that created the algorithm, Professor Gretchen Benedix, said the algorithm could also be adapted to work on other planetary surfaces, including the Moon.

"The formation of thousands of lunar craters can now be dated automatically, and their formation frequency analysed at a higher resolution to investigate their evolution," Professor Benedix said.

Read more at Science Daily

Using ice to boil water: Researcher makes heat transfer discovery that expands on 18th century principle

Associate Professor Jonathan Boreyko and graduate fellow Mojtaba Edalatpour have made a discovery about the properties of water that could provide an exciting addendum to a phenomenon established over two centuries ago. The discovery also holds interesting possibilities for cooling devices and processes in industrial applications using only the basic properties of water. Their work was published on Jan. 21 in the journal Physical Review Fluids.

Water can exist in three phases: a frozen solid, a liquid, and a gas. When heat is applied to a frozen solid, it becomes a liquid. When applied to the liquid, it becomes vapor. This elementary principle is familiar to anyone who has observed a glass of iced tea on a hot day, or boiled a pot of water to make spaghetti.

When the heat source is hot enough, the water's behavior changes dramatically. According to Boreyko, a water droplet deposited onto an aluminum plate heated to 150 degrees Celsius (302 degrees Fahrenheit) or above will no longer boil. Instead, the vapor that forms when the droplet approaches the surface will become trapped beneath the droplet, creating a cushion that prevents the liquid from making direct contact with the surface. The trapped vapor causes the liquid to levitate, sliding around the heated surface like an air hockey puck. This phenomenon is known as the Leidenfrost effect, named for the German doctor and theologian who first described it in a 1751 publication.

This commonly accepted scientific principle applies to water as a liquid, floating on a bed of vapor. Boreyko's team found themselves wondering: Could ice perform in the same way?

"There are so many papers out there about levitating liquid, we wanted to ask the question about levitating ice," said Boreyko. "It started as a curiosity project. What drove our research was the question of whether or not it was possible to have a three-phase Leidenfrost effect with solid, liquid, and vapor."

Going into the ice

Curiosity sparked the first investigation in Boreyko's lab some five years ago in the form of a research project by then-undergraduate student Daniel Cusumano. What he observed was fascinating. Even when the aluminum was heated above 150 C, the ice did not levitate on vapor as liquid does. Cusumano continued raising the temperature, observing the behavior of the ice as the heat increased. What he found was that the threshold for levitation was dramatically higher: 550 C (1022 F) rather than 150 C. Up until that threshold, the meltwater beneath the ice continued to boil in direct contact with the surface, rather than exhibit the Leidenfrost effect.

What was going on underneath the ice that prolonged the boiling? The project was picked back up by graduate student Mojtaba Edalatpour a short time later, to solve the mystery. Edalatpour had been working with Boreyko to develop novel methods of heat transfer and put that knowledge to work in approaching this problem. The answer turned out to be the temperature differential in the meltwater layer beneath the ice. The meltwater layer has two different extremes: Its bottom is boiling, which fixes the temperature at about 100 C, but its top is adhered to the remaining ice, which fixes it at about 0 C. Edalatpour's model revealed that the maintenance of this extreme temperature differential consumes most of the surface's heat, explaining why levitation was more difficult for ice.

Boreyko elaborated. "The temperature differential the ice is uniquely creating across the water layer has changed what happens in the water itself, because now most of the heat from the hot plate has to go across the water to maintain that extreme differential. So only a tiny fraction of the energy can be used to produce vapor anymore."

The elevated temperature of 550 degrees Celsius for the icy Leidenfrost effect is practically important. Boiling water is optimally transporting heat away from the substrate, which is why you feel ample heat rising from a pot of water that is boiling, but not from a pot of water that is merely hot. This means that the difficulty in levitating ice is actually a good thing, as the larger temperature window for boiling will result in better heat transfer compared to using a liquid alone.

"It is much harder to levitate the ice than it was to levitate the water droplet," said Boreyko. "Heat transfer plummets as soon as levitation begins, because when liquid levitates, it doesn't boil anymore. It's floating over the surface rather than touching, and touching is what causes it to boil the heat away. So, for heat transfer, levitation is terrible. Boiling is incredible."

Using ice for heat transfer

As the team explored possibilities for practical application, they looked to their existing work. Since Edalatpour had extensive research in heat transfer, that topic became a logical fit.

Heat transfer comes most into play for cooling off things like computer servers or car engines. It requires a substance or mechanism that can move energy away from a hot surface, redistributing heat quickly to reduce the wear and tear on metal parts. In nuclear power plants, the application of ice to induce rapid cooling could become an easily-deployed emergency measure if power fails, or a regular practice for servicing power plant parts.

There are also potential applications for metallurgy. To produce alloys, it is necessary to quench the heat from metals that have been shaped in a narrow window of time, making the metal stronger and less brittle. If ice were applied, it would allow heat to be offloaded rapidly through the three water phases, quickly cooling the metal.

Boreyko also foresees a potential for applications in firefighting.

"You could imagine having a specially made hose that is spraying ice chips as opposed to a jet of water," he said. "This is not science fiction. I visited an aerospace company that has an icing tunnel and they already have this technology where a nozzle sprays out ice particles as opposed to water droplets."

Read more at Science Daily

Muscular study provides new information about how the largest dinosaurs moved and evolved

New research led by the University of Bristol has revealed how giant 50-tonne sauropod dinosaurs, like Diplodocus, evolved from much smaller ancestors, like the wolf-sized Thecodontosaurus.

In a new study published today in the journal Royal Society Open Science, researchers present a reconstruction of the limb muscles of Thecodontosaurus, detailing the anatomy of the most important muscles involved in movement.

Thecodontosaurus was a small to medium sized two-legged dinosaur that roamed around what today is the United Kingdom during the Triassic period (around 205 million years ago).

This dinosaur was one of the first ever to be discovered and named by scientists, in 1836, but it still surprises scientists with new information about how the earliest dinosaurs lived and evolved.

Antonio Ballell, PhD student in Bristol's School of Earth Sciences and lead author of the study, said: "The University of Bristol houses a huge collection of beautifully preserved Thecodontosaurus fossils that were discovered around Bristol. The amazing thing about these fossilised bones is that many preserve the scars and rugosities that the limb musculature left on them with its attachment."

These features are extremely valuable in scientific terms to infer the shape and direction of the limb muscles. Reconstructing muscles in extinct species requires this kind of exceptional preservation of fossils, but also a good understanding of the muscle anatomy of living, closely related species.

Antonio Ballell added: "In the case of dinosaurs, we have to look at modern crocodilians and birds, that form a group that we call archosaurs, meaning 'ruling reptiles'. Dinosaurs are extinct members of this lineage, and due to evolutionary resemblance, we can compare the muscle anatomy in crocodiles and birds and study the scars that they leave on bones to identify and reconstruct the position of those muscles in dinosaurs."

Professor Emily Rayfield, co-author of the study, said: "These kinds of muscular reconstructions are fundamental to understand functional aspects of the life of extinct organisms. We can use this information to simulate how these animals walked and ran with computational tools."

From the size and orientation of its limb muscles, the authors argue that Thecodontosaurus was quite agile and probably used its forelimbs to grasp objects instead of walking.

This contrasts with its later relatives, the giant sauropods, which partly achieved these huge body sizes by shifting to a quadrupedal posture. The muscular anatomy of Thecodontosaurus seems to indicate that key features of later sauropod-line dinosaurs had already evolved in this early species.

Professor Mike Benton, another co-author, said: "From an evolutionary perspective, our study adds more pieces to the puzzle of how the locomotion and posture changed during the evolution of dinosaurs and in the line to the giant sauropods.

"How were limb muscles modified in the evolution of multi-ton quadrupeds from tiny bipeds? Reconstructing the limb muscles of Thecodontosaurus gives us new information of the early stages of that important evolutionary transition."

Read more at Science Daily

Scientists find previously unknown jumping behavior in insects

A team of researchers has discovered a jumping behavior that is entirely new to insect larvae, and there is evidence that it is occurring in a range of species -- we just haven't noticed it before.

The previously unrecorded behavior occurs in the larvae of a species of lined flat bark beetle (Laemophloeus biguttatus). Specifically, the larvae are able to spring into the air, with each larva curling itself into a loop as it leaps forward. What makes these leaps unique is how the larvae are able to pull it off.

"Jumping at all is exceedingly rare in the larvae of beetle species, and the mechanism they use to execute their leaps is -- as far as we can tell -- previously unrecorded in any insect larvae," says Matt Bertone, corresponding author of a paper on the discovery and director of North Carolina State University's Plant Disease and Insect Clinic.

While there are other insect species that are capable of making prodigious leaps, they rely on something called a "latch-mediated spring actuation mechanism." This means that they essentially have two parts of their body latch onto each other while the insect exerts force, building up a significant amount of energy. The insect then unlatches the two parts, releasing all of that energy at once, allowing it to spring off the ground.

"What makes the L. biguttatus so remarkable is that it makes these leaps without latching two parts of its body together," Bertone says. "Instead, it uses claws on its legs to grip the ground while it builds up that potential energy -- and once those claws release their hold on the ground, that potential energy is converted into kinetic energy, launching it skyward."

The discovery of the behavior was somewhat serendipitous. Bertone had collected a variety of insect samples from a rotting tree near his lab in order to photograph them when he noticed that these beetle larvae appeared to be hopping.

Bertone and paper co-author Adrian Smith then decided to film the behavior in order to get a better look at what was going on. That's when they began to understand just how peculiar the behavior was. Smith is a research assistant professor of biological sciences at NC State and head of the Evolutionary Biology & Behavior Research Lab at the North Carolina Museum of Natural Sciences.

"The way these larvae were jumping was impressive at first, but we didn't immediately understand how unique it was," Bertone says. "We then shared it with a number of beetle experts around the country, and none of them had seen the jumping behavior before. That's when we realized we needed to take a closer look at just how the larvae was doing what it was doing."

To determine how L. biguttatus was able to execute its acrobatics, the researchers filmed the jumps at speeds of up to 60,000 frames per second. This allowed them to capture all of the external movements associated with the jumps, and suggested that the legs were essentially creating a latching mechanism with the ground.

The researchers also conducted a muscle mass assessment to determine whether it was possible for the larvae to make their leaps using just their muscles, as opposed to using a latch mechanism to store energy. They found that the larvae lacked sufficient muscle to hurl themselves into the air as far or as fast as they had been filmed jumping. Ergo, latching onto the ground was the only way the larvae could pull off their aerial feats.

Meanwhile, in an unrelated video about jumping maggots, Smith had included a short clip of the jumping behavior in L. biguttatus. That video was seen by a researcher in Japan named Takahiro Yoshida, who had witnessed similar jumps in the larvae of another beetle species called Placonotus testaceus, but had not published anything related to the behavior.

"We don't have high-speed footage of P. testaceus, but the video evidence we do have from Yoshida's lab suggests that this previously unknown behavior is found in two different genera which are not even closely related," Bertone says.

Read more at Science Daily

Jan 20, 2022

Hubble finds a black hole igniting star formation in a dwarf galaxy

Often portrayed as destructive monsters that hold light captive, black holes take on a less villainous role in the latest research from NASA's Hubble Space Telescope. A black hole at the heart of the dwarf galaxy Henize 2-10 is creating stars rather than gobbling them up. The black hole is apparently contributing to the firestorm of new star formation taking place in the galaxy. The dwarf galaxy lies 30 million light-years away, in the southern constellation Pyxis.

A decade ago this small galaxy set off debate among astronomers as to whether dwarf galaxies were home to black holes proportional to the supermassive behemoths found in the hearts of larger galaxies. This new discovery has little Henize 2-10, containing only one-tenth the number of stars found in our Milky Way, poised to play a big part in solving the mystery of where supermassive black holes came from in the first place.

The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Maryland, conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy in Washington, D.C.

"Ten years ago, as a graduate student thinking I would spend my career on star formation, I looked at the data from Henize 2-10 and everything changed," said Amy Reines, who published the first evidence for a black hole in the galaxy in 2011 and is the principal investigator on the new Hubble observations, published in the January 19 issue of Nature.

"From the beginning I knew something unusual and special was happening in Henize 2-10, and now Hubble has provided a very clear picture of the connection between the black hole and a neighboring star forming region located 230 light-years from the black hole," Reines said.

That connection is an outflow of gas stretching across space like an umbilical cord to a bright stellar nursery. The region was already home to a dense cocoon of gas when the low-velocity outflow arrived. Hubble spectroscopy shows the outflow was moving about 1 million miles per hour, slamming into the dense gas like a garden hose hitting a pile of dirt and spreading out. Newborn star clusters dot the path of the outflow's spread, their ages also calculated by Hubble.

This is the opposite effect of what's seen in larger galaxies, where material falling toward the black hole is whisked away by surrounding magnetic fields, forming blazing jets of plasma moving at close to the speed of light. Gas clouds caught in the jets' path would be heated far beyond their ability to cool back down and form stars. But with the less-massive black hole in Henize 2-10, and its gentler outflow, gas was compressed just enough to precipitate new star formation.

"At only 30 million light-years away, Henize 2-10 is close enough that Hubble was able to capture both images and spectroscopic evidence of a black hole outflow very clearly. The additional surprise was that, rather than suppressing star formation, the outflow was triggering the birth of new stars," said Zachary Schutte, Reines' graduate student and lead author of the new study.

Ever since her first discovery of distinctive radio and X-ray emissions in Henize 2-10, Reines has thought they likely came from a massive black hole, but not as supermassive as those seen in larger galaxies. Other astronomers, however, thought that the radiation was more likely being emitted by a supernova remnant, which would be a familiar occurrence in a galaxy that is rapidly pumping out massive stars that quickly explode.

"Hubble's amazing resolution clearly shows a corkscrew-like pattern in the velocities of the gas, which we can fit to the model of a precessing, or wobbling, outflow from a black hole. A supernova remnant would not have that pattern, and so it is effectively our smoking-gun proof that this is a black hole," Reines said.

Reines expects that even more research will be directed at dwarf galaxy black holes in the future, with the aim of using them as clues to the mystery of how supermassive black holes came to be in the early universe. It's a persistent puzzle for astronomers. The relationship between the mass of the galaxy and its black hole can provide clues. The black hole in Henize 2-10 is around 1 million solar masses. In larger galaxies, black holes can be more than 1 billion times our Sun's mass. The more massive the host galaxy, the more massive the central black hole.

Current theories on the origin of supermassive black holes break down into three categories: 1) they formed just like smaller stellar-mass black holes, from the implosion of stars, and somehow gathered enough material to grow supermassive, 2) special conditions in the early universe allowed for the formation of supermassive stars, which collapsed to form massive black hole "seeds" right off the bat, or 3) the seeds of future supermassive black holes were born in dense star clusters, where the cluster's overall mass would have been enough to somehow create them from gravitational collapse.

So far, none of these black hole seeding theories has taken the lead. Dwarf galaxies like Henize 2-10 offer promising potential clues, because they have remained small over cosmic time, rather than undergoing the growth and mergers of large galaxies like the Milky Way. Astronomers think that dwarf galaxy black holes could serve as an analog for black holes in the early universe, when they were just beginning to form and grow.

Read more at Science Daily

Haunted-house experience scares up interesting insights on the body’s reaction to threats

The so-called fight-or-flight response is evolution's way of preparing the body to defend itself or flee from a real or perceived threat, like a lion in the tall grass or -- in modern times -- an overdue performance review.

Scientists have struggled to study the effects of genuine threats on people's mental and physical state because of ethical and practical constraints of human lab experiments.

In new research published in the journal Psychological Science, researchers used a haunted-house experience to study participants' subjective and physiological responses to perceived threats in a safe yet immersive environment.

In this haunted-house setting, which included 17 rooms with various threats that formed an uninterrupted experience, the researchers examined how the body responds to threats differently depending on the social context (whether friends were around), features of the threats (whether they were expected), and emotions (whether individuals felt afraid).

"There are a lot of factors that influence how human bodies respond to threat," said Sarah M. Tashjian, of the Division of Humanities and Social Sciences at the California Institute of Technology and lead author of the study. "We found that friend-related emotional contagion, threat predictability, and subjective feelings of fear were all relevant for the body mounting a response."

All of these factors help increase a person's ability to survive when under threat, but in the study, each had slightly different influences, which demonstrate the dynamic nature of the sympathetic nervous system.

To study the effects of frightening experiences, previous studies used scary images, mild electric shocks, or loud noises. In the current study, 156 participants went through the haunted house in small groups. During the 30-minute experience, they encountered situations that mimicked the threat of suffocation, an oncoming speeding car, and a volley of shots (with pellets) from a firing squad.

Participants wore real-time physiological-monitoring wristbands to measure their electrodermal activity, or sweat-induced changes in the skin's electrical characteristics, including skin conductance level and skin conductance response.

Before visiting the haunted house, participants rated their expected fear on a scale from 1 to 10. Afterward, they rated their experienced fear level on the same scale. From these data, four factors were examined, including group composition, threat imminence, intrapersonal factors of fear, and a "baseline orienting response," or the participant's sensitivity to threats.

Results showed a positive association between the number of friends in a group and tonic arousal, which reflects the body's overall physical response to stress or emotion. On average, the more friends that participants had with them while touring the haunted house, the higher their physical response.

"We interpreted this to reflect fear contagion -- if your friends are around, your body picks up on their signals and has a higher level of arousal even in the absence of specific scares or startles," Tashjian said. "In the lab, it is difficult to study the effects of groups on physiology."

Studies usually involve testing one person at a time or, at most, pairs of friends. In this study, the researchers had the unique opportunity to study how being in groups with different mixes of friends and strangers affected people's perceptions of threat.

The researchers also noted positive associations between unexpected attacks, subjective fear, and phasic frequency. Phasic effects are rapid changes the body experiences as it responds to an event. Individuals who felt the most afraid during the haunted house had more peaks in these responses. "If your body is more cued-in to the threatening event, you also psychologically feel more fear," Tashjian said.

Other findings revealed that participants with an initially strong response to the first room of the haunted house showed increased responses as they visited other rooms. Participants with more frequent responses in the first room showed decreased responses over time.

"From a results perspective, this study is distinct because we measure multiple aspects of skin conductance, including slow responding, rapid responding, frequency of responses, and level of responses," Tashjian explained. "Most studies use just one of these measures, which limits our understanding of how dynamic the sympathetic nervous system is and how different factors exert different influences on biology."

She added that the research is a "major advance for cognitive and social psychology," because it furthers the understanding of how "naturalistic contexts," such as the immersive haunted house experience, influence the body's response to threats. Also significant is the finding that friends amplify the physical response.

Read more at Science Daily

COVID-19 vaccines do not cause infertility, study finds

COVID-19 vaccination in either partner does not appear to affect fertility, according to new research led by Boston University School of Public Health (BUSPH) investigators.

Published in the American Journal of Epidemiology, the prospective study of couples trying to conceive found no association between COVID-19 vaccination and fecundability -- the probability of conception per menstrual cycle -- in female or male partners who received the Pfizer-BioNTech, Moderna, or Johnson & Johnson vaccines.

In contrast, the findings indicate that COVID-19 infection among males may temporarily reduce fertility -- an outcome that could be avoidable through vaccination.

"Many reproductive-aged individuals have cited concerns about fertility as a reason for remaining unvaccinated," says study lead author Dr. Amelia Wesselink, research assistant professor of epidemiology at BUSPH. "Our study shows for the first time that COVID-19 vaccination in either partner is unrelated to fertility among couples trying to conceive through intercourse. Time-to-pregnancy was very similar regardless of vaccination status."

Wesselink and colleagues analyzed survey data on COVID-19 vaccination and infection, and fecundability, among female and male participants in the BUSPH-based Pregnancy Study Online (PRESTO), an ongoing NIH-funded study that enrolls women trying to conceive, and follows them from preconception through six months after delivery. Participants included 2,126 women in the US and Canada who provided information on sociodemographics, lifestyle, medical factors, and characteristics of their partners from December 2020 to September 2021, and the participants were followed in the study through November 2021.

The researchers calculated the per menstrual cycle probability of conception using self-reported dates of participants' last menstrual period, typical menstrual cycle length, and pregnancy status. Fertility rates among female participants who received at least one dose of a vaccine were nearly identical to unvaccinated female participants. Fecundability was also similar for male partners who had received at least one dose of a COVID-19 vaccine compared with unvaccinated male participants. Additional analyses that considered the number of vaccine doses, brand of vaccine, infertility history, occupation, and geographic region also indicated no effect of vaccination on fertility.

While COVID-19 infection was not strongly associated with fertility, men who tested positive for COVID within 60 days of a given cycle had reduced fertility compared to men who never tested positive, or men who tested positive at least 60 days prior. This data supports previous research that has linked COVID-19 infection in men with poor sperm quality and other reproductive dysfunction.

"These data provide reassuring evidence that COVID vaccination in either partner does not affect fertility among couples trying to conceive," says study senior author Dr. Lauren Wise, professor of epidemiology at BUSPH. "The prospective study design, large sample size, and geographically heterogeneous study population are study strengths, as was our control for many variables such as age, socioeconomic status, preexisting health conditions, occupation, and stress levels."

Read more at Science Daily

Babies can tell who has close relationships based on one clue: Saliva

Learning to navigate social relationships is a skill that is critical for surviving in human societies. For babies and young children, that means learning who they can count on to take care of them.

MIT neuroscientists have now identified a specific signal that young children and even babies use to determine whether two people have a strong relationship and a mutual obligation to help each other: whether those two people kiss, share food, or have other interactions that involve sharing saliva.

In a new study, the researchers showed that babies expect people who share saliva to come to one another's aid when one person is in distress, much more so than when people share toys or interact in other ways that do not involve saliva exchange. The findings suggest that babies can use these cues to try to figure out who around them is most likely to offer help, the researchers say.

"Babies don't know in advance which relationships are the close and morally obligating ones, so they have to have some way of learning this by looking at what happens around them," says Rebecca Saxe, the John W. Jarve Professor of Brain and Cognitive Sciences, a member of MIT's McGovern Institute for Brain Research, and the senior author of the new study.

MIT postdoc Ashley Thomas is the lead author of the study, which appears today in Science. Brandon Woo, a Harvard University graduate student; Daniel Nettle, a professor of behavioral science at Newcastle University; and Elizabeth Spelke, a professor of psychology at Harvard, are also authors of the paper.

Sharing saliva

In human societies, people typically distinguish between "thick" and "thin" relationships. Thick relationships, usually found between family members, feature strong levels of attachment, obligation, and mutual responsiveness. Anthropologists have also observed that people in thick relationships are more willing to share bodily fluids such as saliva.

"That inspired both the question of whether infants distinguish between those types of relationships, and whether saliva sharing might be a really good cue they could use to recognize them," Thomas says.

To study those questions, the researchers observed toddlers (16.5 to 18.5 months) and babies (8.5 to 10 months) as they watched interactions between human actors and puppets. In the first set of experiments, a puppet shared an orange with one actor, then tossed a ball back and forth with a different actor.

After the children watched these initial interactions, the researchers observed the children's reactions when the puppet showed distress while sitting between the two actors. Based on an earlier study of nonhuman primates, the researchers hypothesized that babies would look first at the person whom they expected to help. That study showed that when baby monkeys cry, other members of the troop look to the baby's parents, as if expecting them to step in.

The MIT team found that the children were more likely to look toward the actor who had shared food with the puppet, not the one who had shared a toy, when the puppet was in distress.

In a second set of experiments, designed to focus more specifically on saliva, the actor either placed her finger in her mouth and then into the mouth of the puppet, or placed her finger on her forehead and then onto the forehead of the puppet. Later, when the actor expressed distress while standing between the two puppets, children watching the video were more likely to look toward the puppet with whom she had shared saliva.

Social cues

The findings suggest that saliva sharing is likely an important cue that helps infants to learn about their own social relationships and those of people around them, the researchers say.

"The general skill of learning about social relationships is very useful," Thomas says. "One reason why this distinction between thick and thin might be important for infants in particular, especially human infants, who depend on adults for longer than many other species, is that it might be a good way to figure out who else can provide the support that they depend on to survive."

The researchers did their first set of studies shortly before Covid-19 lockdowns began, with babies who came to the lab with their families. Later experiments were done over Zoom. The results that the researchers saw were similar before and after the pandemic, confirming that pandemic-related hygiene concerns did not affect the outcome.

"We actually know the results would have been similar if it hadn't been for the pandemic," Saxe says. "You might wonder, did kids start to think very differently about sharing saliva when suddenly everybody was talking about hygiene all the time? So, for that question, it's very useful that we had an initial data set collected before the pandemic."

Doing the second set of studies on Zoom also allowed the researchers to recruit a much more diverse group of children because the subjects were not limited to families who could come to the lab in Cambridge during normal working hours.

In future work, the researchers hope to perform similar studies with infants in cultures that have different types of family structures. In adult subjects, they plan to use functional magnetic resonance imaging (fMRI) to study what parts of the brain are involved in making saliva-based assessments about social relationships.

Read more at Science Daily

Jan 18, 2022

Being in space destroys more red blood cells

A world-first study has revealed how space travel can cause lower red blood cell counts, known as space anemia. Analysis of 14 astronauts showed their bodies destroyed 54 percent more red blood cells in space than they normally would on Earth, according to a study published in Nature Medicine.

"Space anemia has consistently been reported when astronauts returned to Earth since the first space missions, but we didn't know why," said lead author Dr. Guy Trudel, a rehabilitation physician and researcher at The Ottawa Hospital and professor at the University of Ottawa. "Our study shows that upon arriving in space, more red blood cells are destroyed, and this continues for the entire duration of the astronaut's mission."

Before this study, space anemia was thought to be a quick adaptation to fluids shifting into the astronaut's upper body when they first arrived in space. Astronauts lose 10 percent of the liquid in their blood vessels this way. It was thought astronauts rapidly destroyed 10 percent of their red blood cells to restore the balance, and that red blood cell control was back to normal after 10 days in space.

Instead, Dr. Trudel's team found that the red blood cell destruction was a primary effect of being in space, not just caused by fluid shifts. They demonstrated this by directly measuring red blood cell destruction in 14 astronauts during their six-month space missions.

On Earth, our bodies create and destroy 2 million red blood cells every second. The researchers found that astronauts were destroying 54 percent more red blood cells during the six months they were in space, or 3 million every second. These results were the same for both female and male astronauts.

Dr. Trudel's team made this discovery thanks to techniques and methods they developed to accurately measure red blood cell destruction. These methods were then adapted to collect samples aboard the International Space Station. At Dr. Trudel's lab at the University of Ottawa, they were able to precisely measure the tiny amounts of carbon monoxide in the breath samples from astronauts. One molecule of carbon monoxide is produced every time one molecule of heme, the deep-red pigment in red blood cells, is destroyed.

While the team didn't measure red blood cell production directly, they assume the astronauts generated extra red blood cells to compensate for the cells they destroyed. Otherwise, the astronauts would end up with severe anemia, and would have had major health problems in space.

"Thankfully, having fewer red blood cells in space isn't a problem when your body is weightless," said Dr. Trudel. "But when landing on Earth and potentially on other planets or moons, anemia affecting your energy, endurance, and strength can threaten mission objectives. The effects of anemia are only felt once you land, and must deal with gravity again."

In this study, five out of 13 astronauts were clinically anemic when they landed -- one of the 14 astronauts did not have blood drawn on landing. The researchers saw that space-related anemia was reversible, with red blood cells levels progressively returning to normal three to four months after returning to Earth.

Interestingly, the team repeated the same measurements one year after astronauts returned to Earth, and found that red blood cell destruction was still 30 percent above preflight levels. These results suggest that structural changes may have happened to the astronaut while they were in space that changed red blood cell control for up to a year after long-duration space missions.

The discovery that space travel increases red blood cell destruction has several implications. First, it supports screening astronauts or space tourists for existing blood or health conditions that are affected by anemia. Second, a recent study by Dr. Trudel's team found that the longer the space mission, the worse the anemia, which could impact long missions to the Moon and Mars. Third, increased red blood cell production will require an adapted diet for astronauts. And finally, it's unclear how long the body can maintain this higher rate of destruction and production of red blood cells.

These findings could also be applied to life on Earth. As a rehabilitation physician, most of Dr. Trudel's patients are anemic after being very ill for a long time with limited mobility, and anemia hinders their ability to exercise and recover. Bedrest has been shown to cause anemia, but how it does this is unknown. Dr. Trudel thinks the mechanism may be like space anemia. His team will investigate this hypothesis during future bedrest studies done on Earth.

"If we can find out exactly what's causing this anemia, then there is a potential to treat it or prevent it, both for astronauts and for patients here on Earth," said Dr. Trudel.

Read more at Science Daily

Rivers speeding up Arctic ice melt at alarming rate

Irina Panyushkina grew up in Siberia, near the Arctic Circle. She was raised on stories of explorers trudging through seas of ice to reach the North Pole.

Now, she is a climate scientist and associate research professor of dendrochronology in the University of Arizona Laboratory of Tree-Ring Research. And she is trying to understand how a warming world is transforming the place she once called home.

Someday, the Arctic Ocean may no longer host ice, since the northern regions of the world are warming are faster than the rest -- a trend scientists refer to as Arctic amplification. As Arctic ice melts, new opportunities and challenges for humans will arise, researchers say.

Freshwater flowing into the Arctic Ocean from the continent is thought to exacerbate Arctic amplification, but the extent of its impact isn't fully understood. New research led by Panyushkina measures how the flow of the Yenisei River -- the largest freshwater river that flows into the Arctic Ocean -- has changed over the last few hundred years, and describes the impact freshwater has had on the Arctic.

Previous studies have attributed recent changes in wintertime freshwater flow into the Arctic to warming air temperature, seasonal precipitation changes or snowpack. But more recent research, including Panyushkina's study, suggests that the primary driver is actually degradation of permafrost -- or frozen ground -- as well as forest fires across southern Siberia.

Panyushkina's research, funded by the National Science Foundation Polar Office, is published in the journal Environmental Research Letters.

What trees can tell us

Data collected by instruments at the upper reaches of the Yenisei River in Tuva, in southern Siberia, only goes back so far. To overcome this, Panyushkina and her team used tree-ring data to double the number of years' worth of the stream flow data they had, allowing them to look back 300 years.

Stream flow, or the amount of water that moves through a certain area of a river over time, can be inferred by measuring changing tree-ring thickness over the years. Measurements of stream flow over specific seasons can even be teased out of the data.

Annual stream flow information is commonly used by water managers to reveal the average changes in stream flow trends. But Panyushkina and her team did something novel when they decided to also investigate winter stream flow specifically.

"We found an unprecedented increase in the winter flow rate over the last 25 years," Panyushkina said. This winter flow rate is nearly 80% above the average seen over approximately 100 years.

"In contrast, annual flow fluctuated normally during the 300-year period, with only a 7% increase over the last 25 years," Panyushkina said.

The winter stream flow data revealed the role of permafrost melt on Arctic ice.

Since ice covers rivers during winter in Siberia, the team's stream flow measurements only captured information about river waterthat originated underground rather thanfrom the sky. That includes water from thawing permafrost, as well as water from sub-permafrost aquifers, as permafrost loss leads to an increased exchange of water between the river and aquifers. These two sources of groundwater are warm compared to the frigid air above, and when they eventually flow into the Arctic Ocean, they melt the ice.

An uncertain future


Forest fires are also thought to be a driver of Arctic ice melt.

"We know the frequency and intensity of forest fires in Siberia have been increasing," Panyushkina said. "When fires happen in forests with permafrost, there is deep thawing under the fire event, and the affected area often doesn't recover for up to 60 years. When we have large-scale fires and long-burning fires and more frequent fires, we're maybe hitting the critical point when permafrost degradation cannot return to normal. Forest fires are also another process that increases connectivity between aquifers and stream flow."

The combined effects of permafrost degradation and fires are very strong at the Yenisei River basin, with more fresh water and heat flowing into the Arctic Ocean in recent decades, according to the study. In turn, melting sea ice also exacerbates global warming.

"Research interest in the region is booming because the surface temperature is warming much faster here than anywhere else in the world," Panyushkina said. "It's a hot spot for climate research, and because I grew up there and understand how the system works, it's a natural topic of study for me. I'm also very interested in knowing the impact of an ice-free Arctic on the surrounding landscape. Humans have never seen an ice-free Arctic before, ever. My mind still cannot comprehend how the Arctic Ocean can be free of ice."

By the middle of the century, changing sea ice conditions are expected to lead to greater navigability for open water-vessels crossing the Arctic. A future trans-Arctic shipping route called the Supra Polar Route will link the Atlantic and Pacific Oceans through the Arctic, potentially paving the way for more trans-Arctic commerce.

There is a need to quantify the Arctic amplification impacts to manage and regulate Arctic seas of the future, Panyushkina said.

"This strong prospect of the global trade fleet entering the Arctic opens the Pandora's box of near-future geopolitical and environmental issues and reinforces the urgency for a new regulatory framework by international organizations to ensure adequate environmental protections and vessel safety standards," she said.

Read more at Science Daily

New MRI technique could improve diagnosis and treatment of multiple sclerosis

It is important that multiple sclerosis (MS) is diagnosed and treated as early as possible in order to delay progression of the disease. The technique of magnetic resonance imaging (MRI) plays a key role in this process. In the search for ever better methods, a new MRI technique has been used at MedUni Vienna as part of a research project that could pave the way to quicker assessment of disease activity in MS. The study was conducted by a research team led by Wolfgang Bogner at MedUni Vienna's Department of Biomedical Imaging and Image-guided Therapy and was recently published in the journal Radiology.

Multiple sclerosis is a disease of the central nervous system that manifests itself in changes (lesions) primarily in the brain. As yet, there is no cure for MS, but it can be effectively treated. Early diagnosis is critical to the prognosis, with highly detailed imaging techniques playing a major role. Although conventional MRI can detect brain lesions, scientists are researching methods to detect the changes at an earlier microscopic or biochemical stage. The method known as proton MR spectroscopy has been identified as a promising tool for this purpose.

Using this technique, the research group led by Eva Niess (formerly Heckova) and Wolfgang Bogner from MedUni Vienna's Department of Biomedical Imaging and Image-guided Therapy, working with scientists from MedUni Vienna's Department of Neurology, went one step further in their recently published study. The team used MR spectroscopy with a 7-tesla magnet to compare the neurochemical changes in the brains of 65 MS patients with those of 20 healthy controls. This particularly powerful imaging tool was co-developed by MedUni Vienna researchers and has been used for scientific studies, e.g., of the brain, at MedUni Vienna's Center of Excellence for High-Field MR since it was commissioned in 2008.

Identifying and predicting changes


Using 7-tesla MRI, MedUni Vienna researchers have now been able to identify MS-relevant neurochemicals, i.e. chemicals involved in the function of the nervous system. "This allowed us to visualize brain changes in regions that appear normal on conventional MRI scans," says study leader Wolfgang Bogner, pointing to one of the study's main findings. According to the study's lead author, Eva Niess, these findings could play a significant role in the care of MS patients in the future: "Some neurochemical changes that we've been able to visualize with the new technique occur early in the course of the disease and might not only correlate with disability but also predict further disease progression."

Clinical studies and further developments follow

More research is needed before these findings can be incorporated into clinical applications, explain Niess and Bogner. They say that the results already show 7-tesla spectroscopic MR imaging to be a valuable new tool in the diagnosis of multiple sclerosis and in the treatment of MS patients.

Read more at Science Daily

Respiratory viruses that hijack immune mechanisms may have Achilles' heel

One viral protein could provide information to deter pneumonia causing the body's exaggerated inflammatory response to respiratory viruses, including the virus that causes COVID-19.

That viral protein is NS2 of Respiratory Syncytial Virus (RSV), and a study has found that if the virus lacks this protein, the human body's immune response can destroy the virus before exaggerated inflammation begins. The research, conducted at Washington State University's College of Veterinary Medicine, was published Jan. 18 in the journal MBio.

Like other respiratory viruses, including the COVID-19-causing SARS-CoV-2 virus, RSV infects the lung cells responsible for exchanging gases and uses them as factories to make more viruses. Uncontrollable virus multiplication in these cells leads to their destruction and manifestation of severe inflammation; lung diseases like pneumonia; and sometimes death.

"Exaggerated inflammation clogs the airways and makes breathing difficult," said Kim Chiok, a WSU post-doctoral researcher who led the study. "This is why people who have these long-term and severe inflammatory responses get pneumonia and need help breathing, and it's why they end up in the hospital in the ICU."

Chiok and fellow WSU researchers are laying the framework to break that cycle by understanding how respiratory viruses, like RSV, persist in the cell. RSV causes 160,000 deaths annually primarily in infants, children, elderly and immune-compromised individuals, according to National Institute of Allergy and Infectious Diseases.

The research was conducted in the laboratory of Professor Santanu Bose, who is part of WSU's Veterinary Microbiology and Pathology research unit. Chiok, a Fulbright Scholar from Peru who completed her Ph.D. at WSU, has spent the past two and a half years in the Bose laboratory exploring the mechanisms that regulate the virus-host battle.

The researchers first determined viral proteins' functions by using viruses lacking genes that code for different viral proteins and comparing them to a wild strain of the virus.

"The virus has a series of tools, some tools with multiple functions, we wanted to learn about these tools by essentially taking them away," Chiok said.

Each tool is a different viral protein.

Chiok identified the viral NS2 protein as a key regulator of autophagy, a cellular process that modulates immune defense during virus infection. Autophagy is mediated by a cellular protein known as Beclin1.

When the virus enters the cell, Beclin1 can recognize and clear the threat from the cell. It does this by attaching to certain smaller gene proteins through a process known as ISGylation. It is almost like Beclin1 is putting on a suit of armor, Chiok said.

The study showed that RSV's NS2 protein removes this "armor" from Beclin1 which allows the virus to persist and replicate within the cell, spreading to other cells and causing damage that initiates an exaggerated inflammatory response from the body that culminates in airway diseases like pneumonia. Without the NS2 protein, the virus is routinely destroyed by Beclin1.

Read more at Science Daily

Jan 17, 2022

Improving reading skills through action video games

Decoding letters into sound is a key point in learning to read but is not enough to master it. "Reading calls upon several other essential mechanisms that we don't necessarily think about, such as knowing how to move our eyes on the page or how to use our working memory to link words together in a coherent sentence," points out Daphné Bavelier, a professor in the Psychology Section of the Faculty of Psychology and Educational Sciences (FPSE) at the UNIGE. "These other skills, such as vision, the deployment of attention, working memory, and cognitive flexibility, are known to be improved by action video games," explains Angela Pasqualotto, first author of this study, which is based on her PhD thesis at the Department of Psychology and Cognitive Science of the University of Trento under the direction of Professors Venuti and De Angeli.

A child-friendly action video game to support learning

With this in mind, a video game was designed that combines action video games with mini games that train different executive functions, such as working memory, inhibition and cognitive flexibility, functions that are called upon during reading. "The universe of this game is an alternative world in which the child, accompanied by his Raku, a flying creature, must carry out different missions to save planets and progress in the game," Angela Pasqualotto adds. The idea is to reproduce the components of an action game, without incorporating violence, so that it is suitable for young children. "For example, the Raku flies through a meteor shower, moving around to avoid those or aiming at them to weaken their impact, while collecting useful resources for the rest of the game, a bit like what you find in action video games."

The scientists then worked with 150 Italian schoolchildren aged 8 to 12, divided into two groups: the first one played the video game developed by the team, and the second one played Scratch, a game that teaches children how to code. Both games require attentional control and executive functions, but in different manners. The action video game requires children to perform tasks within a time limit such as remembering a sequence of symbols or responding only when the Raku makes a specific sound while increasing the difficulty of these tasks according to the child's performance. Scratch, the control game, requires planning, reasoning and problem solving. Children must manipulate objects and logical structures to establish the desired programming sequence.

"First, we tested the children's ability to read words, non-words and paragraphs, and also we conducted an attention test that measures the child's attentional control, a capacity we know is trained by action video games," explains Daphne Bavelier. The children then followed the training with either the action video game or the control game, for six weeks, two hours a week under supervision at school. Children were tested at school by clinicians of the Laboratory of Observation Diagnosis and Education (UNITN).

Long-term improvement in reading skills


Shortly after the end of the training, the scientists repeated the tests on both groups of children. "We found a 7-fold improvement in attentional control in the children who played the action video game compared to the control group," says Angela Pasqualotto. Even more remarkably, the research team observed a clear enhancement in reading, not only in terms of reading speed, but also in accuracy, whereas no improvement was noted for the control group. This improvement in literacy occurs even though the action video game does not require any reading activity.

"What is particularly interesting about this study is that we carried out three further assessment tests at 6 months, 12 months and 18 months after training. On each occasion, the trained children performed better than the control group, which proves that these improvements were sustained," Angela Pasqualotto says. Moreover, the grades in Italian of the trained children became significantly better over time, showing a virtuous improvement in learning ability. "The effects are thus long-term, in line with the action video game strengthening the ability to learn how to learn," says Daphne Bavelier.

Read more at Science Daily

Do we get our most creative ideas when walking?

Movement helps us to think creatively. This insight is over 2000 years old -- and already known to the philosophers in ancient Greece.

However, what is the connection between movement and cognition from a scientific point of view? What happens in the brain when we walk? Are people who rarely move less creative?

"Our research shows that it is not movement per se that helps us to think more flexibly," says neuroscientist Dr Barbara Händel from Julius-Maximilians-Universität Würzburg (JMU) in Bavaria, Germany. Instead, the freedom to make self-determined movements is responsible for it.

Accordingly, even small movements while sitting can have the same positive effects on creative thinking. However, the researcher does not derive any concrete movement suggestions from her work: "The important thing is the freedom to move without external constraints."

Don't stare at small screens for too long

It is important, she says, that movement is not suppressed or forced into regular patterns. "Unfortunately, this happens when people focus for example on a small screen," explains the JMU researcher.

The increased use of mobile phones and similar devices -- also in the field of education at the time of the Corona pandemic -- could therefore have a negative effect on cognitive processes such as creativity.

The experiments that Barbara Händel and her doctoral student Supriya Murali conducted are described in detail in a recent publication in the journal Psychological Research.

Background

How do people perceive their environment? What effect do sensory stimuli have in the peripheral nervous system and what in the brain? What influence do body movements have on perception of sensory input? Researchers like Barbara Händel are interested in such questions for many reasons. In the long term, their findings could contribute to a better understanding of diseases that affect body movements as well as cognitive processes.

From February 2022, the scientist will continue her research at the Neurological Clinic of Würzburg University Hospital. There she plans to focus on the topics of Parkinson's disease and ADHD.

Read more at Science Daily

Repeated exposure to major disasters has long-term mental health impacts

Repeated exposure to major disasters does not make people mentally stronger, a recent study from the Texas A&M University School of Public Health found: individuals who have been repeatedly exposed to major disasters show a reduction in mental health scores.

Additionally, the research team found that the more experience the individuals had with such events, the lower their mental health was.

"We discovered the reverse of the adage 'what does not kill you makes you stronger,'" said the study's lead author Garett Sansom, research assistant professor in the Department of Environmental & Occupational Health at the School of Public Health.

Sansom and a team of Texas A&M researchers studied individuals from the Houston area, which is susceptible to hurricanes and flooding as well as industrial emergencies. The results of the study were published recently in the journal Natural Hazards.

From 2000 to 2020, Texas -- one of the states most prone to natural disasters -- experienced 33 Federal Emergency Management Agency (FEMA) declared major disasters. Many of these -- hurricanes, winter weather, drought and flooding -- impacted the Houston area. The area has also been impacted by emergencies such as explosions and chemical releases at nearby industrial facilities.

According to the research team, the combination of natural disasters and emergencies from industrial facilities presents a unique opportunity to observe the impacts.

"There is an unfortunate truth that many communities that reside along the Gulf Coast are at the nexus of exposures from natural and anthropogenic, or human-caused, hazards," Sansom said.

The team used a 12-item short form health survey to gather information. The survey assessed cumulative impacts from exposure to evaluate changes over time, producing a composite score for both mental (MCS) and physical (PCS) health.

The majority of the respondents reported that they experienced many hazardous events over the past five years. Hurricanes and flooding (96.35 percent) were the events experienced the most, followed by industrial fires (96.08), chemical spills (86.84) and tornados (79.82).

The team found that when individuals experienced two or more events over the past five years, their MCS averages fell below the expected national levels.

"Mental health is often overlooked in responding to and preparing for hazard exposures," Sansom said. "However, in order to reach community resilience efforts, mental conditions need to be accounted for."

The results of the study help to reveal the long-term mental impact hazards can have. More importantly, they underscore the need for public health interventions targeted toward these individuals as well as the communities where they reside.

Read more at Science Daily

Nanotherapy offers new hope for the treatment of Type 1 diabetes

Individuals living with Type 1 diabetes must carefully follow prescribed insulin regimens every day, receiving injections of the hormone via syringe, insulin pump or some other device. And without viable long-term treatments, this course of treatment is a lifelong sentence.

Pancreatic islets control insulin production when blood sugar levels change, and in Type 1 diabetes, the body's immune system attacks and destroys such insulin-producing cells. Islet transplantation has emerged over the past few decades as a potential cure for Type 1 diabetes. With healthy transplanted islets, Type 1 diabetes patients may no longer need insulin injections, but transplantation efforts have faced setbacks as the immune system continues to eventually reject new islets. Current immunosuppressive drugs offer inadequate protection for transplanted cells and tissues and are plagued by undesirable side effects.

Now a team of researchers at Northwestern University has discovered a technique to help make immunomodulation more effective. The method uses nanocarriers to re-engineer the commonly used immunosuppressant rapamycin. Using these rapamycin-loaded nanocarriers, the researchers generated a new form of immunosuppression capable of targeting specific cells related to the transplant without suppressing wider immune responses.

The paper was published today (Jan. 17), in the journal Nature Nanotechnology. The Northwestern team is led by Evan Scott, the Kay Davis Professor and an associate professor of biomedical engineering at Northwestern's McCormick School of Engineering and microbiology-immunology at Northwestern University Feinberg School of Medicine, and Guillermo Ameer, the Daniel Hale Williams Professor of Biomedical Engineering at McCormick and Surgery at Feinberg. Ameer also serves as the director of the Center for Advanced Regenerative Engineering (CARE).

Specifying the body's attack

Ameer has been working on improving the outcomes of islet transplantation by providing islets with an engineered environment, using biomaterials to optimize their survival and function. However, problems associated with traditional systemic immunosuppression remain a barrier to the clinical management of patients and must also be addressed to truly have an impact on their care, said Ameer.

"This was an opportunity to partner with Evan Scott, a leader in immunoengineering, and engage in a convergence research collaboration that was well executed with tremendous attention to detail by Jacqueline Burke, a National Science Foundation Graduate Research Fellow," Ameer said.

Rapamycin is well-studied and commonly used to suppress immune responses during other types of treatment and transplants, notable for its wide range of effects on many cell types throughout the body. Typically delivered orally, rapamycin's dosage must be carefully monitored to prevent toxic effects. Yet, at lower doses it has poor effectiveness in cases such as islet transplantation.

Scott, also a member of CARE, said he wanted to see how the drug could be enhanced by putting it in a nanoparticle and "controlling where it goes within the body."

"To avoid the broad effects of rapamycin during treatment, the drug is typically given at low dosages and via specific routes of administration, mainly orally," Scott said. "But in the case of a transplant, you have to give enough rapamycin to systemically suppress T cells, which can have significant side effects like hair loss, mouth sores and an overall weakened immune system."

Following a transplant, immune cells, called T cells, will reject newly introduced foreign cells and tissues. Immunosuppressants are used to inhibit this effect but can also impact the body's ability to fight other infections by shutting down T cells across the body. But the team formulated the nanocarrier and drug mixture to have a more specific effect. Instead of directly modulating T cells -- the most common therapeutic target of rapamycin -- the nanoparticle would be designed to target and modify antigen presenting cells (APCs) that allow for more targeted, controlled immunosuppression.

Using nanoparticles also enabled the team to deliver rapamycin through a subcutaneous injection, which they discovered uses a different metabolic pathway to avoid extensive drug loss that occurs in the liver following oral administration. This route of administration requires significantly less rapamycin to be effective -- about half the standard dose.

"We wondered, can rapamycin be re-engineered to avoid non-specific suppression of T cells and instead stimulate a tolerogenic pathway by delivering the drug to different types of immune cells?" Scott said. "By changing the cell types that are targeted, we actually changed the way that immunosuppression was achieved."

A 'pipe dream' come true in diabetes research

The team tested the hypothesis on mice, introducing diabetes to the population before treating them with a combination of islet transplantation and rapamycin, delivered via the standard Rapamune® oral regimen and their nanocarrier formulation. Beginning the day before transplantation, mice were given injections of the altered drug and continued injections every three days for two weeks.

The team observed minimal side effects in the mice and found the diabetes was eradicated for the length of their 100-day trial; but the treatment should last the transplant's lifespan. The team also demonstrated the population of mice treated with the nano-delivered drug had a "robust immune response" compared to mice given standard treatments of the drug.

The concept of enhancing and controlling side effects of drugs via nanodelivery is not a new one, Scott said. "But here we're not enhancing an effect, we are changing it -- by repurposing the biochemical pathway of a drug, in this case mTOR inhibition by rapamycin, we are generating a totally different cellular response."

The team's discovery could have far-reaching implications. "This approach can be applied to other transplanted tissues and organs, opening up new research areas and options for patients," Ameer said. "We are now working on taking these very exciting results one step closer to clinical use."

Jacqueline Burke, the first author on the study and a National Science Foundation Graduate Research Fellow and researcher working with Scott and Ameer at CARE, said she could hardly believe her readings when she saw the mice's blood sugar plummet from highly diabetic levels to an even number. She kept double-checking to make sure it wasn't a fluke, but saw the number sustained over the course of months.

Research hits close to home


For Burke, a doctoral candidate studying biomedical engineering, the research hits closer to home. Burke is one such individual for whom daily shots are a well-known part of her life. She was diagnosed with Type 1 diabetes when she was nine, and for a long time knew she wanted to somehow contribute to the field.

"At my past program, I worked on wound healing for diabetic foot ulcers, which are a complication of Type 1 diabetes," Burke said. "As someone who's 26, I never really want to get there, so I felt like a better strategy would be to focus on how we can treat diabetes now in a more succinct way that mimics the natural occurrences of the pancreas in a non-diabetic person."

The all-Northwestern research team has been working on experiments and publishing studies on islet transplantation for three years, and both Burke and Scott say the work they just published could have been broken into two or three papers. What they've published now, though, they consider a breakthrough and say it could have major implications on the future of diabetes research.

Read more at Science Daily

Jan 16, 2022

Newly-found planets on the edge of destruction

Three newly-discovered planets have been orbiting dangerously close to stars nearing the end of their lives.

Out of the thousands of extrasolar planets found so far, these three gas giant planets first detected by the NASA TESS (Transiting Exoplanet Survey Satellite) Mission, have some of the shortest-period orbits around subgiant or giant stars. One of the planets, TOI-2337b, will be consumed by its host star in less than 1 million years, sooner than any other currently known planet.

"These discoveries are crucial to understanding a new frontier in exoplanet studies: how planetary systems evolve over time," explained lead author Samuel Grunblatt, a postdoctoral fellow at the American Museum of Natural History and the Flatiron Institute in New York City. Grunblatt, who earned his PhD from the University of Hawai?i Institute for Astronomy (UH IfA), added that "these observations offer new windows into planets nearing the end of their lives, before their host stars swallow them up."

Grunblatt announced the discovery and confirmation of these planets -- TOI-2337b, TOI-4329b, and TOI-2669b -- at an American Astronomical Society press conference today; the study has been accepted for publication in the Astronomical Journal.

The researchers estimate that the planets have masses between 0.5 and 1.7 times Jupiter's mass, and sizes that range from slightly smaller to more than 1.6 times the size of Jupiter. They also span a wide range of densities, from styrofoam-like to three times denser than water, implying a wide variety of origins.

These three planets are believed to be just the tip of the iceberg. "We expect to find tens to hundreds of these evolved transiting planet systems with TESS, providing new details on how planets interact with each other, inflate, and migrate around stars, including those like our Sun," said Nick Saunders, a graduate student at UH IfA and co-author of the study.

The planets were first found in NASA TESS Mission full-frame image data taken in 2018 and 2019. Grunblatt and his collaborators identified the candidate planets in TESS data, and then used W. M. Keck Observatory's High-Resolution Echelle Spectrometer (HIRES) on Maunakea, Hawai?i to confirm the existence of the three planets.

"The Keck observations of these planetary systems are critical to understanding their origins, helping reveal the fate of solar systems like our own," said UH IfA Astronomer Daniel Huber, who co-authored the study.

Current models of planet dynamics suggest that planets should spiral in toward their host stars as the stars evolve over time, particularly in the last 10 percent of the star's lifetime. This process also heats the planets, potentially causing their atmospheres to inflate. However, this stellar evolution will also cause the orbits of planets around the host star to come closer to one another, increasing the likelihood that some of them will collide, or even destabilize the entire planetary system.

The wide variety of planet densities found in the study suggests that these planetary systems have been shaped through chaotic planet-to-planet interactions. This could also have resulted in unpredictable heating rates and timescales for these planets, giving them the wide range in densities we observe today.

Future observations of one of these systems, TOI-4329, with the recently-launched James Webb Space Telescope could reveal evidence for water or carbon dioxide in the planet's atmosphere. If these molecules are seen, the data would provide constraints on where these planets formed, and what sort of interactions had to occur to produce the planetary orbits we see today.

Continued monitoring of these systems with the NASA TESS telescope will constrain the rate at which these planets are spiraling into their host stars. So far, no clear signal of orbital decay has been observed in any of the systems, but a longer baseline of observations with the TESS Extended Missions will provide much tighter constraints on planet in-spiral than are currently possible, revealing how strongly planetary systems are affected by stellar evolution.

Read more at Science Daily

New insights into seasons on a planet outside our solar system

Imagine being in a place where the winds are so strong that they move at the speed of sound. That's just one aspect of the atmosphere on XO-3b, one of a class of exoplanets (planets outside our solar system), known as hot Jupiters. The eccentric orbit of the planet also leads to seasonal variations hundreds of times stronger than what we experience on Earth. In a recent paper, a McGill-led research team, provides new insight into what seasons looks like on a planet outside our solar system. The researchers also suggest that the oval orbit, extremely high surface temperatures (2,000 degrees C- hot enough to vaporize rock) and "puffiness" of XO-3b reveal traces of the planet's history. The findings will potentially advance both the scientific understanding of how exoplanets form and evolve and give some context for planets in our own solar system.

Hot Jupiters are massive, gaseous worlds like Jupiter, that orbit closer to their parent stars than Mercury is to the Sun. Though not present in our own solar system, they appear to be common throughout the galaxy. Despite being the most studied type of exoplanet, major questions remain about how they form. Could there be subclasses of hot Jupiters with different formation stories? For example, do these planets take shape far from their parent stars -- at a distance where it's cold enough for molecules such as water to become solid -- or closer. The first scenario fits better with theories about how planets in our own solar system are born, but what would drive these types of planets to migrate so close to their parent stars remains unclear.

To test those ideas, the authors of a recent McGill-led study used data from NASA's retired Spitzer Space Telescope to look at the atmosphere of exoplanet XO-3b. They observed eccentric seasons and measured wind speeds on the planet by obtaining a phase curve of the planet as it completed a full revolution about its host star.

Looking at atmospheric dynamics and interior evolution

"This planet is an extremely interesting case study for atmospheric dynamics and interior evolution, as it lies in an intermediate regime of planetary mass where processes normally neglected for less massive hot Jupiters may come into play," says Lisa Dang, the first author of a paper published recently in The Astronomical Journal, a PhD student at McGill University's Department of Physics. "XO-3b has an oval orbit rather than the circular orbit of almost all other known hot Jupiters. This suggests that it recently migrated toward its parent star; if that's the case, it will eventually settle into a more circular orbit."

The eccentric orbit of the planet also leads to seasonal variations hundreds of times stronger than what we experience on Earth. Nicolas Cowan, a McGill professor explains: "The entire planet receives three times more energy when it is close to its star during a brief sort of summer, than when it is far from the star."

The researchers also re-estimated the planet's mass and radius and found that the planet was surprisingly puffier than expected. They suggest and that the possible source of this heating could be due to leftover nuclear fusion.

Excess warmth and puffiness due to tidal heating?

Observations by Gaia, an ESA (European Space Agency) mission, found that the planet is puffier than expected which indicate its interior may be particularly energetic. Spitzer observations also hints that the planet produces much of its own heat as XO-3b's excess thermal emission isn't seasonal -- it's observed throughout the year on XO-3b. It's possible that the excess warmth is coming from the planet's interior, through a process called tidal heating. The star's gravitational squeeze on the planet oscillates as the oblong orbit takes the planet farther and then closer to the star. The resulting changes in interior pressure produce heat.

Read more at Science Daily