Jul 17, 2021

Galactic fireworks: New ESO images reveal stunning features of nearby galaxies

A team of astronomers has released new observations of nearby galaxies that resemble colourful cosmic fireworks. The images, obtained with the European Southern Observatory's Very Large Telescope (ESO's VLT), show different components of the galaxies in distinct colours, allowing astronomers to pinpoint the locations of young stars and the gas they warm up around them. By combining these new observations with data from the Atacama Large Millimeter/submillimeter Array (ALMA), in which ESO is a partner, the team is helping shed new light on what triggers gas to form stars.

Astronomers know that stars are born in clouds of gas, but what sets off star formation, and how galaxies as a whole play into it, remains a mystery. To understand this process, a team of researchers has observed various nearby galaxies with powerful telescopes on the ground and in space, scanning the different galactic regions involved in stellar births.

"For the first time we are resolving individual units of star formation over a wide range of locations and environments in a sample that well represents the different types of galaxies," says Eric Emsellem, an astronomer at ESO in Germany and lead of the VLT-based observations conducted as part of the Physics at High Angular resolution in Nearby GalaxieS (PHANGS) project. "We can directly observe the gas that gives birth to stars, we see the young stars themselves, and we witness their evolution through various phases."

Emsellem, who is also affiliated with the University of Lyon, France, and his team have now released their latest set of galactic scans, taken with the Multi-Unit Spectroscopic Explorer (MUSE) instrument on ESO's VLT in the Atacama Desert in Chile. They used MUSE to trace newborn stars and the warm gas around them, which is illuminated and heated up by the stars and acts as a smoking gun of ongoing star formation.

The new MUSE images are now being combined with observations of the same galaxies taken with ALMA and released earlier this year. ALMA, which is also located in Chile, is especially well suited to mapping cold gas clouds -- the parts of galaxies that provide the raw material out of which stars form.

By combining MUSE and ALMA images astronomers can examine the galactic regions where star formation is happening, compared to where it is expected to happen, so as to better understand what triggers, boosts or holds back the birth of new stars. The resulting images are stunning, offering a spectacularly colourful insight into stellar nurseries in our neighbouring galaxies.

"There are many mysteries we want to unravel," says Kathryn Kreckel from the University of Heidelberg in Germany and PHANGS team member. "Are stars more often born in specific regions of their host galaxies -- and, if so, why? And after stars are born how does their evolution influence the formation of new generations of stars?"

Astronomers will now be able to answer these questions thanks to the wealth of MUSE and ALMA data the PHANGS team have obtained. MUSE collects spectra -- the "bar codes" astronomers scan to unveil the properties and nature of cosmic objects -- at every single location within its field of view, thus providing much richer information than traditional instruments. For the PHANGS project, MUSE observed 30,000 nebulae of warm gas and collected about 15 million spectra of different galactic regions. The ALMA observations, on the other hand, allowed astronomers to map around 100,000 cold-gas regions across 90 nearby galaxies, producing an unprecedentedly sharp atlas of stellar nurseries in the close Universe.

In addition to ALMA and MUSE, the PHANGS project also features observations from the NASA/ESA Hubble Space Telescope. The various observatories were selected to allow the team to scan our galactic neighbours at different wavelengths (visible, near-infrared and radio), with each wavelength range unveiling distinct parts of the observed galaxies. "Their combination allows us to probe the various stages of stellar birth -- from the formation of the stellar nurseries to the onset of star formation itself and the final destruction of the nurseries by the newly born stars -- in more detail than is possible with individual observations," says PHANGS team member Francesco Belfiore from INAF-Arcetri in Florence, Italy. "PHANGS is the first time we have been able to assemble such a complete view, taking images sharp enough to see the individual clouds, stars, and nebulae that signify forming stars."

The work carried out by the PHANGS project will be further honed by upcoming telescopes and instruments, such as NASA's James Webb Space Telescope. The data obtained in this way will lay further groundwork for observations with ESO's future Extremely Large Telescope (ELT), which will start operating later this decade and will enable an even more detailed look at the structures of stellar nurseries.

Read more at Science Daily

New UK study reveals extent of brain complications in children hospitalized with COVID-19

Although the risk of a child being admitted to hospital due to COVID-19 is small, a new UK study has found that around 1 in 20 of children hospitalised with COVID-19 develop brain or nerve complications linked to the viral infection.

The research, published in The Lancet Child and Adolescent Health and led by the University of Liverpool, identifies a wide spectrum of neurological complications in children and suggests they may be more common than in adults admitted with COVID-19.

While neurological problems have been reported in children with the newly described post-COVID condition paediatric inflammatory multisystem syndrome temporally associated with SARS-CoV-2 (PIMS-TS), the capacity of COVID-19 to cause a broad range of nervous system complications in children has been under-recognised.

To address this, the CoroNerve Studies Group, a collaboration between the universities of Liverpool, Newcastle, Southampton and UCL, developed a real-time UK-wide notification system in partnership with the British Paediatric Neurology Association.

Between April 2020 and January 2021, they identified 52 cases of children less than 18 years old with neurological complications among 1,334 children hospitalised with COVID-19, giving an estimated prevalence of 3.8%. This compares to an estimated prevalence of 0.9% in adults admitted with COVID-19.

Eight (15%) children presenting with neurological features did not have COVID-19 symptoms although the virus was detected by PCR, underscoring the importance of screening children with acute neurological disorders for the virus.

Ethnicity was found to be a risk factor, over two thirds of children being of Black or Asian background.

For the first time, the study identified key differences between those with PIMS-TS versus those with non-PIMS-TS neurological complications. The 25 children (48%) diagnosed with PIMS-TS displayed multiple neurological features including encephalopathy, stroke, behavioural change, and hallucinations; they were more likely to require intensive care. Conversely, the non-PIMS-TS 27 (52%) children had a primary neurological disorder such as prolonged seizures, encephalitis (brain inflammation), Guillain-Barré syndrome and psychosis. In almost half of these cases, this was a recognised post-infectious neuro-immune disorder, compared to just one child in the PIMS-TS group, suggesting that different immune mechanisms are at work.

Short-term outcomes were apparently good in two thirds (65%) although a third (33%) had some degree of disability and one child died at the time of follow-up. However, the impacts on the developing brain and longer-term consequences are not yet known.

First author Dr Stephen Ray, a Wellcome Trust clinical fellow and paediatrician at the University of Liverpool said: "The risk of a child being admitted to hospital due to COVID-19 is small, but among those hospitalised, brain and nerve complications occur in almost 4%. Our nationwide study confirms that children with the novel post-infection hyper-inflammatory syndrome PIMS-TS can have brain and nerve problems; but we have also identified a wide spectrum of neurological disorders in children due to COVID-19 who didn't have PIMS-TS. These were often due to the child's immune response after COVID-19 infection."

Joint senior-author Dr Rachel Kneen, a Consultant Paediatric Neurologist at Alder Hey Children's NHS Foundation Trust and honorary clinical Senior Lecturer at the University of Liverpool said: "Many of the children identified were very unwell. Whilst they had a low risk of death, half needed intensive care support and a third had neurological disability identified. Many were given complex medication and treatments, often aimed at controlling their own immune system. We need to follow these children up to understand the impact in the long term."

Read more at Science Daily

Jul 16, 2021

Bats are kings of small talk in the air

Bat conversations might be light on substance, according to researchers from the University of Cincinnati.

Echoes from bats are so simple that a sound file of their calls can be compressed 90% without losing much information, according to a study published in the journal PLOS Computational Biology.

The study demonstrates how bats have evolved to rely on redundancy in their navigational "language" to help them stay oriented in their complex three-dimensional world.

"If you can make decisions with little information, everything becomes simpler. That's nice because you don't need a lot of complex neural machinery to process and store that information," study co-author Dieter Vanderelst said.

UC researchers suspected that the calls of bats contain redundant information and that bats might use efficient encoding strategies to extract the most relevant information from their echoes. Many natural stimuli encountered by animals have a lot of redundancy. Efficient neural encoding retains essential information while reducing this redundancy.

To test their hypothesis, they built their own "bat on a stick," a tripod-mounted device that emits a pulse of sound sweeping from 30 to 70 kilohertz, a frequency range used by many bats. By comparison, human speech typically ranges from 125 to 300 hertz (or 0.125 to 0.3 kHz).

More than 1,000 echoes were captured in distinct indoor and outdoor environments such as in a barn, in different-sized rooms, among bushes and tree branches and in a garden.

Researchers converted the recorded echoes to a graph of the sound, called a cochleogram. Then they subjected these graphs to 25 filters -- essentially compressing the data. They trained a neural network, a computer system modeled on the human brain, to determine if the filtered graphs still contained enough information to complete a number of sonar-based tasks known to be performed by bats.

They found that the neural network correctly identified the location of the echoes even when the cochleogram was stripped of as much as 90% of its data.

"What that tells us is you can compress that data and still do what you need to do. It also means if you're a bat, you can do this efficiently," said Vanderelst, an assistant professor in UC's College of Arts and Sciences and in the College of Engineering and Applied Science.

Vanderelst said researchers often can infer what bats are doing just by listening to their calls.

"Even if you don't see the bat, you can tell with a high degree of certainty what a bat is doing," he said. "If it calls more frequently, it's looking for something. If the calls are spread out, it's cruising or studying something far away."

Bats produce their ultrasonic calls with a larynx much like ours. But what a voice box. It can contract 200 times a second, making it the fastest known muscle in all mammals.

The nighttime forest can be deafening to people because of its chorus of frogs and drone of insects. But Vanderelst said the ultrasonic frequency by comparison is pretty quiet, allowing bats to hear their own chittering calls that bounce off tree branches and other obstacles during echolocation.

While bats use different chirps for navigating than for communicating with each other, Vanderelst said they're all pretty simple. But human language has lots of built-in redundancy as well, Vanderelst said.

Fr xmpl, cn y rd ths sntnc wth mssng vwls?

"Take out a lot of letters in a sentence and it's still readable," Vanderelst said.

UC graduate Adarsh Chitradurga Achutha, Vanderelst's student, was the study's lead author. Co-authors include Vanderelst's mentor Herbert Peremans at the University of Antwerp, Belgium, and bat expert Uwe Firzlaff with the University of Munich, Germany.

The way bats perceive the world is fascinating both from biological and engineering perspectives, Vanderelst said.

Read more at Science Daily

The paradox of a free-electron laser without the laser

A new way of producing coherent light in the ultra-violet spectral region, which points the way to developing brilliant table-top x-ray sources, has been produced in research led at the University of Strathclyde.

The scientists have developed a type of ultra-short wavelength coherent light source that does not require laser action to produce coherence. Common electron-beam based light sources, known as fourth-generation light sources, are based on the free-electron laser (FEL), which uses an undulator to convert electron beam energy into X-rays.

Coherent light sources are powerful tools that enable research in many areas of medicine, biology, material sciences, chemistry and physics.

This new way of producing coherent radiation could revolutionise light sources, as it would make them highly compact, essentially table-top size, and capable of producing ultra-short duration pulses of light, much shorter than can be produced easily by any other means.

Making ultraviolet and X-ray coherent light sources more widely available would transform the way science is done; a university could have one of the devices in a single room, on a table top, for a reasonable price.

The group is now planning a proof-of-principle experiment in the ultraviolet spectral range to demonstrate this new way of producing coherent light. If successful, it should dramatically accelerate the development of even shorter wavelength coherent sources based on the same principle. The Strathclyde group has set up a facility to investigate these types of sources: the Scottish Centre for the Application of Plasma-based Accelerators (SCAPA), which hosts one of the highest power lasers in the UK.

The new research has been published in Scientific Reports, one of the Nature family of journals.

Professor Dino Jaroszynski, of Strathclyde's Department of Physics, led the research. He said: "This work significantly advances the state-of-the-art of synchrotron sources by proposing a new method of producing short-wavelength coherent radiation, using a short undulator and attosecond duration electron bunches.

"This is more compact and less demanding on the electron beam quality than free-electron lasers and could provide a paradigm shift in light sources, which would stimulate a new direction of research. It proposes to use bunch compression -- as in chirped pulse amplification lasers -- within the undulator to significantly enhance the radiation brightness.

"The new method presented would be of wide interest to a diverse community developing and using light sources."

In FELs, as in all lasers, the intensity of light is amplified by a feedback mechanism that locks the phases of individual radiators, which in this case are "free" electrons. In the FEL, this is achieved by passing a high energy electron beam through the undulator, which is an array of alternating polarity magnets.

Light emitted from the electrons as they wiggle through the undulator creates a force called the ponderomotive force that bunches the electrons -- some are slowed down, some are sped up, which causes bunching, similar to traffic on a motorway periodically slowing and speeding up.

Electrons passing through the undulator radiate incoherent light if they are uniformly distributed -- for every electron that emits light, there is another electron that partially cancels out the light because they radiate out of phase. An analogy of this partial cancelling out is rain on the sea: it produces many small ripples that partially cancel each other out, effectively quelling the waves -- reducing their amplitude. In contrast, steady or pulsating wind will cause the waves to amplify through the mutual interaction of the wind with the sea.

In the FEL, electron bunching causes amplification of the light and the increase in its coherence, which usually takes a long time -- thus very long undulators are required. In an X-ray FEL, the undulators can be more than a hundred metres long. The accelerators driving these X-ray FELs are kilometres long, which makes these devices very expensive and some of the largest instruments in the world.

However, using a free-electron laser to produce coherent radiation is not the only way; a "pre-bunched" beam or ultra-short electron bunch can also be used to achieve exactly the same coherence in a very short undulator that is less than a metre in length. As long as the electron bunch is shorter than the wavelength of the light produced by the undulator, it will automatically produce coherent light -- all the light waves will add up or interfere constructively, which leads to very brilliant light with exactly the same properties of light from a laser.

Read more at Science Daily

Think about this: Keeping your brain active may delay Alzheimer's dementia 5 years

Keeping your brain active in old age has always been a smart idea, but a new study suggests that reading, writing letters and playing card games or puzzles in later life may delay the onset of Alzheimer's dementia by up to five years. The research is published in the July 14, 2021, online issue of Neurology, the medical journal of the American Academy of Neurology.

"The good news is that it's never too late to start doing the kinds of inexpensive, accessible activities we looked at in our study," said study author Robert S. Wilson, PhD, of Rush University Medical Center in Chicago. "Our findings suggest it may be beneficial to start doing these things, even in your 80s, to delay the onset of Alzheimer's dementia."

The study looked at 1,978 people with an average age of 80 who did not have dementia at the start of the study. The people were followed for an average of seven years. To determine if they had developed dementia, participants were given annual examinations, which included a number of cognitive tests.

When the study began, people rated their participation in seven activities on a five-point scale. The questions included: "During the past year, how often did you read books?" and "During the past year, how often did you play games like checkers, board games, cards or puzzles?"

Participants also answered questions about cognitive activity in childhood, adulthood and middle age.

Researchers then averaged each person's responses, with a score of one meaning once a year or less and score of five meaning every day or almost every day. People in the group with high cognitive activity scored an average of 4.0 which meant activities several times per week, compared to an average score of 2.1 for those with low cognitive activity, which meant activities several times per year.

During the study follow-up period, 457 people with an average age of 89 were diagnosed with Alzheimer's dementia. People with the highest levels of activity, on average, developed dementia at age 94. The people with the lowest cognitive activity, on average, developed dementia at age 89, a difference of five years. The results were similar when researchers adjusted for other factors that could affect dementia risk, such as education level and sex.

To test the idea that low cognitive activity may be an early sign of dementia, not the other way around, researchers also looked at the brains of 695 people who died during the study. Brain tissue was examined for markers of Alzheimer's like amyloid and tau protein deposits, but researchers found no association between how active they were cognitively and markers of Alzheimer's disease and related disorders in their brains.

"Our study shows that people who engage in more cognitively stimulating activities may be delaying the age at which they develop dementia," Wilson said. "It is important to note, after we accounted for late life level of cognitive activity, neither education nor early life cognitive activity were associated with the age at which a person developed Alzheimer's dementia. Our research suggests that the link between cognitive activity and the age at which a person developed dementia is mainly driven by the activities you do later in life."

Read more at Science Daily

High daily screen time linked to cognitive, behavioral problems in children born extremely preterm

Among 6- and 7-year-olds who were born extremely preterm -- before the 28th week of pregnancy -- those who had more than two hours of screen time a day were more likely to have deficits in overall IQ, executive functioning (problem solving skills), impulse control and attention, according to a study funded by the National Institutes of Health. Similarly, those who had a television or computer in their bedrooms were more likely to have problems with impulse control and paying attention. The findings suggest that high amounts of screen time may exacerbate the cognitive deficits and behavioral problems common to children born extremely preterm.

The study was conducted by Betty R. Vohr, M.D., and colleagues. It appears in JAMA Pediatrics. Funding was provided by NIH's Eunice Kennedy Shriver National Institute of Child Health and Human Development; National Heart, Lung, and Blood Institute; and National Center for Advancing Translational Sciences.

Previous studies have linked high amounts of screen time among children born full-term to language and developmental, behavioral and other problems. In the current study, researchers analyzed data from a study of children born at 28 weeks or earlier. Of 414 children, 238 had more than two hours of screen time per day and 266 had a television or computer in their bedrooms. Compared to children with less screen time per day, those with high amounts of screen time scored an average deficit of nearly 8 points on global executive function percentile scores, roughly 0.8 points lower on impulse control (inhibition) and more than 3 points higher on inattention. Children with a television or computer in their bedrooms also scored lower on measures of inhibition, hyperactivity and impulsivity.

The authors concluded that the findings support the need for physicians to discuss the potential effects of screen time with families of children born extremely preterm.

Read more at Science Daily

How micro-circuits in the brain regulate fear

Fear is an important reaction that warns and protects us from danger. But when fear responses are out of control, this can lead to persistent fears and anxiety disorders. In Europe, about 15 percent of the population is affected by anxiety disorders. Existing therapies remain largely unspecific or are not generally effective, because the detailed neurobiological understanding of these disorders is lacking.

What was known so far is that distinct nerve cells interact together to regulate fear responses by promoting or suppressing them. Different circuits of nerve cells are involved in this process. A kind of "tug-of-war" takes place, with one brain circuit "winning" and overriding the other, depending on the context. If this system is disturbed, for example if fear reactions are no longer suppressed, this can lead to anxiety disorders.

Recent studies have shown that certain groups of neurons in the amygdala are crucial for the regulation of fear responses. The amygdala is a small almond-shaped brain structure in the center of the brain that receives information about fearful stimuli and transmits it to other brain regions to generate fear responses. This causes the body to release stress hormones, change heart rate or trigger fight, flight or freezing responses.

Now, a group led by Professors Stephane Ciocchi of the University of Bern and Andreas Luthi of the Friedrich Miescher Institute in Basel has discovered that the amygdala plays a much more active role in these processes than previously thought: Not only is the central amygdala a "hub" to generate fear responses, but it contains neuronal microcircuits that regulate the suppression of fear responses. In animal models, it has been shown that inhibition of these microcircuits leads to long-lasting fear behaviour. However, when they are activated, behaviour returns to normal despite previous fear responses. This shows that neurons in the central amygdala are highly adaptive and essential for suppressing fear. These results were published in the journal Nature Communications.

"Disturbed" suppression leads to long-lasting fear

The researchers led by Stephane Ciocchi and Andreas Luthi studied the activity of neurons of the central amygdala in mice during the suppression of fear responses. They were able to identify different cell types that influence the animals' behaviour. For their study, the researchers used several methods, including a technique called optogenetics with which they could precisely shut down -- with pulses of light -- the activity of an identified neuronal population within the central amygdala that produces a specific enzyme. This impaired the suppression of fear responses, whereupon animals became excessively fearful. "We were surprised how strongly our targeted intervention in specific cell types of the central amygdala affected fear responses," says Ciocchi, Assistant Professor at the Institute of Physiology, University of Bern. "The optogenetic silencing of these specific neurons completely abolished the suppression of fear and provoked a state of pathological fear."

Important for developing more effective therapies

In humans, dysfunction of this system, including deficient plasticity in the nerve cells of the central amygdala described here, could contribute to the impaired suppression of fear memories reported in patients with anxiety and trauma-related disorders. A better understanding of these processes will help develop more specific therapies for these disorders. "However, further studies are necessary to investigate whether discoveries obtained in simple animal models can be extrapolated to human anxiety disorders," Ciocchi adds.

This study was carried out in partnership with the University of Bern, the Friedrich Miescher Institute and international collaborators. It was funded by the University of Bern, the Swiss National Science Foundation and the European Research Council (ERC).

Read more at Science Daily

Jul 15, 2021

DNA from 1,600-year-old Iranian sheep mummy brings history to life

A team of geneticists and archaeologists from Ireland, France, Iran, Germany, and Austria has sequenced the DNA from a 1,600-year-old sheep mummy from an ancient Iranian salt mine, Chehrabad. This remarkable specimen has revealed sheep husbandry practices of the ancient Near East, as well as underlining how natural mummification can affect DNA degradation.

The incredible findings have just been published in the international, peer-reviewed journal Biology Letters.

The salt mine of Chehrabad is known to preserve biological material. Indeed, it is in this mine that human remains of the famed "Salt Men" were recovered, dessicated by the salt-rich environment. The new research confirms that this natural mummification process -- where water is removed from a corpse, preserving soft tissues that would otherwise be degraded -- also conserved animal remains.

The research team, led by geneticists from Trinity College Dublin, exploited this by extracting DNA from a small cutting of mummified skin from a leg recovered in the mine.

While ancient DNA is usually damaged and fragmented, the team found that the sheep mummy DNA was extremely well-preserved; with longer fragment lengths and less damage that would usually be associated with such an ancient age. The group attributes this to the mummification process, with the salt mine providing conditions ideal for preservation of animal tissues and DNA.

The salt mine's influence was also seen in the microorganisms present in the sheep leg skin. Salt-loving archaea and bacteria dominated the microbial profile -- also known as the metagenome -- and may have also contributed to the preservation of the tissue.

The mummified animal was genetically similar to modern sheep breeds from the region, which suggests that there has been a continuity of ancestry of sheep in Iran since at least 1,600 years ago.

The team also exploited the sheep's DNA preservation to investigate genes associated with a woolly fleece and a fat-tail -- two important economic traits in sheep. Some wild sheep -- the asiatic mouflon -- are characterised by a "hairy" coat, much different to the woolly coats seen in many domestic sheep today. Fat-tailed sheep are also common in Asia and Africa, where they are valued in cooking, and where they may be well-adapted to arid climates.

The team built a genetic impression of the sheep and discovered that the mummy lacked the gene variant associated with a woolly coat, while fibre analysis using Scanning Electron Microscopy found the microscopic details of the hair fibres consistent with hairy or mixed coat breeds. Intriguingly, the mummy carried genetic variants associated with fat-tailed breeds, suggesting the sheep was similar to the hairy-coated, fat-tailed sheep seen in Iran today.

"Mummified remains are quite rare so little empirical evidence was known about the survival of ancient DNA in these tissues prior to this study," says Conor Rossi, PhD candidate in Trinity's School of Genetics and Microbiology, and the lead author of the paper.

"The astounding integrity of the DNA was not like anything we had encountered from ancient bones and teeth before. This DNA preservation, coupled with the unique metagenomic profile, is an indication of how fundamental the environment is to tissue and DNA decay dynamics.

Dr Kevin G Daly, also from Trinity's School of Genetics and Microbiology, supervised the study. He added:

"Using a combination of genetic and microscopic approaches, our team managed to create a genetic picture of what sheep breeds in Iran 1,600 years ago may have looked like and how they may have been used.

Read more at Science Daily

How climate change and fires are shaping the forests of the future

Forest fires are already a global threat. "But considering how climate change is progressing, we are probably only at the beginning of a future that will see more and bigger forest fires," explains Rupert Seidl, Professor of Ecosystem Dynamics and Forest Management in Mountain Landscapes at TUM.

In many places, fire is part of the natural environment, and many tree species have become naturally adapted to recurrent fires. These adaptations range from particularly thick bark, which protects the sensitive cambium in the trunk from the fire, to the cones of certain types of pine, which open only due to the heat of fire, allowing a quick regeneration and recovery of affected woodland .

AI is accelerating ecosystem models

"The interaction between climate, forest fires, and other processes in the forest ecosystem is very complex, and sophisticated process-based simulation models are required to take account of the different interactions appropriately," explains Prof. Seidl. A method that has been developed at TUM is using artificial intelligence to significantly expand the field of use of these complex models.

This method involves the training of a deep neural network in order to imitate the behavior of a complex simulation model as effectively as possible. The neural network learns on the basis of how the ecosystem responds to differing environmental influences, but does so using only a fraction of the computing power that would otherwise be necessary for large-scale simulation models. "This allows us to carry out spatially high-resolution simulations of areas of forest that stretch across several million hectares," explains scientist Dr. Werner Rammer.

Forecast for the forests in Yellowstone National Park

The simulations completed by the team of scientists include simulations for the "Greater Yellowstone Ecosystem," which has the world-famous Yellowstone National Park at its heart. This area, which is approximately 8 million hectares in size, is situated in the Rocky Mountains and is largely untouched. The researchers at the TUM have worked with American colleagues to determine how different climate scenarios could affect the frequency of forest fires in this region in the 21st century, and which areas of forest cannot regenerate successfully following a forest fire.

Depending on the climate change scenario, the study has found that by the end of the century, the current forest coverage will have disappeared in 28 to 59 percent of the region. Particularly affected were the forests in the sub-alpine zone near the tree line, where the species of tree are naturally less adapted to fire, and the areas on the Yellowstone Plateau, where the relatively flat topography is mostly unable to stop the fire from spreading.

Climate change is causing significant changes to forest ecosystems

The regeneration of the forest in the region under investigation is at threat for several reasons: If the fires get bigger and the distances between the surviving trees also increase, too few seeds will make their way onto the ground. If the climate gets hotter and drier in the future, the vulnerable young trees won't survive, and if there are too many fires, the trees won't reach the age at which they themselves yield seeds.

"By 2100, the Greater Yellowstone Ecosystem is expected to have changed more than it has in the last 10,000 years, and will therefore look significantly different than it does today," explains Rammer. "The loss of today's forest vegetation is leading to a reduction in the carbon which is stored in the ecosystem, and will also have a profound impact on the biodiversity and recreational value of this iconic landscape."

Read more at Science Daily

Pandemic of antibiotic resistance is killing children in Bangladesh, researchers find

Resistance to antibiotics is common and often deadly among children with pneumonia in Bangladesh, according to a new study coauthored by researchers from Massachusetts General Hospital (MGH) with colleagues at the International Centre for Diarrhoeal Disease Research, Bangladesh (abbreviated as icddr,b). This study, which appears in the journal Open Forum Infectious Diseases, offers an early warning that a pandemic of potentially deadly antibiotic resistance is under way and could spread around the globe.

The study was led by Mohammod Jobayer Chisti, MD, PhD, a senior scientist in icddr,b's Nutrition and Clinical Services Division. Chisti was inspired to conduct the research when he observed that the hospital affiliated with icddr,b was admitting more and more young children with pneumonia who were highly resistant to treatment with standard antibiotics. "At our hospital, dozens of kids died of pneumonia between 2014 and 2017, despite receiving the World Health Organization's recommended antibiotics and enhanced respiratory support," says Chisti.

Pneumonia is an infection of the lungs that causes fluid and pus to fill air sacs, producing cough, fever, trouble breathing, and other symptoms. Without effective treatment, the infection can be fatal; pneumonia is the most common cause of death in young children, according to the World Health Organization. In small children, pneumonia can be caused by viruses, but certain types of bacteria are common sources of infection, too. In the United States and other high-income countries, Staphylococcus ("staph"), Streptococcus ("strep"), and Haemophilus influenzae are the most common bacterial causes of pneumonia, which usually respond well to antibiotic therapy. Vaccines for the latter two have saved countless lives worldwide.

However, when Chisti and his colleagues examined health records of more than 4,000 children under age five with pneumonia admitted to their hospital between 2014 and 2017, they found that a very different pattern of bacterial infections was occurring. The usual staph and strep infections that commonly cause pneumonia in the United States and elsewhere were relatively rare. Among the children who had a positive culture, gram-negative bacteria were responsible for 77 percent of the infections, including Pseudomonas, E. coli, Salmonella and Klebsiella.

"That's totally different than what I'm used to in my practice in Boston," says Jason Harris, MD, MPH, co-first author of the study and chief of the division of Pediatric Global Health at the Massachusetts General Hospital for Children. Unfortunately, he adds, "the gram-negative bacteria we saw in these kids are notorious for being antibiotic resistant." To wit: Some 40 percent of the gram-negative bacterial infections in this study resisted treatment with first- and second-line antibiotics that are routinely used to treat pneumonia. More alarming, children who had antibiotic-resistant bacterial infections were 17 times more likely than others without bacterial infections to die.

Harris believes that these results are clear evidence that longstanding concerns that antibiotic resistance will become a deadly menace are no longer theoretical -- the problem has taken root. "These kids are already dying early because of antibiotic-resistant bacteria, from what would be a routine infection in other parts of the world," says Harris. "And this was at one hospital in Bangladesh. Extrapolate these findings across a country of 163 million people, and then to a larger region where antibiotic resistance is emerging, and the overall numbers are probably huge."

There is an urgent need to address factors that are promoting antibiotic resistance in Bangladesh, says Tahmeed Ahmed, PhD, executive director of icddr,b and senior author of the study. For starters, antibiotics can be purchased without a prescription in the country and many people use them to self-treat conditions such as dysentery, cold, cough and fever. Misuse of antibiotics promotes the spread of bacteria that resist the medications. "We may be able to reduce this emerging bacterial resistance by improving antibiotic stewardship, particularly in the outpatient setting," says Ahmed. Lab testing for diagnosis of bacterial infections is also inadequate in the country. "What's more, lack of access to clean water and adequate sanitation helps spread bacteria that are resistant to antibiotics," adds Ahmed. Improvements in health care infrastructure and policy changes to rein in the misuse of antibiotics are essential, he argues, though Ahmed notes that Bangladesh's health care system also needs better access to more advanced antibiotic therapies for resistant infections.

If these and other steps aren't taken now, it's only a matter of time before the problem of widespread deadly antibiotic resistance spreads around the world, notes Harris. "We know that acquisition of antibiotic resistance is very common in travelers, and that when highly resistant bacteria crop up in one part of the world, they ultimately crop up everywhere," he says, comparing the problem to another current global health care crisis. "If COVID-19 was a tsunami, then emerging antibiotic resistance is like a rising flood water. And it's kids in Bangladesh who are already going under."

Read more at Science Daily

'Neuroprosthesis' restores words to man with paralysis

Researchers at UC San Francisco have successfully developed a "speech neuroprosthesis" that has enabled a man with severe paralysis to communicate in sentences, translating signals from his brain to the vocal tract directly into words that appear as text on a screen.

The achievement, which was developed in collaboration with the first participant of a clinical research trial, builds on more than a decade of effort by UCSF neurosurgeon Edward Chang, MD, to develop a technology that allows people with paralysis to communicate even if they are unable to speak on their own. The study appears July 15 in the New England Journal of Medicine.

"To our knowledge, this is the first successful demonstration of direct decoding of full words from the brain activity of someone who is paralyzed and cannot speak," said Chang, the Joan and Sanford Weill Chair of Neurological Surgery at UCSF, Jeanne Robertson Distinguished Professor, and senior author on the study. "It shows strong promise to restore communication by tapping into the brain's natural speech machinery."

Each year, thousands of people lose the ability to speak due to stroke, accident, or disease. With further development, the approach described in this study could one day enable these people to fully communicate.

Translating Brain Signals into Speech

Previously, work in the field of communication neuroprosthetics has focused on restoring communication through spelling-based approaches to type out letters one-by-one in text. Chang's study differs from these efforts in a critical way: his team is translating signals intended to control muscles of the vocal system for speaking words, rather than signals to move the arm or hand to enable typing. Chang said this approach taps into the natural and fluid aspects of speech and promises more rapid and organic communication.

"With speech, we normally communicate information at a very high rate, up to 150 or 200 words per minute," he said, noting that spelling-based approaches using typing, writing, and controlling a cursor are considerably slower and more laborious. "Going straight to words, as we're doing here, has great advantages because it's closer to how we normally speak."

Over the past decade, Chang's progress toward this goal was facilitated by patients at the UCSF Epilepsy Center who were undergoing neurosurgery to pinpoint the origins of their seizures using electrode arrays placed on the surface of their brains. These patients, all of whom had normal speech, volunteered to have their brain recordings analyzed for speech-related activity. Early success with these patient volunteers paved the way for the current trial in people with paralysis.

Previously, Chang and colleagues in the UCSF Weill Institute for Neurosciences mapped the cortical activity patterns associated with vocal tract movements that produce each consonant and vowel. To translate those findings into speech recognition of full words, David Moses, PhD, a postdoctoral engineer in the Chang lab and one of the lead authors of the new study, developed new methods for real-time decoding of those patterns and statistical language models to improve accuracy.

But their success in decoding speech in participants who were able to speak didn't guarantee that the technology would work in a person whose vocal tract is paralyzed. "Our models needed to learn the mapping between complex brain activity patterns and intended speech," said Moses. "That poses a major challenge when the participant can't speak."

In addition, the team didn't know whether brain signals controlling the vocal tract would still be intact for people who haven't been able to move their vocal muscles for many years. "The best way to find out whether this could work was to try it," said Moses.

The First 50 Words

To investigate the potential of this technology in patients with paralysis, Chang partnered with colleague Karunesh Ganguly, MD, PhD, an associate professor of neurology, to launch a study known as "BRAVO" (Brain-Computer Interface Restoration of Arm and Voice). The first participant in the trial is a man in his late 30s who suffered a devastating brainstem stroke more than 15 years ago that severely damaged the connection between his brain and his vocal tract and limbs. Since his injury, he has had extremely limited head, neck, and limb movements, and communicates by using a pointer attached to a baseball cap to poke letters on a screen.

The participant, who asked to be referred to as BRAVO1, worked with the researchers to create a 50-word vocabulary that Chang's team could recognize from brain activity using advanced computer algorithms. The vocabulary -- which includes words such as "water," "family," and "good" -- was sufficient to create hundreds of sentences expressing concepts applicable to BRAVO1's daily life.

For the study, Chang surgically implanted a high-density electrode array over BRAVO1's speech motor cortex. After the participant's full recovery, his team recorded 22 hours of neural activity in this brain region over 48 sessions and several months. In each session, BRAVO1 attempted to say each of the 50 vocabulary words many times while the electrodes recorded brain signals from his speech cortex.

Translating Attempted Speech into Text

To translate the patterns of recorded neural activity into specific intended words, the other two lead authors of the study, Sean Metzger, MS and Jessie Liu, BS, both bioengineering doctoral students in the Chang Lab used custom neural network models, which are forms of artificial intelligence. When the participant attempted to speak, these networks distinguished subtle patterns in brain activity to detect speech attempts and identify which words he was trying to say.

To test their approach, the team first presented BRAVO1 with short sentences constructed from the 50 vocabulary words and asked him to try saying them several times. As he made his attempts, the words were decoded from his brain activity, one by one, on a screen.

Then the team switched to prompting him with questions such as "How are you today?" and "Would you like some water?" As before, BRAVO1's attempted speech appeared on the screen. "I am very good," and "No, I am not thirsty."

The team found that the system was able to decode words from brain activity at rate of up to 18 words per minute with up to 93 percent accuracy (75 percent median). Contributing to the success was a language model Moses applied that implemented an "auto-correct" function, similar to what is used by consumer texting and speech recognition software.

Moses characterized the early trial results as a proof of principle. "We were thrilled to see the accurate decoding of a variety of meaningful sentences," he said. "We've shown that it is actually possible to facilitate communication in this way and that it has potential for use in conversational settings."

Looking forward, Chang and Moses said they will expand the trial to include more participants affected by severe paralysis and communication deficits. The team is currently working to increase the number of words in the available vocabulary, as well as improve the rate of speech.

Both said that while the study focused on a single participant and a limited vocabulary, those limitations don't diminish the accomplishment. "This is an important technological milestone for a person who cannot communicate naturally," said Moses, "and it demonstrates the potential for this approach to give a voice to people with severe paralysis and speech loss."

Read more at Science Daily

Jul 14, 2021

Galactic gamma ray bursts predicted last year show up right on schedule

Magnetars are bizarre objects -- massive, spinning neutron stars with magnetic fields among the most powerful known, capable of shooting off brief bursts of radio waves so bright they're visible across the universe.

A team of astrophysicists has now found another peculiarity of magnetars: They can emit bursts of low energy gamma rays in a pattern never before seen in any other astronomical object.

It's unclear why this should be, but magnetars themselves are poorly understood, with dozens of theories about how they produce radio and gamma ray bursts. The recognition of this unusual pattern of gamma ray activity could help theorists figure out the mechanisms involved.

"Magnetars, which are connected with fast radio bursts and soft gamma repeaters, have something periodic going on, on top of randomness," said astrophysicist Bruce Grossan, an astrophysicist at the University of California, Berkeley's Space Sciences Laboratory (SSL). "This is another mystery on top of the mystery of how the bursts are produced."

The researchers -- Grossan and theoretical physicist and cosmologist Eric Linder from UC Berkeley and postdoctoral fellow Mikhail Denissenya from Nazarbayev University in Kazakhstan -- discovered the pattern in bursts from a soft gamma repeater, SGR1935+2154, that is a magnetar, a prolific source of soft or lower energy gamma ray bursts and the only known source of fast radio bursts within our Milky Way galaxy. They found that the object emits bursts randomly, but only within regular four-month windows of time, each active window separated by three months of inactivity.

On March 19, the team uploaded a preprint claiming "periodic windowed behavior" in soft gamma bursts from SGR1935+2154 and predicted that these bursts would start up again after June 1 -- following a three month hiatus -- and could occur throughout a four-month window ending Oct. 7.

On June 24, three weeks into the window of activity, the first new burst from SGR1935+2154 was observed after the predicted three month gap, and nearly a dozen more bursts have been observed since, including one on July 6, the day the paper was published online in the journal Physical Review D.

"These new bursts within this window means that our prediction is dead on," said Grossan, who studies high energy astronomical transients. "Probably more important is that no bursts were detected between the windows since we first published our preprint."

Linder likens the non-detection of bursts in three-month windows to a key clue -- the "curious incident" that a guard dog did not bark in the nighttime -- that allowed Sherlock Holmes to solve a murder in the short story "The Adventure of Silver Blaze."

"Missing or occasional data is a nightmare for any scientist," noted Denissenya, the first author of the paper and a member of the Energetic Cosmos Laboratory at Nazarbayev University that was founded several years ago by Grossan, Linder and UC Berkeley cosmologist and Nobel laureate George Smoot. "In our case, it was crucial to realize that missing bursts or no bursts at all carry information."

The confirmation of their prediction startled and thrilled the researchers, who think this may be a novel example of a phenomenon -- periodic windowed behavior -- that could characterize emissions from other astronomical objects.

Mining data from 27-year-old satellite

Within the last year, researchers suggested that the emission of fast radio bursts -- which typically last a few thousandths of a second -- from distant galaxies might be clustered in a periodic windowed pattern. But the data were intermittent, and the statistical and computational tools to firmly establish such a claim with sparse data were not well developed.

Grossan convinced Linder to explore whether advanced techniques and tools could be used to demonstrate that periodically windowed -- but random, as well, within an activity window -- behavior was present in the soft gamma ray burst data of the SGR1935+2154 magnetar. The Konus instrument aboard the WIND spacecraft, launched in 1994, has recorded soft gamma ray bursts from that object -- which also exhibits fast radio bursts -- since 2014 and likely never missed a bright one.

Linder, a member of the Supernova Cosmology Project based at Lawrence Berkeley National Laboratory, had used advanced statistical techniques to study the clustering in space of galaxies in the universe, and he and Denissenya adapted these techniques to analyze the clustering of bursts in time. Their analysis, the first to use such techniques for repeated events, showed an unusual windowed periodicity distinct from the very precise repetition produced by bodies rotating or in orbit, which most astronomers think of when they think of periodic behavior.

"So far, we have observed bursts over 10 windowed periods since 2014, and the probability is 3 in 10,000 that while we think it is periodic windowed, it is actually random," he said, meaning there's a 99.97% chance they're right. He noted that a Monte Carlo simulation indicated that the chance they're seeing a pattern that isn't really there is likely well under 1 in a billion.

The recent observation of five bursts within their predicted window, seen by WIND and other spacecraft monitoring gamma ray bursts, adds to their confidence. However, a single future burst observed outside the window would disprove the whole theory, or cause them to redo their analysis completely.

"The most intriguing and fun part for me was to make predictions that could be tested in the sky. We then ran simulations against real and random patterns and found it really did tell us about the bursts," Denissenya said.

As for what causes this pattern, Grossan and Linder can only guess. Soft gamma ray bursts from magnetars are thought to involve starquakes, perhaps triggered by interactions between the neutron star's crust and its intense magnetic field. Magnetars rotate once every few seconds, and if the rotation is accompanied by a precession -- a wobble in the rotation -- that might make the source of burst emission point to Earth only within a certain window. Another possibility, Grossan said, is that a dense, rotating cloud of obscuring material surrounds the magnetar but has a hole that only periodically allows bursts to come out and reach Earth.

"At this stage of our knowledge of these sources, we can't really say which it is," Grossan said. "This is a rich phenomenon that will likely be studied for some time."

Linder agrees and points out that the advances were made by the cross-pollination of techniques from high energy astrophysics observations and theoretical cosmology.

Read more at Science Daily

Eating whole grains linked to smaller increases in waist size, blood pressure, blood sugar

Middle- to older-aged adults who ate at least three servings of whole grains daily had smaller increases in waist size, blood pressure, and blood sugar levels over time compared to those who ate less than one-half serving per day, according to new research.

Published July 13, 2021, in the Journal of Nutrition, the study by researchers at the Jean Mayer USDA Human Nutrition Research Center on Aging at Tufts University examined how whole- and refined-grain intake over time impacted five risk factors of heart disease: Waist size, blood pressure, blood sugar, triglyceride, and HDL ("good") cholesterol.

Using data from the Framingham Heart Study Offspring Cohort, which began in the 1970s to assess long-term risk factors of heart disease, the new research examined health outcomes associated with whole- and refined-grain consumption over a median of 18 years. The 3,100 participants from the cohort were mostly white and, on average, in their mid-50s at the start of data collection.

The research team compared changes in the five risk factors, over four-year intervals, across four categories of reported whole grain intake, ranging from less than a half serving per day to three or more servings per day. According to the Dietary Guidelines for Americans 2020-2025, the recommended amount of whole grains is three or more servings daily. An example of a serving is one slice of whole-grain bread, a half cup of rolled oats cereal, or a half cup of brown rice.

The results showed that for each four-year interval:
 

  • Waist size increased by an average of over 1 inch in the low intake participants, versus about ½ inch in the high intake participants.
     
  • Even after accounting for changes in waist size, average increases in blood sugar levels and systolic blood pressure were greater in low intake participants compared to high intake participants.


The researchers also studied the five risk factors across four categories of refined-grain intake, ranging from less than two servings per day to more than four servings per day. Lower refined-grain intake led to a lower average increase in waist size and a greater mean decline in triglyceride levels for each four-year period.

"Our findings suggest that eating whole-grain foods as part of a healthy diet delivers health benefits beyond just helping us lose or maintain weight as we age. In fact, these data suggest that people who eat more whole grains are better able to maintain their blood sugar and blood pressure over time. Managing these risk factors as we age may help to protect against heart disease," said Nicola McKeown, senior and corresponding author and a scientist on the Nutritional Epidemiology Team at the USDA HNRCA.

"There are several reasons that whole grains may work to help people maintain waist size and reduce increases in the other risk factors. The presence of dietary fiber in whole grains can have a satiating effect, and the magnesium, potassium, and antioxidants may contribute to lowering blood pressure. Soluble fiber in particular may have a beneficial effect on post-meal blood sugar spikes," said Caleigh Sawicki. Sawicki did this work as part of her doctoral dissertation while a student at the Gerald J. and Dorothy R. Friedman School of Nutrition Science and Policy at Tufts University and while working with the Nutritional Epidemiology Team at the USDA HNRCA.

The greatest contributor to whole-grain intake among participants was whole-wheat breads and ready-to-eat whole-grain breakfast cereals. The refined grains came mostly from pasta and white bread. The difference in health benefits between whole and refined grains may stem from the fact that whole grains are less processed than refined grains. Whole grains have a fiber-rich outer layer and an inner germ layer packed with B vitamins, antioxidants, and small amounts of healthy fats. Milling whole grains removes these nutrient-dense components, leaving only the starch-packed refined grain behind.

"The average American consumes about five servings of refined grains daily, much more than is recommended, so it's important to think about ways to replace refined grains with whole grains throughout your day. For example, you might consider a bowl of whole-grain cereal instead of a white flour bagel for breakfast and replacing refined-grain snacks, entrees, and side dishes with whole-grain options. Small incremental changes in your diet to increase whole-grain intake will make a difference over time," McKeown said.

Read more at Science Daily

New study links moderate alcohol use with higher cancer risk

A new study from the World Health Organization's (WHO) International Agency for Research on Cancer (IARC), published in the journal Lancet Oncology, has found an association between alcohol and a substantially higher risk of several forms of cancer, including breast, colon, and oral cancers. Increased risk was evident even among light to moderate drinkers (up to two drinks a day), who represented 1 in 7 of all new cancers in 2020 and more than 100,000 cases worldwide.

In Canada, alcohol use was linked to 7,000 new cases of cancer in 2020, including 24 per cent of breast cancer cases, 20 per cent of colon cancers, 15 per cent of rectal cancers, and 13 per cent of oral and liver cancers.

"All drinking involves risk," said study co-author Dr. Jürgen Rehm, Senior Scientist, Institute for Mental Health Policy Research and Campbell Family Mental Health Research Institute at CAMH. "And with alcohol-related cancers, all levels of consumption are associated with some risk. For example, each standard sized glass of wine per day is associated with a 6 per cent higher risk for developing female breast cancer."

"Alcohol consumption causes a substantial burden of cancer globally," said Dr. Isabelle Soerjomataram, Deputy Branch Head, Cancer Surveillance Branch at IARC. "Yet the impact on cancers is often unknown or overlooked, highlighting the need for implementation of effective policy and interventions to increase public awareness of the link between alcohol use and cancer risk, and decrease overall alcohol consumption to prevent the burden of alcohol-attributable cancers."

Dr. Leslie Buckley, CAMH Chief of Addictions, added: "In our clinic we are seeing many people who report increased alcohol use since the onset of the pandemic. Although this may be related to temporary stressors, there is a potential for new habits to become more permanent. The consequences with alcohol use are often subtle harms initially that take time to show themselves, while long-term consequences such as cancer, liver disease and substance use disorder can be devastating."

The modelling study was based on data on alcohol exposure from almost all countries of the world, both surveys and sales figures, which were combined with the latest relative risk estimates for cancer based on level of consumption.

"Alcohol causes cancer in numerous ways," explained Dr. Kevin Shield, Independent Scientist, Institute for Mental Health Policy Research, and study co-author. "The main mechanism of how alcohol causes cancer is through impairing DNA repair. Additional pathways include chronic alcohol consumption resulting in liver cirrhosis, and alcohol leading to a dysregulation of sex hormones, leading to breast cancer. Alcohol also increases the risk of head and neck cancer for smokers as it increases the absorption of carcinogens from tobacco."

Dr. Rehm says research into the link between light to moderate drinking and cancer is relatively new and that public policy does not yet reflect the degree of cancer risk. He added, "As an epidemiologist, I would recommend higher taxes to fully reflect the burden of disease from alcohol. Along with limiting the physical availability and marketing of alcohol, price controls are recognized as high-impact, cost-effective measures to reduce alcohol-related harm." Governments can also consider requiring manufacturers to include information about health and safety risks associated with alcohol consumption, including cancer risk, on alcoholic beverage labels.

Read more at Science Daily

Revealing the mystery behind the threat of non-alcoholic liver disease

Researchers revealed how non-alcoholic fatty liver disease can develop into a life-threatening complication. Their discovery will accelerate the search for therapeutic solutions. The study was led by Helmholtz Zentrum München in collaboration with the Heidelberg University Hospital and the German Center for Diabetes Research.

Non-alcoholic fatty liver disease is the most common liver disorder worldwide and is present in approximately 25 percent of the world's population. Over 90 percent of obese, 60 percent of diabetic, and up to 20 percent of normal-weight people develop it. A key feature of the condition is the accumulation of fat in the liver. A liver can remain fatty without disturbing normal function; however, fat accumulations may progress into a so-called non-alcoholic steatohepatitis -- an aggressive form of the non-alcoholic fatty liver disease combined with inflammation and sometimes fibrosis. Non-alcoholic steatohepatitis can lead to further complications such as liver cirrhosis, primary liver cancer and eventually death.

Liver fibrosis is a strong predictor of long-term mortality in patients with non-alcoholic fatty liver disease. The mechanisms underlying the progression from the comparatively benign fatty liver state to advanced non-alcoholic steatohepatitis and liver fibrosis are incompletely understood. "Understanding the mechanism by which this condition becomes life threatening is key in our quest for the discovery of therapeutic solutions and preventative measures," said Stephan Herzig.

Loss of identity results in dysfunction

The researchers used comparative genomics to analyze mechanisms that control the development and specialized functions of the most abundant cell type in the liver, the hepatocyte. "Our results demonstrated that during progression to non-alcoholic steatohepatitis, hepatocytes suffer from partial identity loss, they are re-programmed," explained Anne Loft, first co-author of the article.

The hepatocyte reprogramming is tightly controlled by a network of proteins acting as molecular switches, so-called 'transcription factors'. Their activity results in the dysfunction of hepatocytes. The network of transcription factors that controls this process also plays a role in fibrosis progression. "These findings are important because they unravel the cellular mechanisms underlying non-alcoholic steatohepatitis. Knowing about the role of the protein networks and the identity loss of hepatocytes gives us potential intervention targets for the development of effective therapies" says Ana Alfaro, first co-author of the article.

Future work

Based on these findings, it will now be possible to develop novel approaches to effectively target certain nodes in the protein network to prevent disease progression or even revert existing fibrosis, something that is still not possible to-date.

Read more at Science Daily

Jul 13, 2021

Trace gas phosphine points to volcanic activity on Venus, scientists say

Scientists last autumn revealed that the gas phosphine was found in trace amounts in Venus' upper atmosphere. That discovery promised the slim possibility that phosphine serves as a biological signature for the hot, toxic planet.

Now Cornell scientists say the phosphine's chemical fingerprints support a different and important scientific find: evidence of explosive volcanoes on the mysterious planet.

"The phosphine is not telling us about the biology of Venus," said Jonathan Lunine, the David C. Duncan Professor in Physical Sciences and chair of the Department of Astronomy in the College of Arts and Sciences. "It's telling us about the geology. Science is pointing to a planet that has active explosive volcanism today or in the very recent past."

Lunine and Ngoc Truong, a doctoral candidate in geology, have authored the study, "Volcanically Extruded Phosphides as an Abiotic Source of Venusian Phosphine," published July 12 in the Proceedings of the National Academy of Sciences.

Truong and Lunine argue that volcanism is the means for phosphine to get into Venus' upper atmosphere, after examining observations from the ground-based, submillimeter-wavelength James Clerk Maxwell Telescope atop Mauna Kea in Hawaii, and the Atacama Large Millimeter/submillimeter Array (ALMA) in northern Chile.

"Volcanism could supply enough phosphide to produce phosphine," Truong said. "The chemistry implies that phosphine derives from explosive volcanoes on Venus, not biological sources."

Our planetary neighbor broils with an almost 900-degree Fahrenheit average surface temperature and features a carbon dioxide-filled atmosphere enveloped in sulfuric acid clouds, according to NASA.

If Venus has phosphide -- a form of phosphorus present in the planet's deep mantle -- and, if it is brought to the surface in an explosive, volcanic way and then injected into the atmosphere, those phosphides react with the Venusian atmosphere's sulfuric acid to form phosphine, Truong said.

He found published laboratory data confirming that the phosphide reacts with sulfuric acid to produce phosphines efficiently.

Volcanism on Venus is not necessarily surprising, Lunine said. But while "our phosphine model suggests explosive volcanism occurring, radar images from the Magellan spacecraft in the 1990s show some geologic features could support this."

In 1978, on NASA's Pioneer Venus orbiter mission, scientists uncovered variations of sulfur dioxide in Venus' upper atmosphere, hinting at the prospect of explosive volcanism, Truong said, similar to the scale of Earth's Krakatoa volcanic eruption in Indonesia in 1883.

Said Truong: "Confirming explosive volcanism on Venus through the gas phosphine was totally unexpected."

Read more at Science Daily

DNA reveals the evolutionary history of museum specimens

Museum specimens held in natural history collections around the world represent a wealth of underutilized genetic information due to the poor state of preservation of the DNA, which often makes it difficult to sequence. An international team, led by researchers from the University of Geneva (UNIGE) and the Museum of Natural History of the City of Geneva (MHN), has optimized a method developed for analyzing ancient DNA to identify the relationships between species on a deep evolutionary scale. This work is published in the journal Genome Biology and Evolution.

By combining and comparing the sequences of a large number of genes or complete genomes, it is possible to establish the links between related species and to trace the main steps in the evolution of organisms from a common ancestor. These phylogenomic studies are based on the amplification and sequencing of DNA fragments, followed by bioinformatics analyses to compare the sequences. They therefore typically require carefully sampled DNA in good preservation condition.

Deciphering degraded DNA

For this reason, most of the specimens preserved in natural history museums have not yet revealed all their secrets, since in most cases the DNA is often highly degraded and difficult to sequence. An international team led by Emmanuel Toussaint, researcher at the MHN, and Nadir Alvarez, researcher at the Department of Genetics and Evolution of the Faculty of Science of the UNIGE and chief curator at the MHN, has perfected a method already used for well-preserved samples in order to apply it to DNA that is highly fragmented due to partial degradation. The HyRAD-X technique consists of fishing out pieces of the genome to be analyzed with DNA probes from closely related species, and then sequencing them to detect the differences between the genomes. However, these DNA probes are only effective hooks for closely related genomes and this technique had so far only allowed to follow the evolution of a single species over time.

In this work, the scientists used HyRAD-X RNA probes instead of DNA probes to find fragments of interest in the genome. RNAs, copies of DNA molecules in charge of transferring the information encoded by the genome, have a very strong affinity for DNA and RNA-DNA pairings occur more easily than DNA-DNA pairings. RNA probes are therefore more efficient hooks, especially when the genomes to be analyzed demonstrate large divergence levels. "Thanks to this new method, we were able to trace the evolutionary history, not within a single species over a million years, but within several species and over tens of millions of years!" explains Emmanuel Toussaint, first author of the study.

The genealogy of the carabid beetle better known

The researchers were notably interested in the specimens of an emblematic carabid from the island of Saint Helena in the middle of the Atlantic Ocean, collected in the 1960s and preserved at the MHN in Geneva. The analysis of the DNA of these beetles revealed that this species, now extinct and until now classified in the genus Aplothorax, actually belongs to the genus Calosoma. It also allowed to locate its biogeographic origin probably in Africa and to generate the chronology of the evolution of the subfamily Carabinae whose origin goes back to the Lower Cretaceous. "Our study opens many perspectives to establish the evolutionary history of millions of specimens in museum collections around the world," concludes Nadir Alvarez.

Read more at Science Daily

Restless nights: Shelter housed dogs need days to adapt to new surroundings

Every year, thousands of dogs end up in a shelter in the Netherlands. Experts expect an increase in this number in the upcoming period, when people go back to the office after working from home during the corona crisis. Despite the good care of staff and volunteers, the shelter can be a turbulent experience for dogs. Researchers at Utrecht University investigated if dogs can adapt to their new environment based on their nocturnal activity.

Janneke van der Laan and fellow researchers from Utrecht University's Faculty of Veterinary Medicine compared the nocturnal activity of 29 shelter dogs and 29 pet dogs in their own homes -- similar in breed, age and sex -- with the help of night cameras and a small activity tracker on their collar.

They found that shelter dogs rest much less at night than pet dogs, especially during the first two nights in the shelter. This restlessness did decrease over time, but even after twelve days in the shelter, the dogs still rested less at night than the pet dogs.

"We also saw this restlessness in hormone measurements in the urine of shelter dogs" says Janneke van der Laan. Shelter dogs had higher values of the stress hormone cortisol in their urine than pet dogs, especially during the first two days but also after twelve days. It was also striking that smaller shelter dogs, for instance Shi Tzu's and Chihuahua's, were more restless during the first two nights than larger shelter dogs, and they also had higher cortisol values.

The researchers found big differences between individual dogs: some were already quite calm during the first night in the shelter, while others barely slept for a few nights. "It seems that dogs need at least two days, but often longer to get used to their new environment, in this case the shelter," Van der Laan explains. "Humans usually also sleep less good during the first night in a new environment, for example at the beginning of a vacation."

"With our follow-up research we will zoom in even further on the welfare of dogs in shelters. But our current findings already show that it is important to pay close attention to dogs that are unable to rest properly after several nights. The shelter staff may already be able to help these dogs by for example moving them to a less busy spot in the shelter."

 From Science Daily

More complex than we thought: The body's reaction to contact allergens

Many people react to contact allergens, but some patients develop rashes and itching much faster than others. Previously the scientists were unable to explain why, but now researchers have outlined an entire new subgroup of allergic reactions which explains these early skin reactions. The new knowledge is vital to understanding the disease mechanisms in contact allergy.

Hair dye, perfume, jewellery. Beautifying to most, but for some they are equivalent to rashes, irritation and reduced quality of life. Together with hay fever and food allergies, allergic contact dermatitis due to exposure to e.g. nickel and perfume ingredients represents the majority of allergic reactions seen among Danes.

Traditionally, researchers have distinguished between immediate and delayed allergic reactions, depending on which parts of the immune system that is responsible for the reaction. e.g., hay fever and food allergies are 'immediate' forms that cause immediate symptoms, whereas it can take days before the skin reacts to things like nickel and perfume. But now a new study conducted by the LEO Foundation Skin Immunology Research Center at the University of Copenhagen changes this understanding.

'Some patients develop allergic contact dermatitis at a much earlier stage than described by text books. The aim of the study was therefore to try to determine why some react to contact allergens much faster than prescribed. It turns out that when a part of the skin is exposed to the allergen for the first time, the cells within that specific skin area will develop local memory towards the contact allergen. And then when the same area is re-exposed to the allergen at a later point in time, the patient will develop a clear reaction within only 12 hours', explains PhD Student and first author of the study Anders Boutrup Funch.

It is the T cells in the body that are responsible for delayed allergic reactions -- also known as type 4 allergic reactions. But in the new study conducted on mice the researchers have shown that the T cells are capable of building a sophisticated memory that enables them to respond much faster than previously assumed. This gives us a more complex picture of contact allergy.

'We point to a need for clarification of this disease. Type 4 reactions should be subcategorised, giving us both the classic delayed reaction -- that is, where the patient reacts 24-72 hours after exposure -- and an immediate reaction, where the patient develops symptoms much faster. Based on these results, we may have to change the text books on contact allergy. At any case, we will need to add a chapter', says the main author of the study, Professor Charlotte Menné Bonefeld.

The study also reveals that activation of the memory T cells following exposure to an allergen leads to massive recruitment of the most abundant type of white blood cells in the body -- the so-called neutrophils -- to the affected part of the skin. Normally, neutrophil recruitment is used to fight infections, as these cells are capable of effectively eliminating microorganisms. At the same time, they cause intense infection and local tissue damage, which is what the patients experience as a rash. Neutrophil recruitment is not seen in connection with delayed reactions to contact allergens.

The next step in the research is to test the study results on humans. Once a person has developed contact allergy, they are likely to suffer from it for the rest of their lives. Therefore, the researchers behind the study hope the new knowledge may improve contact allergy patients' chances of getting treatment in the future.

Read more at Science Daily

Jul 12, 2021

How the universe is reflected near black holes

In the vicinity of black holes, space is so warped that even light rays may curve around them several times. This phenomenon may enable us to see multiple versions of the same thing. While this has been known for decades, only now do we have an exact, mathematical expression, thanks to Albert Sneppen, student at the Niels Bohr Institute. The result, which even is more useful in realistic black holes, has just been published in the journal Scientific Reports.

You have probably heard of black holes -- the marvelous lumps of gravity from which not even light can escape. You may also have heard that space itself and even time behave oddly near black holes; space is warped.

In the vicinity of a black hole, space curves so much that light rays are deflected, and very nearby light can be deflected so much that it travels several times around the black hole. Hence, when we observe a distant background galaxy (or some other celestial body), we may be lucky to see the same image of the galaxy multiple times, albeit more and more distorted.

Galaxies in multiple versions

The mechanism is shown on the figure below: A distant galaxy shines in all directions -- some of its light comes close to the black hole and is lightly deflected; some light comes even closer and circumvolves the hole a single time before escaping down to us, and so on. Looking near the black hole, we see more and more versions of the same galaxy, the closer to the edge of the hole we are looking.

How much closer to the black hole do you have to look from one image to see the next image? The result has been known for over 40 years, and is some 500 times (for the math aficionados, it is more accurately the "exponential function of two pi," written e2π).

Calculating this is so complicated that, until recently, we had not yet developed a mathematical and physical intuition as to why it happens to be this exact factor. But using some clever, mathematical tricks, master's student Albert Sneppen from the Cosmic Dawn Center -- a basic research center under both the Niels Bohr Institute and DTU Space -- has now succeeded in proving why.

"There is something fantastically beautiful in now understanding why the images repeat themselves in such an elegant way. On top of that, it provides new opportunities to test our understanding of gravity and black holes," Albert Sneppen clarifies.

Proving something mathematically is not only satisfying in itself; indeed, it brings us closer to an understanding of this marvelous phenomenon. The factor "500" follows directly from how black holes and gravity work, so the repetitions of the images now become a way to examine and test gravity.

Spinning black holes


As a completely new feature, Sneppen's method can also be generalized to apply not only to "trivial" black holes, but also to black holes that rotate. Which, in fact, they all do.

"It turns out that when the it rotates really fast, you no longer have to get closer to the black hole by a factor 500, but significantly less. In fact, each image is now only 50, or 5, or even down to just 2 times closer to the edge of the black hole," explains Albert Sneppen.

Read more at Science Daily

Teardrop star reveals hidden supernova doom

Astronomers have made the rare sighting of two stars spiralling to their doom by spotting the tell-tale signs of a teardrop-shaped star.

The tragic shape is caused by a massive nearby white dwarf distorting the star with its intense gravity, which will also be the catalyst for an eventual supernova that will consume both. Found by an international team of astronomers and astrophysicists led by the University of Warwick, it is one of only very small number of star systems that has been discovered that will one day see a white dwarf star reignite its core.

New research published by the team today (12 July) in Nature Astronomy confirms that the two stars are in the early stages of a spiral that will likely end in a Type Ia supernova, a type that helps astronomers determine how fast the universe is expanding.

This research received funding from the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) and the Science and Technology Facilities Council, part of UK Research and Innovation.

HD265435 is located roughly 1,500 light years away and comprises a hot subdwarf star and a white dwarf star orbiting each other closely at a rate of around 100 minutes. White dwarfs are 'dead' stars that have burnt out all their fuel and collapsed in on themselves, making them small but extremely dense.

A type Ia supernova is generally thought to occur when a white dwarf star's core reignites, leading to a thermonuclear explosion. There are two scenarios where this can happen. In the first, the white dwarf gains enough mass to reach 1.4 times the mass of our Sun, known as the Chandrasekhar limit. HD265435 fits in the second scenario, in which the total mass of a close stellar system of multiple stars is near or above this limit. Only a handful of other star systems have been discovered that will reach this threshold and result in a Type Ia supernova.

Lead author Dr Ingrid Pelisoli from the University of Warwick Department of Physics, and formerly affiliated with the University of Potsdam, explains: "We don't know exactly how these supernovae explode, but we know it has to happen because we see it happening elsewhere in the universe.

"One way is if the white dwarf accretes enough mass from the hot subdwarf, so as the two of them are orbiting each other and getting closer, matter will start to escape the hot subdwarf and fall onto the white dwarf. Another way is that because they are losing energy to gravitational wave emissions, they will get closer until they merge. Once the white dwarf gains enough mass from either method, it will go supernova."

Using data from NASA's Transiting Exoplanet Survey Satellite (TESS), the team were able to observe the hot subdwarf, but not the white dwarf as the hot subdwarf is much brighter. However, that brightness varies over time which suggested the star was being distorted into a teardrop shape by a nearby massive object. Using radial velocity and rotational velocity measurements from the Palomar Observatory and the W. M. Keck Observatory, and by modelling the massive object's effect on the hot subdwarf, the astronomers could confirm that the hidden white dwarf is as heavy as our Sun, but just slightly smaller than the Earth's radius.

Combined with the mass of the hot subdwarf, which is a little over 0.6 times the mass of our Sun, both stars have the mass needed to cause a Type Ia supernova. As the two stars are already close enough to begin spiralling closer together, the white dwarf will inevitably go supernova in around 70 million years. Theoretical models produced specifically for this study predict that the hot subdwarf will contract to become a white dwarf star as well before merging with its companion.

Type Ia supernovae are important for cosmology as 'standard candles'. Their brightness is constant and of a specific type of light, which means astronomers can compare what luminosity they should be with what we observe on Earth, and from that work out how distant they are with a good degree of accuracy. By observing supernovae in distant galaxies, astronomers combine what they know of how fast this galaxy is moving with our distance from the supernova and calculate the expansion of the universe.

Dr Pelisoli adds: "The more we understand how supernovae work, the better we can calibrate our standard candles. This is very important at the moment because there's a discrepancy between what we get from this kind of standard candle, and what we get through other methods.

"The more we understand about how supernovae form, the better we can understand whether this discrepancy we are seeing is because of new physics that we're unaware of and not taking into account, or simply because we're underestimating the uncertainties in those distances.

Read more at Science Daily

You can snuggle wolf pups all you want, they still won't 'get' you quite like your dog

You know your dog gets your gist when you point and say "go find the ball" and he scampers right to it.

This knack for understanding human gestures may seem unremarkable, but it's a complex cognitive ability that is rare in the animal kingdom. Our closest relatives, the chimpanzees, can't do it. And the dogs' closest relative, the wolf, can't either, according to a new Duke University-led study published July 12 in the journal Current Biology.

More than 14,000 years of hanging out with us has done a curious thing to the minds of dogs. They have what are known as "theory of mind" abilities, or mental skills allowing them to infer what humans are thinking and feeling in some situations.

The study, a comparison of 44 dog and 37 wolf puppies who were between 5 and 18 weeks old, supports the idea that domestication changed not just how dogs look, but their minds as well.

At the Wildlife Science Center in Minnesota, wolf puppies were first genetically tested to make sure they were not wolf -- dog hybrids. The wolf puppies were then raised with plenty of human interaction. They were fed by hand, slept in their caretakers' beds each night, and received nearly round-the-clock human care from just days after birth. In contrast, the dog puppies from Canine Companions for Independence lived with their mother and littermates and had less human contact.

Then the canines were tested. In one test, the researchers hid a treat in one of two bowls, then gave each dog or wolf puppy a clue to help them find the food. In some trials, the researchers pointed and gazed in the direction the food was hidden. In others, they placed a small wooden block beside the right spot -- a gesture the puppies had never seen before -- to show them where the treat was hidden.

The results were striking. Even with no specific training, dog puppies as young as eight weeks old understood where to go, and were twice as likely to get it right as wolf puppies the same age who had spent far more time around people.

Seventeen out of 31 dog puppies consistently went to the right bowl. In contrast, none out of 26 human-reared wolf pups did better than a random guess. Control trials showed the puppies weren't simply sniffing out the food.

Even more impressive, many of the dog puppies got it right on their first trial. Absolutely no training necessary. They just get it.

It's not about which species is "smarter," said first author Hannah Salomons, a doctoral student in Brian Hare's lab at Duke. Dog puppies and wolf puppies proved equally adept in tests of other cognitive abilities, such as memory, or motor impulse control, which involved making a detour around transparent obstacles to get food.

It was only when it came to the puppies' people-reading skills that the differences became clear.

"There's lots of different ways to be smart," Salomons said. "Animals evolve cognition in a way that will help them succeed in whatever environment they're living in."

Other tests showed that dog puppies were also 30 times more likely than wolf pups to approach a stranger.

"With the dog puppies we worked with, if you walk into their enclosure they gather around and want to climb on you and lick your face, whereas most of the wolf puppies run to the corner and hide," Salomons said.

And when presented with food inside a container that was sealed so they could no longer retrieve it, the wolf pups generally tried to solve the problem on their own, whereas the dog puppies spent more time turning to people for help, looking them in the eye as if to say: "I'm stuck can you fix this?"

Senior author Brian Hare says the research offers some of the strongest evidence yet of what's become known as the "domestication hypothesis."

Somewhere between 12,000 and 40,000 years ago, long before dogs learned to fetch, they shared an ancestor with wolves. How such feared and loathed predators transformed into man's best friend is still a bit of a mystery. But one theory is that, when humans and wolves first met, only the friendliest wolves would have been tolerated and gotten close enough to scavenge on the human's leftovers instead of running away. Whereas the shyer, surlier wolves might go hungry, the friendlier ones would survive and pass on the genes that made them less fearful or aggressive toward humans.

The theory is that this continued generation after generation, until the wolf's descendants became masters at gauging the intentions of people they interact with by deciphering their gestures and social cues.

"This study really solidifies the evidence that the social genius of dogs is a product of domestication," said Hare, professor of evolutionary anthropology at Duke.

It's this ability that makes dogs such great service animals, Hare said. "It is something they are really born prepared to do."

Much like human infants, dog puppies intuitively understand that when a person points, they're trying to tell them something, whereas wolf puppies don't.

"We think it indicates a really important element of social cognition, which is that others are trying to help you," Hare said.

Read more at Science Daily

People given 'friendly' bacteria in nose drops protected against meningitis, study suggests

 Led by Professor Robert Read and Dr Jay Laver from the NIHR Southampton Biomedical Research Centre and the University of Southampton, the work is the first of its kind.

Together they inserted a gene into a harmless type of a bacteria, that allows it to remain in the nose and trigger an immune response. They then introduced these bacteria into the noses of healthy volunteers via nose drops.

The results, published in the journal Science Translational Medicine, showed a strong immune response against bacteria that cause meningitis. Published in Science Translational Medicine, those data also show long-lasting protection.

Meningitis occurs in people of all age groups but affects mainly infants, young children and the elderly. Meningococcal meningitis, is a bacterial form of the disease, causing 1,500 cases a year in the UK. It can lead to death in as little as four hours after symptoms start.

Around 10% of adults carry N. meningitidis in the back of their nose and throat with no signs or symptoms. However, in some people it can invade the bloodstream. That can lead to life-threatening conditions including meningitis and blood poisoning ('septicaemia).

The 'friendly' bacteria Neisseria lactamica (N. lactamica) also lives in some people's noses naturally. By occupying the nose, it protects from a severe type of meningitis. It does so by denying a foothold to its close cousin Neisseria meningitidis (N. meningitidis).

The new data build on the team's previous work aiming to exploit this natural phenomenon. That study showed nose drops of N. lactamica prevented N. meningitidis from settling in 60% of participants.

For those people, N. lactamica had locked out its deadly cousin. That drove work to make N. lactamica even more effective in displacing N. meningitidis.

The team did so by handing it one of N. meningitidis' key weapons; a 'sticky' surface protein that grips the cells lining the nose. By inserting a copy of the gene for this protein into N. lactamica's DNA, it could also it -- levelling the playing field.

As well as inducing a stronger immune response, those modified bacteria stayed longer. Present for at least 28 days, with most participants (86%) still carrying it at 90 days, it caused no adverse symptoms.

The results of the study, which was funded by the Medical Research Council, are promising for this new way of preventing life-threatening infections, without drugs. It's an approach that could be critical in the face of growing antimicrobial resistance.

Dr Jay Laver, Senior Research Fellow in Molecular Microbiology at the University of Southampton, commented: "Although this study has identified the potential of our recombinant N. lactamica technology for protecting people against meningococcal disease, the underlying platform technology has broader applications.

"It is theoretically possible to express any antigen in our bacteria, which means we can potentially adapt them to combat a multitude of infections that enter the body through the upper respiratory tract. In addition to the delivery of vaccine antigens, advances in synthetic biology mean we might also use genetically modified bacteria to manufacture and deliver therapeutics molecules in the near future."

Read more at Science Daily

Technology that restores the sense of touch in nerves damaged as a result of injury

Tel Aviv University's new and groundbreaking technology inspires hope among people who have lost their sense of touch in the nerves of a limb following amputation or injury. The technology involves a tiny sensor that is implanted in the nerve of the injured limb, for example in the finger, and is connected directly to a healthy nerve. Each time the limb touches an object, the sensor is activated and conducts an electric current to the functioning nerve, which recreates the feeling of touch. The researchers emphasize that this is a tested and safe technology that is suited to the human body and could be implanted anywhere inside of it once clinical trials will be done.

The technology was developed under the leadership of a team of experts from Tel Aviv University: Dr. Ben M. Maoz, Iftach Shlomy, Shay Divald, and Dr. Yael Leichtmann-Bardoogo from the Department of Biomedical Engineering, Fleischman Faculty of Engineering, in collaboration with Keshet Tadmor from the Sagol School of Neuroscience and Dr. Amir Arami from the Sackler School of Medicine and the Microsurgery Unit in the Department of Hand Surgery at Sheba Medical Center. The study was published in the journal ACS Nano.

The researchers say that this unique project began with a meeting between the two Tel Aviv University colleagues -- biomedical engineer Dr. Maoz and surgeon Dr. Arami. "We were talking about the challenges we face in our work," says Dr. Maoz, "and Dr. Arami shared with me the difficulty he experiences in treating people who have lost tactile sensation in one organ or another as a result of injury. It should be understood that this loss of sensation can result from a very wide range of injuries, from minor wounds -- like someone chopping a salad and accidentally cutting himself with the knife -- to very serious injuries. Even if the wound can be healed and the injured nerve can be sutured, in many cases the sense of touch remains damaged. We decided to tackle this challenge together, and find a solution that will restore tactile sensation to those who have lost it."

In recent years, the field of neural prostheses has made promising developments to improve the lives of those who have lost sensation in their limbs by implanting sensors in place of the damaged nerves. But the existing technology has a number of significant drawbacks, such as complex manufacturing and use, as well as the need for an external power source, such as a battery. Now, the researchers at Tel Aviv University have used state-of-the-art technology called a triboelectric nanogenerator (TENG) to engineer and test on animal models a tiny sensor that restores tactile sensation via an electric current that comes directly from a healthy nerve and doesn't require a complex implantation process or charging.

The researchers developed a sensor that can be implanted on a damaged nerve under the tip of the finger; the sensor connects to another nerve that functions properly and restores some of the tactile sensation to the finger. This unique development does not require an external power source such as electricity or batteries. The researchers explain that the sensor actually works on frictional force: whenever the device senses friction, it charges itself.

The device consists of two tiny plates less than half a centimeter by half a centimeter in size. When these plates come into contact with each other, they release an electric charge that is transmitted to the undamaged nerve. When the injured finger touches something, the touch releases tension corresponding to the pressure applied to the device -- weak tension for a weak touch and strong tension for a strong touch -- just like in a normal sense of touch.

The researchers explain that the device can be implanted anywhere in the body where tactile sensation needs to be restored, and that it actually bypasses the damaged sensory organs. Moreover, the device is made from biocompatible material that is safe for use in the human body, it does not require maintenance, the implantation is simple, and the device itself is not externally visible.

Read more at Science Daily