Dec 7, 2019

First 'lab in a field' experiment reveals a sunnier side of climate change

Pioneering experiments using heated field plots to test the responses of crops to temperature have revealed an unexpected plus side of climate change for farmers.

The field trial experiment -- the first of its kind -- was set up to investigate the link between warmer Octobers in the United Kingdom and higher yields of oilseed rape.

The crop, planted in autumn and harvested early the following summer, is particularly sensitive to temperature at certain times of the year with annual yields varying by up to 30 percent as a result. It is known that warmer temperatures in October are correlated with higher oilseed rape yields, but the reason for this trend was unclear.

The results of this study by the John Innes Centre reveal that the temperature in October is surprisingly important for the timing of flowering, and that warmer Octobers result in a delay to flowering the following spring.

Professor Steve Penfield an author of the study says, "We found that oilseed rape plants stop growing when they go through the floral transition at the end of October, and that warmer temperatures at this time of year enable the plant to grow for longer, giving more potential for higher yields."

The good news for growers of oilseed rape is that Met Office data shows cold Octobers are now much less frequent than they were in the past.

"By establishing the link between autumn temperatures and yield, our study highlights an example of climate change being potentially useful to farmers. Cold Octobers have a negative effect on yield if you are growing oilseed rape, and these are now rarer," says Professor Penfield.

Temperature is critical for oilseed rape lifecycle because it determines at what point the plant goes through the transition from vegetative state to flowering, with delays in flowering being associated with higher yields.

This process called vernalisation is well understood in the lab as a requirement of a prolonged exposure to cold temperature. But an increasing body of research suggests vernalisation might work differently under more variable conditions experienced by a plant in the field.

In this study the team used soil surface warming cables to raise the temperature of field plots by between 4 and 8 degrees Celsius, simulating warmer October temperatures. Two varieties of oilseed rape with differing vernalisation requirements were trialled.

Lab tests on dissected plants showed that warming in October conditions delayed floral transition by between 3 and 4 weeks for both varieties. Genetic tests showed genes associated with vernalisation in cold conditions were also highly expressed in the warm conditions.

The study shows that vernalisation in oilseed rape takes place predominantly during October during which time the mean temperature is between 10-12 degrees Celsius.

The technology used in the study has been used before in natural grasslands to simulate winter warming but the trials conducted by the John Innes Centre research team are the first time it's been used on a crop in the field.

Read more at Science Daily

Clinical study finds eating within 10-hour window may help stave off diabetes, heart disease

Metabolic syndrome affects nearly 30 percent of the U.S. population, and increases the risk for type 2 diabetes, heart disease and stroke. But lifestyle interventions such as adopting a healthy diet and increasing physical exercise are difficult to maintain and, even when combined with medication, are often insufficient to fully manage the disease.

Now, in a collaborative effort, researchers from the Salk Institute and the UC San Diego School of Medicine found that a 10-hour time-restricted eating intervention, when combined with traditional medications, resulted in weight loss, reduced abdominal fat, lower blood pressure and cholesterol, and more stable blood sugar and insulin levels for participants. The pilot study, published in Cell Metabolism on December 5, 2019, could lead to a new treatment option for metabolic syndrome patients who are at risk for developing life-altering and costly medical conditions such as diabetes.

"We have found that combining time-restricted eating with medications can give metabolic syndrome patients the ability to better manage their disease," says Satchidananda Panda, co-corresponding author and professor in Salk's Regulatory Biology Laboratory. "Unlike counting calories, time-restricted eating is a simple dietary intervention to incorporate, and we found that participants were able to keep the eating schedule."

Time-restricted eating (eating all calories within a consistent 10-hour window) supports an individual's circadian rhythms and can maximize health benefits, as evidenced by previous research published by the Salk team. Circadian rhythms are the 24-hour cycles of biological processes that affect nearly every cell in the body. Increasingly, scientists are finding that erratic eating patterns can disrupt this system and increase the risk for metabolic syndrome and other metabolic disorders with such symptoms as increased abdominal fat, abnormal cholesterol or triglycerides, and high blood pressure and blood sugar levels.

"Eating and drinking everything (except water) within a consistent 10-hour window allows your body to rest and restore for 14 hours at night. Your body can also anticipate when you will eat so it can prepare to optimize metabolism," says Emily Manoogian, the paper's co-first author and a postdoctoral fellow in the Panda lab. "We wanted to know if controlling the timing of food intake to support circadian rhythms would improve the health of individuals that were already being treated for cardiometabolic diseases."

"We suspected a 10-hour eating intervention might be beneficial because of Satchidananda Panda's pioneering work in animals, which showed that time-restricted eating led to dramatic health benefits, including a healthier metabolism," adds Michael Wilkinson, co-first author, assistant clinical professor of medicine at UC San Diego School of Medicine and a cardiologist at UC San Diego Health.

The pilot study included 19 participants (13 men and 6 women) diagnosed with metabolic syndrome who self-reported eating during a time window of more than 14 hours per day. Additionally, 84 percent of participants were taking at least one medication such as a statin or an antihypertensive therapy. Study participants used the Panda lab's myCircadianClock app to log when and what they ate during an initial 2-week baseline period followed by the three-month, 10-hour time-restricted eating intervention. Nearly 86 percent of participants correctly logged their food using the app, indicating high compliance throughout the study.

Participants did not report any adverse effects during the intervention. To reduce food intake to the 10-hour window, most participants delayed their first meal and advanced their last meal each day, so meals were not skipped. Although calories were not recommended to be reduced for the intervention, some participants did report eating less, likely due to the shorter eating window.

Overall, participants experienced improved sleep as well as a 3-4 percent reduction in body weight, body mass index, abdominal fat and waist circumference. Major risk factors for heart disease were diminished as participants showed reduced blood pressure and total cholesterol. Blood sugar levels and insulin levels also showed a trend toward improvement.

"Metabolism is closely linked with circadian rhythms, and knowing this, we were able to develop an intervention to help patients with metabolic syndrome without decreasing calories or increasing physical exercise," says Pam Taub, co-corresponding author and associate professor of medicine at the UC San Diego School of Medicine and a cardiologist at UC San Diego Health. "If we can optimize circadian rhythms then we might be able to optimize the metabolic system."

"Adapting this 10-hour time-restricted eating is an easy and cost-effective method for reducing symptoms of metabolic syndrome and improving health," adds Panda. "By delaying the onset of diabetes by even one year in a million people with prediabetes, the intervention could save roughly 9.6 billion dollars in healthcare costs."

Read more at Science Daily

Dec 6, 2019

Dull teeth, long skulls, specialized bites evolved in unrelated plant-eating dinosaurs

Herbivorous dinosaurs evolved many times during the 180 million-year Mesozoic era, and while they didn't all evolve to chew, swallow, and digest their food in the same way, a few specific strategies appeared time and time again. An investigation of the skulls of 160 non-avian dinosaurs revealed the evolution of common traits in the skulls and teeth of plant-eating members of otherwise very different families of these extinct reptiles. These new examples of convergent evolution in plant-eating dinosaurs appear December 5 in the journal Current Biology.

"People often think of dinosaurs as a swansong for extinction or that they were a failed species. But they were actually extremely successful in terms of how different species' anatomies evolved -- particularly in herbivores," says co-senior author David J. Button (@ItsDavidButton), a paleontologist at the Natural History Museum, London.

By looking at herbivorous and carnivorous dinosaur skulls, Button and co-senior author Lindsay Zanno, a professor at North Carolina State University and the head of paleontology at the North Carolina Museum of Natural Sciences, found that while there are many ways for dinosaurs that eat similar foods to evolve, some traits reappear during evolution, even in unrelated species.

Herbivorous dinosaurs came in all shapes and sizes. Some exhibited dull, flat teeth like horses, while others had beaked faces like tortoises; some developed towering necks like giraffes, while others mimicked the short and stout build of a rhino. "Nonetheless, we see the evolution of common traits in the skull between these otherwise very different herbivorous dinosaur groups," explains Button.

"For example, both the ostrich-like ornithomimosaurs and giant titanosaurs independently evolved elongate skulls and weaker bites, whereas the horned ceratopsians and gazelle-like ornithopods sported more powerful jaws and grinding teeth," he says. These are results of convergent evolution, where adaptation to a diet of plants led to the evolution of common characters in different dinosaur groups.

The researchers hypothesized that some traits would be most common in plant-eaters. Slow-moving dinosaurs with small heads and dull teeth would likely have a difficult time wrapping their jaws around the neck of another dinosaur, in the way a carnivore like the Tyrannosaurus is thought to have done with ease. Instead, eating plants poses other challenges, such as grinding down tough plant stems.

"There's a tradeoff between biting speed and biting efficiency," says Button. "If you're a herbivorous animal, you don't really need speed because plants don't move very fast."

Some of the results of this functional analysis surprised the researchers, however. That was the case when investigating the eating habits of ankylosaurs, armored, armadillo-like plant-eating dinosaurs with small teeth and a large stomach cavity. Researchers previously thought dinosaurs with these traits usually swallowed their food nearly whole and let their gut break it down. "In our results, we found that ankylosaurs actually may have chewed their food more thoroughly than is often thought. So, that was interesting," says Button.

In the future, Button and Zanno hope to look at the entire skeleton of herbivorous dinosaurs for similar, reoccurring traits. They also plan to expand this work to better understand predominate traits in carnivores, though Button admits plant-eaters will always be his favorite dinosaurs to study.

Read more at Science Daily

Through the eyes of animals

Humans are now closer to seeing through the eyes of animals, thanks to an innovative software framework developed by researchers from the University of Queensland and the University of Exeter.

PhD candidate Cedric van den Berg from UQ's School of Biological Sciences said that, until now, it has been difficult to understand how animals really saw the world.

"Most animals have completely different visual systems to humans, so -- for many species -- it is unclear how they see complex visual information or colour patterns in nature, and how this drives their behaviour," he said.

"The Quantitative Colour Pattern Analysis (QCPA) framework is a collection of innovative digital image processing techniques and analytical tools designed to solve this problem.

"Collectively, these tools greatly improve our ability to analyse complex visual information through the eyes of animals."

Dr Jolyon Troscianko the study's co-leader from the University of Exeter said colour patterns have been key to understanding many fundamental evolutionary problems, such as how animals signal to each other or hide from predators.

"We have known for many years that understanding animal vision and signalling depends on combining colour and pattern information, but the available techniques were near impossible to implement without some key advances we developed for this framework."

The framework's use of digital photos means it can be used in almost any habitat -- even underwater -- using anything from off-the-shelf cameras to sophsiticated full-spectrum imaging systems.

"You can even access most of its capabilities by using a cheap (~ $110 AUD, £60 GBP, $80 USD) smartphone to capture photos," Dr Troscianko said.

It took four years to develop and test the technology, which included the development of an extensive interactive online platform to provide researchers, teachers and students with user-guides, tutorials and worked examples of how to use the tools.

UQ's Dr Karen Cheney said the framework can be applied to a wide range of environmental conditions and visual systems.

"The flexibility of the framework allows researchers to investigate the colour patterns and natural surroundings of a wide range of organisms, such as insects, birds, fish and flowering plants," she said.

"For example, we can now truly understand the impacts of coral bleaching for camouflaged reef creatures in a new and informative way."

"We're helping people -- wherever they are -- to cross the boundaries between human and animal visual perception."

Read more at Science Daily

Signs of life: New field guide aids astronomers' search

A Cornell University senior has come up with a way to discern life on exoplanets loitering in other cosmic neighborhoods: a spectral field guide.

Zifan Lin has developed high-resolution spectral models and scenarios for two exoplanets that may harbor life: Proxima b, in the habitable zone of our nearest neighbor Proxima Centauri; and Trappist-1e, one of three possible Earth-like exoplanet candidates in the Trappist-1 system.

The paper, co-authored with Lisa Kaltenegger, associate professor of astronomy and director of Cornell's Carl Sagan Institute, published in Monthly Notices of the Royal Astronomical Society.

"In order to investigate whether there are signs of life on other worlds, it is very important to understand signs of life that show in a planet's light fingerprint," Lin said. "Life on exoplanets can produce a characteristic combination of molecules in its atmosphere -- and those become telltale signs in the spectra of such planets.

"In the near future we will be seeing the atmosphere of these worlds with new, sophisticated ground-based telescopes, which will allow us to explore the exoplanet's climate and might spot its biota," he said.

In the search for habitable worlds, "M dwarf" stars catch astronomers' eyes, since the local universe teems with these suns, which make up 75% of the nearby cosmos, according to Lin.

Throughout the Milky Way, our home galaxy, astronomers have discovered more than 4,000 exoplanets, some in their own suns' habitable zone -- an area that provides conditions suitable for life.

To explore the atmosphere of these places, scientists need large next-generation telescopes, such as the Extremely Large Telescope (ELT), which is currently under construction in northern Chile's Atacama Desert and expected to be operational in 2025. Scientists can aim the mammoth eyepiece -- with a flawless primary mirror about half the size of a football field -- at Proxima b and Trappist-1e. The future telescope will have more than 250 times the light-gathering power of the Hubble Space Telescope.

Lin and Kaltenegger said the high-resolution spectrographs from the ELT can discern water, methane and oxygen for both Proxima b and Trappist-1e, if these planets are like our own pale blue dot.

Read more at Science Daily

NASA's OSIRIS-REx mission explains Asteroid Bennu's mysterious particle events

Shortly after NASA's OSIRIS-REx spacecraft arrived at asteroid Bennu, an unexpected discovery by the mission's science team revealed that the asteroid could be active, or consistently discharging particles into space. The ongoing examination of Bennu -- and its sample that will eventually be returned to Earth -- could potentially shed light on why this intriguing phenomenon is occurring.

The OSIRIS-REx team first observed a particle ejection event in images captured by the spacecraft's navigation cameras taken on Jan. 6, just a week after the spacecraft entered its first orbit around Bennu. At first glance, the particles appeared to be stars behind the asteroid, but on closer examination, the team realized that the asteroid was ejecting material from its surface. After concluding that these particles did not compromise the spacecraft's safety, the mission began dedicated observations in order to fully document the activity.

"Among Bennu's many surprises, the particle ejections sparked our curiosity, and we've spent the last several months investigating this mystery," said Dante Lauretta, OSIRIS-REx principal investigator at the University of Arizona, Tucson. "This is a great opportunity to expand our knowledge of how asteroids behave."

After studying the results of the observations, the mission team released their findings in a Science paper published Dec. 6. The team observed the three largest particle ejection events on Jan. 6 and 19, and Feb. 11, and concluded that the events originated from different locations on Bennu's surface. The first event originated in the southern hemisphere, and the second and third events occurred near the equator. All three events took place in the late afternoon on Bennu.

The team found that, after ejection from the asteroid's surface, the particles either briefly orbited Bennu and fell back to its surface or escaped from Bennu into space. The observed particles traveled up to 10 feet (3 meters) per second, and measured from smaller than an inch up to 4 inches (10 cm) in size. Approximately 200 particles were observed during the largest event, which took place on Jan. 6.

The team investigated a wide variety of possible mechanisms that may have caused the ejection events, and narrowed the list to three candidates: meteoroid impacts, thermal stress fracturing, and released of water vapor.

Meteoroid impacts are common in the deep space neighborhood of Bennu, and it is possible that these small fragments of space rock could be hitting Bennu where OSIRIS-REx is not observing it, shaking loose particles with the momentum of their impact.

The team also determined that thermal fracturing is another reasonable explanation. Bennu's surface temperatures vary drastically over its 4.3-hour rotation period. Although it is extremely cold during the night hours, the asteroid's surface warms significantly in the mid-afternoon, which is when the three major events occurred. As a result of this temperature change, rocks may begin to crack and break down, and eventually particles could be ejected from the surface. This cycle is known as thermal stress fracturing.

Water release may also explain the asteroid's activity. When Bennu's water-locked clays are heated, the water could begin to release and create pressure. It is possible that as pressure builds in cracks and pores in boulders where absorbed water is released, the surface could become agitated, causing particles to erupt.

But nature does not always allow for simple explanations. "It could be that more than one of these possible mechanisms are at play," said Steve Chesley, an author on the paper and Senior Research Scientist at NASA's Jet Propulsion Laboratory in Pasadena, Calif. "For example, thermal fracturing could be chopping the surface material into small pieces, making it far easier for meteoroid impacts to launch pebbles into space."

If thermal fracturing, meteoroid impacts, or both, are in fact the causes of these ejection events, then this phenomenon is likely happening on all small asteroids, as they all experience these mechanisms. However, if water release is the cause of these ejection events, then this phenomenon would be specific to asteroids that contain water-bearing minerals, like Bennu.

Read more at Science Daily

Dec 5, 2019

Hidden giant planet revealed around tiny white dwarf star

The first evidence of a giant planet orbiting a dead white dwarf star has been found in the form of a disc of gas formed from its evaporating atmosphere.

The Neptune-like planet orbits a star a quarter of its size about once every ten days, leaving a comet-like tail of gas comprised of hydrogen, oxygen and sulphur in its wake.

The discovery by astronomers from the University of Warwick's Department of Physics and the Millennium Nucleus for Planet Formation (NPF) at the University of Valparaíso is published today (4 December) in the journal Nature. It is the first evidence of a giant planet orbiting a white dwarf star and suggests that there could be many more planets around such stars waiting to be discovered.

Until now, there has never been evidence of a planet that has survived a star's transition to a white dwarf.

The star WDJ0914+1914 was identified in a survey of ten thousand white dwarfs observed by the Sloan Digital Sky Survey. Scientists at Warwick analysed subtle variations in the light emitted from the system to identify the elements present around the star.

They detected very minute spikes of hydrogen in the data, which was unusual in itself, but also of oxygen and sulphur, which they had never seen before. Using the Very Large Telescope of the European Southern Observatory in Chile to obtain more observations of this star, they found that the shape of the hydrogen, oxygen and sulphur features are typical indicators of a ring of gas.

Lead author Dr Boris Gaensicke, from the University of Warwick, said: "At first, we thought that this was a binary star with an accretion disc formed from mass flowing between the two stars. However, our observations show that it is a single white dwarf with a disc around it roughly ten times the size of our sun, made solely of hydrogen, oxygen and sulphur. Such a system has never been seen before, and it was immediately clear to me that this was a unique star."

When the astronomers averaged all the spectra they obtained over two nights in Chile it was clear that WDJ0914+1914 was accreting sulphur and oxygen from the disc. Analysing the data, they were able to measure the composition of the disc, and concluded that it matches what scientists expect for the deeper layers of our own solar system's ice giants, Uranus and Neptune.

Dr Matthias Schreiber from the University of Valparaíso showed through a set of calculations that the 28,000 degrees Celsius hot white dwarf is slowly evaporating this hidden icy giant by bombarding it with high energy photons and pulling its lost mass into a gas disc around the star at a rate of over 3,000 tons per second.

Dr Gaensicke said: "This star has a planet that we can't see directly, but because the star is so hot it is evaporating the planet, and we detect the atmosphere it is losing. There could be many cooler white dwarfs that have planets but lacking the high-energy photons necessary to drive evaporation, so we wouldn't be able to find them with the same method. However, some of those planets might detectable using the transit method once the Large Synoptic Survey Telescope goes on sky.

"This discovery is major progress because over the past two decades we had growing evidence that planetary systems survive into the white dwarf stage. We've seen a lot of asteroids, comets and other small planetary objects hitting white dwarfs, and explaining these events requires larger, planet-mass bodies further out. Having evidence for an actual planet that itself was scattered in is an important step."

Dr Schreiber adds: "In a sense, WDJ0914+1914 is providing us with a glimpse into the very distant future of our own solar system."

The white dwarf we see today was once a star similar to the sun but eventually ran out of fuel, swelled up into a red giant, a few 100 times the size of the sun. During that phase of its life the star will have lost about half of its mass and what was left has shrunk dramatically ending up size of the Earth -- the white dwarf is essentially the burnt-out core of the former star.

Extraordinarily, today's orbit of the planet around the white dwarf would have been deep inside the red giant, so scattering with some other planets in the system, a kind of cosmic pool game, moved it close to the white dwarf after the red giant's outer layers were lost.

Once our own sun runs out of fuel in about 4.5 billion years it will shed its outer layers, destroying Mercury, Venus, and probably the Earth, and eventually expose the burnt-out core -- the white dwarf. In a companion paper led by Dr Schreiber and Dr Gaensicke and published in Astrophysical Journal Letters, they detail how this will radiate enough high energy photons to evaporate Jupiter, Saturn, Uranus and Neptune. Just as on WDJ0914+1914, some of that atmospheric gas will end up on the white dwarf left behind by the sun, and will be observable for future generations of alien astronomers.

The astronomers argue that this planetary evaporation and subsequent accretion by young white dwarfs is probably a relatively common process and that it might open a new window towards studying the chemical composition of the atmospheres of extrasolar gas giant planets.

Read more at Science Daily

A week in the dark rewires brain cell networks and changes hearing in adult mice

Scientists have known that depriving adult mice of vision can increase the sensitivity of individual neurons in the part of the brain devoted to hearing. New research from biologists at the University of Maryland revealed that sight deprivation also changes the way brain cells interact with one another, altering neuronal networks and shifting the mice's sensitivity to different frequencies. The research was published in the November 11, 2019 issue of the journal eNeuro.

"This study reinforces what we are learning about how manipulating vision can have a significant effect on the ability of an animal to hear long after the window for auditory learning was thought to have closed," said Patrick Kanold, professor of biology at UMD and senior author of the study.

It was once thought that the sensory regions of the brain were not adaptable after a critical period in childhood. This is why children learn languages much more readily than adults. Kanold's earlier research disproved this idea by showing that depriving adult mice of vision for a short period increased the sensitivity of individual neurons in the auditory cortex, which is devoted to hearing.

The current study expands on that earlier work. Kanold and his team investigated how exposure to darkness affects the way groups of neurons in the auditory cortex work together in response to a given sound -- which neurons are connected and which fire more powerfully or faster. The researchers placed adult mice in a dark space for one week and then played 17 different tones while measuring brain activity in the auditory cortex. Based on their earlier work, Kanold and his team expected to see changes in the neural networks, but they were surprised to find that groups of neurons changed in different ways.

Young brains wire themselves according to the sounds they hear frequently, allocating areas of the auditory cortex for specific frequencies based on what they are used to hearing. The researchers found that, in adult mice, a week in the dark also redistributed the allocation of space to different frequencies. In the areas of the auditory cortex they examined, the researchers saw an increase in the proportion of neurons that were sensitive to high and low frequencies and a decrease in proportion of neurons that were sensitive to mid-range frequencies.

"We don't know why we are seeing these patterns," Kanold said. "We speculate that it may have to do with what the mice are paying attention to while they are in the dark. Maybe they pay attention to the noises or voices from the other mice, or maybe they're paying more attention to the footsteps they are making."

Kanold said his next steps include manipulating the sounds the mice are exposed to during the darkness phase of the experiment and monitoring brain activity to determine what aspects of their soundscape the mice are listening to. This will help the researchers understand the role of focus and attention in promoting change to the auditory neurons. Such information could be very useful in helping people adapt to cochlear implants or hearing aids.

Read more at Science Daily

Some stress in early life extends lifespan, research in roundworms shows

Caenorhabditis elegans
Some stress at a young age could actually lead to a longer life, new research shows.

University of Michigan researchers have discovered that oxidative stress experienced early in life increases subsequent stress resistance later in life.

Oxidative stress happens when cells produce more oxidants and free radicals than they can deal with. It's part of the aging process, but can also arise from stressful conditions such as exercise and calorie restriction.

Examining a type of roundworm called Caenorhabditis elegans, U-M scientists Ursula Jakob and Daphne Bazopoulou found that worms that produced more oxidants during development lived longer than worms that produced fewer oxidants. Their results are published in the journal Nature.

Researchers have long wondered what determines variability in lifespan, says Jakob, a professor of molecular, cellular and developmental biology. One part of that is genetics: If your parents are long-lived, you have a good chance for living longer as well. Environment is another part.

That other stochastic -- or random -- factors might be involved becomes clear in the case of C. elegans. These short-lived organisms are a popular model system among aging researchers in part because every hermaphroditic mother produces hundreds of genetically identical offspring. However, even if kept in the same environment, the lifespan of these offspring varies to a surprising extent, Jakob says.

"If lifespan was determined solely by genes and environment, we would expect that genetically identical worms grown on the same petri dish would all drop dead at about the same time, but this is not at all what happens. Some worms live only three days while others are still happily moving around after 20 days," Jakob said. "The question then is, what is it, apart from genetics and environment, that is causing this big difference in lifespan?"

Jakob and Bazopoulou, a postdoctoral researcher and lead author of the paper, found one part of the answer when they discovered that during development, C. elegans worms varied substantially in the amount of reactive oxygen species they produce.

Reactive oxygen species, or ROS, are oxidants that every air-breathing organism produces. ROS are closely associated with aging: the oxidative damage they elicit are what many anti-aging creams claim to combat. Bazopoulou and Jakob discovered that instead of having a shorter lifespan, worms that produced more ROS during development actually lived longer.

"Experiencing stress at this early point in life may make you better able to fight stress you might encounter later in life," Bazopoulou said.

When the researchers exposed the whole population of juvenile worms to external ROS during development, the average lifespan of the entire population increased. Though the researchers don't know yet what triggers the oxidative stress event during development, they were able to determine what processes enhanced the lifespan of these worms.

To do this, Bazopoulou sorted thousands of C. elegans larvae according to the oxidative stress levels they have during development. By separating worms that produced large amounts of ROS from those that produced little amounts of ROS, she showed that the main difference between the two groups was a histone modifier, whose activity is sensitive to oxidative stress conditions.

The researchers found that the temporary production of ROS during development caused changes in the histone modifier early in the worm's life. How these changes persist throughout life and how they ultimately affect and extend lifespan is still unknown. What is known, however, is that this specific histone modifier is also sensitive to oxidative stress sensitive in mammalian cells. Additionally, early-life interventions have been shown to extend lifespans in mammalian model systems such as mice.

"The general idea that early life events have such profound, positive effects later in life is truly fascinating. Given the strong connection between stress, aging and age-related diseases, it is possible that early events in life might also affect the predisposition for age-associated diseases, such as dementia and Alzheimer's disease," Jakob said.

Read more at Science Daily

NASA's Parker Solar Probe sheds new light on the sun

In August 2018, NASA's Parker Solar Probe launched to space, soon becoming the closest-ever spacecraft to the Sun. With cutting-edge scientific instruments to measure the environment around the spacecraft, Parker Solar Probe has completed three of 24 planned passes through never-before-explored parts of the Sun's atmosphere, the corona. On Dec. 4, 2019, four new papers in the journal Nature describe what scientists have learned from this unprecedented exploration of our star -- and what they look forward to learning next.

These findings reveal new information about the behavior of the material and particles that speed away from the Sun, bringing scientists closer to answering fundamental questions about the physics of our star. In the quest to protect astronauts and technology in space, the information Parker has uncovered about how the Sun constantly ejects material and energy will help scientists re-write the models we use to understand and predict the space weather around our planet and understand the process by which stars are created and evolve.

"This first data from Parker reveals our star, the Sun, in new and surprising ways," said Thomas Zurbuchen, associate administrator for science at NASA Headquarters in Washington. "Observing the Sun up close rather than from a much greater distance is giving us an unprecedented view into important solar phenomena and how they affect us on Earth, and gives us new insights relevant to the understanding of active stars across galaxies. It's just the beginning of an incredibly exciting time for heliophysics with Parker at the vanguard of new discoveries."

Though it may seem placid to us here on Earth, the Sun is anything but quiet. Our star is magnetically active, unleashing powerful bursts of light, deluges of particles moving near the speed of light and billion-ton clouds of magnetized material. All this activity affects our planet, injecting damaging particles into the space where our satellites and astronauts fly, disrupting communications and navigation signals, and even -- when intense -- triggering power outages. It's been happening for the Sun's entire 5-billion-year lifetime, and will continue to shape the destinies of Earth and the other planets in our solar system into the future.

"The Sun has fascinated humanity for our entire existence," said Nour E. Raouafi, project scientist for Parker Solar Probe at the Johns Hopkins Applied Physics Laboratory in Laurel, Maryland, which built and manages the mission for NASA. "We've learned a great deal about our star in the past several decades, but we really needed a mission like Parker Solar Probe to go into the Sun's atmosphere. It's only there that we can really learn the details of these complex solar processes. And what we've learned in just these three solar orbits alone has changed a lot of what we know about the Sun."

What happens on the Sun is critical to understanding how it shapes the space around us. Most of the material that escapes the Sun is part of the solar wind, a continual outflow of solar material that bathes the entire solar system. This ionized gas, called plasma, carries with it the Sun's magnetic field, stretching it out through the solar system in a giant bubble that spans more than 10 billion miles.

The dynamic solar wind

Observed near Earth, the solar wind is a relatively uniform flow of plasma, with occasional turbulent tumbles. But by that point it's traveled over ninety million miles -- and the signatures of the Sun's exact mechanisms for heating and accelerating the solar wind are wiped out. Closer to the solar wind's source, Parker Solar Probe saw a much different picture: a complicated, active system.

"The complexity was mind-blowing when we first started looking at the data," said Stuart Bale, the University of California, Berkeley, lead for Parker Solar Probe's FIELDS instrument suite, which studies the scale and shape of electric and magnetic fields. "Now, I've gotten used to it. But when I show colleagues for the first time, they're just blown away." From Parker's vantage point 15 million miles from the Sun, Bale explained, the solar wind is much more impulsive and unstable than what we see near Earth.

Like the Sun itself, the solar wind is made up of plasma, where negatively charged electrons have separated from positively charged ions, creating a sea of free-floating particles with individual electric charge. These free-floating particles mean plasma carries electric and magnetic fields, and changes in the plasma often make marks on those fields. The FIELDS instruments surveyed the state of the solar wind by measuring and carefully analyzing how the electric and magnetic fields around the spacecraft changed over time, along with measuring waves in the nearby plasma.

These measurements showed quick reversals in the magnetic field and sudden, faster-moving jets of material -- all characteristics that make the solar wind more turbulent. These details are key to understanding how the wind disperses energy as it flows away from the Sun and throughout the solar system.

One type of event in particular drew the eye of the science teams: flips in the direction of the magnetic field, which flows out from the Sun, embedded in the solar wind. These reversals -- dubbed "switchbacks" -- last anywhere from a few seconds to several minutes as they flow over Parker Solar Probe. During a switchback, the magnetic field whips back on itself until it is pointed almost directly back at the Sun. Together, FIELDS and SWEAP, the solar wind instrument suite led by the University of Michigan and managed by the Smithsonian Astrophysical Observatory, measured clusters of switchbacks throughout Parker Solar Probe's first two flybys.

"Waves have been seen in the solar wind from the start of the space age, and we assumed that closer to the Sun the waves would get stronger, but we were not expecting to see them organize into these coherent structured velocity spikes," said Justin Kasper, principal investigator for SWEAP -- short for Solar Wind Electrons Alphas and Protons -- at the University of Michigan in Ann Arbor. "We are detecting remnants of structures from the Sun being hurled into space and violently changing the organization of the flows and magnetic field. This will dramatically change our theories for how the corona and solar wind are being heated."

The exact source of the switchbacks isn't yet understood, but Parker Solar Probe's measurements have allowed scientists to narrow down the possibilities.

Among the many particles that perpetually stream from the Sun are a constant beam of fast-moving electrons, which ride along the Sun's magnetic field lines out into the solar system. These electrons always flow strictly along the shape of the field lines moving out from the Sun, regardless of whether the north pole of the magnetic field in that particular region is pointing towards or away from the Sun. But Parker Solar Probe measured this flow of electrons going in the opposite direction, flipping back towards the Sun -- showing that the magnetic field itself must be bending back towards the Sun, rather than Parker Solar Probe merely encountering a different magnetic field line from the Sun that points in the opposite direction. This suggests that the switchbacks are kinks in the magnetic field -- localized disturbances traveling away from the Sun, rather than a change in the magnetic field as it emerges from the Sun.

Parker Solar Probe's observations of the switchbacks suggest that these events will grow even more common as the spacecraft gets closer to the Sun. The mission's next solar encounter on Jan. 29, 2020, will carry the spacecraft nearer to the Sun than ever before, and may shed new light on this process. Not only does such information help change our understanding of what causes the solar wind and space weather around us, it also helps us understand a fundamental process of how stars work and how they release energy into their environment.

The rotating solar wind

Some of Parker Solar Probe's measurements are bringing scientists closer to answers to decades-old questions. One such question is about how, exactly, the solar wind flows out from the Sun.

Near Earth, we see the solar wind flowing almost radially -- meaning it's streaming directly from the Sun, straight out in all directions. But the Sun rotates as it releases the solar wind; before it breaks free, the solar wind was spinning along with it. This is a bit like children riding on a playground park carousel -- the atmosphere rotates with the Sun much like the outer part of the carousel rotates, but the farther you go from the center, the faster you are moving in space. A child on the edge might jump off and would, at that point, move in a straight line outward, rather than continue rotating. In a similar way, there's some point between the Sun and Earth, the solar wind transitions from rotating along with the Sun to flowing directly outwards, or radially, like we see from Earth.

Exactly where the solar wind transitions from a rotational flow to a perfectly radial flow has implications for how the Sun sheds energy. Finding that point may help us better understand the lifecycle of other stars or the formation of protoplanetary disks, the dense disks of gas and dust around young stars that eventually coalesce into planets.

Now, for the first time -- rather than just seeing that straight flow that we see near Earth -- Parker Solar Probe was able to observe the solar wind while it was still rotating. It's as if Parker Solar Probe got a view of the whirling carousel directly for the first time, not just the children jumping off it. Parker Solar Probe's solar wind instrument detected rotation starting more than 20 million miles from the Sun, and as Parker approached its perihelion point, the speed of the rotation increased. The strength of the circulation was stronger than many scientists had predicted, but it also transitioned more quickly than predicted to an outward flow, which is what helps mask these effects from where we usually sit, about 93 million miles from the Sun.

"The large rotational flow of the solar wind seen during the first encounters has been a real surprise," said Kasper. "While we hoped to eventually see rotational motion closer to the Sun, the high speeds we are seeing in these first encounters is nearly ten times larger than predicted by the standard models."

Dust near the Sun

Another question approaching an answer is the elusive dust-free zone. Our solar system is awash in dust -- the cosmic crumbs of collisions that formed planets, asteroids, comets and other celestial bodies billions of years ago. Scientists have long suspected that, close to the Sun, this dust would be heated to high temperatures by powerful sunlight, turning it into a gas and creating a dust-free region around the Sun. But no one had ever observed it.

For the first time, Parker Solar Probe's imagers saw the cosmic dust begin to thin out. Because WISPR -- Parker Solar Probe's imaging instrument, led by the Naval Research Lab -- looks out the side of the spacecraft, it can see wide swaths of the corona and solar wind, including regions closer to the Sun. These images show dust starting to thin a little over 7 million miles from the Sun, and this decrease in dust continues steadily to the current limits of WISPR's measurements at a little over 4 million miles from the Sun.

"This dust-free zone was predicted decades ago, but has never been seen before," said Russ Howard, principal investigator for the WISPR suite -- short for Wide-field Imager for Solar Probe -- at the Naval Research Laboratory in Washington, D.C. "We are now seeing what's happening to the dust near the Sun."

At the rate of thinning, scientists expect to see a truly dust-free zone starting a little more than 2-3 million miles from the Sun -- meaning Parker Solar Probe could observe the dust-free zone as early as 2020, when its sixth flyby of the Sun will carry it closer to our star than ever before.

Putting space weather under a microscope

Parker Solar Probe's measurements have given us a new perspective on two types of space weather events: energetic particle storms and coronal mass ejections.

Tiny particles -- both electrons and ions -- are accelerated by solar activity, creating storms of energetic particles. Events on the Sun can send these particles rocketing out into the solar system at nearly the speed of light, meaning they reach Earth in under half an hour and can impact other worlds on similarly short time scales. These particles carry a lot of energy, so they can damage spacecraft electronics and even endanger astronauts, especially those in deep space, outside the protection of Earth's magnetic field -- and the short warning time for such particles makes them difficult to avoid.

Understanding exactly how these particles are accelerated to such high speeds is crucial. But even though they zip to Earth in as little as a few minutes, that's still enough time for the particles to lose the signatures of the processes that accelerated them in the first place. By whipping around the Sun at just a few million miles away, Parker Solar Probe can measure these particles just after they've left the Sun, shedding new light on how they are released.

Already, Parker Solar Probe's IS?IS instruments, led by Princeton University, have measured several never-before-seen energetic particle events -- events so small that all trace of them is lost before they reach Earth or any of our near-Earth satellites. These instruments have also measured a rare type of particle burst with a particularly high number of heavier elements -- suggesting that both types of events may be more common than scientists previously thought.

"It's amazing -- even at solar minimum conditions, the Sun produces many more tiny energetic particle events than we ever thought," said David McComas, principal investigator for the Integrated Science Investigation of the Sun suite, or IS?IS, at Princeton University in New Jersey. "These measurements will help us unravel the sources, acceleration, and transport of solar energetic particles and ultimately better protect satellites and astronauts in the future."

Data from the WISPR instruments also provided unprecedented detail on structures in the corona and solar wind -- including coronal mass ejections, billion-ton clouds of solar material that the Sun sends hurtling out into the solar system. CMEs can trigger a range of effects on Earth and other worlds, from sparking auroras to inducing electric currents that can damage power grids and pipelines. WISPR's unique perspective, looking alongside such events as they travel away from the Sun, has already shed new light on the range of events our star can unleash.

"Since Parker Solar Probe was matching the Sun's rotation, we could watch the outflow of material for days and see the evolution of structures," said Howard. "Observations near Earth have made us think that fine structures in the corona segue into a smooth flow, and we're finding out that's not true. This will help us do better modeling of how events travel between the Sun and Earth."

As Parker Solar Probe continues on its journey, it will make 21 more close approaches to the Sun at progressively closer distances, culminating in three orbits a mere 3.83 million miles from the solar surface.

Read more at Science Daily

Dec 4, 2019

Migraine headaches? Consider aspirin for treatment and prevention

Migraine headache is the third most common disease in the world affecting about 1 in 7 people. More prevalent than diabetes, epilepsy and asthma combined, migraine headaches are among the most common and potentially debilitating disorders encountered by primary health care providers. Migraines also are associated with an increased risk of stroke.

There are effective prescription medications available to treat acute migraine headaches as well as to prevent recurrent attacks. Nonetheless, in the United States many patients are not adequately treated for reasons that include limited access to health care providers and lack of health insurance or high co-pays, which make expensive medications of proven benefit unaffordable. The rates of uninsured or underinsured individuals have been estimated to be 8.5 percent nationwide and 13 percent in Florida. Furthermore, for all patients, the prescription drugs may be poorly tolerated or contraindicated.

Researchers from Florida Atlantic University's Schmidt College of Medicine have proposed aspirin as a possible option for consideration by primary care providers who treat the majority of patients with migraine. Their review includes evidence from 13 randomized trials of the treatment of migraine in 4,222 patients and tens of thousands of patients in prevention of recurrent attacks.

Their findings, published in the American Journal of Medicine, suggest that high-dose aspirin, in doses from 900 to 1,300 milligrams given at the onset of symptoms, is an effective and safe treatment option for acute migraine headaches. In addition, some but not all randomized trials suggest the possibility that daily aspirin in doses from 81 to 325 milligrams may be an effective and safe treatment option for the prevention of recurrent migraine headaches.

"Our review supports the use of high dose aspirin to treat acute migraine as well as low dose daily aspirin to prevent recurrent attacks," said Charles H. Hennekens, M.D., Dr.PH, corresponding author, first Sir Richard Doll Professor and senior academic advisor in FAU's Schmidt College of Medicine. "Moreover, the relatively favorable side effect profile of aspirin and extremely low costs compared with other prescription drug therapies may provide additional clinical options for primary health care providers treating acute as well as recurrent migraine headaches."

Common symptoms of migraine include a headache that often begins as a dull pain and then grows into a throbbing pain, which can be incapacitating and often occurs with nausea and vomiting, and sensitivity to sound, light and smell. Migraines can last anywhere from four to 72 hours and may occur as many times as several times a week to only once a year.

"Migraine headaches are among the most common and potentially debilitating disorders encountered by primary health care providers," said Bianca Biglione, first author and a second-year medical student in FAU's Schmidt College of Medicine. "In fact, about 1 in 10 primary care patients present with headache and three out of four are migraines. Aspirin is readily available without a prescription, is inexpensive, and based on our review, was shown to be effective in many migraine patients when compared with alternative more expensive therapies."

Approximately 36 million Americans suffer from migraine headaches and the cause of this disabling disorder is not well understood. There is a higher prevalence in women (18 percent) than men (9 percent). In women, the prevalence is highest during childbearing age.

Read more at Science Daily

Fake news feels less immoral to share when we've seen it before

People who repeatedly encounter a fake news item may feel less and less unethical about sharing it on social media, even when they don't believe the information, research indicates.

In a series of experiments involving more than 2,500 people, Daniel A. Effron, a London Business School associate professor of organizational behavior, and Medha Raj, a PhD student at the University of Southern California, found that seeing a fake headline just once leads individuals to temper their disapproval of the misinformation when they see it a second, third, or fourth time.

The findings, published in Psychological Science, have important implications for policymakers and social media companies trying to curb the spread of misinformation online, Effron says.

"We suggest that efforts to fight misinformation should consider how people judge the morality of spreading it, not just whether they believe it," he says.

Across five experiments, Effron and Raj asked online survey participants to rate how unethical or acceptable they thought it would be to publish a fake headline, and how likely they would be to "like," share, and block or unfollow the person who posted it.

As they expected, the researchers found that participants rated headlines they had seen more than once as less unethical to publish than headlines they were shown for the first time. Participants also said they were more likely to "like" and share a previously seen headline and less likely to block or unfollow the person who posted it. What's more, they did not rate previously seen headline as significantly more accurate than new ones.

"Thus, our main results cannot be explained by a tendency to misremember false headlines as true," the researchers write.

Effron and Raj note that efforts to curtail misinformation typically focus on helping people distinguish fact from fiction. Facebook, for example, has tried informing users when they try to share news that fact-checkers have flagged as false. But such strategies may fail if users feel more comfortable sharing misinformation they know is fake when they have seen it before.

The researchers theorize that repeating misinformation lends it a "ring of truthfulness" that can increase people's tendency to give it a moral pass, regardless of whether they believe it. Merely imagining misinformation as if it were true can have a similar effect. Effron's earlier research shows that people are more likely to excuse a blatant falsehood after imagining how it could have been true if the past had been different.

"The results should be of interest to citizens of contemporary democracies," Effron adds. "Misinformation can stoke political polarization and undermine democracy, so it is important for people to understand when and why it spreads."

Read more at Science Daily

A common drug could help restore limb function after spinal cord injury

Long-term treatment with gabapentin, a commonly prescribed drug for nerve pain, could help restore upper limb function after a spinal cord injury, new research in mice suggests.

In the study, mice treated with gabapentin regained roughly 60 percent of forelimb function in a skilled walking test, compared to restoration of approximately 30 percent of forelimb function in mice that received a placebo.

The drug blocks activity of a protein that has a key role in the growth process of axons, the long, slender extensions of nerve cell bodies that transmit messages. The protein stops axon growth at times when synapses form, allowing transmission of information to another nerve cell.

The research showed that gabapentin blocks the protein from putting on its brakes, which effectively allowed axons to grow longer after injury.

"There is some spontaneous recovery in untreated mice, but it's never complete. The treated mice still have deficits, but they are significantly better," said senior author Andrea Tedeschi, assistant professor of neuroscience at The Ohio State University.

"This research has translational implications because the drug is clinically approved and already prescribed to patients," he said. "I think there's enough evidence here to reconsider how we use this drug in the clinic. The implication of our finding may also impact other neurological conditions such as brain injury and stroke."

The regained function in mice occurred after four months of treatment -- the equivalent of about nine years in adult humans.

"We really have to consider that rebuilding neuronal circuits, especially in an adult central nervous system, takes time. But it can happen," said Wenjing Sun, research assistant professor of neuroscience at Ohio State and first author of the publication.

The study is published in the Journal of Clinical Investigation.

The spinal cord injury in these mice is located near the top of the spine. Humans with this type of injury generally lose enough sensation and movement to require assistance with daily living tasks.

After receiving gabapentin for four months, the treated mice were better able to move across a horizontal ladder and spread their forelimb toes than untreated mice. When the researchers used a special technique to silence neurons in the repair pathway they had targeted, there was no difference in functional recovery between treated and untreated mice.

"Now we can comfortably say that whatever we see in terms of structural and functional alterations of this motor pathway is really meaningful in promoting recovery in these mice," Tedeschi said.

Tedeschi noted that in this study, treatment with gabapentin occurred much earlier than is typical in human medicine, when it is prescribed to treat existing neuropathic pain and other neurological conditions.

"Gabapentin is given when the nervous system is already having issues associated with maladaptive plasticity that hinders normal function. We are giving it much, much earlier, when the nervous system may be more responsive to programming an adaptive repair process," he said.

A retrospective study of European medical data published in 2017 showed that individuals who had received anticonvulsants -- gabapentin or a similar drug -- early after spinal cord injury regained motor function. It was not a clinical trial, but the analysis showed an association between taking a class of drugs called gabapentinoids and regaining muscle strength.

Plenty of questions remain: how and when to adjust the amount of gabapentin used for treatment, and whether the drug could be combined with other interventions used to promote repair of an injured spinal cord at chronic stages. But testing the effectiveness of the drug in larger animal models is a logical next step prior to embarking on clinical trials, Tedeschi said.

"With all the evidence and mechanistic insight we provide, I feel like we are in a better situation to start planning a more translational type of research," he said. "It's the right time to try."

Tedeschi's research focuses on neurons in the corticospinal tract -- specifically motor neurons that carry signals from the central nervous system to the body telling muscles to move. These cells are particularly important in controlling voluntary movement, which is impaired in cervical spinal cord injuries modeled in the study.

This work builds upon the recent discovery of the regulatory role of a neuronal receptor called alpha2delta2 in controlling axon growth ability. Tedeschi and colleagues have determined that alpha2delta2 facilitates synapse formation by putting on the brake for axon growth, an essential step during the development of the central nervous system.

The researchers discovered in the current study that after a cervical spinal cord injury, affected motor neurons above the spine increased the expression of this receptor, interfering with axons' ability to regrow. If axon repair doesn't go as expected and neuronal circuits are reorganized improperly, individuals with spinal cord injury may experience uncontrolled movement and pain.

"When neuronal circuits need to be rebuilt after injury, we need to down-regulate the expression of the receptor so axons can re-engage in an active growth program. And we found that it's doing exactly the opposite," said Tedeschi, also a member of Ohio State's Chronic Brain Injury Discovery Theme.

"Because this receptor can be pharmacologically blocked through administration of clinically approved drugs called gabapentinoids -- for example, gabapentin and pregabalin -- that's a very powerful target that you can modulate as long as you take the drug."

Read more at Science Daily

Mystery of how early animals survived ice age

How did life survive the most severe ice age? A McGill University-led research team has found the first direct evidence that glacial meltwater provided a crucial lifeline to eukaryotes during Snowball Earth, when the oceans were cut off from life-giving oxygen, answering a question puzzling scientists for years.

In a new study published in the Proceedings of the National Academy of Sciences, researchers studied iron-rich rocks left behind by glacial deposits in Australia, Namibia, and California to get a window into the environmental conditions during the ice age. Using geological maps and clues from locals, they hiked to rock outcrops, navigating challenging trails to track down the rock formations.

By examining the chemistry of the iron formations in these rocks, the researchers were able to estimate the amount of oxygen in the oceans around 700 million years ago and better understand the effects this would have had on all oxygen-dependent marine life, including the earliest animals like simple sponges.

"The evidence suggests that although much of the oceans during the deep freeze would have been uninhabitable due to a lack of oxygen, in areas where the grounded ice sheet begins to float there was a critical supply of oxygenated meltwater. This trend can be explained by what we call a 'glacial oxygen pump'; air bubbles trapped in the glacial ice are released into the water as it melts, enriching it with oxygen," says Maxwell Lechte, a postdoctoral researcher in the Department of Earth and Planetary Sciences under the supervision of Galen Halverson at McGill University.

Around 700 million years ago, the Earth experienced the most severe ice age of its history, threatening the survival of much of the planet's life. Previous research has suggested that oxygen-dependent life may have been restricted to meltwater puddles on the surface of the ice, but this study provides new evidence of oxygenated marine environments.

"The fact that the global freeze occurred before the evolution of complex animals suggests a link between Snowball Earth and animal evolution. These harsh conditions could have stimulated their diversification into more complex forms," says Lechte, who is also the study's lead author.

Lechte points out that while the findings focus on the availability of oxygen, primitive eukaryotes would also have needed food to survive the harsh conditions of the ice age. Further research is needed to explore how these environments might have sustained a food web. A starting point might be modern ice environments that host complex ecosystems today.

"This study actually solves two mysteries about the Snowball Earth at once. It not only provides explanation for how early animals may have survived global glaciation, but also eloquently explains the return of iron deposits in the geological record after an absence of over a billion years," says Professor Galen Halverson.

Read more at Science Daily

Dec 3, 2019

Cracking 60-year-old mystery of Sun's magnetic waves

A Queen's University Belfast scientist has led an international team to the ground-breaking discovery of why the Sun's magnetic waves strengthen and grow as they emerge from its surface, which could help to solve the mystery of how the corona of the Sun maintains its multi-million degree temperatures.

For more than 60 years observations of the Sun have shown that as the magnetic waves leave the interior of the Sun they grow in strength but until now there has been no solid observational evidence as to why this was the case.

The corona's high temperatures have also always been a mystery. Usually the closer we are to a heat source, the warmer we feel. However, this is the opposite of what seems to happen on the Sun -- its outer layers are warmer than the heat source at its surface.

Scientists have accepted for a long time that magnetic waves channel energy from the Sun's vast interior energy reservoir, which is powered by nuclear fusion, up into the outer regions of its atmosphere. Therefore, understanding how the wave motion is generated and spread throughout the Sun is of huge importance to researchers.

The team, which was led by Queen's, included 13 scientists, spanning five countries and 11 research institutes including University of Exeter; Northumbria University; the European Space Agency; Instituto de Astrofísica de Canarias, Spain; University of Oslo, Norway; the Italian Space Agency and California State University Northridge, USA.

The experts formed a consortium called "Waves in the Lower Solar Atmosphere (WaLSA)" to carry out the research and used advanced high-resolution observations from the National Science Foundation's Dunn Solar Telescope, New Mexico, to study the waves.

Dr David Jess from the School of Mathematics and Physics at Queen's led the team of experts. He explains: "This new understanding of wave motion may help scientists uncover the missing piece in the puzzle of why the outer layers of the Sun are hotter than its surface, despite being further from the heat source.

"By breaking the Sun's light up into its basic colours, we were able to examine the behaviour of certain elements from the periodic table within its atmosphere, including silicon (formed close to the Sun's surface), calcium and helium (formed in the chromosphere where the wave amplification is most apparent).

"The variations in the elements allowed the speeds of the Sun's plasma to be uncovered. The timescales over which they evolve were benchmarked, which allowed the wave frequencies of the Sun to be recorded. This is similar to how a complex musical ensemble is deconstructed into basic notes and frequencies by visualising its musical score."

The team then used super computers to analyse the data through simulations. They found that the wave amplification process can be attributed to the formation of an 'acoustic resonator', where significant changes in temperature between the surface of the Sun and its outer corona create boundaries that are partially reflective and act to trap the waves, allowing them to intensify and dramatically grow in strength.

The experts also found that the thickness of the resonance cavity -the distance between the significant temperature changes -- is one of the main factors governing the characteristics of the detected wave motion.

Dr Jess comments: "The effect that we have found through the research is similar to how an acoustic guitar changes the sound it emits through the shape of its hollow body. If we think of this analogy we can see how the waves captured in the Sun can grow and change as they exit its surface and move towards the outer layers and exterior."

Read more at Science Daily

How does language emerge?

How the languages of the world emerged is largely a mystery. Considering that it might have taken millennia, it is intriguing to see how deaf people can create novel sign languages spontaneously. Observations have shown that when deaf strangers are brought together in a community, they come up with their own sign language in a considerably short amount of time. The most famous example of this is Nicaraguan Sign Language, which emerged in the 1980s. Interestingly, children played an important role in the development of these novel languages. However, how exactly this happened has not been documented, as Manuel Bohn describes: "We know relatively little about how social interaction becomes language. This is where our new study comes in."

In a series of studies, researchers at the Leipzig Research Centre for Early Childhood Development and the Max Planck Institute for Evolutionary Anthropology attempted to recreate exactly this process. The idea had been around for quite some time, says Gregor Kachel. But there was a problem: how to make children communicate with each other without them reverting to talking to each other? The solution came up in Skype conversations between the two researchers from Germany and their colleague Michael Tomasello in the US. In the study, children were invited to stay in two different rooms and a Skype connection was established between them. After a brief familiarization with the set-up, the researchers sneakily turned off the sound and watched as the children found new ways of communicating that go beyond spoken language.

The children's task was to describe an image with different motifs in a coordination game. With concrete things -- like a hammer or a fork -- children quickly found a solution by imitating the corresponding action (e.g. eating) in a gesture. But the researchers repeatedly challenged the children with new, more abstract pictures. For example, they introduced a white sheet of paper as a picture. The depicted "nothing" is difficult to imitate. Kachel describes how two children nevertheless mastered this task: "The sender first tried all sorts of different gestures, but her partner let her know that she did not know what was meant. Suddenly our sender pulled her T-shirt to the side and pointed to a white dot on her coloured T-shirt. The two had a real breakthrough: of course! White! Like the white paper! Later, when the roles were switched, the recipient didn't have a white spot on her T-shirt, but she nevertheless took the same approach: she pulled her T-shirt to the side and pointed to it. Immediately her partner knew what to do." Within a very short time, the two had established a sign for an abstract concept. In the course of the study, the images to be depicted became more and more complex, which was also reflected in the gestures that the children produced. In order to communicate, for example, an interaction between two animals, children invented separate gestures for actors and actions and began to combine them -- thus creating a kind of small local grammar.

How does a language emerge? Based on the present study, the following steps appear plausible: first, people create reference to actions and objects via signs that resemble things. The prerequisite for this is a common ground of experience between interaction partners. Partners also coordinate by imitating each other such that they use the same signs for the same things. The signs thus gain interpersonal and eventually conventional meaning. Over time, the relationships between the signs and things become more abstract and the meaning of the individual signs more specific. Grammatical structures are gradually introduced when there is a need to communicate more complex facts. However, the most remarkable aspect of the current studies is that these processes can be observed under controlled circumstances and within 30 minutes.

Read more at Science Daily

Eating in sync with biological clock could replace problematic diabetes treatment

Type 2 diabetics inject themselves with insulin, a hormone that regulates the movement of sugar into liver, muscle and fat cells, up to four times a day. But insulin injections are linked to weight gain and the loss of control of blood sugar levels. This triggers a vicious cycle of higher insulin doses, continuous weight gain, a higher incidence of cardiovascular disease and other complications.

A new Tel Aviv University study finds that a starch-rich breakfast consumed early in the morning coupled with a small dinner could replace insulin injections and other diabetes medications for many diabetics.

"The traditional diabetic diet specifies six small meals spread throughout the day. But our research proposes shifting the starch-rich calories to the early hours of the day. This produces a glucose balance and improved glycemic control among type 2 diabetics," explains Prof. Daniela Jakubowicz of TAU's Sackler Faculty of Medicine and Wolfson Medical Center's Diabetes Unit. "We believe that through this regimen it will be possible for diabetics to significantly reduce or even stop the injections of insulin, and most of antidiabetic medications, to achieve excellent control of glucose levels."

Prof. Jakubowicz is the lead author of the study, the result of a collaboration with Prof. Julio Wainstein and Dr. Zohar Landau of Wolfson Medical Center's Diabetes Unit and Prof. Oren Froy and Dr. Shani Tsameret of the Hebrew University of Jerusalem. The research was published in Diabetes Care in December.

According to the new research, our metabolism and biological clock are optimized for eating in the morning and for fasting during the evening and night, when we are supposed to be asleep. "But the usual diet recommended for type 2 diabetes consists of several small meals evenly distributed throughout the day -- for example, three meals and three snacks daily, including a snack before going to sleep to prevent a drop in sugar levels during the night," Prof. Jakubowicz says.

"But the '6M-diet,' as this is called, has not been effective for sugar control, so diabetics require additional medication and insulin. And insulin injections lead to weight gain, which further increases blood sugar levels," Prof. Jakubowicz adds.

The researchers studied 29 type 2 diabetes participants and compared a new "3M-diet," more in alignment with our biological clock, with a control group on the traditional 6M-diet. The experimental 3M-diet comprises a meal of bread, fruits and sweets in the early hours of the morning; a substantial lunch; and a small dinner specifically lacking starches, sweets and fruits.

The group on the traditional 6M-diet did not lose weight and did not experience any improvement of sugar levels, requiring an increase in medication and insulin doses. But the group on the 3M-diet not only lost weight but also experienced substantially improved sugar levels.

"Their need for diabetic medication, especially for insulin doses, dipped substantially. Some were even able to stop using insulin altogether," adds Prof. Jakubowicz. "In addition, the 3M-diet improved the expression of biological clock genes. This suggests that the 3M-diet is not only more effective in controlling diabetes. It may also prevent many other complications such as cardiovascular disease, aging and cancer, which are all regulated by the biological clock genes."

Read more at Science Daily

Bacterial communities 'hitchhiking' on marine plastic trash

Millions of tons of plastic trash are fouling the world's ocean, most of it tiny pieces of microplastic less than a quarter-inch in size. Even the smallest marine animals can ingest these microplastics, potentially threatening their survival.

Marine microplastics aren't floating solo, either -- they quickly pick up a thin coating of bacteria and other microbes, a biofilm known as "The Plastisphere." These biofilms can influence the microplastics' fate -- causing them to sink or float, or breaking them down into even tinier bits, for example. They can even make the plastic smell or taste like food to some marine organisms. But very little is known about what kinds of microbes are in the Plastisphere, and how they interact with one another and the plastic.

Now, using an innovative microscopy method developed at the Marine Biological Laboratory (MBL), Woods Hole, scientists have revealed the structure of the microbial communities coating microplastic samples from a variety of ocean sites. The team, led by Linda Amaral-Zettler (who coined the term "Plastisphere"), Jessica Mark Welch, and Cathleen Schlundt, reports its results this week in Molecular Ecology Resources.

The MBL team built upon an fluorescence imaging technique developed by Mark Welch and colleagues to literally see the spatial organization of microbes on the plastic samples. They did so by designing probes that fluorescently lit up and targeted major, known bacterial groups in the Plastisphere.

"We now have a toolkit that enables us to understand the spatial structure of the Plastisphere and, combined with other methods, a better future way to understand the Plastisphere's major microbial players, what they are doing, and their impact on the fate of plastic litter in the ocean," said Amaral-Zettler, a MBL Fellow from the NIOZ Royal Netherlands Institute for Sea Research and the University of Amsterdam.

The scientists saw diatoms and bacteria colonizing the microplastics, dominated in all cases by three phyla: Proteobacteria, Cyanobacteria, and Bacteriodetes. Spatially, the Plastisphere microbial communities were heterogeneously mixed, providing the first glimpse of bacterial interactions on marine microplastics.

Mark Welch and colleagues have previously applied their imaging technology to study microbial communities in the human mouth and in the digestive tract of cuttlefish and vertebrates.

Read more at Science Daily

Dec 2, 2019

Global levels of biodiversity could be lower than we think, new study warns

Tropical forest
Biodiversity across the globe could be in a worse state than previously thought as current biodiversity assessments fail to take into account the long-lasting impact of abrupt land changes, a new study has warned.

The study by PhD graduate Dr Martin Jung, Senior Lecturer in Geography Dr Pedram Rowhani and Professor of Conservation Science Jörn Scharlemann, all at the University of Sussex, shows that fewer species and fewer individuals are observed at sites that have been disturbed by an abrupt land change in past decades.

The authors warn that areas subjected to deforestation or intensification of agriculture can take at least ten years to recover, with reductions in species richness and abundance.

With current biodiversity assessments failing to take into account the impacts of past land changes, the researchers believe that the natural world could be in a far worse state than currently thought.

Lead author, Dr Martin Jung said: "These findings show that recent abrupt land changes, like deforestation or intensification through agriculture, can cause even more impactful and long-lasting damage to biodiversity than previously thought.

"Our study shows that it can take at least ten or more years for areas which have undergone recent abrupt land changes to recover to levels comparable to undisturbed sites. This only strengthens the argument to limit the impacts of land change on biodiversity with immediate haste."

The study combined global data on biodiversity from the PREDICTS database, one of the largest databases of terrestrial plants, fungi and animals across the world, with quantitative estimates of abrupt land change detected using images from NASA's Landsat satellites from 1982 to 2015.

Comparing numbers of plants, fungi and animals at 5,563 disturbed sites with those at 10,102 undisturbed sites across the world from Africa to Asia, the researchers found that biodiversity remains affected by a land change event for several years after it has occurred, due to a lag effect.

Species richness and abundance were found to be 4.2% and 2% lower, respectively, at sites where an abrupt land change had occurred.

In addition, the impacts on species were found to be greater if land changes had occurred more recently, and caused greater changes in vegetation cover. At sites that had land changes in the last five years, there were around 6.6% fewer species observed.

However, at sites where a land change had taken place 10 or more years ago, species richness and abundance were indistinguishable from sites without a past land change in the same period, indicating that biodiversity can recover after such disturbances.

Dr Jung explained: "For us, the results clearly indicate that regional and global biodiversity assessments need to consider looking back at the past in order to have more accurate results in the present.

"We've shown that remotely-sensed satellite data can assist in doing this in a robust way globally. Our framework can also be applied to habitat restoration and conservation prioritization assessments."

Read more at Science Daily

How ancient microbes created massive ore deposits, set stage for early life

Lake Kivu
New research in Science Advances is uncovering the vital role that Precambrian-eon microbes may have played in two of the early Earth's biggest mysteries.

University of British Columbia (UBC) researchers, and collaborators from the universities of Alberta, Tübingen, Autònoma de Barcelona and the Georgia Institute of Technology, found that ancestors of modern bacteria cultured from an iron-rich lake in Democratic Republic of Congo could have been key to keeping Earth's dimly lit early climate warm, and in forming the world's largest iron ore deposits billions of years ago.

The bacteria have special chemical and physical features that in the complete absence of oxygen allow them to convert energy from sunlight into rusty iron minerals and into cellular biomass. The biomass ultimately causes the production of the potent greenhouse gas methane by other microbes.

"Using modern geomicrobiological techniques, we found that certain bacteria have surfaces which allow them to expel iron minerals, making it possible for them to export these minerals to the seafloor to make ore deposits," said Katharine Thompson, lead author of the study and PhD student in the department of microbiology and immunology.

"Separated from their rusty mineral products, these bacteria then go on to feed other microbes that make methane. That methane is what likely kept Earth's early atmosphere warm, even though the sun was much less bright than today."

This is a possible explanation to the 'faint-young-sun' paradox, originated by astronomer Carl Sagan. The paradox is that there were liquid oceans on early Earth, yet heat budgets calculated from the early Sun's luminosity and modern atmospheric chemistry imply Earth should have been entirely frozen. A frozen Earth would not have supported very much life. A methane-rich atmosphere formed in connection to large-scale iron ore deposits and life was initially proposed by University of Michigan atmospheric scientist James Walker in 1987. The new study provides strong physical evidence to support the theory and finds that microscale bacterial-mineral interactions were likely responsible.

"The fundamental knowledge we're gaining from studies using modern geomicrobiological tools and techniques is transforming our view of Earth's early history and the processes that led to a planet habitable by complex life including humans," said senior author of the paper, Sean Crowe, Canada Research Chair in Geomicrobiology and associate professor at UBC.

"This knowledge of the chemical and physical processes through which bacteria interact with their surroundings can also be used to develop and design new processes for resource recovery, novel building and construction materials, and new approaches to treating disease."

Read more at Science Daily

Why do we freeze when startled? New study in flies points to serotonin

Fruit fly
A Columbia University study in fruit flies has identified serotonin as a chemical that triggers the body's startle response, the automatic deer-in-the-headlights reflex that freezes the body momentarily in response to a potential threat. Today's study reveals that when a fly experiences an unexpected change to its surroundings, such as a sudden vibration, release of serotonin helps to literally -- and temporarily -- stop the fly in its tracks.

These findings, published today in Current Biology, offer broad insight into the biology of the startle response, a ubiquitous, yet mysterious, phenomenon that has been observed in virtually every animal studied to date, from flies to fish to people.

"Imagine sitting in your living room with your family and -- all of a sudden -- the lights go out, or the ground begins to shake," said Richard Mann, PhD, a principal investigator at Columbia's Mortimer B. Zuckerman Mind Brain Behavior Institute and the paper's senior author. "Your response, and that of your family, will be the same: You will stop, freeze and then move to safety. With this study, we show in flies that a rapid release of the chemical serotonin in their nervous system drives that initial freeze. And because serotonin also exists in people, these findings shed light on what may be going on when we get startled as well."

In the brain, serotonin is most closely associated with regulating mood and emotion. But previous research on flies and vertebrates has shown it can also affect the speed of an animal's movement. The Columbia researchers' initial goal was to more fully understand how the chemical accomplished this.

The team first analyzed fruit fly steps using FlyWalker, an apparatus developed by Dr. Mann and Columbia physicist Szabolcs Marka, PhD, to track an insect's steps on a special type of glass. After monitoring how the flies moved, the scientists manipulated the levels of serotonin -- and another chemical called dopamine -- in the fly's ventral nerve cord (VNC), which is analogous to the vertebrate spinal cord.

Their initial results revealed that activating neurons that produce serotonin in the VNC slows flies down, while silencing those same neurons speeds flies up. Additional experiments showed that serotonin levels could impact the insects' walking speed under a wide variety of conditions, including different temperatures, when the flies were hungry, or while they walked upside down, all situations that normally affect walking speed.

"We witnessed serotonin's biggest effects when the flies experienced rapid environmental changes," said Clare Howard, PhD, the paper's first author. "In other words, when they were startled."

To further investigate, the research team devised two scenarios to elicit a fly's startle response. In the first, they turned the lights off: a total blackout for the insects. For the second, they simulated an earthquake.

To accomplish this, the scientists partnered with Tanya Tabachnik, Director of Advanced Instrumentation at Columbia's Zuckerman Institute. Tabachnik's team of machinists and engineers works with scientists to design and build customized systems for their research. For this study, they created a miniature, fly-sized arena perched atop specialized vibrating motors. Adjusting the motors' strength produced the desired earthquake effect. When the researchers exposed the flies to either the blackout or earthquake scenarios, they also manipulated the fly's ability to produce serotonin.

"We found that when a fly is startled in these scenarios, serotonin acts like an emergency brake; its release is needed for them to freeze, and that part of this response may be a result of stiffening both sides of the animal's leg joints," said Dr. Mann, who is also the Higgins Professor of Biochemistry and Molecular Biophysics (in Systems Biology) at Columbia's Vagelos College of Physicians and Surgeons. "This co-contraction could cause the brief pause in walking, after which the insect begins to move."

"We think this pause is important," added Dr. Howard, "It could allow the fly's nervous system to gather the information about this sudden change and decide how it should respond."

Interestingly, even though the fly's response in both scenarios was to cause an immediate pause, their subsequent walking speeds differed significantly.

"After being startled in the blackout scenario, the fly's gait was slow and deliberate," Dr. Howard said. "But the earthquake caused the flies to walk faster after the initial pause."

While these findings are specific to fruit flies, the ubiquity of serotonin and the startle response provides clues as to the chemical and molecular processes that occur when more complex animals, including people, get startled.

Going forward, the researchers hope to further investigate serotonin's role in movement, as well as what other factors may be at play.

Read more at Science Daily

The coldest reaction

The coldest chemical reaction in the known universe took place in what appears to be a chaotic mess of lasers. The appearance deceives: Deep within that painstakingly organized chaos, in temperatures millions of times colder than interstellar space, Kang-Kuen Ni achieved a feat of precision. Forcing two ultracold molecules to meet and react, she broke and formed the coldest bonds in the history of molecular couplings.

"Probably in the next couple of years, we are the only lab that can do this," said Ming-Guang Hu, a postdoctoral scholar in the Ni lab and first author on their paper published today in Science. Five years ago, Ni, the Morris Kahn Associate Professor of Chemistry and Chemical Biology and a pioneer of ultracold chemistry, set out to build a new apparatus that could achieve the lowest temperature chemical reactions of any currently available technology. But they couldn't be sure their intricate engineering would work.

Now, they not only performed the coldest reaction yet, they discovered their new apparatus can do something even they did not predict. In such intense cold -- 500 nanokelvin or just a few millionths of a degree above absolute zero -- their molecules slowed to such glacial speeds, Ni and her team could see something no one has been able to see before: the moment when two molecules meet to form two new molecules. In essence, they captured a chemical reaction in its most critical and elusive act.

Chemical reactions are responsible for literally everything: breathing, cooking, digesting, creating energy, pharmaceuticals, and household products like soap. So, understanding how they work at a fundamental level could help researchers design combinations the world has never seen. With an almost infinite number of new combinations possible, these new molecules could have endless applications from more efficient energy production to new materials like mold-proof walls and even better building blocks for quantum computers.

In her previous work, Ni used colder and colder temperatures to work this chemical magic: forging molecules from atoms that would otherwise never react. Cooled to such extremes, atoms and molecules slow to a quantum crawl, their lowest possible energy state. There, Ni can manipulate molecular interactions with utmost precision. But even she could only see the start of her reactions: two molecules go in, but then what? What happened in the middle and the end was a black hole only theories could try to explain.

Chemical reactions occur in just millionths of a billionth of a second, better known in the scientific world as femtoseconds. Even today's most sophisticated technology can't capture something so short-lived, though some come close. In the last twenty years, scientists have used ultra-fast lasers like fast-action cameras, snapping rapid images of reactions as they occur. But they can't capture the whole picture. "Most of the time," Ni said, "you just see that the reactants disappear and the products appear in a time that you can measure. There was no direct measurement of what actually happened in these chemical reactions." Until now.

Ni's ultracold temperatures force reactions to a comparatively numbed speed. "Because [the molecules] are so cold," Ni said, "now we kind of have a bottleneck effect." When she and her team reacted two potassium rubidium molecules -- chosen for their pliability -- the ultracold temperatures forced the molecules to linger in the intermediate stage for microseconds. Microseconds -- mere millionths of a second -- may seem short, but that's millions of times longer than usual and long enough for Ni and her team to investigate the phase when bonds break and form, in essence, how one molecule turns into another.

With this intimate vision, Ni said she and her team can test theories that predict what happens in a reaction's black hole to confirm if they got it right. Then, her team can craft new theories, using actual data to more precisely predict what happens during other chemical reactions, even those that take place in the mysterious quantum realm.

Read more at Science Daily

Dec 1, 2019

Vision: Not seeing the trees for the wood

Researchers from the Netherlands Institute for Neuroscience have shown how it is possible that objects stand out less when they are surrounded by similar objects. This surround-suppression effect is caused by feedback from higher visual brain areas. The results of this research are important for a better understanding of the way in which the brain transforms incoming light into a cohesive image. The paper has been published in the scientific journal Current Biology.

The brain area responsible for processing vision is located at the back of the brain. One of the most important parts of this area, the primary visual cortex, is the area where a visual stimulus first reaches the cortex. Nerve cells in this area are sensitive to perceiving objects within a very small field of vision. So when you look at a specific object, the nerve cells in the primary cortex are activated and you see this object. "But when this object is surrounded by similar objects, the cells are less active. So really what happens is that you don't see the trees for the wood," says Alexander Heimel, group leader at the Netherlands Institute.

SURROUND-SUPPRESSION EFFECT

"The theory had previously yielded the idea that this surround-suppression effect was the result of signals from higher visual brain areas. But until recently there was not much scientific evidence for this," says principal researcher Joris Vangeneugden, aios at Maastricht University. In order to find out whether it really was a matter of higher visual brain areas signaling, the researchers measured mouse brain activity while the mouse was looking at images of different sizes. At the same time, the researchers managed to pause the higher visual areas for a couple of seconds. It turned out that the activity in the primary visual cortex remained high for the larger images when these higher visual areas were paused, while this did not happen when they were active. The suppression of the surroundings thus decreased. This shows that the higher areas do indeed provide some sort of feedback to the primary visual cortex. "They tell the primary visual cortex that it should focus on a small individual object, not on everything there is to see," says Heimel.

VISUAL PROTHESIS

Understanding this step is necessary to understand, eventually, how the brain transforms the light that enters via our eyes into a perception that makes us understand what we see. "An understanding of how our brain does this is essential for the development of prosthetics that will make blind people see again. Merely ensuring that light reaches the brain does not always suffice; what happens after that is even more important," says Vangeneugden.

From Science Daily