Jul 20, 2019

Music can be a viable alternative to medications in reducing anxiety before anesthesia

Music is a viable alternative to sedative medications in reducing patient anxiety prior to an anesthesia procedure, according to a Penn Medicine study published today in the journal Regional Anesthesia & Pain Medicine.

A peripheral nerve block procedure is a type of regional anesthesia -- done in the preoperative area under ultrasound guidance -- that blocks sensations of pain from a specific area of the body. The procedure is routinely performed for a variety of outpatient orthopedic surgeries, such as hip and knee arthroscopies and elbow or hand surgeries. To reduce anxiety, which can lead to prolonged recovery and an increase in postoperative pain, patients commonly take sedative medications, like midazolam, prior to the nerve block procedure. Yet, the medications can have side effects, including breathing issues and paradoxical effects like hostility and agitation. In this study, researchers found a track of relaxing music to be similarly effective to the intravenous form of midazolam in reducing a patient's anxiety prior to the procedure.

"Our findings show that there are drug-free alternatives to help calm a patient before certain procedures, like nerve blocks," said the study's lead author Veena Graff, MD, an assistant professor of Clinical Anesthesiology and Critical Care. "We've rolled out a new process at our ambulatory surgical center to provide patients who want to listen to music with access to disposable headphones. Ultimately, our goal is to offer music as an alternative to help patients relax during their perioperative period."

While research has shown music can help reduce a patient's anxiety prior to surgery, previous studies have primarily focused on music vs. an oral form of sedative medications, which are not routinely used in the preoperative setting. In this study -- the first to compare music medicine with an intravenous form of sedative medication -- researchers aimed to measure the efficacy of music in lowering a patient's anxiety prior to conducting a peripheral nerve block.

The team randomly assigned 157 adults to receive one of two options three minutes prior to the peripheral nerve block: either an injection of 1-2 mg of midazolam, or a pair of noise canceling headphones playing Marconi Union's "Weightless," -- an eight-minute song, created in collaboration with sound therapists, with carefully arranged harmonies, rhythms, and bass lines designed specifically to calm listeners down. Researchers evaluated levels of anxiety before and after the use of each method, and found similar changes in the levels of anxiety in both groups.

However, the team noted that patients who received midazolam reported higher levels of satisfaction with their overall experience and fewer issues with communication. Researchers attribute these findings to a number of factors, including the fact they used noise canceling headphones, didn't standardize the volume of music, and didn't allow patients to select the music.

From Science Daily

Sea level rise: West Antarctic ice collapse may be prevented by snowing ocean water onto it

Antarctica illustration.
The ice sheet covering West Antarctica is at risk of sliding off into the ocean. While further ice-sheet destabilisation in other parts of the continent may be limited by a reduction of greenhouse gas emissions, the slow, yet inexorable loss of West Antarctic ice is likely to continue even after climate warming is stabilised. A collapse might take hundreds of years but will raise sea levels worldwide by more than three meters.

A team of researchers from the Potsdam Institute for Climate Impact Research (PIK) is now scrutinising a daring way of stabilising the ice sheet: Generating trillions of tons of additional snowfall by pumping ocean water onto the glaciers and distributing it with snow canons. This would mean unprecedented engineering efforts and a substantial environmental hazard in one of the world's last pristine regions -- to prevent long-term sea level rise for some of the world's most densely populated areas along coastlines from the US to China.

"The fundamental trade-off is whether we as humanity want to sacrifice Antarctica to safe the currently inhabited coastal regions and the cultural heritage that we have built and are building on our shores. It is about global metropolises, from New York to Shanghai, which in the long term will be below sea level if nothing is done" explains Anders Levermann, physicist at the Potsdam Institute for Climate Impact Research (PIK) and Columbia University and one of the authors of the study. "The West Antarctic Ice Sheet is one of the tipping elements in our climate system. Ice loss is accelerating and might not stop until the West Antarctic ice sheet is practically gone."

Unprecedented measures to stabilise the ice sheet

Warm ocean currents have reached the Amundsen Sea Sector of West Antarctica -- a region comprising several glaciers that are prone to instability due to their topographic configuration. Underwater melting of these glaciers triggered their speed-up and retreat. This is already now responsible for the largest ice loss from the continent and provides an accelerating contribution to global sea level rise. In their study, the researchers employ computer simulations to project the dynamic ice loss into the future. They confirm earlier studies suggesting that even strong reduction of greenhouse gas emissions may not prevent the collapse of the West Antarctic ice sheet.

"So we investigated what could stop a potential collapse in our simulations and increased the snowfall in the destabilised region far beyond observations," says PIK co-author Johannes Feldmann. "In fact, we find that an awful lot of snow can indeed push the ice sheet back towards a stable regime and stop the instability. In practice, this could be realized by an enormous redisposition of water masses -- pumped out of the ocean and snowed onto the ice sheet at a rate of several hundred billion tons per year over a few decades."

A tremendous trade-off between hazards and hopes

"We are fully aware of the disruptive character such an intervention would have," adds Feldmann. Uplifting, desalinating and heating the ocean water as well as powering the snow canons would require an amount of electric power in the order of several ten thousand high-end wind turbines. "Putting up such a wind farm and the further infrastructure in the Amundsen Sea and the massive extraction of ocean water itself would essentially mean losing a unique natural reserve. Further, the harsh Antarctic climate makes the technical challenges difficult to anticipate and hard to handle while the potential hazardous impacts to the region are likely to be devastating." Thus the risks and costs of such an unprecedented endeavour must be weighted very carefully against its potential benefits. "Also, our study does not consider future human-made global warming. Hence this gigantic endeavour only makes sense if the Paris Climate Agreement is kept and carbon emissions are reduced fast and unequivocally."

Read more at Science Daily

Jul 19, 2019

Rising CO2, climate change projected to reduce availability of nutrients worldwide

One of the biggest challenges to reducing hunger and undernutrition around the world is to produce foods that provide not only enough calories but also make enough necessary nutrients widely available. New research finds that, over the next 30 years, climate change and increasing carbon dioxide (CO2) could significantly reduce the availability of critical nutrients such as protein, iron, and zinc, compared to a future without it. The total impacts of climate change shocks and elevated levels of CO2 in the atmosphere are estimated to reduce growth in global per capita nutrient availability of protein, iron, and zinc by 19.5%, 14.4%, and 14.6%, respectively.

"We've made a lot of progress reducing undernutrition around the world recently but global population growth over the next 30 years will require increasing the production of foods that provide sufficient nutrients," explained Senior Scientist at the International Food Policy Research Institute (IFPRI) and study co-author Timothy Sulser. "These findings suggest that climate change could slow progress on improvements in global nutrition by simply making key nutrients less available than they would be without it."

The study, "A modeling approach combining elevated atmospheric CO2 effects on protein, iron and zinc availability with projected climate change impacts on global diets," [LINK] was co-authored by an international group of researchers and published in the peer-reviewed journal, Lancet Planetary Health. The study represents the most comprehensive synthesis of the impacts of elevated CO2 and climate change on the availability of nutrients in the global food supply to date.

Using the IMPACT global agriculture sector model along with data from the Global Expanded Nutrient Supply (GENuS) model and two data sets on the effects of CO2 on nutrient content in crops, researchers projected per capita availability of protein, iron, and zinc out to 2050.

Improvements in technology, and markets effects are projected to increase nutrient availability over current levels by 2050, but these gains are substantially diminished by the negative impacts of rising concentrations of carbon dioxide.

While higher levels of CO2 can boost photosynthesis and growth in some plants, previous research has also found they reduce the concentration of key micronutrients in crops. The new study finds that wheat, rice, maize, barley, potatoes, soybeans, and vegetables are all projected to suffer nutrient losses of about 3% on average by 2050 due to elevated CO2 concentration.

The effects are not likely to be felt evenly around the world, however, and many countries currently experiencing high levels of nutrient deficiency are also projected to be more affected by lower nutrient availability in the future.

Nutrient reductions are projected to be particularly severe in South Asia, the Middle East, Africa South of the Sahara, North Africa, and the former Soviet Union -- regions largely comprised of low- and middle-income countries where levels of undernutrition are generally higher and diets are more vulnerable to direct impacts of changes in temperature and precipitation triggered by climate change.

"In general, people in low- and middle-income countries receive a larger portion of their nutrients from plant-based sources, which tend to have lower bioavailability than animal-based sources," said Robert Beach, Senior Economist and Fellow at RTI International and lead author of the study.

This means that many people with already relatively low nutrient intake will likely become more vulnerable to deficiencies in iron, zinc, and protein as crops lose their nutrients. Many of these regions are also the ones expected to fuel the largest growth in populations and thus requiring the most growth in nutrient availability.

The impact on individual crops can also have disproportionate effects on diets and health. Significant nutrient losses in wheat have especially widespread implications. "Wheat accounts for a large proportion of diets in many parts of the world, so any changes in its nutrient concentrations can have substantial impact on the micronutrients many people receive," added Beach.

Protein, iron, and zinc availability in wheat are projected to be reduced by up to 12% by 2050 in all regions. People will likely experience the largest decreases in protein availability from wheat in places where wheat consumption is particularly high, including the former Soviet Union, Middle East, North Africa, and eastern Europe.

In South Asia, where the population's iron intake already sits well below the recommended level-India exhibits the highest prevalence of anemia in the world -- iron availability is projected to remain inadequate. What's more, elevated carbon levels push the average availability of zinc in the region below the threshold of recommended nutrient intake.

Although the study's models were limited to 2050, Sulser added, "extending the analysis through the second half of this century, when climate change is expected to have even stronger impacts, would result in even greater reductions in nutrient availability."

Researchers also emphasized the need for further work to build upon their findings, including additional study of climate impacts on animal sources, such as poultry, livestock, and fisheries, crops' nutritional composition, nutrient deficiencies resulting from short-term climate shocks, and technologies that could mitigate reductions in nutrient availability.

Quantifying the potential health impacts for individuals also requires a consideration of the many factors beyond food consumption -- including access to clean water, sanitation, and education -- that influence nutrition and health outcomes.

Read more at Science Daily

The unpopular truth about biases toward people with disabilities

Needing to ride in a wheelchair can put the brakes on myriad opportunities -- some less obvious than one might think. New research from Michigan State University sheds light on the bias people have toward people with disabilities, known as "ableism," and how it shifts over time.

Contrary to popular belief, the findings suggest that biases toward people with disabilities increase with age and over time, but that people are less likely to show how they really feel publicly.

"Disabilities are a sensitive, uncomfortable topic for many people to talk about. Few are willing to acknowledge a bias toward people with disabilities," said William Chopik, MSU assistant professor of psychology and senior author. "Because this is so understudied, the goal of our research was to characterize why -- and which types of -- people hold higher biases against those with disabilities."

The research, published in the Journal of Social Issues, is the largest of its kind using data from 300,000 participants gathered over 13 years. Participants ranged from 18- to 90-years-old, and 15% classified themselves as having a disability.

Authors Jenna Harder, Victor Keller and Chopik used data from Project Implicit, a platform that allows users to learn and measure biases anonymously. The platform defined a disability as "some sort of physical, mental or emotional limitation" and asked a series of questions measuring feelings about people with disabilities. The researchers also measured how much contact participants had with the disability community using a scale of one to seven, one being "knowing someone" and seven being "having constant contact" with a person with a disability.

The researchers used the surveys to measure implicit attitudes and explicit bias. Harder explained that an implicit attitude are thoughts or feelings that happen automatically, which are hard to control, suppress or regulate. Explicit attitudes, she said, are the things people consciously agree with and are more controllable because it is how people express or portray their opinions about something publicly. One can think through what they are about to say and filter themselves if necessary, she said.

The researchers found that implicit bias from respondents increased over time and with age, meaning that they had less-favorable feelings toward people with disabilities. But, when asking explicitly how much participants preferred people with disabilities to be abled, they shared more positive responses with time and age, meaning that they outwardly portrayed positive opinions about people with disabilities.

"This is a big mystery because people outwardly say they feel less biased, but in actuality the implicit attitude has been getting stronger as time goes on," Chopik said. "It's not popular to express negative opinions about people with disabilities, so perhaps they feel inclined say nicer things publicly instead. Changes in explicit attitudes do not always lead to changes in implicit prejudice -- sometimes becoming more aware of a prejudice might increase implicit prejudice."

The findings also revealed that women felt less implicit bias, and that people who had contact with the disabled population had lower prejudice.

"Some of our findings related to women align with stereotypes: when you look at how men and women compare on bias, women are more compassionate toward stigmatized groups," Harder said.

"Gender was one of the most consistent predictors in this study, supporting theories that women are particularly receptive to people who they perceive as needing help."

Chopik explained that lower prejudice from people who had contact with disabled people was consistent with theories related to interactions with other stigmatized groups.

"As you interact more with a stigmatized group, you can potentially have more positive experiences with them, which changes your attitudes," he said. "You start with a certain bias, but over time those biases are challenged and your attitude changes because you have the chance to develop positive associations with the group and see them in a different light."

Data gathered from disabled participants showed feelings of warmth among their own community and a more positive attitude toward their peers. The more visible a disability -- like needing a wheelchair or a walker -- the stronger the positive attitude toward the disability community was.

Chopik emphasized the lack of research on ableism and hopes to encourage more participation from academia.

Read more at Science Daily

Jurassic fossil shows how early mammals could swallow like their modern descendants

The 165-million-year-old fossil of Microdocodon gracilis, a tiny, shrew-like animal, shows the earliest example of modern hyoid bones in mammal evolution.

The hyoid bones link the back of the mouth, or pharynx, to the openings of the esophagus and the larynx. The hyoids of modern mammals, including humans, are arranged in a "U" shape, similar to the saddle seat of children's swing, suspended by jointed segments from the skull. It helps us transport and swallow chewed food and liquid -- a crucial function on which our livelihood depends.

Mammals as a whole are far more sophisticated than other living vertebrates in chewing up food and swallowing it one small lump at a time, instead of gulping down huge bites or whole prey like an alligator.

"Mammals have become so diverse today through the evolution of diverse ways to chew their food, weather it is insects, worms, meat, or plants. But no matter how differently mammals can chew, they all have to swallow in the same way," said Zhe-Xi Luo, PhD, a professor of organismal biology and anatomy at the University of Chicago and the senior author of a new study of the fossil, published this week in Science.

"Essentially, the specialized way for mammals to chew and then swallow is all made possible by the agile hyoid bones at the back of the throat," Luo said.

'A pristine, beautiful fossil'

This modern hyoid apparatus is mobile and allows the throat muscles to control the intricate functions to transport and swallow chewed food or drink fluids. Other vertebrates also have hyoid bones, but their hyoids are simple and rod-like, without mobile joints between segments. They can only swallow food whole or in large chunks.

When and how this unique hyoid structure first appeared in mammals, however, has long been in question among paleontologists. In 2014, Chang-Fu Zhou, PhD, from the Paleontological Museum of Liaoning in China, the lead author of the new study, found a new fossil of Microdocodon preserved with delicate hyoid bones in the famous Jurassic Daohugou site of northeastern China. Soon afterwards, Luo and Thomas Martin from the University of Bonn, Germany, met up with Zhou in China to study the fossil.

"It is a pristine, beautiful fossil. I was amazed by the exquisite preservation of this tiny fossil at the first sight. We got a sense that it was unusual, but we were puzzled about what was unusual about it," Luo said. "After taking detailed photographs and examining the fossil under a microscope, it dawned on us that this Jurassic animal has tiny hyoid bones much like those of modern mammals."

This new insight gave Luo and his colleagues added context on how to study the new fossil. Microdocodon is a docodont, from an extinct lineage of near relatives of mammals from the Mesozoic Era called mammaliaforms. Previously, paleontologists anticipated that hyoids like this had to be there in all of these early mammals, but it was difficult to identify the delicate bones. After finding them in Microdocodon, Luo and his collaborators have since found similar fossilized hyoid structures in other Mesozoic mammals.

"Now we are able for the first time to address how the crucial function for swallowing evolved among early mammals from the fossil record," Luo said. "The tiny hyoids of Microdocodon are a big milestone for interpreting the evolution of mammalian feeding function."

New insights on mammal evolution as a whole

Luo also worked with postdoctoral scholar Bhart-Anjan Bhullar, PhD, now on the faculty at Yale University, and April Neander, a scientific artist and expert on CT visualization of fossils at UChicago, to study casts of Microdocodon and reconstruct how it lived.

The jaw and middle ear of modern mammals are developed from (or around) the first pharyngeal arch, structures in a vertebrate embryo that develop into other recognizable bones and tissues. Meanwhile, the hyoids are developed separately from the second and the third pharyngeal arches. Microdocodon has a primitive middle ear still attached to the jaw like that of other early mammals like cynodonts, which is unlike the ear of modern mammals. Yet its hyoids are already like those of modern mammals.

"Hyoids and ear bones are all derivatives of the primordial vertebrate mouth and gill skeleton, with which our earliest fishlike ancestors fed and respired," Bhullar said. "The jointed, mobile hyoid of Microdocodon coexists with an archaic middle ear -- still attached to the lower jaw. Therefore, the building of the modern mammal entailed serial repurposing of a truly ancient system."

The tiny, shrew-like creature likely weighed only 5 to 9 grams, with a slender body, and an exceptionally long tail. The dimensions of its limb bones match up with those of modern tree-dwellers.

Read more at Science Daily

How mammals' brains evolved to distinguish odors is nothing to sniff at

Cat smelling flowers.
The world is filled with millions upon millions of distinct smells, but how mammals' brains evolved to tell them apart is something of a mystery.

Now, two neuroscientists from the Salk Institute and UC San Diego have discovered that at least six types of mammals -- from mice to cats -- distinguish odors in roughly the same way, using circuitry in the brain that's evolutionarily preserved across species.

"The study yields insights into organizational principles underpinning brain circuitry for olfaction in mammals that may be applied to other parts of the brain and other species," says Charles Stevens, distinguished professor emeritus in Salk's Neurobiology Laboratory and coauthor of the research published in the July 18, 2019 issue of Current Biology.

In brief, the study reveals that the size of each of the three components of the neural network for olfaction scales about the same for each species, starting with receptors in the nose that transmit signals to a cluster of neurons in the front of the brain called the olfactory bulb which, in turn, relays the signals to a "higher functioning" region for odor identification called the piriform cortex.

"These three stages scale with each other, with the relationship of the number of neurons in each stage the same across species," says Shyam Srinivasan, assistant project scientist with UC San Diego's Kavli Institute for Brain and Mind, and the paper's coauthor. "So, if you told me the number of neurons in the nose, I could predict the number in the piriform cortex or the bulb."

The current study builds on research by the same duo, published in 2018, which described how mouse brains process and distinguish odors using what's known as "distributed circuits." Unlike the visual system, for example, where information is transmitted in an orderly manner to specific parts of the visual cortex, the researchers discovered that the olfactory system in mice relies on a combination of connections distributed across the piriform cortex.

Following that paper, Stevens and Srinivasan sought to determine if the distributed neural circuitry revealed in mice is similar in other mammals. For the current work, the researchers analyzed mammal brains of varying sizes and types. Their calculations, plus previous studies over the past few years, were used to estimate brain volumes. Stevens and Srinivasan used a variety of microscopy techniques that let them visualize different types of neurons that form synapses (connections) in the olfactory circuitry.

"We couldn't count every neuron, so we did a survey," says Srinivasan. "The idea is that you take samples from different represented areas, so any irregularities are caught."

The new study revealed that the average number of synapses connecting each functional unit of the olfactory bulb (a glomerulus) to neurons in the piriform cortex is invariant across species.

"It was remarkable to see how these were conserved," says Stevens.

Specifically, identification of individual odors is linked to the strength and combination of firing neurons in the circuit that can be likened to music from a piano whose notes spring from the depression of multiple keys to create chords, or the arrangement of letters that form the words on this page.

"The discrimination of odors is based on the firing rate, the electric pulse that travels down the neuron's axon," says Srinivasan. "One odor, say for coffee, may elicit a slow response in a neuron while the same neuron may respond to chocolate at a faster rate."

This code used for olfaction is different than other parts of the brain.

"We showed that the connectivity parameters and the relationship between different stages of the olfactory circuit are conserved across mammals, suggesting that evolution has used the same design for the circuit across species, but just changed the size to fit the animals' environmental niche," says Stevens.

In the future, Stevens plans to examine other regions of the brain in search of other distributed circuits whose function is based on similar coding found in this study.

Read more at Science Daily

Jul 18, 2019

What makes some people more receptive to the idea of being vaccinated against infectious disease?

Fear, trust, and the likelihood of exposure are three leading factors that influence whether people are willing to be vaccinated against a virulent disease, according to a new study in the journal Heliyon, published by Elsevier.

Following the highly-publicized 2014 outbreak of Ebola in Africa and anticipating the possibility of a future Ebola outbreak in the United States, a 2014 CNN/ORC poll asked a random sample of 1,018 adults if they would take an anti-Ebola vaccination if and when it became available. About half of the participants reported that they would, while half expressed hesitation or refusal, even if vaccination services for Ebola were available to them.

In the current study, investigators conducted a secondary analysis of that data to examine the factors contributing to vaccination receptivity vs. hesitation. They found that three factors primarily influenced receptivity: a general fear orientation; trust in government to contain a crisis; and the relative chance of being exposed to the pathogen. Interestingly, the effectiveness and safety of a vaccine itself was not among the factors influencing receptivity.

"Facing a raising number of epidemics that create public health dangers, our findings indicate that vaccine hesitancy is associated with social factors that are independent of the perceived effectiveness of vaccines. Willingness to take vaccination is positively associated with a generalized sense of fear, trust in the government's ability to control an outbreak of the disease, and expectation of a potential Ebola outbreak that is imminent and proximate," explains one of the study's investigators, Kent P. Schwirian, PhD, Emeritus Professor of Sociology, The Ohio State University, Columbus, OH, USA.

Professor Schwirian elaborated on how these three factors shape the willingness of half of the sample population to engage in the protective behavior of vaccination.

  • General Fear Orientation. Respondents expressed fear not only of being infected, but also more generally in terms of their outlook on life and how they perceive things are going overall in society today. The more than 60 percent who reported being somewhat or very scared about events in the US today were much more willing to consider an anti-Ebola vaccination than individuals who did not report this anxiety.
  • Trust in Government. People who expressed confidence in the US government's ability to prevent an Ebola outbreak were much more willing to take the anti-Ebola vaccine than individuals who lacked confidence in the government to do so.
  • Exposure Expectancy of an Ebola Outbreak. While approximately 80 percent of the respondents thought that it was somewhat or highly likely that an Ebola outbreak would happen fairly soon in the US, most people thought that the outbreak would not happen in their local community or family. However, the closer in proximity they thought the outbreak would be to them, the more willing they were to take the anti-Ebola vaccination.

Gustavo S. Mesch, PhD, Professor of Sociology and Rector, University of Haifa, Israel, the study's other investigator, recommends reexamining the research questions with more current data. "Our life and death struggle against lethal microbes is ultimately fought at the local level. Unless local hospitals and health care personnel are prepared to fight and ready to go, we are at a major disadvantage in attempting to save lives," he cautions. "Confirming what percentage of the population would opt-in or out of vaccines, and the central role of trust in the government, would help public health officials plan their responses." He added that the results also showed other factors that can be validated and explored, particularly that of older respondents who were more likely to be vaccine receptive, as were those with less formal education.

Vaccination is the primary public health response to the growing number of infectious diseases that infect the world's population. At the same time, a growing anti-vax movement has spawned a small but vocal faction of the population, spreading hesitancy despite widespread evidence of vaccine efficacy and safety. The reluctance or refusal to be vaccinated or to have one's children vaccinated was identified by the World Health Organization as one of the top ten global health threats of 2019. Understanding the factors that contribute to vaccination compliance or hesitancy is vital for controlling disease outbreaks.

Read more at Science Daily

Correcting historic sea surface temperature measurements

Researchers found a "statistical nightmare" in comparing ocean temperature readings taken on different vessels in different time periods.
Something odd happened in the oceans in the early 20th century. The North Atlantic and Northeast Pacific appeared to warm twice as much as the global average while the Northwest Pacific cooled over several decades.

Atmospheric and oceanic models have had trouble accounting for these differences in temperature changes, leading to a mystery in climate science: why did the oceans warm and cool at such different rates in the early 20th century?

Now, research from Harvard University and the UK's National Oceanography Centre points to an answer both as mundane as a decimal point truncation and as complicated as global politics. Part history, part climate science, this research corrects decades of data and suggests that ocean warming occurred in a much more homogenous way.

The research is published in Nature.

Humans have been measuring and recording the sea surface temperature for centuries. Sea surface temperatures helped sailors verify their course, find their bearings, and predict stormy weather.

Until the 1960s, most sea surface temperature measurements were taken by dropping a bucket into the ocean and measuring the temperature of the water inside.

The National Oceanic and Atmospheric Administration (NOAA) and the National Science Foundation's National Center for Atmospheric Research (NCAR) maintains a collection of sea surface temperature readings dating back to the early 19th Century. The database contains more than 155 million observations from fishing, merchant, research and navy ships from all over the world. These observations are vital to understanding changes in ocean surface temperature over time, both natural and anthropogenic.

They are also a statistical nightmare.

How do you compare, for example, the measurements of a British Man-of-War from 1820 to a Japanese fishing vessel from 1920 to a U.S. Navy ship from 1950? How do you know what kind of buckets were used, and how much they were warmed by sunshine or cooled by evaporation while being sampled?

For example, a canvas bucket left on a deck for three minutes under typical weather conditions can cool by 0.5 degrees Celsius more than a wooden bucket measured under the same conditions. Given that global warming during the 20th Century was about 1 degree Celsius, the biases associated with different measurement protocols requires careful accounting.

"There are gigabytes of data in this database and every piece has a quirky story," said Peter Huybers, Professor of Earth and Planetary Sciences and of Environmental Science and Engineering at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and senior author of the paper. "The data is rife with peculiarities."

A lot of research has been done to identify and adjust for these peculiarities. In 2008, for example, researchers found that a 0.3-degree Celsius jump in sea surface temperatures in 1945 was the result of measurements taken from engine room intakes. Even with these corrections, however, the data is far from perfect and there are still unexplained changes in sea surface temperature.

In this research, Huybers and his colleagues proposed a comprehensive approach to correcting the data, using a new statistical technique that compares measurements taken by nearby ships.

"Our approach looks at the differences in sea surface temperature measurements from distinct groups of ships when they pass nearby, within 300 kilometers and two days of one another," said Duo Chan, a graduate student in the Harvard Graduate School of Arts and Sciences and first author of the paper. "Using this approach, we found 17.8 million near crossings and identified some big biases in some groups."

The researchers focused on data from 1908 to 1941, broken down by the country of origin of the ship and the "decks," a term stemming from the fact that marine observations were stored using decks of punch cards. One deck includes observations from both Robert Falcon Scott's and Ernest Shackleton's voyages to the Antarctic.

"These data have made a long journey from the original logbooks to the modern archive and difficult choices were made to fit the available information onto punch cards or a manageable number of magnetic tape reels," said Elizabeth Kent, a co-author from the UK National Oceanography Centre. "We now have both the methods and the computer power to reveal how those choices have affected the data, and also pick out biases due to variations in observing practice by different nations, bringing us closer to the real historical temperatures."

The researchers found two new key causes of the warming discrepancies in the North Pacific and North Atlantic.

The first had to do with changes in Japanese records. Prior to 1932, most records of sea surface temperature from Japanese vessels in the North Pacific came from fishing vessels. This data, spread across several different decks, was originally recorded in whole-degrees Fahrenheit, then converted to Celsius, and finally rounded to tenths-of-a-degree.

However, in the lead-up to World War II, more and more Japanese readings came from naval ships. These data were stored in a different deck and when the U.S. Air Force digitized the collection, they truncated the data, chopping off the tenths-of-a-degree digits and recording the information in whole-degree Celsius.

Unrecognized effects of truncation largely explain the rapid cooling apparent in foregoing estimate of Pacific sea surface temperatures between 1935 and 1941, said Huybers. After correcting for the bias introduced by truncation, the warming in the Pacific is much more uniform.

While Japanese data holds the key to warming in the Pacific in the early 20th century, it's German data that plays the most important role in understanding sea surface temperatures in the North Atlantic during the same time.

In the late 1920s, German ships began providing a majority of data in the North Atlantic. Most of these measurements are collected in one deck, which, when compared to nearby measurements, is significantly warmer. When adjusted, the warming in the North Atlantic becomes more gradual.

With these adjustments, the researchers found that rates of warming across the North Pacific and North Atlantic become much more similar and have a warming pattern closer to what would be expected from rising greenhouse gas concentrations. However, discrepancies still remain and the overall rate of warming found in the measurements is still faster than predicted by model simulations.

Read more at Science Daily

Scientists discover how mosquito brains integrate diverse sensory cues to find a host

Mosquito biting.
For female mosquitoes, finding their next meal is all about smelling and seeing.

Through behavioral experiments and real-time recording of the female mosquito brain, a team of scientists, led by researchers at the University of Washington, has discovered how the mosquito brain integrates signals from two of its sensory systems -- visual and olfactory -- to identify, track and hone in on a potential host for her next blood meal.

Their findings, published July 18 in the journal Current Biology, indicate that, when the mosquito's olfactory system detects certain chemical cues, they trigger changes in the mosquito brain that initiate a behavioral response: The mosquito begins to use her visual system to scan her surroundings for specific types of shapes and fly toward them, presumably associating those shapes with potential hosts.

Only female mosquitoes feed on blood, and these results give scientists a much-needed glimpse of the sensory-integration process that the mosquito brain uses to locate a host. Scientists can use these findings to help develop new methods for mosquito control and reduce the spread of mosquito-borne diseases.

This study focused on the olfactory cue that triggers the hunt for a host: carbon dioxide, or CO2. For mosquitoes, smelling CO2 is a telltale sign that a potential meal is nearby.

"Our breath is just loaded with CO2," said corresponding author Jeffrey Riffell, a UW professor of biology. "It's a long-range attractant, which mosquitoes use to locate a potential host that could be more than 100 feet away."

That potential host could be a person or another warm-blooded animal. Prior research by Riffell and his collaborators has shown that smelling CO2 can "prime" the mosquito's visual system to hunt for a host. In this new research, they measure how CO2 triggers precise changes in mosquito flight behavior and visualize how the mosquito brain responds to combinations of olfactory and visual cues.

The team collected data from approximately 250 individual mosquitoes during behavioral trials conducted in a small circular arena, about 7 inches in diameter. A 360-degree LED display framed the arena and a tungsten wire tether in the middle held each mosquito. An optical sensor below the insect collected data about mosquito wingbeats, an air inlet and vacuum line streamed odors into the arena, and the LED display showed different types of visual stimuli.

The team tested how tethered Aedes aegypti mosquitoes responded to visual stimuli as well as puffs of CO2-rich air. They found that, in the arena, one-second puffs of air containing 5% CO2 -- just above the 4.5% CO2 air emitted by humans -- prompted the mosquitoes to beat their wings faster. Some visual elements like a fast-moving starfield had little effect on mosquito behavior. But if the arena showed a horizontally moving bar, mosquitoes beat their wings faster and attempted to steer in the same direction. This response was more pronounced if researchers introduced a puff of CO2 before showing the bar.

To get a clear picture of how smelling CO2 first affected flight behavior, they analyzed their data using a mathematical model of housefly flight behavior.

"We found that CO2 influences the mosquito's ability to turn toward an object that isn't directly in their flight path," said Riffell. "When they smell the CO2, they essentially turn toward the object in their visual field faster and more readily than they would without CO2."

The researchers repeated the arena experiments with a genetically modified Aedes aegypti strain created by Riffell and co-author Omar Akbari, an assistant professor at the University of California, San Diego. Cells in these mosquitoes glow fluorescent green if they contain high levels of calcium ions -- including neurons of the central nervous system when they are actively firing. In the arena, the researchers removed a small portion of the mosquito skull and used a microscope to view neuronal activity in sections of the brain in real time.

The team focused on 59 "regions of interest" that showed especially high levels of calcium ion levels in the lobula, a part of the mosquito brain's optic lobe. If the mosquito was shown a horizontal bar, two-thirds of those regions lit up, indicating increased neuronal firing in response to the visual stimulus. When the researchers introduced a puff of CO2 first and then showed the horizontal bar, 23% of the regions had even higher activity than before -- indicating that the CO2 odor prompted a larger-magnitude response in these areas of the brain that control vision.

The researchers tried the reverse experiment -- seeing if a horizontal bar triggered increased firing in the parts of the mosquito brain that control smell -- but saw no response.

"Smell triggers vision, but vision does not trigger the sense of smell," said Riffell.

Their findings align with the general picture of mosquito senses. The mosquito sense of smell operates at long distances, picking up scents more than 100 feet away. But their eyesight is most effective for objects 15 to 20 feet away, according to Riffell.

"Olfaction is a long-range sense for mosquitoes, while vision is for intermediate-range tracking," said Riffell. "So, it makes sense that we see an odor -- in this case CO2 -- affecting parts of the mosquito brain that control vision, and not the reverse."

Read more at Science Daily

200 times faster than ever before: The speediest quantum operation yet

A group of scientists led by 2018 Australian of the Year Professor Michelle Simmons has achieved the first two-qubit gate between atom qubits in silicon -- a major milestone on the team's quest to build an atom-scale quantum computer. The pivotal piece of research was published today in the journal Nature.

A two-qubit gate is the central building block of any quantum computer -- and the UNSW team's version of it is the fastest that's ever been demonstrated in silicon, completing an operation in 0.8 nanoseconds, which is ~200 times faster than other existing spin-based two-qubit gates.

In the Simmons' group approach, a two-qubit gate is an operation between two electron spins -- comparable to the role that classical logic gates play in conventional electronics. For the first time, the team was able to build a two-qubit gate by placing two atom qubits closer together than ever before, and then -- in real-time -- controllably observing and measuring their spin states.

The team's unique approach to quantum computing requires not only the placement of individual atom qubits in silicon but all the associated circuitry to initialise, control and read-out the qubits at the nanoscale -- a concept that requires such exquisite precision it was long thought to be impossible. But with this major milestone, the team is now positioned to translate their technology into scalable processors.

Professor Simmons, Director of the Centre of Excellence for Quantum Computation and Communication Technology (CQC2T) and founder of Silicon Quantum Computing Pty Ltd., says the past decade of previous results perfectly set the team up to shift the boundaries of what's thought to be "humanly possible."

"Atom qubits hold the world record for the longest coherence times of a qubit in silicon with the highest fidelities," she says. "Using our unique fabrication technologies, we have already demonstrated the ability to read and initialise single electron spins on atom qubits in silicon with very high accuracy. We've also demonstrated that our atomic-scale circuitry has the lowest electrical noise of any system yet devised to connect to a semiconductor qubit.

"Optimising every aspect of the device design with atomic precision has now allowed us to build a really fast, highly accurate two-qubit gate, which is the fundamental building block of a scalable, silicon-based quantum computer.

"We've really shown that it is possible to control the world at the atomic scale -- and that the benefits of the approach are transformational, including the remarkable speed at which our system operates."

UNSW Science Dean, Professor Emma Johnston AO, says this key paper further shows just how ground-breaking Professor Simmons' research is.

"This was one of Michelle's team's final milestones to demonstrate that they can actually make a quantum computer using atom qubits. Their next major goal is building a 10-qubit quantum integrated circuit -- and we hope they reach that within 3-4 years."

Getting up and close with qubits -- engineering with a precision of just thousand-millionths of a metre

Using a scanning tunnelling microscope to precision-place and encapsulate phosphorus atoms in silicon, the team first had to work out the optimal distance between two qubits to enable the crucial operation.

"Our fabrication technique allows us to place the qubits exactly where we want them. This allows us to engineer our two-qubit gate to be as fast as possible," says study lead co-author Sam Gorman from CQC2T.

"Not only have we brought the qubits closer together since our last breakthrough, but we have learnt to control every aspect of the device design with sub-nanometer precision to maintain the high fidelities."

Observing and controlling qubit interactions in real-time

The team was then able to measure how the qubits states evolved in real-time. And, most excitingly, the researchers showed how to control the interaction strength between two electrons on the nano-second timescale.

"Importantly, we were able to bring the qubit's electrons closer or further apart, effectively turning on and off the interaction between them, a prerequisite for a quantum gate," says other lead co-author Yu He.

"The tight confinement of the qubit's electrons, unique to our approach, and the inherently low noise in our system enabled us to demonstrate the fastest two qubit gate in silicon to date."

"The quantum gate we demonstrated, the so-called SWAP gate, is also ideally suited to shuttle quantum information between qubits -- and, when combined with a single qubit gate, allows you to run any quantum algorithm."

A thing of physical impossibility? Not anymore

Professor Simmons says that this is the culmination of two decades' worth of work.

"This is a massive advance: to be able to control nature at its very smallest level so that we can create interactions between two atoms but also individually talk to each one without disturbing the other is incredible. A lot of people thought this would not be possible," she says.

"The promise has always been that if we could control the qubit world at this scale, they would be fast, and they sure are!"

What are qubits?


In Professor Michelle Simmons' approach, quantum bits (or qubits) are made from electrons hosted on phosphorus atoms in silicon. Creating qubits by precisely positioning and encapsulating individual phosphorus atoms within a silicon chip is a unique Australian approach that Professor Simmons' team has been leading globally. These types of qubits are a promising platform for large-scale quantum computers, thanks to their long-lasting stability.

Read more at Science Daily

Jul 17, 2019

Supernova observation first of its kind using NASA satellite

Supernova illustration
When NASA's Transiting Exoplanet Survey Satellite launched into space in April 2018, it did so with a specific goal: to search the universe for new planets.

But in recently published research, a team of astronomers at The Ohio State University showed that the survey, nicknamed TESS, could also be used to monitor a particular type of supernova, giving scientists more clues about what causes white dwarf stars to explode -- and about the elements those explosions leave behind.

"We have known for years that these stars explode, but we have terrible ideas of why they explode," said Patrick Vallely, lead author of the study and an Ohio State astronomy graduate student. "The big thing here is that we are able to show that this supernova isn't consistent with having a white dwarf (take mass) directly from a standard star companion and explode into it -- the kind of standard idea that had led to people trying to find hydrogen signatures in the first place. That is, because the TESS light curve doesn't show any evidence of the explosion slamming into the surface of a companion, and because the hydrogen signatures in the SALT spectra don't evolve like the other elements, we can rule out that standard model."

Their research, detailed in the Monthly Notices of the Royal Astronomical Society, represents the first published findings about a supernova observed using TESS, and add new insights to long-held theories about the elements left behind after a white dwarf star explodes into a supernova.

Those elements have long troubled astronomers.

A white dwarf explodes into a specific type of supernova, a 1a, after gathering mass from a nearby companion star and growing too big to remain stable, astronomers believe. But if that is true, then the explosion should, astronomers have theorized, leave behind trace elements of hydrogen, a crucial building block of stars and the entire universe. (White dwarf stars, by their nature, have already burned through their own hydrogen and so would not be a source of hydrogen in a supernova.)

But until this TESS-based observation of a supernova, astronomers had never seen those hydrogen traces in the explosion's aftermath: This supernova is the first of its type in which astronomers have measured hydrogen. That hydrogen, first reported by a team from the Observatories of the Carnegie Institution for Science, could change the nature of what astronomers know about white dwarf supernovae.

"The most interesting thing about this particular supernova is the hydrogen we saw in its spectra (the elements the explosion leaves behind)," Vallely said. "We've been looking for hydrogen and helium in the spectra of this type of supernova for years -- those elements help us understand what caused the supernova in the first place."

The hydrogen could mean that the white dwarf consumed a nearby star. In that scenario, the second star would be a normal star in the middle of its lifespan -- not a second white dwarf. But when astronomers measured the light curve from this supernova, the curve indicated that the second star was in fact a second white dwarf. So where did the hydrogen come from?

Professor of Astronomy Kris Stanek, Vallely's adviser at Ohio State and a co-author on this paper, said it is possible that the hydrogen came from a companion star -- a standard, regular star -- but he thinks it is more likely that the hydrogen came from a third star that happened to be near the exploding white dwarf and was consumed in the supernova by chance.

"We would think that because we see this hydrogen, it means that the white dwarf consumed a second star and exploded, but based on the light curve we saw from this supernova, that might not be true," Stanek said.

"Based on the light curve, the most likely thing that happened, we think, is that the hydrogen might be coming from a third star in the system," Stanek added. "So the prevailing scenario, at least at Ohio State right now, is that the way to make a Type Ia (pronounced 1-A) supernova is by having two white dwarf stars interacting -- colliding even. But also having a third star that provides the hydrogen."

For the Ohio State research, Vallely, Stanek and a team of astronomers from around the world combined data from TESS, a 10-centimeter-diameter telescope, with data from the All-Sky Automated Survey for Supernovae (ASAS-SN for short.) ASAS-SN is led by Ohio State and is made up of small telescopes around the world watching the sky for supernovae in far-away galaxies.

TESS, by comparison, is designed to search the skies for planets in our nearby galaxy -- and to provide data much more quickly than previous satellite telescopes. That means that the Ohio State team was able to use data from TESS to see what was happening around the supernova in the first moments after it exploded -- an unprecedented opportunity.

The team combined data from TESS and ASAS-SN with data from the South African Large Telescope to evaluate the elements left behind in the supernova's wake. They found both hydrogen and helium there, two indicators that the exploding star had somehow consumed a nearby companion star.

"What is really cool about these results is, when we combine the data, we can learn new things," Stanek said. "And this supernova is the first exciting case of that synergy."

The supernova this team observed was a Type Ia, a type of supernova that can occur when two stars orbit one another -- what astronomers call a binary system. In some cases of a Type I supernova, one of those stars is a white dwarf.

A white dwarf has burned off all its nuclear fuel, leaving behind only a very hot core. (White dwarf temperatures exceed 100,000 degrees Kelvin -- nearly 200,000 degrees Fahrenheit.) Unless the star grows bigger by stealing bits of energy and matter from a nearby star, the white dwarf spends the next billion years cooling down before turning into a lump of black carbon.

But if the white dwarf and another star are in a binary system, the white dwarf slowly takes mass from the other star until, eventually, the white dwarf explodes into a supernova.

Type I supernovae are important for space science -- they help astronomers measure distance in space, and help them calculate how quickly the universe is expanding (a discovery so important that it won the Nobel Prize in Physics in 2011.)

"These are the most famous type of supernova -- they led to dark energy being discovered in the 1990s," Vallely said. "They are responsible for the existence of so many elements in the universe. But we don't really understand the physics behind them that well. And that's what I really like about combining TESS and ASAS-SN here, that we can build up this data and use it to figure out a little more about these supernovae."

Scientists broadly agree that the companion star leads to a white dwarf supernova, but the mechanism of that explosion, and the makeup of the companion star, are less clear.

This finding, Stanek said, provides some evidence that the companion star in this type of supernova is likely another white dwarf.

Read more at Science Daily

Out of Africa and into an archaic human melting pot

Earth illustration, centered on Africa.
Genetic analysis has revealed that the ancestors of modern humans interbred with at least five different archaic human groups as they moved out of Africa and across Eurasia.

While two of the archaic groups are currently known -- the Neandertals and their sister group the Denisovans from Asia -- the others remain unnamed and have only been detected as traces of DNA surviving in different modern populations. Island Southeast Asia appears to have been a particular hotbed of diversity.

Published in the Proceedings of the National Academy of Sciences (PNAS), researchers from the University of Adelaide's Australian Centre for Ancient DNA (ACAD) have mapped the location of past "mixing events" (analysed from existing scientific literature) by contrasting the levels of archaic ancestry in the genomes of present-day populations around the world.

"Each of us carry within ourselves the genetic traces of these past mixing events," says first author Dr João Teixeira, Australian Research Council Research Associate, ACAD, at the University of Adelaide. "These archaic groups were widespread and genetically diverse, and they survive in each of us. Their story is an integral part of how we came to be.

"For example, all present-day populations show about 2% of Neandertal ancestry which means that Neandertal mixing with the ancestors of modern humans occurred soon after they left Africa, probably around 50,000 to 55,000 years ago somewhere in the Middle East."

But as the ancestors of modern humans travelled further east they met and mixed with at least four other groups of archaic humans.

"Island Southeast Asia was already a crowded place when what we call modern humans first reached the region just before 50,000 years ago," says Dr Teixeira. "At least three other archaic human groups appear to have occupied the area, and the ancestors of modern humans mixed with them before the archaic humans became extinct."

Using additional information from reconstructed migration routes and fossil vegetation records, the researchers have proposed there was a mixing event in the vicinity of southern Asia between the modern humans and a group they have named "Extinct Hominin 1."

Other interbreeding occurred with groups in East Asia, in the Philippines, the Sunda shelf (the continental shelf that used to connect Java, Borneo and Sumatra to mainland East Asia), and possibly near Flores in Indonesia, with another group they have named "Extinct Hominin 2."

"We knew the story out of Africa wasn't a simple one, but it seems to be far more complex than we have contemplated," says Dr Teixeira. "The Island Southeast Asia region was clearly occupied by several archaic human groups, probably living in relative isolation from each other for hundreds of thousands of years before the ancestors of modern humans arrived.

Read more at Science Daily

Vast majority of dietary supplements don't improve heart health or put off death, study finds

Dietary supplements.
In a massive new analysis of findings from 277 clinical trials using 24 different interventions, Johns Hopkins Medicine researchers say they have found that almost all vitamin, mineral and other nutrient supplements or diets cannot be linked to longer life or protection from heart disease.

Although they found that most of the supplements or diets were not associated with any harm, the analysis showed possible health benefits only from a low-salt diet, omega-3 fatty acid supplements and possibly folic acid supplements for some people. Researchers also found that supplements combining calcium and vitamin D may in fact be linked to a slightly increased stroke risk.

Results of the analysis were published on July 8 in Annals of Internal Medicine.

Surveys by the Centers for Disease Control and Prevention show that 52% of Americans take a least one vitamin or other dietary/nutritional supplement daily. As a nation, Americans spend $31 billion each year on such over-the-counter products. An increasing number of studies -- including this new one from Johns Hopkins -- have failed to prove health benefits from most of them.

"The panacea or magic bullet that people keep searching for in dietary supplements isn't there," says senior author of the study Erin D. Michos, M.D., M.H.S., associate director of preventive cardiology at the Ciccarone Center for the Prevention of Cardiovascular Disease and associate professor of medicine at the Johns Hopkins University School of Medicine. "People should focus on getting their nutrients from a heart-healthy diet, because the data increasingly show that the majority of healthy adults don't need to take supplements."

For the current study, the researchers used data from 277 randomized clinical trials that evaluated 16 vitamins or other supplements and eight diets for their association with mortality or heart conditions including coronary heart disease, stroke, and heart attack. All together they included data gathered on 992,129 research participants worldwide.

The vitamin and other supplements reviewed included: antioxidants, ?-carotene, vitamin B-complex, multivitamins, selenium, vitamin A, vitamin B3/niacin, vitamin B6, vitamin C, vitamin E, vitamin D alone, calcium alone, calcium and vitamin D together, folic acid, iron and omega-3 fatty acid (fish oil). The diets reviewed were a Mediterranean diet, a reduced saturated fat (less fats from meat and dairy) diet, modified dietary fat intake (less saturated fat or replacing calories with more unsaturated fats or carbohydrates), a reduced fat diet, a reduced salt diet in healthy people and those with high blood pressure, increased alpha linolenic acid (ALA) diet (nuts, seeds and vegetable oils), and increased omega-6 fatty acid diet (nuts, seeds and vegetable oils). Each intervention was also ranked by the strength of the evidence as high, moderate, low or very low risk impact.

The majority of the supplements including multivitamins, selenium, vitamin A, vitamin B6, vitamin C, vitamin E, vitamin D alone, calcium alone and iron showed no link to increased or decreased risk of death or heart health.

In the three studies of 3,518 people that looked at a low-salt diet in people with healthy blood pressure, there were 79 deaths. The researchers say that they found a 10% decrease in the risk of death in these people, which they classified as a moderate associated impact.

Of the five studies in which 3,680 participants with high blood pressure were put on a low-salt diet, they found that the risk of death due to heart disease decreased by 33%, as there were 674 heart disease deaths during the study periods. They also classified this intervention as moderate evidence of an impact.

Forty-one studies with 134,034 participants evaluated the possible impact of omega-3 fatty acid supplements. In this group, 10,707 people had events such as a heart attack or stroke indicating heart disease. Overall, these studies suggested that supplement use was linked to an 8 percent reduction in heart attack risk and a 7 percent reduction in coronary heart disease compared to those not on the supplements. The researchers ranked evidence for a beneficial link to this intervention as low.

Based on 25 studies in 25,580 healthy people, data also showed that folic acid was linked to a 20 percent reduced risk of stroke. Some 877 participants had strokes during the trials. The authors graded evidence for a link to beneficial effects as low.

The authors point out that the studies suggesting the greatest impact of folic acid supplementation on reducing stroke risk took place in China, where cereals and grains aren't fortified with folic acid like they are in the U.S. Thus, they say, this apparent protective effect may not be applicable in regions where most people get enough folic acid in their diet.

Twenty studies evaluated the combination of calcium with vitamin D in a supplement. Of the 42,072 research participants, 3,690 had strokes during the trials, and taken together the researchers say this suggests a 17% increased risk for stroke. The risk evidence was ranked as moderate. There was no evidence that calcium or vitamin D taken alone had any health risks or benefits.

"Our analysis carries a simple message that although there may be some evidence that a few interventions have an impact on death and cardiovascular health, the vast majority of multivitamins, minerals and different types of diets had no measurable effect on survival or cardiovascular disease risk reduction," says lead author Safi U. Khan, M.D., an assistant professor of Medicine at West Virginia University.

Read more at Science Daily

Joshua trees facing extinction

Joshua Tree National Park, California.
They outlived mammoths and saber-toothed tigers. But without dramatic action to reduce climate change, new research shows Joshua trees won't survive much past this century.

UC Riverside scientists wanted to verify earlier studies predicting global warming's deadly effect on the namesake trees that millions flock to see every year in Joshua Tree National Park. They also wanted to learn whether the trees are already in trouble.

Using multiple methods, the study arrived at several possible outcomes. In the best-case scenario, major efforts to reduce heat-trapping gasses in the atmosphere would save 19 percent of the tree habitat after the year 2070. In the worst case, with no reduction in carbon emissions, the park would retain a mere 0.02 percent of its Joshua tree habitat.

The team's findings were published recently in Ecosphere. Project lead Lynn Sweet, a UCR plant ecologist, said she hopes the study inspires people to take protective environmental action. "The fate of these unusual, amazing trees is in all of our hands," she said. "Their numbers will decline, but how much depends on us."

To answer their questions about whether climate change is already having an effect, a large group of volunteers helped the team gather data about more than 4,000 trees.

They found that Joshua trees have been migrating to higher elevation parts of the park with cooler weather and more moisture in the ground. In hotter, drier areas, the adult trees aren't producing as many younger plants, and the ones they do produce aren't surviving.

Joshua trees as a species have existed since the Pleistocene era, about 2.5 million years ago, and individual trees can live up to 300 years. One of the ways adult trees survive so long is by storing large reserves of water to weather droughts.

Younger trees and seedlings aren't capable of holding reserves in this way though, and the most recent, 376-week-long drought in California left the ground in some places without enough water to support new young plants. As the climate changes, long periods of drought are likely to occur with more frequency, leading to issues with the trees like those already observed.

An additional finding of this study is that in the cooler, wetter parts of the park the biggest threat other than climate change is fire. Fewer than 10 percent of Joshua trees survive wildfires, which have been exacerbated in recent years by smog from car and industrial exhaust. The smog deposits nitrogen on the ground, which in turn feeds non-native grasses that act as kindling for wildfires.

As a partner on this project, the U.S. Park Service is using this information to mitigate fire risk by removing the invasive plants.

"Fires are just as much a threat to the trees as climate change, and removing grasses is a way park rangers are helping to protect the area today," Sweet said. "By protecting the trees, they're protecting a host of other native insects and animals that depend on them as well."

UCR animal ecologist and paper co-author Cameron Barrows conducted a similar research project in 2012, which also found Joshua tree populations would decline, based on models assuming a temperature rise of three degrees. However, this newer study considered a climate change scenario using twice as many variables, including soil-water estimates, rainfall, soil types, and more. In addition, Barrows said on-the-ground observations were essential to verifying the climate models this newer team had constructed.

Quoting the statistician George Box, Barrows said, "All models are wrong, but some are useful." Barrows went on to say, "Here, the data we collected outdoors showed us where our models gave us the most informative glimpse into the future of the park."

For this study, the UC Riverside Center for Conservation Biology partnered with Earthwatch Institute to recruit the volunteer scientists. Barrows and Sweet both recommend joining such organizations as a way to help find solutions to the park's problems.

Read more at Science Daily

Jul 16, 2019

Gaia starts mapping the galactic bar in the Milky Way

The second release of data from Gaia star-mapping satellite, published in 2018, has been revolutionising many fields of astronomy. The unprecedented catalogue contains the brightness, positions, distance indicators and motions across the sky for more than one billion stars in our Milky Way galaxy, along with information about other celestial bodies.

This is just the beginning. While the second release is based on the first twenty-two months of Gaia's surveys, the satellite has been scanning the sky for five years, and will keep doing so at least until 2022. New data releases planned in coming years will steadily improve measurements as well as provide extra information that will enable us to chart our home galaxy and delve into its history like never before.

Meanwhile, a team of astronomers have combined the latest Gaia data with infrared and optical observations performed from ground and space to provide a preview of what future releases of ESA's stellar surveyor will reveal.

"We looked in particular at two of the stellar parameters contained in the Gaia data: the surface temperature of stars and the 'extinction', which is basically a measure of how much dust there is between us and the stars, obscuring their light and making it appear redder," says Friedrich Anders ICCUB member and lead author of the new study.

"These two parameters are interconnected, but we can estimate them independently by adding extra information obtained by peering through the dust with infrared observations," continues the expert.

The team combined the second Gaia data release with several infrared surveys using a computer code called StarHorse, developed by co-author Anna Queiroz and other collaborators. The code compares the observations with stellar models to determine the surface temperature of stars, the extinction and an improved estimate of the distance to the stars.

As a result, the astronomers obtained much better determination of the distances to about 150 million stars -- in some cases, the improvement is up to 20% or more. This enabled them to trace the distribution of stars across the Milky Way to much greater distances than possible with the original Gaia data alone.

"With the second Gaia data release, we could probe a radius around the Sun of about 6500 light years, but with our new catalogue, we can extend this 'Gaia sphere' by three or four times, reaching out to the centre of the Milky Way," explains co-author Cristina Chiappini from Leibniz Institute for Astrophysics Potsdam, Germany, where the project was coordinated. At the centre of our galaxy, the data clearly reveals a large, elongated feature in the three-dimensional distribution of stars: the galactic bar.

"We know the Milky Way has a bar, like other barred spiral galaxies, but so far we only had indirect indications from the motions of stars and gas, or from star counts in infrared surveys. This is the first time that we see the galactic bar in three-dimensional space, based on geometric measurements of stellar distances," says Friedrich Anders.

"Ultimately, we are interested in galactic archaeology: we want to reconstruct how the Milky Way formed and evolved, and to do so we have to understand the history of each and every one of its components," adds Cristina Chiappini.

"It is still unclear how the bar -- a large amount of stars and gas rotating rigidly around the centre of the galaxy -- formed, but with Gaia and other upcoming surveys in the next years we are certainly on the right path to figure it out," notes the researcher.

The team is looking forward to the next data release from the Apache Point Observatory Galaxy Evolution Experiment (APOGEE-2), as well as upcoming facilities such as the 4-metre Multi-Object Survey Telescope (4MOST) at the European Southern Observatory in Chile and the WEAVE (WHT Enhanced Area Velocity Explorer) survey at the William Herschel Telescope (WHT) in La Palma (Canary Islands).

The third Gaia data release, currently planned for 2021, will include greatly improved distance determinations for a much larger number of stars, and is expected to enable progress in our understanding of the complex region at the centre of the Milky Way.

"With this study, we can enjoy a taster of the improvements in our knowledge of the Milky Way that can be expected from Gaia measurements in the third data release," explains co-author Anthony Brown of Leiden University (the Netherlands).

Read more at Science Daily

Breakthrough material could lead to cheaper, more widespread solar panels and electronics

Imagine printing electronic devices using a simple inkjet printer -- or even painting a solar panel onto the wall of a building.

Such technology would slash the cost of manufacturing electronic devices and enable new ways to integrate them into our everyday lives. Over the last two decades, a type of material called organic semiconductors, made out of molecules or polymers, has been developed for such purposes. But some properties of these materials pose a major hurdle that limits their widespread use.

"In these materials, an electron is usually bound to its counterpart, a missing electron known as 'hole,' and can't move freely," said Wai-Lun Chan, associate professor of physics & astronomy at the University of Kansas. "So-called 'free electrons,' which wander freely in the material and conduct electricity, are rare and can't be generated readily by light absorption. This impedes the use of these organic materials in applications like solar panels because panels built with these materials often have poor performance."

Because of this problem, Chan said "freeing the electrons" has been a focus in developing organic semiconductors for solar cells, light sensors and many other optoelectronic applications.

Now, two physics research groups at KU, led by Chan and Hui Zhao, professor of physics & astronomy, have effectively generated free electrons from organic semiconductors when combined with a single atomic layer of molybdenum disulfide (MoS2), a recently discovered two-dimensional (2D) semiconductor.

The introduced 2D layer allows the electrons to escape from "holes" and move freely. The findings have just been published in the Journal of American Chemical Society, a leading journal in chemistry and interfacing areas of science.

Over the last few years, many researchers have been investigating how free charges can be generated effectively from hybrid organic-2D interfaces.

"One of the prevailing assumptions is free electrons can be generated from the interface as long as electrons can be transferred from one material to another in a relatively short period of time -- less than one-trillionth of a second," Chan said. "However, my graduate students Tika Kafle and Bhupal Kattel and I have found the presence of the ultrafast electron transfer in itself is not sufficient to guarantee the generation of free electrons from the light absorption. That's because the 'holes' can prevent the electrons from moving away from the interface. Whether the electron can be free from this binding force depends on the local energy landscape near the interface."

Chan said the energy landscape of the electrons could be seen as a topographic map of a mountain.

"A hiker chooses his path based on the height contour map," he said. "Similarly, the motion of the electron at the interface between the two materials is controlled by the electron energy landscape near the interface."

Chan and Zhao's findings will help develop general principles of how to design the "landscape" to free the electrons in such hybrid materials.

The discovery was made by combining two highly complementary experimental tools based on ultrafast lasers, time-resolved photoemission spectroscopy in Chan's lab and transient optical absorption in Zhao's lab. Both experimental setups are located in the basement of the Integrated Science Building.

In the time-resolved photoemission spectroscopy experiment, Kafle used an ultrashort laser pulse that only exists for 10-quadrillionths (10-14) of a second to trigger the motion of electrons. The advantage of using such a short pulse is the researcher knows precisely the starting time of the electron's journey. Kafle then used another ultrashort laser pulse to hit the sample again at an accurately controlled time relative to the first pulse. This second pulse is energetic enough to kick out these electrons from the sample. By measuring the energy of these electrons (now in a vacuum) and using the principle of energy conservation, the researchers were able to figure out the energy of electrons before they were kicked out and thus reveal the journey of these electrons since they were hit by the first pulse. This technique resolved the energy of the excited electrons as it moves across the interface after the light absorption. Because only electrons near the front surface of the sample can be released by the second pulse, the position of the electron relative to the interface is also revealed with atomic precision.

In the transient optical absorption measurements, Peng Yao (a visiting student) and KU graduate Peymon Zereshki, both supervised by Zhao, also used a two-pulse technique, with the first pulse initiating the electron motion in the same way. However, in their measurements, the second pulse does the trick of monitoring electrons by detecting the fraction of the second pulse that is reflected from the sample, instead of kicking out the electrons.

Read more at Science Daily

Maternal secrets of our earliest ancestors unlocked

Extended parental care is considered one of the hallmarks of human evolution. A stunning new research result published today in Nature reveals for the first time the parenting habits of one of our earliest extinct ancestors.

Analysis of more than two-million-year-old teeth from Australopithecus africanus fossils found in South Africa have revealed that infants were breastfed continuously from birth to about one year of age. Nursing appears to continue in a cyclical pattern in the early years for infants; seasonal changes and food shortages caused the mother to supplement gathered foods with breastmilk. An international research team led by Dr Renaud Joannes-Boyau of Southern Cross University, and by Dr Luca Fiorenza and Dr Justin W. Adams from Monash University, published the details of their research into the species in Nature today.

"For the first time, we gained new insight into the way our ancestors raised their young, and how mothers had to supplement solid food intake with breastmilk when resources were scarce," said geochemist Dr Joannes-Boyau from the Geoarchaeology and Archaeometry Research Group (GARG) at Southern Cross University.

"These finds suggest for the first time the existence of a long-lasting mother-infant bond in Australopithecus. This makes us to rethink on the social organisations among our earliest ancestors," said Dr Fiorenza, who is an expert in the evolution of human diet at the Monash Biomedicine Discovery Institute (BDI).

"Fundamentally, our discovery of a reliance by Australopithecus africanus mothers to provide nutritional supplementation for their offspring and use of fallback resources highlights the survival challenges that populations of early human ancestors faced in the past environments of South Africa," said Dr Adams, an expert in hominin palaeoecology and South African sites at the Monash BDI.

For decades there has been speculation about how early ancestors raised their offspring. With this study, the research team has opened a new window into our enigmatic evolutionary history.

Australopithecus africanus lived from about two to three million years ago during a period of major climatic and ecological change in South Africa, and the species was characterised by a combination of human-like and retained ape-like traits. While the first fossils of Australopithecus were found almost a century ago, scientists have only now been able to unlock the secrets of how they raised their young, using specialised laser sampling techniques to vaporise microscopic portions on the surface of the tooth. The gas containing the sample is then analysed for chemical signatures with a mass spectrometer- enabling researchers to develop microscopic geochemical maps which can tell the story of the diet and health of an individual over time. Dr Joannes-Boyau conducted the analyses at the Geoarchaeology and Archaeometry Research Group at Southern Cross University in Lismore NSW and at the Icahn School of Medicine at Mount Sinai in New York.

Teeth grow similarly to trees; they form by adding layer after layer of enamel and dentine tissues every day. Thus, teeth are particularly valuable for reconstructing the biological events occurring during the early period of life of an individual, simply because they preserve precise temporal changes and chemical records of key elements incorporated in the food we eat.

By developing micro geochemical maps, we are able to 'read' successive bands of daily signal in teeth, which provide insights into food consumption and stages of life. Previously the team had revealed the nursing behaviour of our closest evolutionary relatives, the Neanderthals. With this latest study, the international team has analysed teeth that are more than ten times older than those of Neanderthals.

"We can tell from the repetitive bands that appear as the tooth developed that the fall back food was high in lithium, which is believed to be a mechanism to reduce protein deficiency in infants more prone to adverse effect during growth periods," Dr Joannes-Boyau said.

"This likely reduced the potential number of offspring, because of the length of time infants relied on a supply of breastmilk. The strong bond between mothers and offspring for a number of years has implications for group dynamics, the social structure of the species, relationships between mother and infant and the priority that had to be placed on maintaining access to reliable food supplies," he said.

"This finding underscores the diversity, variability and flexibility in habitats and adaptive strategies these australopiths used to obtain food, avoid predators, and raise their offspring," Dr Adams emphasised.

"This is the first direct proof of maternal roles of one of our earliest ancestors and contributes to our understanding of the history of family dynamics and childhood," concluded Dr Fiorenza.

Read more at Science Daily

Australian bee sting vaccine trial holds promise against allergic reactions

Most people have probably been stung by a bee and while it can be painful, it's especially dangerous for the many that are at risk of suffering a life threatening allergic reaction.

Australian researchers have successfully completed a human trial on a vaccine designed to eliminate the risk of a severe allergic reaction to European honeybee stings.

The clinical trial at Flinders University and the Royal Adelaide Hospital included 27 adults with a history of allergic reactions to bee stings.

The vaccine used in the trial contained a unique sugar-based ingredient called an adjuvant, developed in Australia, which is designed to help the body neutralise the bee venom at a faster rate.

Professor Nikolai Petrovsky says the adjuvant used to enhance the bee sting vaccines has now been successfully given to over a thousand individuals across a range of different vaccines including in the current bee sting allergy trial.

"Our technology is like adding a turbocharger to a car and in this case makes the bee allergy vaccine much more powerful, allowing the immune system to better neutralise the bee venom and prevent allergic symptoms," says Professor Petrovsky.

Associate Professor Robert Heddle, lead investigator in the trial, says the aim was to see if the Advax adjuvant would safely speed up and improve bee sting immunotherapy.

"The results of the study were very promising and confirmed the safety of this approach to improving bee sting immunotherapy."

Dr Anthony Smith, an investigator in the trial, says while a commercial bee venom therapy is already available, it requires patients to have over 50 injections over a 3 year period to build up their immune system.

"The current treatment option for serious bee venom allergies is lengthy and cumbersome, so I hope this enhanced bee venom therapy brings about faster, but longer lasting protection to bee stings for allergic individuals."

The Advax adjuvant which enhances the bee sting vaccines was developed in Adelaide by Vaxine Pty Ltd and has also been used to develop vaccines for seasonal and pandemic influenza, hepatitis, malaria, Alzheimers disease, cancer and other diseases.

From Science Daily

Jul 15, 2019

Healthy lifestyle may offset genetic risk of dementia

Living a healthy lifestyle may help offset a person's genetic risk of dementia, according to new research.

The study was led by the University of Exeter -- simultaneously published today in JAMA and presented at the Alzheimer's Association International Conference 2019 in Los Angeles. The research found that the risk of dementia was 32 per cent lower in people with a high genetic risk if they had followed a healthy lifestyle, compared to those who had an unhealthy lifestyle.

Participants with high genetic risk and an unfavourable lifestyle were almost three times more likely to develop dementia compared to those with a low genetic risk and favourable lifestyle.

Joint lead author Dr El?bieta Ku?ma, at the University of Exeter Medical School, said: "This is the first study to analyse the extent to which you may offset your genetic risk of dementia by living a healthy lifestyle. Our findings are exciting as they show that we can take action to try to offset our genetic risk for dementia. Sticking to a healthy lifestyle was associated with a reduced risk of dementia, regardless of the genetic risk."

The study analysed data from 196,383 adults of European ancestry aged 60 and older from UK Biobank. The researchers identified 1,769 cases of dementia over a follow-up period of eight years. The team grouped the participants into those with high, intermediate and low genetic risk for dementia.

To assess genetic risk, the researchers looked at previously published data and identified all known genetic risk factors for Alzheimer's disease. Each genetic risk factor was weighted according to the strength of its association with Alzheimer's disease.

To assess lifestyle, researchers grouped participants into favourable, intermediate and unfavourable categories based on their self-reported diet, physical activity, smoking and alcohol consumption. The researchers considered no current smoking, regular physical activity, healthy diet and moderate alcohol consumption as healthy behaviours. The team found that living a healthy lifestyle was associated with a reduced dementia risk across all genetic risk groups.

Joint lead author Dr David Llewellyn, from the University of Exeter Medical School and the Alan Turing Institute, said: "This research delivers a really important message that undermines a fatalistic view of dementia. Some people believe it's inevitable they'll develop dementia because of their genetics. However it appears that you may be able to substantially reduce your dementia risk by living a healthy lifestyle."

Read more at Science Daily

Warming climate intensifes summer drought in parts of US

Climate change is amplifying the intensity and likelihood of heatwaves during severe droughts in the southern plains and southwest United States, according to a new study by a University of Arkansas researcher.

Linyin Cheng, assistant professor of geosciences, used data from the National Center for Atmospheric Research's Community Earth System Model to study summer droughts that occurred both before and after the Industrial Revolution. Cheng and colleagues from the National Oceanic and Atmospheric Administration and universities in China and Colorado ran simulations to assess how, and by how much, human-induced climate change affects summer heatwaves in the contiguous United States. The study was published in the Journal of Climate.

The researchers found that in places with low moisture in the soil, such as the southern plains and southwest, higher temperatures brought about by climate change led to an increased "coupling" of land and atmosphere, which further increased the severity of heatwaves. In places with more moisture in the soil, such as the northeast, they found no appreciable coupling and therefore no contribution to heatwave intensification.

"Our analysis of climate simulation finds that summertime drought-heatwave relationships change significantly over the southern and southwest U.S. due to human-made climate change since the late 19th century," said Cheng. "By contrast, the drought-heatwave relationship over northern U.S regions undergoes little change in the warmed climate."

The findings raise the idea of a self-reinforcing climate loop: as a region's climate becomes more arid due to climate change, droughts become hotter, further reducing soil moisture.

"Overall, these results indicate that strengthened land-atmosphere feedback is a significant physical driver for increasing occurrences of drought-related extreme heatwaves, particularly over the semi-arid and arid regions of the United States," the report states.

From Science Daily

Quantum logic clock returns to top performance

The quantum logic clock -- perhaps best known for showing you age faster if you stand on a stool -- has climbed back to the leading performance echelons of the world's experimental atomic clocks.

Physicists at the National Institute of Standards and Technology (NIST) have been quietly upgrading their quantum logic clock design for the past eight years, mainly to reduce errors from unwanted motion of the single aluminum ion (electrically charged atom) that provides the clock "ticks."

As described in Physical Review Letters, the quantum logic clock's systematic uncertainty (how closely the clock represents the ion's natural vibrations, or frequency) is 9.5×10−19, the best of any clock worldwide. This means the logic clock would now neither gain nor lose one second in 33 billion years, which is about two-and-a-half times the estimated age of the universe.

In this metric, it now outpaces both NIST clocks using neutral atoms trapped in lattices of laser beams, the ytterbium lattice clock and the strontium lattice clock.

"The logic clock's performance is not surprising to me," project leader David Leibrandt said. "Ion clocks are naturally better isolated from the environment -- which is the source of inaccuracy for atomic clocks -- than lattice clocks are. It's important to distinguish between precision and stability on this point. People expect that lattice clocks should perform the best in stability, and they currently do. Our newest quantum logic clock is the world leader in precision but not stability."

The logic clock's stability (how long it takes to measure the time) is 1.2×10−15 for a 1-second measurement, which is near the best achieved by a single ion clock but about 10 times worse than both NIST lattice clocks.

The quantum logic clock got its nickname because it borrows logical decision-making techniques from experimental quantum computing. Aluminum is an exceptionally stable source of clock ticks, vibrating between two energy levels over a million billion times per second, but its properties are not easily manipulated or detected with lasers. So, logic operations with a partner magnesium ion are used to cool the aluminum and to signal its ticks.

Back in 2010, NIST's quantum logic clock had the best performance of any experimental atomic clock. The clock also attracted attention for 2010 demonstrations of "time dilation" aspects of Einstein's theories of relativity: that time passes faster at higher elevations but more slowly when you move faster.

Since then, NIST's lattice clocks have been continually leapfrogging each other in performance, giving the impression of a race to identify a single winner. In fact, all the clocks are useful for research purposes and are possible contenders for future time standards or other applications.

The international definition of the second (in the International System of Units, or SI) has been based on the cesium atom since 1967, so cesium remains the "ruler" for official timekeeping. The logic clock is one contender for a future time standard to be selected by the international scientific community. NIST scientists are working on several different types of experimental clocks, each based on different atoms and offering its own advantages. All these experimental clocks are based on optical frequencies, which are higher than the microwave frequencies used in today's timekeeping standards based on cesium.

Several technical advances enabled the improved performance of the logic clock, including a new ion trap design that reduced heat-induced ion motion, enabling operation near the desirable ground state, or lowest motional energy level. In addition, a lower frequency was used to operate the ion trap, reducing unwanted ion motion caused by the electric field used to trap the ions. Finally, improved quantum control has reduced the uncertainty of measurements of frequency shifts due to ion motion.

The clock's precision was determined by measuring and adding up the frequency shifts caused by nine different effects. Stability was measured by comparison to NIST's ytterbium lattice clock.

Read more at Science Daily