Nov 30, 2019

We love coffee, tea, chocolate and soft drinks so much, caffeine is literally in our blood

Scientists at Oregon State University may have proven how much people love coffee, tea, chocolate, soda and energy drinks as they validated their new method for studying how different drugs interact in the body.

In conducting mass spectrometry research, Richard van Breemen and Luying Chen worked with various biomedical suppliers to purchase 18 batches of supposedly pure human blood serum pooled from multiple donors. Biomedical suppliers get their blood from blood banks, who pass along inventory that's nearing its expiration date.

All 18 batches tested positive for caffeine. Also, in many of the samples the researchers found traces of cough medicine and an anti-anxiety drug. The findings point to the potential for contaminated blood transfusions, and also suggest that blood used in research isn't necessarily pure.

"From a 'contamination' standpoint, caffeine is not a big worry for patients, though it may be a commentary on current society," said Chen, a Ph.D. student. "But the other drugs being in there could be an issue for patients, as well as posing a problem for those of us doing this type of research because it's hard to get clean blood samples."

The study was published in the Journal of Pharmaceutical and Biomedical Analysis.

In addition to caffeine, the research also involved testing pooled serum for alprazolam, an anti-anxiety medicine sold under the trade name Xanax; dextromethorphan, an over-the-counter cough suppressant; and tolbutamide, a medicine used to treat type 2 diabetes.

All of the pooled serum was free of tolbutamide, but eight samples contained dextromethorphan and 13 contained alprazolam -- possibly meaning that if you ever need a blood transfusion, your odds of also receiving caffeine, cough medicine and an anti-anxiety drug are pretty good.

"The study leads you in that direction, though without doing a comprehensive survey of vendors and blood banks we can only speculate on how widespread the problem is," said van Breemen, the director of OSU's Linus Pauling Institute. "Another thing to consider is that we found drugs that we just happened to be looking for in doing the drug interaction assay validation -- how many others are in there too that we weren't looking for?"

The purpose of the study by Chen and van Breemen was to test a new method for evaluating the potential for interactions between botanical dietary supplements and drug metabolism.

The method involves rapid protein precipitation and ultra high pressure liquid chromatography and is being used to support clinical studies. In the clinical studies, participants take a drug cocktail along with a botanical supplement -- hops, licorice or red clover -- to see if the supplement causes any of the drugs to be metabolized differently than they otherwise would.

"Botanicals basically contain natural products with drug-like activities," van Breemen said. "Just as a drug may alter the drug-metabolizing enzymes, so can natural products. It can become a real problem when someone takes a botanical supplement and is also on prescription drugs -- how do those two interact? It's not straightforward or necessarily predictable, thus the need for methods to look for these interactions. The odd thing in this case was finding all the tainted blood."

Read more at Science Daily

Ultrafast quantum simulations: A new twist to an old approach

Billions of tiny interactions occur between thousands of particles in every piece of matter in the blink of an eye. Simulating these interactions in their full dynamics was said to be elusive but has now been made possible by new work of researchers from Oxford and Warwick.

In doing so, they have paved the way for new insights into the complex mutual interactions between the particles in extreme environments such as at the heart of large planets or laser nuclear fusion.

Researchers at the University of Warwick and University of Oxford have developed a new way to simulate quantum systems of many particles, that allows for the investigation of the dynamic properties of quantum systems fully coupled to slowly moving ions.

Effectively, they have made the simulation of the quantum electrons so fast that it could run extremely long without restrictions and the effect of their motion on the movement of the slow ions would be visible.

Reported in the journal Science Advances, it is based on a long-known alternative formulation of quantum mechanics (Bohm dynamics) which the scientists have now empowered to allow to study the dynamics of large quantum systems.

Many quantum phenomena have been studied for single or just a few interacting particles as large complex quantum systems overpower scientists' theoretical and computational capabilities to make predictions. This is complicated by the vast difference in timescale the different particle species act on: ions evolve thousands of times more slowly than electrons due to their larger mass. To overcome this problem, most methods involve decoupling electrons and ions and ignoring the dynamics of their interactions -- but this severely limits our knowledge on quantum dynamics.

To develop a method that allows scientists to account for the full electron-ion interactions, the researchers revived an old alternative formulation of quantum mechanics developed by David Bohm. In quantum mechanics, one needs to know the wave function of a particle. It turns out that describing it by the mean trajectory and a phase, as done by Bohm, is very advantageous. However, it took an additional suit of approximations and many tests to speed up the calculations as dramatic as required. Indeed, the new methods demonstrated an increase of speed by more than a factor of 10,000 (four orders of magnitude) yet is still consistent with previous calculations for static properties of quantum systems.

The new approach was then applied to a simulation of warm dense matter, a state between solids and hot plasmas, that is known for its inherent coupling of all particle types and the need for a quantum description. In such systems, both the electrons and the ions can have excitations in the form of waves and both waves will influence each other. Here, the new approach can show its strength and determined the influence of the quantum electrons on the waves of the classical ions while the static properties were proven to agree with previous data.

Many-body quantum systems are the core of many scientific problem ranging from the complex biochemistry in our bodies to the behaviour of matter inside of large planets or even technological challenges like high-temperature superconductivity or fusion energy which demonstrates the possible range of applications of the new approach.

Prof Gianluca Gregori (Oxford), who led the investigation, said: "Bohm quantum mechanics has often been treated with skepticism and controversy. In its original formulation, however, this is just a different reformulation of quantum mechanics. The advantage in employing this formalism is that different approximations become simpler to implement and this can increase the speed and accuracy of simulations involving many-body systems."

Dr Dirk Gericke from the University of Warwick, who assisted the design of the new computer code, said: "With this huge increase of numerical efficiency, it is now possible to follow the full dynamics of fully interacting electron-ion systems. This new approach thus opens new classes of problems for efficient solutions, in particular, where either the system is evolving or where the quantum dynamics of the electrons has a significant effect on the heavier ions or the entire system.

Read more at Science Daily

Nov 29, 2019

Nine climate tipping points now 'active,' warn scientists

More than half of the climate tipping points identified a decade ago are now "active," a group of leading scientists have warned.

This threatens the loss of the Amazon rainforest and the great ice sheets of Antarctica and Greenland, which are currently undergoing measurable and unprecedented changes much earlier than expected.

This "cascade" of changes sparked by global warming could threaten the existence of human civilisations.

Evidence is mounting that these events are more likely and more interconnected than was previously thought, leading to a possible domino effect.

In an article in the journal Nature, the scientists call for urgent action to reduce greenhouse gas emissions to prevent key tipping points, warning of a worst-case scenario of a "hothouse," less habitable planet.

"A decade ago we identified a suite of potential tipping points in the Earth system, now we see evidence that over half of them have been activated," said lead author Professor Tim Lenton, director of the Global Systems Institute at the University of Exeter.

"The growing threat of rapid, irreversible changes means it is no longer responsible to wait and see. The situation is urgent and we need an emergency response."

Co-author Johan Rockström, director of the Potsdam Institute for Climate Impact Research, said: "It is not only human pressures on Earth that continue rising to unprecedented levels.

"It is also that as science advances, we must admit that we have underestimated the risks of unleashing irreversible changes, where the planet self-amplifies global warming.

"This is what we now start seeing, already at 1°C global warming.

"Scientifically, this provides strong evidence for declaring a state of planetary emergency, to unleash world action that accelerates the path towards a world that can continue evolving on a stable planet."

In the commentary, the authors propose a formal way to calculate a planetary emergency as risk multiplied by urgency.

Tipping point risks are now much higher than earlier estimates, while urgency relates to how fast it takes to act to reduce risk.

Exiting the fossil fuel economy is unlikely before 2050, but with temperature already at 1.1°C above pre-industrial temperature, it is likely Earth will cross the 1.5°C guardrail by 2040. The authors conclude this alone defines an emergency.

Nine active tipping points:

  1. Arctic sea ice
  2. Greenland ice sheet
  3. Boreal forests
  4. Permafrost
  5. Atlantic Meridional Overturning Circulation
  6. Amazon rainforest
  7. Warm-water corals
  8. West Antarctic Ice Sheet
  9. Parts of East Antarctica

The collapse of major ice sheets on Greenland, West Antarctica and part of East Antarctica would commit the world to around 10 metres of irreversible sea-level rise.

Reducing emissions could slow this process, allowing more time for low-lying populations to move.

The rainforests, permafrost and boreal forests are examples of biosphere tipping points that if crossed result in the release of additional greenhouse gases amplifying warming.

Despite most countries having signed the Paris Agreement, pledging to keep global warming well below 2°C, current national emissions pledges -- even if they are met -- would lead to 3°C of warming.

Although future tipping points and the interplay between them is difficult to predict, the scientists argue: "If damaging tipping cascades can occur and a global tipping cannot be ruled out, then this is an existential threat to civilization.

"No amount of economic cost-benefit analysis is going to help us. We need to change our approach to the climate problem."

Professor Lenton added: "We might already have crossed the threshold for a cascade of inter-related tipping points.

"However, the rate at which they progress, and therefore the risk they pose, can be reduced by cutting our emissions."

Read more at Science Daily

Unique sled dogs helped the inuit thrive in the North American Arctic

Inuit sled dogs have changed little since people migrated to the North American Arctic across the Bering Strait from Siberia with them, according to researchers who have examined DNA from the dogs from that time span. The legacy of these Inuit dogs survives today in Arctic sled dogs, making them one of the last remaining descendant populations of indigenous, pre-European dog lineages in the Americas.

The latest research is the result of nearly a decade's work by University of California, Davis, researchers in anthropology and veterinary genetics, who analyzed the DNA of hundreds of dogs' ancient skeletal remains to determine that the Inuit dog had significantly different DNA than other Arctic dogs, including malamutes and huskies.

The article, "Specialized sledge dogs accompanied the Inuit dispersal across the North American Arctic," was published Wednesday in the Proceedings of the Royal Society B: Biological Sciences. From UC Davis, authors include Christyann Darwent, professor of anthropology; Ben Sacks, adjunct professor and director of the Mammalian Ecology and Conservation Unit, Veterinary Genetics Laboratory, School of Veterinary Medicine; and Sarah Brown, a postdoctoral researcher. Lead author Carly Ameen is an archaeologist from the University of Exeter; Tatiana Feuerborn is with the Globe Institute in Denmark and Centre for Palaeogenetics in Sweden; and Allowen Evin is at the CNRS, Université de Montpellier, Institut des Sciences de l'Evolution in Montpellier, France. The list of authors includes many others from a large number of collaborating institutions.

Qimmiit (dogs in Inuktitut) were viewed by the Inuit as particularly well-suited to long-distance hauling of people and their goods across the Arctic and consuming local resources, such as sea mammals, for food.

The unique group of dogs helped the Inuit conquer the tough terrain of the North American Arctic 2,000 years ago, researchers said. Inuit dogs are the direct ancestors of modern Arctic sled dogs, and although their appearance has continued to change over time, they continue to play an important role in Arctic communities.

Experts examined the DNA from 921 dogs and wolves who lived during the last 4,500 years. Analysis of the DNA, and the locations and time periods in which they were recovered archaeologically, shows dogs from Inuit sites occupied beginning around 2,000 years ago were genetically different from dogs already in the region.

According to Sacks "the genetic profiles of ancient dogs of the American Arctic dating to 2,000 years ago were nearly identical to those of older dogs from Siberia, but contrasted starkly with those of more ancient dogs in the Americas, providing an unusually clear and definitive picture of the canine replacement event that coincided with the expansion of Thule peoples across the American Arctic two millennia ago."

Read more at Science Daily

Researchers study chickens, ostriches, penguins to learn how flight feathers evolved

If you took a careful look at the feathers on a chicken, you'd find many different forms within the same bird -- even within a single feather. The diversity of feather shapes and functions expands vastly when you consider the feathers of birds ranging from ostriches to penguins to hummingbirds. Now, researchers reporting in the journal Cell on November 27 have taken a multidisciplinary approach to understanding how all those feathers get made.

"We always wonder how birds can fly and in different ways," says corresponding author Cheng-Ming Chuong of the University of Southern California, Los Angeles. "Some soar like eagles, while others require rapid flapping of wings like hummingbirds." Some birds, including ostriches and penguins, don't fly at all.

"Such differences in flight styles are largely due to the characteristics of their flight feathers," Chuong adds. "We wanted to learn how flight feathers are made so we can understand nature better and learn principles of bioinspired architecture."

In the new study, the researchers put together a multidisciplinary team to look at feathers in many different ways, from their biophysical properties to the underlying molecular biology that allows their formation from stem cells in the skin. They examined the feathers of flightless ostriches, short-distance flying chickens, soaring ducks and eagles, and high-frequency flying sparrows. They studied the extremes by including hummingbirds and penguins. To better understand how feathers have evolved and changed over evolutionary time, the team also looked to feathers that are nearly 100 million years old, found embedded and preserved in amber in Myanmar.

Based on their findings, the researchers explain that feathers' modular structure allowed birds to adapt over evolutionary time, helping them to succeed in the many different environments in which birds live today. Their structure also allows for the specialization of feathers in different parts of an individual bird's body.

The flight feather is made of two highly adaptable architectural modules: the central shaft, or rachis, and the peripheral vane. The rachis is a composite beam made of a porous medulla that keeps feathers light surrounded by a rigid cortex that adds strength. Their studies show that these two components of the rachis allow for highly flexible designs that enabled to fly or otherwise get around in different ways. The researchers also revealed the underlying molecular signals, including Bmp and Ski, that guide the development of those design features.

Attached to the rachis is the feather vane. The vane is the part of the feather made up of many soft barbs that zip together. The researchers report that the vane develops using principles akin to paper cutting. As such, a single epithelial sheet produces a series of diverse, branched designs with individual barbs, each bearing many tiny hooklets that hold the vane together into a plane using a Velcro-like mechanism. Their studies show that gradients in another signaling pathway (Wnt2b) play an important role in the formation of those barbs.

To look back in time, the researchers studied recently discovered amber fossils, allowing them to explore delicate, three-dimensional feather structures. Their studies show that ancient feathers had the same basic architecture but with more primitive characteristics. For instance, adjacent barbs formed the vane with overlapping barbules, without the Velcro-like, hooklet mechanism found in living birds.

"We've learned how a simple skin can be transformed into a feather, how a prototypic feather structure can be transformed into downy, contour, or flight feathers, and how a flight feather can be modulated to adapt to different flight modes required for different living environments," Chuong says. "In every corner and at different morphological scales, we were amazed at how the elegant adaption of the prototype architecture can help different birds to adapt to different new environments."

The researchers say that, in addition to helping to understand how birds have adapted over time, they hope these bioinspired architectural principles they've uncovered can be useful in future technology design. They note that composite materials of the future could contribute toward the construction of light but robust flying drones, durable and resilient wind turbines, or better medical implants and prosthetic devices.

Read more at Science Daily

Ostrich eggshell beads reveal 10,000 years of cultural interaction across Africa

Ostrich eggshell beads are some of the oldest ornaments made by humankind, and they can be found dating back at least 50,000 years in Africa. Previous research in southern Africa has shown that the beads increase in size about 2,000 years ago, when herding populations first enter the region. In the current study, researchers Jennifer Miller and Elizabeth Sawchuk investigate this idea using increased data and evaluate the hypothesis in a new region where it has never before been tested.

Review of old ideas, analysis of old collections

To conduct their study, the researchers recorded the diameters of 1,200 ostrich eggshell beads unearthed from 30 sites in Africa dating to the last 10,000 years. Many of these bead measurements were taken from decades-old unstudied collections, and so are being reported here for the first time. This new data increases the published bead diameter measurements from less than 100 to over 1,000, and reveals new trends that oppose longstanding beliefs.

The ostrich eggshell beads reflect different responses to the introduction of herding between eastern and southern Africa. In southern Africa, new bead styles appear alongside signs of herding, but do not replace the existing forager bead traditions. On the other hand, beads from the eastern Africa sites showed no change in style with the introduction of herding. Although eastern African bead sizes are consistently larger than those from southern Africa, the larger southern African herder beads fall within the eastern African forager size range, hinting at contact between these regions as herding spread. "These beads are symbols that were made by hunter-gatherers from both regions for more than 40,000 years," says lead author Jennifer Miller, "so changes -- or lack thereof -- in these symbols tells us how these communities responded to cultural contact and economic change."

Ostrich eggshell beads tell the story of ancient interaction

The story told by ostrich eggshell beads is more nuanced than previously believed. Contact with outside groups of herders likely introduced new bead styles along with domesticated animals, but the archaeological record suggests the incoming influence did not overwhelm existing local traditions. The existing customs were not replaced with new ones; rather they continued and incorporated some of the new elements.

In eastern Africa, studied here for the first time, there was no apparent change in bead style with the arrival of herding groups from the north. This may be because local foragers adopted herding while retaining their bead-making traditions, because migrant herders possessed similar traditions prior to contact, and/or because incoming herders adopted local styles. "In the modern world, migration, cultural contact, and economic change often create tension," says Sawchuk, "ancient peoples experienced these situations too, and the patterns in cultural objects like ostrich eggshell beads give us a chance to study how they navigated these experiences."

The researchers hope that this work inspires a renewed interest into ostrich eggshell beads, and recommend that future studies present individual bead diameters rather than a single average of many. Future research should also investigate questions related to manufacture, chemical identification, and the effects of taphonomic processes and wear on bead diameter. "This study shows that examining old collections can generate important findings without new excavation," says Miller, "and we hope that future studies will take advantage of the wealth of artifacts that have been excavated but not yet studied."

From Science Daily

Nov 28, 2019

Stem cell therapy helps broken hearts heal in unexpected way

Stem cell therapy helps hearts recover from a heart attack, although not for the biological reasons originally proposed two decades ago that today are the basis of ongoing clinical trials. This is the conclusion of a Nov. 27 study in Nature that shows an entirely different way that heart stem cells help the injured heart -- not by replacing damaged or dead heart cells as initially proposed.

The study reports that injecting living or even dead heart stem cells into the injured hearts of mice triggers an acute inflammatory process, which in turn generates a wound healing-like response to enhance the mechanical properties of the injured area.

Mediated by macrophage cells of the immune system, the secondary healing process provided a modest benefit to heart function after heart attack, according to Jeffery Molkentin, PhD, principal investigator, director of Molecular Cardiovascular Microbiology a Cincinnati Children's Hospital Medical Center and a professor of the Howard Hughes Medical Institute (HHMI).

"The innate immune response acutely altered cellular activity around the injured area of the heart so that it healed with a more optimized scar and improved contractile properties," Molkentin said. "The implications of our study are very straight forward and present important new evidence about an unsettled debate in the field of cardiovascular medicine."

The new paper builds on a 2014 study published by the same research team, also in Nature. As in that earlier study, the current paper shows that injecting c-kit positive heart stem cells into damaged hearts as a strategy to regenerate cardiomyocytes doesn't work. The findings prompted Molkentin and his colleagues to conclude that there is a need to "re-evaluate the current planned cell therapy based clinical trials to ask how this therapy might really work."

An Unexpected Discovery

The study worked with two types of heart stem cells currently used in the clinical trials -- bone marrow mononuclear cells and cardiac progenitor cells. As the researchers went through the process of testing and re-verifying their data under different conditions, they were surprised to discover that in addition to the two types of stem cells, injecting dead cells or even an inert chemical called zymosan also provided benefit to the heart by optimizing the healing process. Zymosan is a substance designed to induce an innate immune response.

Researchers reported that stem cells or zymosan therapies tested in this study altered immune cell responses that significantly decreased the formation of extra cellular matrix connective tissue in the injury areas, while also improving the mechanical properties of the scar itself. The authors concluded: "injected hearts produced a significantly greater change in passive force over increasing stretch, a profile that was more like uninjured hearts."

Molkentin and his colleagues also found that stem cells and other therapeutic substances like zymosan have to be injected directly into the hearts surrounding the area of infarction injury. This is in contrast to most past human clinical trials that for patient safety reasons simply injected stem cells into the circulatory system.

"Most of the current trials were also incorrectly designed because they infuse cells into the vasculature," Molkentin explained. "Our results show that the injected material has to go directly into the heart tissue flanking the infarct region. This is where the healing is occurring and where the macrophages can work their magic."

The researchers also noted an interesting finding involving zymosan, a chemical compound that binds with select pattern recognition receptors to cause an acute innate immune response. Using zymosan to treat injured hearts in mice resulted in a slightly greater and longer-lasting benefit on injured tissues than injecting stem cells or dead cell debris.

Looking to the Future

Molkentin said he and other collaborating scientists will follow up the findings by looking for ways to leverage the healing properties of the stem cells and compounds they tested.

For example, considering how heart stem cells, cell debris and zymosan all triggered an acute innate immune response involving macrophages in the current paper, Molkentin explained they will test a theory that harnesses the selective healing properties of macrophages. This includes polarizing or biologically queuing macrophages to only have healing-like properties.

Read more at Science Daily

A new theory for how black holes and neutron stars shine bright

Black hole illustration.
For decades, scientists have speculated about the origin of the electromagnetic radiation emitted from celestial regions that host black holes and neutron stars -- the most mysterious objects in the universe.

Astrophysicists believe that this high-energy radiation -- which makes neutron stars and black holes shine bright -- is generated by electrons that move at nearly the speed of light, but the process that accelerates these particles has remained a mystery.

Now, researchers at Columbia University have presented a new explanation for the physics underlying the acceleration of these energetic particles.

In a study published in the December issue of The Astrophysical Journal, astrophysicists Luca Comisso and Lorenzo Sironi employed massive super-computer simulations to calculate the mechanisms that accelerate these particles. They concluded that their energization is a result of the interaction between chaotic motion and reconnection of super-strong magnetic fields.

"Turbulence and magnetic reconnection -- a process in which magnetic field lines tear and rapidly reconnect -- conspire together to accelerate particles, boosting them to velocities that approach the speed of light," said Luca Comisso, a postdoctoral research scientist at Columbia and first author on the study.

"The region that hosts black holes and neutron stars is permeated by an extremely hot gas of charged particles, and the magnetic field lines dragged by the chaotic motions of the gas, drive vigorous magnetic reconnection," he added. "It is thanks to the electric field induced by reconnection and turbulence that particles are accelerated to the most extreme energies, much higher than in the most powerful accelerators on Earth, like the Large Hadron Collider at CERN."

When studying turbulent gas, scientists cannot predict chaotic motion precisely. Dealing with the mathematics of turbulence is difficult, and it constitutes one of the seven "Millennium Prize" mathematical problems. To tackle this challenge from an astrophysical point of view, Comisso and Sironi designed extensive super-computer simulations -- among the world's largest ever done in this research area -- to solve the equations that describe the turbulence in a gas of charged particles.

"We used the most precise technique -- the particle-in-cell method -- for calculating the trajectories of hundreds of billions of charged particles that self-consistently dictate the electromagnetic fields. And it is this electromagnetic field that tells them how to move," said Sironi, assistant professor of astronomy at Columbia and the study's principal investigator.

Sironi said that the crucial point of the study was to identify role magnetic reconnection plays within the turbulent environment. The simulations showed that reconnection is the key mechanism that selects the particles that will be subsequently accelerated by the turbulent magnetic fields up to the highest energies.

The simulations also revealed that particles gained most of their energy by bouncing randomly at an extremely high speed off the turbulence fluctuations. When the magnetic field is strong, this acceleration mechanism is very rapid. But the strong fields also force the particles to travel in a curved path, and by doing so, they emit electromagnetic radiation.

"This is indeed the radiation emitted around black holes and neutron stars that make them shine, a phenomenon we can observe on Earth," Sironi said.

The ultimate goal, the researchers said, is to get to know what is really going on in the extreme environment surrounding black holes and neutron stars, which could shed additional light on fundamental physics and improve our understanding of how our Universe works.

They plan to connect their work even more firmly with observations, by comparing their predictions with the electromagnetic spectrum emitted from the Crab Nebula, the most intensely studied bright remnant of a supernova (a star that violently exploded in the year 1054). This will be a stringent test for their theoretical explanation.

Read more at Science Daily

Humans co-evolved with immune-related diseases -- and it's still happening

Immune system word cloud.
Some of the same mutations allowing humans to fend off deadly infections also make us more prone to certain inflammatory and autoimmune diseases, such as Crohn's disease. In a Review published November 27 in the journal Trends in Immunology, researchers describe how ancestral origins impact the likelihood that people of African or Eurasian descent might develop immune-related diseases. The authors also share evidence that the human immune system is still evolving depending on a person's location or lifestyle.

"In the past, people's lifespans were much shorter, so some of these inflammatory and autoimmune diseases that can appear in the second half of life were not so relevant," says first author Jorge Dominguez-Andres (@dominjor), a postdoctoral researcher at Radboud Institute for Molecular Life Science in the Netherlands. "Now that we live so much longer, we can see the consequences of infections that happened to our ancestors."

One of the body's best defenses against infectious diseases is inflammation. Dominguez-Andres and senior author Mihai Netea, a Radboud University immunologist and evolutionary biologist, compiled data from genetics, immunology, microbiology, and virology studies and identified how the DNA from people within different communities commonly infected with bacterial or viral diseases was altered, subsequently allowing for inflammation. While these changes made it more difficult for certain pathogens to infect these communities, they were also associated with the emergence -- over time -- of new inflammatory diseases such as Crohn's disease, Lupus, and inflammatory bowel disease.

"There seems to be a balance. Humans evolve to build defenses against diseases, but we are not able to stop disease from happening, so the benefit we obtain on one hand also makes us more sensitive to new diseases on the other hand," says Dominguez-Andres. "Today, we are suffering or benefiting from defenses built into our DNA by our ancestors' immune systems fighting off infections or growing accustomed to new lifestyles."

For example, the malaria parasite Plasmodium sp. has infected African populations for millions of years. Because of this, evolutionary processes have selected people with DNA that favors resistance to infections by causing more inflammation in the body. In doing so, this has also contributed to making modern Africans prone to developing cardiovascular diseases, such as atherosclerosis, later in life.

Dominguez-Andres and Netea also write about how the early-human ancestors of Eurasians lived in regions still inhabited by Neanderthals and interbred. Today, people with remainders of Neanderthal DNA can be more resistant against HIV-1 and 'staph' infections, but are also more likely to develop allergies, asthma, and hay fever.

The negative side effects of changes in each population's immune systems are a relatively recent finding. "We know a few things about what is happening at the genetic level in our ancestry, but we need more powerful technology. So, next generation sequencing is bursting now and allowing us to study the interplay between DNA and host responses at much deeper levels," says Dominguez-Andres. "So, we are obtaining a much more comprehensive point of view."

These technologies are also revealing how our immune systems are evolving in real time because of modern lifestyle changes. African tribes that still engage in hunting have greater bacterial gut diversity than urbanized African-Americans that eat store-bought foods. Also, changes in hygiene patterns seen in the last two centuries have improved sanitation, drinking water, and garbage collection, and have led to reduced exposure to infectious pathogens relative to previous times. As humans move toward processed foods and stricter hygiene standards, their bodies adapt by developing what researchers call "diseases of civilization," such as type 2 diabetes.

Read more at Science Daily

Laboratory-evolved bacteria switch to consuming carbon dioxide for growth

E. coli illustration
Over the course of several months, researchers in Israel created Escherichia coli strains that consume CO2 for energy instead of organic compounds. This achievement in synthetic biology highlights the incredible plasticity of bacterial metabolism and could provide the framework for future carbon-neutral bioproduction. The work appears November 27th in the journal Cell.

"Our main aim was to create a convenient scientific platform that could enhance CO2 fixation, which can help address challenges related to sustainable production of food and fuels and global warming caused by CO2 emissions," says senior author Ron Milo, at systems biologist at the Weizmann Institute of Science. "Converting the carbon source of E. coli, the workhorse of biotechnology, from organic carbon into CO2 is a major step towards establishing such a platform."

The living world is divided into autotrophs that convert inorganic CO2 into biomass and heterotrophs that consume organic compounds. Autotrophic organisms dominate the biomass on Earth and supply much of our food and fuels. A better understanding of the principles of autotrophic growth and methods to enhance it is critical for the path to sustainability.

A grand challenge in synthetic biology has been to generate synthetic autotrophy within a model heterotrophic organism. Despite widespread interest in renewable energy storage and more sustainable food production, past efforts to engineer industrially relevant heterotrophic model organisms to use CO2 as the sole carbon source have failed. Previous attempts to establish autocatalytic CO2 fixation cycles in model heterotrophs always required the addition of multi-carbon organic compounds to achieve stable growth.

"From a basic scientific perspective, we wanted to see if such a major transformation in the diet of bacteria -- from dependence on sugar to the synthesis of all their biomass from CO2 -- is possible," says first author Shmuel Gleizer (@GleizerShmuel), a Weizmann Institute of Science postdoctoral fellow. "Beyond testing the feasibility of such a transformation in the lab, we wanted to know how extreme an adaptation is needed in terms of the changes to the bacterial DNA blueprint."

In the Cell study, the researchers used metabolic rewiring and lab evolution to convert E. coli into autotrophs. The engineered strain harvests energy from formate, which can be produced electrochemically from renewable sources. Because formate is an organic one-carbon compound that does not serve as a carbon source for E. coli growth, it does not support heterotrophic pathways. The researchers also engineered the strain to produce non-native enzymes for carbon fixation and reduction and for harvesting energy from formate. But these changes alone were not enough to support autotrophy because E. coli's metabolism is adapted to heterotrophic growth.

To overcome this challenge, the researchers turned to adaptive laboratory evolution as a metabolic optimization tool. They inactivated central enzymes involved in heterotrophic growth, rendering the bacteria more dependent on autotrophic pathways for growth. They also grew the cells in chemostats with a limited supply of the sugar xylose -- a source of organic carbon -- to inhibit heterotrophic pathways. The initial supply of xylose for approximately 300 days was necessary to support enough cell proliferation to kick start evolution. The chemostat also contained plenty of formate and a 10% CO2 atmosphere.

In this environment, there is a large selective advantage for autotrophs that produce biomass from CO2 as the sole carbon source compared with heterotrophs that depend on xylose as a carbon source for growth. Using isotopic labeling, the researchers confirmed that the evolved isolated bacteria were truly autotrophic, i.e., CO2 and not xylose or any other organic compound supported cell growth.

"In order for the general approach of lab evolution to succeed, we had to find a way to couple the desired change in cell behavior to a fitness advantage," Milo says. "That was tough and required a lot of thinking and smart design."

By sequencing the genome and plasmids of the evolved autotrophic cells, the researchers discovered that as few as 11 mutations were acquired through the evolutionary process in the chemostat. One set of mutations affected genes encoding enzymes linked to the carbon fixation cycle. The second category consisted of mutations found in genes commonly observed to be mutated in previous adaptive laboratory evolution experiments, suggesting that they are not necessarily specific to autotrophic pathways. The third category consisted of mutations in genes with no known role.

"The study describes, for the first time, a successful transformation of a bacterium's mode of growth. Teaching a gut bacterium to do tricks that plants are renowned for was a real long shot," Gleizer says. "When we started the directed evolutionary process, we had no clue as to our chances of success, and there were no precedents in the literature to guide or suggest the feasibility of such an extreme transformation. In addition, seeing in the end the relatively small number of genetic changes required to make this transition was surprising."

The authors say that one major study limitation is that the consumption of formate by bacteria releases more CO2 than is consumed through carbon fixation. In addition, more research is needed before it's possible to discuss the scalability of the approach for industrial use.

In future work, the researchers will aim to supply energy through renewable electricity to address the problem of CO2 release, determine whether ambient atmospheric conditions could support autotrophy, and try to narrow down the most relevant mutations for autotrophic growth.

Read more at Science Daily

Nov 27, 2019

Unravelling the venomous bite of an endangered mammal

Researchers from Liverpool School of Tropical Medicine (LSTM) and ZSL (Zoological Society of London) have worked with a team of scientists from institutions across the globe - to uncover the truth behind the origin of venom in some very unusual mammals.

As outlined in a paper published today in PNAS, the team focused their attention on an unusual endangered species known as the Hispaniolan solenodon (Solenodon paradoxus) - a member of the eulipotyphlan order of mammals, an ancient group of insectivores also including hedgehogs, moles and shrews. Obtaining venom from wild solenodons and unravelling the genetic blueprint of this species enabled the identification of the proteins that make up their venom, revealing that it consists of multiple kallikrein-1 serine proteases. Analysis of these toxins showed they are probably used by solenodons to cause drops in blood pressure in the vertebrate prey on which they sometimes feed.

To their surprise, the team found that the kallikrein-1 serine protease toxins found in solenodon venom had evolved in parallel with those detected in the venom of distantly related venomous shrews. The same building blocks of venom have therefore evolved convergently within solenodons and shrews, despite their divergence from one another over 70 million years ago - when dinosaurs still walked the earth.

Professor Nick Casewell from LSTM's Centre for Snakebite Research & Interventions is lead author on the paper. He said: "These particular proteins are present in the salivary glands of many mammals. Through our research we are able to demonstrate that they have been independently co-opted for a toxic role in the oral venom systems of both solenodons and shrews. These findings represent a fascinating example of how evolution can funnel novel adaptations down repeatable pathways".

The Hispaniolan solenodon is only found on the Caribbean island of Hispaniola (made up of the Dominican Republic and Haiti), and is considered one of the most Evolutionarily Distinct and Globally Endangered (EDGE) mammals by ZSL's EDGE of Existence programme. It is one of the few venomous mammals, producing toxic saliva that it injects into its prey through unique grooves in its lower incisor teeth (which gives the solenodon its name). Solenodons are some of the last few surviving Caribbean land mammals and are threatened today by habitat loss and predation from introduced dogs and cats.

Professor Samuel Turvey, from ZSL's Institute of Zoology and who jointly led the project, said: "This study highlights how little we know about one of the world's most fascinating animals. Unravelling details of the solenodon's previously unstudied venom system helps us to understand the mechanisms behind convergent evolution - and demonstrates the importance of conserving the world's remarkable EDGE species."

Read more at Science Daily

Researchers use machine learning tools to reveal how memories are coded in the brain

NUS researchers have made a breakthrough in the field of cognitive computational neuroscience, by discovering a key aspect of how the brain encodes short-term memories.

The researchers working in The N.1 Institute for Health at the National University of Singapore (NUS), led by Assistant Professor Camilo Libedinsky from the Department of Psychology at NUS, and Senior Lecturer Shih-Cheng Yen from the Innovation and Design Programme in the Faculty of Engineering at NUS, discovered that a population of neurons in the brain's frontal lobe contain stable short-term memory information within dynamically-changing neural activity.

This discovery may have far-reaching consequences in understanding how organisms have the ability to perform multiple mental operations simultaneously, such as remembering, paying attention and making a decision, using a brain of limited size.

The results of this study were published in the journal Nature Communications on 1 November 2019.

Mapping short-term memory in the frontal lobe

In the human brain, the frontal lobe plays an important role in processing short-term memories. Short-term memory has a low capacity to retain information. "It can usually only hold six to eight items. Think for example about our ability to remember a phone number for a few seconds -- that uses short-term memory," Asst Prof Libendisky explained.

Here, the NUS researchers studied how the frontal lobe represents short-term memory information by measuring the activity of many neurons. Previous results from the researchers had shown that if a distraction was presented during the memory maintenance period, it changed the code used by frontal lobe neurons that encode the memory.

"This was counterintuitive since the memory was stable but the code changed. In this study, we solved this riddle," Asst Prof Libedinsky said. Employing tools derived from machine learning, the researchers showed that stable information can be found within the changing neural population code.

This means that the NUS team demonstrated that memory information can be read out from a population of neurons that morphs their code after a distractor is presented.

Next steps

This simple finding has broader implications, suggesting that a single neural population may contain multiple independent types of information that do not interfere with each other. "This may be an important property of organisms that display cognitive flexibility," Asst Prof Libedinsky explained.

Read more at Science Daily

Helper protein worsens diabetic eye disease

In a recent study using mice, lab-grown human retinal cells and patient samples, Johns Hopkins Medicine scientists say they found evidence of a new pathway that may contribute to degeneration of the light sensitive tissue at the back of the eye. The findings, they conclude, bring scientists a step closer to developing new drugs for a central vision-destroying complication of diabetes that affects an estimated 750,000 Americans.

The Johns Hopkins research team focused on diabetic macular edema, a form of swelling and inflammation that occurs in people with diabetes when blood vessels in the eye leak their fluids into the portion of the retina that controls detailed vision.

Current therapies for this disease block the protein VEGF, which contributes to abnormal blood vessel growth. However, because the treatment is not adequate for more than half of patients with diabetic macular edema, investigators have long suspected that more factors drive vision loss in these patients.

In the new study, the Johns Hopkins researchers say they found compelling evidence that angiopoietin-like 4 is at play in macular edema. The signaling protein is already well known to be a blood vessel growth factor with roles in heart disease, cancer and metabolic diseases, of which diabetes is one.

A report on the findings was published Sept. 23 in The Journal of Clinical Investigation.

Akrit Sodhi, M.D., Ph.D., associate professor of ophthalmology at the Johns Hopkins University School of Medicine and the Johns Hopkins Wilmer Eye Institute, in collaboration with Silvia Montaner, Ph.D., M.P.H., at the University of Maryland, led the research team and was intrigued by angiopoietin-like 4 after finding, in previous studies, elevated levels of this protein in the eyes of people with a variety of vision-related diseases.

In the new study, Sodhi and his team found that angiopoietin-like 4 acts both independent of, and synergistically with, VEGF activity, and they identified a potential way to block it.

The investigators made their discoveries by exposing human blood vessel tissue cells grown in the lab to low levels of VEGF and angiopoietin-like 4. Knowing that low levels of these factors individually did not generally create an effect, the researchers were surprised to find that in combination, low-level VEGF and low-level angiopoeitin-like 4 had a synergistic effect on vascular cell permeability, and doubled the leakage from retinal vessels in mice.

"This told us that you can have subthreshold levels of both molecules, where neither alone is enough to do anything, but together, produce a huge effect," says Sodhi.

The amplifying effect led the researchers to believe that VEGF and angiopoietin-like 4 might share a protein receptor within vascular cells.

However, similar experiments revealed that angiopoietin-like 4 also increases blood vessel formation independently of VEGF. "This could explain why some patients continue to experience vision loss despite treatment with current anti-VEGF therapies," says Sodhi.

To test this, the team looked to see whether the angiopoietin-like 4 protein bound to one of VEGF's receptors in lab-grown human vascular cells. They found that angiopoietin-like 4 did not bind to the classic VEGF receptor that is a target of current anti-VEGF medicines, but another less studied one called neuropilin.

With the newly identified receptor, the researchers next sought to learn whether a lab-grown version of the receptor could block angiopoietin-like 4 before it was able to interact with blood vessel cells.

To do that, they injected a soluble fragment of the neuropilin receptor into the eyes of mice pharmacologically treated to mimic human diabetes, resulting in a twofold increase in retinal vascular leakage. The treated diabetic mice showed approximately half of the blood vessel leakage as mice who did not receive the treatment, similar to the nondiabetic mice.

To further explore the new receptor-based treatment's potential value for human patients, the researchers grew human blood vessel cells in the lab in fluid samples collected from the eyes of patients with diabetic macular edema, to replicate the conditions and growth factors found naturally inside of the patients' eyes.

One group of such cells was exposed to the soluble receptor neuropilin. The researchers say they observed a marked decrease in the diabetic macular edema cells treated with the receptor compared to untreated cells.

"This gives us some confidence that this approach will work in human eyes as well," says Sodhi, although he cautions that clinical use of a treatment based on their findings will require many more years of research.

Next, the researchers hope to take a look at the molecular interactions between angiopoietin-like 4 and the neuropilin receptor. Doing so, says Sodhi, will allow them to create a refined match that can bind up as much vision-threatening angiopoietin-like 4 in the eye as possible.

Sodhi also hopes the team's discovery will have value in treating cancer and cardiovascular disease, the courses of which also are influenced by uncontrolled blood vessel growth.

Read more at Science Daily

What keeps cells in shape? New research points to two types of motion

Illustration of human cell
The health of cells is maintained, in part, by two types of movement of their nucleoli, a team of scientists has found. This dual motion within surrounding fluid, it reports, adds to our understanding of what contributes to healthy cellular function and points to how its disruption could affect human health.

"Nucleolar malfunction can lead to disease, including cancer," explains Alexandra Zidovska, an assistant professor in New York University's Department of Physics and the senior author of the study, which appears in the journal eLife. "Thus, understanding the processes responsible for the maintenance of nucleolar shape and motion might help in the creation of new diagnostics and therapies for certain human afflictions."

Recent discoveries have shown that some cellular compartments don't have membranes, which were previously seen as necessary to hold a cell together. Researchers have since sought to understand the forces that maintain the integrity of these building blocks of life absent these membranes.

What has been observed is the nature of this behavior. Specifically, these compartments act as liquid droplets made of a material that does not mix with the fluid around them -- similar to oil and water. This process, known as liquid-liquid phase separation, has now been established as one of the key cellular organizing principles.

In their study, the researchers focused on the best known example of such cellular liquid droplet: the nucleolus, which resides inside the cell nucleus and is vital to cell's protein synthesis.

"While the liquid-like nature of the nucleolus has been studied before, its relationship with the surrounding liquid is not known," explains Zidovska, who co-authored the study with Christina Caragine, an NYU doctoral student, and Shannon Haley, an undergraduate in NYU's College of Arts and Science at the time of the work and now a doctoral student at the University of California at Berkeley. "This relationship is particularly intriguing considering the surrounding liquid -- the nucleoplasm -- contains the entire human genome."

Yet, unclear is how the two fluids interact with each other.

To better understand this dynamic, the scientists examined the motion and fusion of human nucleoli in live human cells, while monitoring their shape, size, and smoothness of their surface. The method for studying the fusion of the nucleolar droplets was created by the team in 2018 and reported in the journal Physical Review Letters.

Their latest study showed two types of nucleolar pair movements or "dances": an unexpected correlated motion prior to their fusion and separate independent motion. Moreover, they found that the smoothness of the nucleolar interface is susceptible to both changes in gene expression and the packing state of the genome, which surrounds the nucleoli.

Read more at Science Daily

Nov 26, 2019

Extra-terrestrial impacts may have triggered 'bursts' of plate tectonics

When -- and how -- Earth's surface evolved from a hot, primordial mush into a rocky planet continually resurfaced by plate tectonics remain some of the biggest unanswered questions in earth science research. Now a new study, published in Geology, suggests this earthly transition may in fact have been triggered by extra-terrestrial impacts.

"We tend to think of the Earth as an isolated system, where only internal processes matter," says Craig O'Neill, director of Macquarie University's Planetary Research Centre. "Increasingly, though, we're seeing the effect of solar system dynamics on how the Earth behaves."

Modelling simulations and comparisons with lunar impact studies have revealed that following Earth's accretion about 4.6 billion years ago, Earth-shattering impacts continued to shape the planet for hundreds of millions of years. Although these events appear to have tapered off over time, spherule beds -- distinctive layers of round particles condensed from rock vaporized during an extra-terrestrial impact -- found in South Africa and Australia suggest the Earth experienced a period of intense bombardment about 3.2 billion years ago, roughly the same time the first indications of plate tectonics appear in the rock record.

This coincidence caused O'Neill and co-authors Simone Marchi, William Bottke, and Roger Fu to wonder whether these circumstances could be related. "Modelling studies of the earliest Earth suggest that very large impacts -- more than 300 km in diameter -- could generate a significant thermal anomaly in the mantle," says O'Neill. This appears to have altered the mantle's buoyancy enough to create upwellings that, according to O'Neill, "could directly drive tectonics."

But the sparse evidence found to date from the Archaean -- the period of time spanning 4.0 to 2.5 billion years ago -- suggests that mostly smaller impacts less than 100 km in diameter occurred during this interval. To determine whether these more modest collisions were still large and frequent enough to initiate global tectonics, the researchers used existing techniques to expand the Middle Archaean impact record and then developed numerical simulations to model the thermal effects of these impacts on Earth's mantle.

The results indicate that during the Middle Archaean, 100-kilometer-wide impacts (about 30 km wider than the much younger Chixculub crater) were capable of weakening Earth's rigid, outermost layer. This, says O'Neill, could have acted as a trigger for tectonic processes, especially if Earth's exterior was already "primed" for subduction.

"If the lithosphere were the same thickness everywhere, such impacts would have little effect," states O'Neill. But during the Middle Archean, he says, the planet had cooled enough for the mantle to thicken in some spots and thin in others. The modelling showed that if an impact were to happen in an area where these differences existed, it would create a point of weakness in a system that already had a large contrast in buoyancy -- and ultimately trigger modern tectonic processes.

Read more at Science Daily

Rapamycin may slow skin aging

The search for youthfulness typically turns to lotions, supplements, serums and diets, but there may soon be a new option joining the fray. Rapamycin, a FDA-approved drug normally used to prevent organ rejection after transplant surgery, may also slow aging in human skin, according to a study from Drexel University College of Medicine researchers published in Geroscience.

Basic science studies have previously used the drug to slow aging in mice, flies, and worms, but the current study is the first to show an effect on aging in human tissue, specifically skin -- in which signs of aging were reduced. Changes include decreases in wrinkles, reduced sagging and more even skin tone -- when delivered topically to humans.

"As researchers continue to seek out the elusive 'fountain of youth' and ways to live longer, we're seeing growing potential for use of this drug," said senior author Christian Sell, PhD, an associate professor of Biochemistry and Molecular Biology at the College of Medicine. "So, we said, let's try skin. It's a complex organism with immune, nerve cells, stem cells -- you can learn a lot about the biology of a drug and the aging process by looking at skin."

In the current Drexel-led study, 13 participants over age 40 applied rapamycin cream every 1-2 days to one hand and a placebo to the other hand for eight months. The researchers checked on subjects after two, four, six and eight months, including conducting a blood test and a biopsy at the six- or eight-month mark.

After eight months, the majority of the rapamycin hands showed increases in collagen protein, and statistically significant lower levels of p16 protein, a key marker of skin cell aging. Skin that has lower levels of p16 has fewer senescent cells, which are associated with skin wrinkles. Beyond cosmetic effects, higher levels of p16 can lead to dermal atrophy, a common condition in seniors, which is associated with fragile skin that tears easily, slow healing after cuts and increased risk of infection or complications after an injury.

So how does rapamycin work? Rapamycin blocks the appropriately named "target of rapamycin" (TOR), a protein that acts as a mediator in metabolism, growth and aging of human cells. The capability for rapamycin to improve human health beyond outward appearance is further illuminated when looking deeper at p16 protein, which is a stress response that human cells undergo when damaged, but is also a way of preventing cancer. When cells have a mutation that would have otherwise created a tumor, this response helps prevent the tumor by slowing the cell cycle process. Instead of creating a tumor, it contributes to the aging process.

"When cells age, they become detrimental and create inflammation," said Sell. "That's part of aging. These cells that have undergone stress are now pumping out inflammatory markers."

In addition to its current use to prevent organ rejection, rapamycin is currently prescribed (in higher doses than used in the current study) for the rare lung disease lymphangioleiomyomatosis, and as an anti-cancer drug. The current Drexel study shows a second life for the drug in low doses, including new applications for studying rapamycin to increase human lifespan or improve human performance.

Rapamycin -- first discovered in the 1970s in bacteria found in the soil of Easter Island -- also reduces stress in the cell by attacking cancer-causing free radicals in the mitochondria.

In previous studies, the team used rapamycin in cell cultures, which reportedly improved cell function and slowed aging.

In 1996, a study in Cell of yeast cultures which used rapamycin to block TOR proteins in yeast, made the yeast cells smaller, but increased their lifespan.

"If you ramp the pathway down you get a smaller phenotype," said Sell. "When you slow growth, you seem to extend lifespan and help the body repair itself -- at least in mice. This is similar to what is seen in calorie restriction."

Read more at Science Daily

First recording of a blue whale's heart rate

Blue whale
Encased in a neon orange plastic shell, a collection of electronic sensors bobbed along the surface of the Monterey Bay, waiting to be retrieved by Stanford University researchers. A lunchbox-sized speck in the vast waters, it held cargo of outsized importance: the first-ever recording of a blue whale's heart rate.

This device was fresh off a daylong ride on Earth's largest species -- a blue whale. Four suction cups had secured the sensor-packed tag near the whale's left flipper, where it recorded the animal's heart rate through electrodes embedded in the center of two of the suction feet. The details of this tag's journey and the heart rate it delivered were published Nov. 25 in Proceedings of the National Academy of Sciences.

"We had no idea that this would work and we were skeptical even when we saw the initial data. With a very keen eye, Paul Ponganis -- our collaborator from the Scripps Institution of Oceanography -- found the first heart beats in the data," said Jeremy Goldbogen, assistant professor of biology in the School of Humanities Sciences at Stanford and lead author of the paper. "There were a lot of high fives and victory laps around the lab."

Analysis of the data suggests that a blue whale's heart is already working at its limit, which may explain why blue whales have never evolved to be bigger. The data also suggest that some unusual features of the whale's heart might help it perform at these extremes. Studies like this add to our fundamental knowledge of biology and can also inform conservation efforts.

"Animals that are operating at physiological extremes can help us understand biological limits to size," said Goldbogen. "They may also be particularly susceptible to changes in their environment that could affect their food supply. Therefore, these studies may have important implications for the conservation and management of endangered species like blue whales."

Penguins to whales

A decade ago, Goldbogen and Ponganis measured the heart rates of diving emperor penguins in Antarctica's McMurdo Sound. For years after, they wondered whether a similar task could be accomplished with whales.

"I honestly thought it was a long shot because we had to get so many things right: finding a blue whale, getting the tag in just the right location on the whale, good contact with the whale's skin and, of course, making sure the tag is working and recording data," said Goldbogen.

The tag performed well on smaller, captive whales, but getting it near a wild blue whale's heart is a different task. For one thing, wild whales aren't trained to flip belly-up. For another, blue whales have accordion-like skin on their underside that expands during feeding, and one such gulp could pop the tag right off.

"We had to put these tags out without really knowing whether or not they were going to work," recalled David Cade, a recent graduate of the Goldbogen Lab who is a co-author of the paper and who placed the tag on the whale. "The only way to do it was to try it. So we did our best."

Cade stuck the tag on his first attempt and, over time, it slid into a position near the flipper where it could pick up the heart's signals. The data it captured showed striking extremes.

When the whale dove, its heart rate slowed, reaching an average minimum of about four to eight beats per minute -- with a low of two beats per minute. At the bottom of a foraging dive, where the whale lunged and consumed prey, the heart rate increased about 2.5 times the minimum, then slowly decreased again. Once the whale got its fill and began to surface, the heart rate increased. The highest heart rate -- 25 to 37 beats per minutes -- occurred at the surface, where the whale was breathing and restoring its oxygen levels.

An elastic heart

This data was intriguing because the whale's highest heart rate almost outpaced predictions while the lowest heart rate was about 30 to 50 percent lower than predicted. The researchers think that the surprisingly low heart rate may be explained by a stretchy aortic arch -- part of the heart that moves blood out to the body -- which, in the blue whale, slowly contracts to maintain some additional blood flow in between beats. Meanwhile, the impressively high rates may depend on subtleties in the heart's movement and shape that prevent the pressure waves of each beat from disrupting blood flow.

Looking at the big picture, the researchers think the whale's heart is performing near its limits. This may help explain why no animal has ever been larger than a blue whale -- because the energy needs of a larger body would outpace what the heart can sustain.

Now, the researchers are hard at work adding more capabilities to the tag, including an accelerometer, which could help them better understand how different activities affect heart rate. They also want to try their tag on other members of the rorqual whale group, such as fin whales, humpbacks and minke whales.

"A lot of what we do involves new technology and a lot of it relies on new ideas, new methods and new approaches," said Cade. "We're always looking to push the boundaries of how we can learn about these animals."

Read more at Science Daily

Scientists inch closer than ever to signal from cosmic dawn

Around 12 billion years ago, the universe emerged from a great cosmic dark age as the first stars and galaxies lit up. With a new analysis of data collected by the Murchison Widefield Array (MWA) radio telescope, scientists are now closer than ever to detecting the ultra-faint signature of this turning point in cosmic history.

In a paper on the preprint site ArXiv and soon to be published in the Astrophysical Journal, researchers present the first analysis of data from a new configuration of the MWA designed specifically to look for the signal of neutral hydrogen, the gas that dominated the universe during the cosmic dark age. The analysis sets a new limit -- the lowest limit yet -- for the strength of the neutral hydrogen signal.

"We can say with confidence that if the neutral hydrogen signal was any stronger than the limit we set in the paper, then the telescope would have detected it," said Jonathan Pober, an assistant professor of physics at Brown University and corresponding author on the new paper. "These findings can help us to further constrain the timing of when the cosmic dark ages ended and the first stars emerged."

The research was led by Wenyang Li, who performed the work as a Ph.D. student at Brown. Li and Pober collaborated with an international group of researchers working with the MWA.

Despite its importance in cosmic history, little is known about the period when the first stars formed, which is known as the Epoch of Reionization (EoR). The first atoms that formed after the Big Bang were positively charged hydrogen ions -- atoms whose electrons were stripped away by the energy of the infant universe. As the universe cooled and expanded, hydrogen atoms reunited with their electrons to form neutral hydrogen. And that's just about all there was in the universe until about 12 billion years ago, when atoms started clumping together to form stars and galaxies. Light from those objects re-ionized the neutral hydrogen, causing it to largely disappear from interstellar space.

The goal of projects like the one happening at MWA is to locate the signal of neutral hydrogen from the dark ages and measure how it changed as the EoR unfolded. Doing so could reveal new and critical information about the first stars -- the building blocks of the universe we see today. But catching any glimpse of that 12-billion-year-old signal is a difficult task that requires instruments with exquisite sensitivity.

When it began operating in 2013, the MWA was an array of 2,048 radio antennas arranged across the remote countryside of Western Australia. The antennas are bundled together into 128 "tiles," whose signals are combined by a supercomputer called the Correlator. In 2016, the number of tiles was doubled to 256, and their configuration across the landscape was altered to improve their sensitivity to the neutral hydrogen signal. This new paper is the first analysis of data from the expanded array.

Neutral hydrogen emits radiation at a wavelength of 21 centimeters. As the universe has expanded over the past 12 billion years, the signal from the EoR is now stretched to about 2 meters, and that's what MWA astronomers are looking for. The problem is there are myriad other sources that emit at the same wavelength -- human-made sources like digital television as well as natural sources from within the Milky Way and from millions of other galaxies.

"All of these other sources are many orders of magnitude stronger than the signal we're trying to detect," Pober said. "Even an FM radio signal that's reflected off an airplane that happens to be passing above the telescope is enough to contaminate the data."

To home in on the signal, the researchers use a myriad of processing techniques to weed out those contaminants. At the same time, they account for the unique frequency responses of the telescope itself.

"If we look at different radio frequencies or wavelengths, the telescope behaves a little differently," Pober said. "Correcting for the telescope response is absolutely critical for then doing the separation of astrophysical contaminants and the signal of interest."

Those data analysis techniques combined with the expanded capacity of the telescope itself resulted in a new upper bound of the EoR signal strength. It's the second consecutive best-limit-to-date analysis to be released by MWA and raises hope that the experiment will one day detect the elusive EoR signal.

Read more at Science Daily

Nov 25, 2019

A monkey's balancing act

The study, which looks specifically at the behaviour of an endangered monkey species, reveals that even in national parks where human presence is reduced and regulated, the animals carry out careful calculations and modify their natural behaviour to balance the pros and cons of living in close proximity to humans.

It reveals the negative impact that consuming human foods can have on the physical health of the monkeys, and highlights the need for new and sustainable conservation programmes to save the growing number of endangered species in their natural habitats.

Barbary macaques are an endangered species of monkeys restricted to the forests of Morocco and Algeria, with an introduced population also living on the Rock of Gibraltar. The wild population in North Africa has dramatically declined in the last decades.

The new study, led by Dr Bonaventura Majolo from the University of Lincoln, UK, involved a detailed examination of the effects of human activity on wild Barbary macaques in Ifrane National Park in Morocco.

Dr Majolo said: "When we observe animals in the wild we often talk about a 'landscape of fear'. This term refers to the decisions that animals make when they choose whether or not to avoid an area where the risk of predation is highest; weighing up the risk of attack against the possible rewards to be found there.

"Our study shows that macaques make many behavioural adjustments in response to varying levels of risk and reward, and that the way the macaques respond to human activity is very similar to the way in which they respond to predation risk. We see evidence here that the macaques are capable of great behavioural flexibility as they navigate the problems and the opportunities that sharing space with humans presents."

The researchers followed five groups of Barbary macaques and observed their behaviour and habitat selection over the course of a year. Their findings reveal the true extent of human activity on the monkeys' habitats and choices.

The researchers observed the macaques making significant adjustments to their behaviour and navigating their environment strategically in relation to human activity. They appear to balance food acquisition and risk avoidance -- for example they minimise risk by avoiding areas used by local shepherds and their dogs (which are now among the monkeys' most dangerous predators), and exploit opportunities to receive high-calorie human food by spending time close to roads.

Although being fed by humans may appear to be beneficial for the monkeys, food provisioning in fact has negative impacts on the macaques -- increasing their stress levels, heightening the probability of road injury and death, and having a detrimental impact on their health.

The monkeys' behaviour also shows seasonal trends in correlation with human activities. The macaques avoid herding routes during summer months, when herding activity by the local shepherds is at its peak, and they are more likely to use areas close to roads in the autumn and winter months, when natural food sources are low and the benefits of receiving high calorie human food may exceed the risk of being injured or even killed by road traffic.

The study reveals that the 'home range' of each observed macaque group (the area where a group of monkeys spend most of their time) included some kind of human structure, from roads and paths to picnic areas and farms. They also found that all of the study group's home ranges overlapped with at least one other, which the researchers conclude could be a result of declining availability of suitable habitats and food sources, or of direct competition over profitable areas close to roads and safe sleeping sites.

Their findings are published in the scientific journal Animal Conservation.

James Waterman, first author of the paper and a PhD student at Liverpool John Moores University, said: "Even in a national park, the effect of human disturbance on animal life can be considerable, and as our landscapes become increasingly human dominated, many wildlife species must cope with new ecological pressures. The impact of habitat loss and fragmentation, climate change, expanding human infrastructure, hunting and poaching quickly and dramatically alters habitats, forcing wildlife to adjust, move to more suitable areas (if these are available), or ultimately face the threat of extinction.

"This study highlights that it is more important than ever to develop conservation programs that take into account the requirements of all involved, including, but not limited to the wildlife that is ultimately at risk. Programs that fail to do so rarely produce lasting, positive change."

Read more at Science Daily

Bizarre worlds orbiting a black hole?

Theoreticians in two different fields defied the common knowledge that planets orbit stars like the Sun. They proposed the possibility of thousands of planets around a supermassive black hole.

"With the right conditions, planets could be formed even in harsh environments, such as around a black hole," says Keiichi Wada, a professor at Kagoshima University researching active galactic nuclei which are luminous objects energized by black holes.

According to the latest theories, planets are formed from fluffy dust aggregates in a protoplanetary disk around a young star. But young stars are not the only objects that possess dust disks. In a novel approach, the researchers focused on heavy disks around supermassive black holes in the nuclei of galaxies.

"Our calculations show that tens of thousands of planets with 10 times the mass of the Earth could be formed around 10 light-years from a black hole," says Eiichiro Kokubo, a professor at the National Astronomical Observatory of Japan who studies planet formation. "Around black holes there might exist planetary systems of astonishing scale."

Some supermassive black holes have large amounts of matter around them in the form of a heavy, dense disk. A disk can contain as much as a hundred thousand times the mass of the Sun worth of dust. This is a billion times the dust mass of a protoplanetary disk.

In a low temperature region of a protoplanetary disk, dust grains with ice mantles stick together and evolve into fluffy aggregates. A dust disk around a black hole is so dense that the intense radiation from the central region is blocked and low temperature regions are formed. The researchers applied the planet formation theory to circumnuclear disks and found that planets could be formed in several hundred million years.

Currently there are no techniques to detect these planets around black holes. However, the researchers expect this study to open a new field of astronomy.

From Science Daily

Cannabis reduces headache and migraine pain by nearly half

Inhaled cannabis reduces self-reported headache severity by 47.3% and migraine severity by 49.6%, according to a recent study led by Carrie Cuttler, a Washington State University assistant professor of psychology.

The study, published online recently in the Journal of Pain, is the first to use big data from headache and migraine patients using cannabis in real time. Previous studies have asked patients to recall the effect of cannabis use in the past. There has been one clinical trial indicating that cannabis was better than ibuprofen in alleviating headache, but it used nabilone, a synthetic cannabinoid drug.

"We were motivated to do this study because a substantial number of people say they use cannabis for headache and migraine, but surprisingly few studies had addressed the topic," said Cuttler, the lead author on the paper.

In the WSU study, researchers analyzed archival data from the Strainprint app, which allows patients to track symptoms before and after using medical cannabis purchased from Canadian producers and distributors. The information was submitted by more than 1,300 patients who used the app over 12,200 times to track changes in headache from before to after cannabis use, and another 653 who used the app more than 7,400 times to track changes in migraine severity.

"We wanted to approach this in an ecologically valid way, which is to look at actual patients using whole plant cannabis to medicate in their own homes and environments," Cuttler said. "These are also very big data, so we can more appropriately and accurately generalize to the greater population of patients using cannabis to manage these conditions."

Cuttler and her colleagues saw no evidence that cannabis caused "overuse headache," a pitfall of more conventional treatments which can make patients' headaches worse over time. However, they did see patients using larger doses of cannabis over time, indicting they may be developing tolerance to the drug.

The study found a small gender difference with significantly more sessions involving headache reduction reported by men (90.0%) than by women (89.1%). The researchers also noted that cannabis concentrates, such as cannabis oil, produced a larger reduction in headache severity ratings than cannabis flower.

There was, however, no significant difference in pain reduction among cannabis strains that were higher or lower in levels of tetrahydrocannabinol (THC) and cannabidiol (CBD), two of the most commonly studied chemical constituents in cannabis, also known as cannabinoids. Since cannabis is made up of over 100 cannabinoids, this finding suggests that different cannabinoids or other constituents like terpenes may play the central role in headache and migraine relief.

More research is needed, and Cuttler acknowledges the limitations of the Strainprint study since it relies on a self-selected group of people who may already anticipate that cannabis will work to alleviate their symptoms, and it was not possible to employ a placebo control group.

Read more at Science Daily

Babies in the womb may see more than we thought

By the second trimester, long before a baby's eyes can see images, they can detect light.

But the light-sensitive cells in the developing retina -- the thin sheet of brain-like tissue at the back of the eye -- were thought to be simple on-off switches, presumably there to set up the 24-hour, day-night rhythms parents hope their baby will follow.

University of California, Berkeley, scientists have now found evidence that these simple cells actually talk to one another as part of an interconnected network that gives the retina more light sensitivity than once thought, and that may enhance the influence of light on behavior and brain development in unsuspected ways.

In the developing eye, perhaps 3% of ganglion cells -- the cells in the retina that send messages through the optic nerve into the brain -- are sensitive to light and, to date, researchers have found about six different subtypes that communicate with various places in the brain. Some talk to the suprachiasmatic nucleus to tune our internal clock to the day-night cycle. Others send signals to the area that makes our pupils constrict in bright light.

But others connect to surprising areas: the perihabenula, which regulates mood, and the amygdala, which deals with emotions.

In mice and monkeys, recent evidence suggests that these ganglion cells also talk with one another through electrical connections called gap junctions, implying much more complexity in immature rodent and primate eyes than imagined.

"Given the variety of these ganglion cells and that they project to many different parts of the brain, it makes me wonder whether they play a role in how the retina connects up to the brain," said Marla Feller, a UC Berkeley professor of molecular and cell biology and senior author of a paper that appeared this month in the journal Current Biology. "Maybe not for visual circuits, but for non-vision behaviors. Not only the pupillary light reflex and circadian rhythms, but possibly explaining problems like light-induced migraines, or why light therapy works for depression."

Parallel systems in developing retina

The cells, called intrinsically photosensitive retinal ganglion cells (ipRGCs), were discovered only 10 years ago, surprising those like Feller who had been studying the developing retina for nearly 20 years. She played a major role, along with her mentor, Carla Shatz of Stanford University, in showing that spontaneous electrical activity in the eye during development -- so-called retinal waves -- is critical for setting up the correct brain networks to process images later on.

Hence her interest in the ipRGCs that seemed to function in parallel with spontaneous retinal waves in the developing retina.

"We thought they (mouse pups and the human fetus) were blind at this point in development," said Feller, the Paul Licht Distinguished Professor in Biological Sciences and a member of UC Berkeley's Helen Wills Neuroscience Institute. "We thought that the ganglion cells were there in the developing eye, that they are connected to the brain, but that they were not really connected to much of the rest of the retina, at that point. Now, it turns out they are connected to each other, which was a surprising thing."

UC Berkeley graduate student Franklin Caval-Holme combined two-photon calcium imaging, whole-cell electrical recording, pharmacology and anatomical techniques to show that the six types of ipRGCs in the newborn mouse retina link up electrically, via gap junctions, to form a retinal network that the researchers found not only detects light, but responds to the intensity of the light, which can vary nearly a billionfold.

Gap junction circuits were critical for light sensitivity in some ipRGC subtypes, but not others, providing a potential avenue to determine which ipRGC subtypes provide the signal for specific non-visual behaviors that light evokes.

"Aversion to light, which pups develop very early, is intensity-dependent," suggesting that these neural circuits could be involved in light-aversion behavior, Caval-Holme said. "We don't know which of these ipRGC subtypes in the neonatal retina actually contributes to the behavior, so it will be very interesting to see what role all these different subtypes have."

The researchers also found evidence that the circuit tunes itself in a way that could adapt to the intensity of light, which probably has an important role in development, Feller said.

Read more at Science Daily

The nature of salmonella is changing -- and it's meaner

Salmonella is acting up in Michigan, and it could be a model for what's happening in other states, according to a new Michigan State University study.

The study, appearing in Frontiers in Medicine, documents a substantial uptick in antibiotic resistant strains, and consequently, longer hospital stays as doctors work to treat the increasing virulent pathogens.

"If you get a salmonella infection that is resistant to antibiotics today, you are more likely to be hospitalized longer, and it will take you longer to recover," said Shannon Manning, MSU Foundation professor in the Department of Microbiology and Molecular Genetics and senior author of the study. "We need better detection methods at the clinical level to identify resistant pathogens earlier so we can treat them with the right drugs the first time."

Losing a day or more to misdiagnosis or improper treatment allows symptoms to get worse. Doctors might kill off a subpopulation of bacteria that are susceptible, but the ones that are resistant grow stronger, she added.

Salmonella is a diverse group of bacterial pathogens that causes foodborne infections. Infected patients often develop diarrhea, nausea, vomiting and abdominal pain, though some infections are more severe and can be life threatening.

When it comes to treatments, each strain reacts differently to the range of antibiotics available for prescription by doctors. So getting it right the first time is crucial.

Specifically in Michigan, doctors are seeing more strains that are resistant to ampicillin, a common antibiotic prescribed to treat salmonella. Multidrug resistance, or resistance to more than three classes of antibiotics, has also increased in Michigan and could further complicate patient treatment plans.

"We're still uncertain as to why this is happening; it could be that these antibiotics have been overprescribed in human and veterinary medicine and that possessing genes for resistance has allowed these bacteria to grow and thrive in the presence of antibiotics," Manning said. "Each state has its own antibiotic-resistance issues. It's important that the medical profession remains vigilant to ever-changing patterns of resistance in salmonella and other foodborne pathogens, rather than look for a blanket national solution."

Historically, salmonella has affected young children and the elderly, but now there's been a rise in adult cases, suggesting that the epidemiology of the infections has changed in Michigan.

Diving into individual strains of salmonella, the team of scientists found that patients with Typhimurium were more likely to have resistant infections as were patients infected during the fall, winter or spring months.

Another distinction was revealed between the strains affecting people living in rural and urban areas. Enteritis infections tend to be higher in rural areas. This may be attributed to rural residents' exposure to farm animals or untreated sources of water.

Each state's salmonella population has its own personality; so every state's approach to identifying disease drivers and effective treatments should be modified to reflect these traits.

Read more at Science Daily

Nov 24, 2019

Science underestimated dangerous effects of sleep deprivation

Michigan State University's Sleep and Learning Lab has conducted one of the largest sleep studies to date, revealing that sleep deprivation affects us much more than prior theories have suggested.

Published in the Journal of Experimental Psychology: General, the research is not only one of the largest studies, but also the first to assess how sleep deprivation impacts placekeeping -- or, the ability to complete a series of steps without losing one's place, despite potential interruptions. This study builds on prior research from MSU's sleep scientists to quantify the effect lack of sleep has on a person's ability to follow a procedure and maintain attention.

"Our research showed that sleep deprivation doubles the odds of making placekeeping errors and triples the number of lapses in attention, which is startling," Fenn said. "Sleep-deprived individuals need to exercise caution in absolutely everything that they do, and simply can't trust that they won't make costly errors. Oftentimes -- like when behind the wheel of a car -- these errors can have tragic consequences."

By sharing their findings on the separate effects sleep deprivation has on cognitive function, Fenn -- and co-authors Michelle Stepan, MSU doctoral candidate and Erik Altmann, professor of psychology -- hope that people will acknowledge how significantly their abilities are hindered because of a lack of sleep.

"Our findings debunk a common theory that suggests that attention is the only cognitive function affected by sleep deprivation," Stepan said. "Some sleep-deprived people might be able to hold it together under routine tasks, like a doctor taking a patient's vitals. But our results suggest that completing an activity that requires following multiple steps, such as a doctor completing a medical procedure, is much riskier under conditions of sleep deprivation."

The researchers recruited 138 people to participate in the overnight sleep assessment; 77 stayed awake all night and 61 went home to sleep. All participants took two separate cognitive tasks in the evening: one that measured reaction time to a stimulus; the other measured a participant's ability to maintain their place in a series of steps without omitting or repeating a step -- even after sporadic interruptions. The participants then repeated both tasks in the morning to see how sleep-deprivation affected their performance.

"After being interrupted there was a 15% error rate in the evening and we saw that the error rate spiked to about 30% for the sleep-deprived group the following morning," Stepan said. "The rested participants' morning scores were similar to the night before.

"There are some tasks people can do on auto-pilot that may not be affected by a lack of sleep," Fenn said. "However, sleep deprivation causes widespread deficits across all facets of life."

From Science Daily

Universal features of music around the world

Music festival
Is music really a "universal language"? Two articles in the most recent issue of Science support the idea that music all around the globe shares important commonalities, despite many differences. Researchers led by Samuel Mehr at Harvard University have undertaken a large-scale analysis of music from cultures around the world. Cognitive biologists Tecumseh Fitch and Tudor Popescu of the University of Vienna suggest that human musicality unites all cultures across the planet.

The many musical styles of the world are so different, at least superficially, that music scholars are often sceptical that they have any important shared features. "Universality is a big word -- and a dangerous one," the great Leonard Bernstein once said. Indeed, in ethnomusicology, universality became something of a dirty word. But new research promises to once again revive the search for deep universal aspects of human musicality.

Samuel Mehr at Harvard University found that all cultures studied make music, and use similar kinds of music in similar contexts, with consistent features in each case. For example, dance music is fast and rhythmic, and lullabies soft and slow -- all around the world. Furthermore, all cultures showed tonality: building up a small subset of notes from some base note, just as in the Western diatonic scale. Healing songs tend to use fewer notes, and more closely spaced, than love songs. These and other findings indicate that there are indeed universal properties of music that likely reflect deeper commonalities of human cognition -- a fundamental "human musicality."

In a Science perspective piece in the same issue, University of Vienna researchers Tecumseh Fitch and Tudor Popescu comment on the implications. "Human musicality fundamentally rests on a small number of fixed pillars: hard-coded predispositions, afforded to us by the ancient physiological infrastructure of our shared biology. These 'musical pillars' are then 'seasoned' with the specifics of every individual culture, giving rise to the beautiful kaleidoscopic assortment that we find in world music," Tudor Popescu explains.

"This new research revives a fascinating field of study, pioneered by Carl Stumpf in Berlin at the beginning of the 20th century, but that was tragically terminated by the Nazis in the 1930s," Fitch adds.

As humanity comes closer together, so does our wish to understand what it is that we all have in common -- in all aspects of behaviour and culture. The new research suggests that human musicality is one of these shared aspects of human cognition. "Just as European countries are said to be 'United In Diversity', so too the medley of human musicality unites all cultures across the planet," concludes Tudor Popescu.

From Science Daily