May 18, 2019

Jawless fish take a bite out of the blood-brain barrier

Lamprey mouth.
A jawless parasitic fish could help lead the way to more effective treatments for multiple brain ailments, including cancer, trauma and stroke.

One major challenge in treating cancers and other disorders of the brain is ensuring that medicines reach their targets. A team of biomedical engineers and clinician-scientists at the University of Wisconsin-Madison and the University of Texas at Austin borrowed molecules from the immune system of the parasitic sea lamprey to deliver anti-cancer drugs directly to brain tumors.

They published their results today (May 15, 2019) in the journal Science Advances.

Unlike most currently used medicines, which target specific features on or inside individual cells in our body's organs and tissues, the lamprey-derived molecules take aim at a different target -- the extracellular matrix, a tangled mesh of proteins and sugars that supports and surrounds all cells in the brain.

The researchers believe the molecules could be adapted and combined with a wide array of other therapies, offering hope to treat numerous brain ailments beyond tumors, such as multiple sclerosis, Alzheimer's disease or even traumatic injuries.

"This set of targeting molecules appears somewhat agnostic to the disease," says Eric Shusta, a professor of chemical and biological engineering at UW-Madison. "We believe it could be applied as a platform technology across multiple conditions."

The technology takes advantage of the fact that many diseases disrupt one of the body's natural defense mechanisms: the blood-brain barrier, which lines the blood vessels of the central nervous system and protects the brain from potential threats such as circulating toxins or pathogens.

Many drugs -- including the lamprey-derived molecules -- cannot reach targets in the brain when they are injected into the bloodstream, because the blood-brain barrier normally prevents large molecules from leaving the blood vessels in the brain.

Yet, in conditions such as brain cancer, stroke, trauma and multiple sclerosis, the barrier becomes leaky in and around the disease locations. A leaky barrier offers a unique point of entry. It will allow the matrix-targeting lamprey molecules to access the brain and deliver drugs precisely on target.

"Molecules like this normally couldn't ferry cargo into the brain, but anywhere there's a blood-brain barrier disruption, they can deliver drugs right to the site of pathology," says Shusta.

Knowing that brain tumors often cause the barrier to leak, the researchers linked the lamprey-derived molecules to a Food and Drug Administration-approved chemotherapy called doxorubicin. The treatment prolonged survival in mouse models of glioblastoma, the incurable brain cancer that afflicted Senators John McCain and Ted Kennedy.

The matrix-targeting strategy means a wide variety of therapies could be linked to the lamprey-derived molecules. They could also be combined with techniques that temporarily open the blood brain barrier at specific brain sites. And it's possible that drugs delivered to the matrix could accumulate to a much higher therapeutic dose than medicines aimed at the inside of cells.

"Similar to water soaking into a sponge, the lamprey molecules will potentially accumulate much more of the drug in the abundant matrix around cells compared to specific delivery to cells," says collaborator John Kuo, a neurosurgeon-scientist and professor of neurosurgery in the Dell Medical School at the University of Texas at Austin.

Additionally, brain cells actively pump out many chemicals -- a useful trick to protect against toxic compounds, but a major headache for achieving effective therapeutic doses for medicines.

Targeting the matrix that surrounds the cells sidesteps that pumping problem.

"This could be a way to hold therapies in place that don't otherwise accumulate well in the brain so they can be more effective," says Ben Umlauf, a postdoctoral scholar in Shusta's group who isolated the lamprey-derived molecules.

Lampreys and humans have similar immune systems. But instead of producing antibodies to neutralize threats (that's how vaccines help protect us against measles), they produce small crescent-shaped defensive molecules called VLRs. To obtain their drug-delivery molecules, the researchers "vaccinated" lampreys with components of the brain extracellular matrix and then hunted through many thousands of VLRs to find one that stuck specifically to the brain matrix.

Importantly, in the mouse studies, the lamprey-derived molecules circulated throughout the body without accumulating in healthy brain tissue or other organs. This targeted delivery is especially important in cancer treatments, since many therapies frequently cause debilitating adverse reactions due to indiscriminate effects on healthy cells.

In the future, the researchers plan to link the matrix-targeting molecules to additional anti-cancer drugs, such as immunotherapy agents that activate a patient's own immune system to destroy tumors.

They also see promise in using the molecules as diagnostic tools to detect blood-brain barrier disruption by linking the matrix binders with probes for advanced imaging with PET scanners or MRI machines.

And because the molecules appear to be quite adaptable, the researchers speculate that many other medicines for the brain could become more effective if they were targeted to the matrix.

"I'm excited about trying this strategy in different disease model systems," says Kuo. "There are several disease processes that disrupt the blood-brain barrier and we could conceive of delivering a variety of different therapies with these molecules."

Read more at Science Daily

Brain's insular cortex processes pain and drives learning from pain

Pain concept (touching cactus).
Acute pain, e.g. hitting your leg against a sharp object, causes an abrupt, unpleasant feeling. In this way, we learn from painful experiences to avoid future harmful situations. This is called "threat learning" and helps animals and humans to survive. But which part of the brain actually warns other parts of the brain of painful events so that threat learning can occur?

We've known for a while that a brain area called amygdala is important for threat learning. But now, scientists from the lab of Ralf Schneggenburger at EPFL have discovered that the insular cortex sends such "warnings." The insular cortex, folded deep within the lateral sulcus of the brain, is known to code for feelings about our own body. Moreover, neurons in the insular cortex connect to neurons in the amygdala, but the function of this brain connection was previously little studied.

The insular cortex being similar between mice and men, the scientists turned to mice for their study. The researchers used light-activated ion channels that were genetically engineered into specific neurons in the brains of mice. This allowed them to switch off the electrical activity of neurons in the insular cortex by shining brief pulses of laser-light during the threat-learning behavior.

By switching off the insular cortex during the painful event, the scientists found that mice became essentially fearless against a mild electric shock to the foot. In addition, the ability of the mice to learn from the painful event was greatly reduced.

The study demonstrates that, besides informing our brain about bodily states, the insular cortex can send a strong warning signal to other brain areas involved in forming a memory of the unpleasant event. "Because silencing the insular cortex takes away the unpleasant feeling normally associated with a painful event, our study suggests that neurons in the insular cortex cause the subjective feeling of pain, and induce learning about the pain in other brain areas," says Schneggenburger.

"Because of this, activity in the insular cortex could have powerful consequences on shaping brain connectivity in other brain areas, which fits with studies that show aberrant activity in the insular cortex in humans with certain psychiatric diseases. Thus, our study of the neuronal mechanisms of how pain is encoded in the brain -- together with future studies of the underlying plasticity mechanisms -- might be relevant for the development of treatments for psychiatric diseases such as anxiety and post-traumatic stress disorders."

From Science Daily

May 17, 2019

Scientists find new type of cell that helps tadpoles' tails regenerate

Tadpole.
Researchers at the University of Cambridge have uncovered a specialised population of skin cells that coordinate tail regeneration in frogs. These 'Regeneration-Organizing Cells' help to explain one of the great mysteries of nature and may offer clues about how this ability might be achieved in mammalian tissues.

It has long been known that some animals can regrow their tails following amputation -- Aristotle observed this in the fourth century B.C. -- but the mechanisms that support such regenerative potential remain poorly understood.

Using 'single-cell genomics', scientists at the Wellcome Trust/ Cancer Research UK Gurdon Institute at the University of Cambridge developed an ingenious strategy to uncover what happens in different tadpole cells when they regenerate their tails.

Recent Cambridge-led advances in next-generation sequencing mean that scientists can now track which genes are turned on (being expressed) throughout a whole organism or tissue, at the resolution of individual cells. This technique, known as 'single-cell genomics', makes it possible to distinguish between cell types in more detail based on their characteristic selection of active genes.

These breakthroughs are beginning to reveal a map of cellular identities and lineages, as well as the factors involved in controlling how cells choose between alternative pathways during embryo development to produce the range of cell types in adults.

Using this technology, Can Aztekin and Dr Tom Hiscock -- under the direction of Dr Jerome Jullien -- made a detailed analysis of cell types involved in regeneration after damage in African clawed frog tadpoles (Xenopus laevis). Details are published today in the journal Science.

Dr Tom Hiscock says: "Tadpoles can regenerate their tails throughout their life; but there is a two-day period at a precise stage in development where they lose this ability. We exploited this natural phenomenon to compare the cell types present in tadpoles capable of regeneration and those no longer capable."

The researchers found that the regenerative response of stem cells is orchestrated by a single sub-population of epidermal (skin) cells, which they termed Regeneration-Organizing Cells, or ROCs.

Can Aztekin says: "It's an astonishing process to watch unfold. After tail amputation, ROCs migrate from the body to the wound and secrete a cocktail of growth factors that coordinate the response of tissue precursor cells. These cells then work together to regenerate a tail of the right size, pattern and cell composition."

In mammals, many tissues such as the skin epidermis, the intestinal epithelium and the blood system, undergo constant turnover through life. Cell lost through exhaustion or damage are replenished by stem cells. However, these specialised cells are usually dedicated to tissue sub-lineages, while the ability to regenerate whole organs and tissues has been lost in all but a minority of tissues such as liver and skin.

Read more at Science Daily

Sedimentary, my dear Johnson: Is NASA looking at the wrong rocks for clues to Martian life?

Mars robotic rover illustration.
In 2020, NASA and European-Russian missions will look for evidence of past life on Mars. But while volcanic, igneous rock predominates on the Red Planet, virtually the entire Earth fossil record comes from sedimentary rocks.

Addressing the problem in Frontiers in Earth Science, Swedish scientists have begun compiling evidence of fossilized microbes in underexplored igneous rock environments on Earth, to help guide where to search for a Martian fossil record -- and what to look for.

"We propose a 'volcanic microfossil atlas' to help select target sites for missions seeking evidence of extraterrestrial life, such as the NASA Mars mission 2020 and ExoMars," says lead author Dr. Magnus Ivarsson. "The atlas could also help us recognize what Mars microfossils might look like, by identifying biosignatures associated with different types of fossilized microbes."

Earth's deep biosphere

Ivarsson and colleagues study life buried in deep rock and deep time: fossilized remains of mysterious microbes, that have lived up to a kilometer below the deepest ocean floors for as long as 3.5 billion years.

"The majority of the microorganisms on Earth are believed to exist in the deep biosphere of the ocean and continental crust," reveals Ivarsson. "Yet we are just now beginning to explore -- through deep drilling projects -- this hidden biosphere."

In a watery world that never sees sunlight, bacteria, fungi and other microbes have adapted to feed on the igneous rock that surrounds them -- or even on each other. They spread through micro-fractures and cavities, forming complex and extended communities.

"Upon death, the microbial communities become fossilized on the walls of their rocky home. These microfossils can provide a history of microbial life in volcanic rock."

A volcanic microfossil atlas

Crucially, Earth's oceanic crust is geochemically very similar to the volcanic rocks that dominate the Martian landscape.

"Our aim is to be able to use the oceanic crust microfossil record as a model system to guide Martian exploration," Ivarsson explains. "Our review of existing knowledge is an important first step, but a more comprehensive understanding of the deep life is needed to show where and what to search for."

To achieve this, says Ivarsson, we need to collect more data on microfossil appearance and location -- but also, on their chemical composition.

"These fossils often preserve immense morphological detail. For example, we can distinguish broad classes of fungi through the appearance of spores, fruiting bodies, mycelia and other growth states -- or of bacteria, through the presence of cauliflower-like formations, generations of biofilms preserved as laminated sheets, and other characteristic community structures.

"But analysis of lipids and carbon isotopes in microfossils will make it possible to discriminate more precise groups based on their metabolism.

"Altogether this information will help to identify which types of microorganism are most likely to have been preserved on Mars, and which geochemical conditions most favour fossilization."

A fossil record on Mars


The microfossil atlas would therefore also help to determine which samples should be targeted for return to Earth, given the limited payload of the Mars missions.

"Both NASA's Mars 2020 and the ExoMars missions are capable of detecting larger fossilized structures from volcanic rocks, such as mm-sized mineralized fungal mycelia, or larger microstromatolites in open vesicles.

Read more at Science Daily

Earliest evidence of the cooking and eating of starch

Fire in hearth.
New discoveries made at the Klasies River Cave in South Africa's southern Cape, where charred food remains from hearths were found, provide the first archaeological evidence that anatomically modern humans were roasting and eating plant starches, such as those from tubers and rhizomes, as early as 120,000 years ago.

The new research by an international team of archaeologists, published in the Journal of Human Evolution, provides archaeological evidence that has previously been lacking to support the hypothesis that the duplication of the starch digestion genes is an adaptive response to an increased starch diet.

"This is very exciting. The genetic and biological evidence previously suggested that early humans would have been eating starches, but this research had not been done before," says Lead author Cynthia Larbey of the Department of Archaeology at the University of Cambridge. The work is part of a systemic multidisciplinary investigation into the role that plants and fire played in the lives of Middle Stone Age communities.

The interdisciplinary team searched for and analysed undisturbed hearths at the Klasies River archaeological site.

"Our results showed that these small ashy hearths were used for cooking food and starchy roots and tubers were clearly part of their diet, from the earliest levels at around 120,000 years ago through to 65,000 years ago," says Larbey. "Despite changes in hunting strategies and stone tool technologies, they were still cooking roots and tubers."

Professor Sarah Wurz from the School of Geography, Archaeology and Environmental Studies at the University of the Witwatersrand in Johannesburg, South Africa (Wits University) and principal investigator of the site says the research shows that "early human beings followed a balanced diet and that they were ecological geniuses, able to exploit their environments intelligently for suitable foods and perhaps medicines."

By combining cooked roots and tubers as a staple with protein and fats from shellfish, fish, small and large fauna, these communities were able to optimally adapt to their environment, indicating great ecological intelligence as early as 120,000 years ago.

"Starch diet isn't something that happens when we started farming, but rather, is as old as humans themselves," says Larbey. Farming in Africa only started in the last 10,000 years of human existence.

Humans living in South Africa 120,000 years ago formed and lived in small bands.

"Evidence from Klasies River, where several human skull fragments and two maxillary fragments dating 120,000 years ago occur, show that humans living in that time period looked like modern humans of today. However, they were somewhat more robust," says Wurz.

Klasies River is a very famous early human occupation site on the Cape coast of South Africa excavated by Wurz, who, along with Susan Mentzer of the Senckenberg Institute and Eberhard Karls Universit?t Tübingen, investigated the small (c. 30cm in diameter) hearths.

Read more at Science Daily

Scientists propose rethinking 'endangered species' definition to save slow-breeding giants

Elephants.
Conservation decisions based on population counts may fail to protect large, slow-breeding animals from irrevocable decline, according to new research coinciding with Endangered Species Day.

"Critical thresholds in so-called vital rates -- such as mortality and fertility rates among males and females of various ages -- can signal an approaching population collapse long before numbers drop below a point of no return," says lead author Dr. Shermin de Silva, President & Founder of Asian elephant conservation charity Trunks & Leaves. "We propose that conservation efforts for Asian elephants and other slow-breeding megafauna be aimed at maintaining their 'demographic safe space': that is, the combination of key vital rates that supports a non-negative growth rate."

A mammoth insight

Published in Frontiers in Ecology and Evolution, the study suggests that a combination of key vital rates governing population growth is a better indicator of a species' viability than short-term trends in population size and distribution.

"History bears this out," argues de Silva. "Genomic studies of the last mammoths isolated on Wrangel Island -- between Russia and Alaska -- have shown that although they were able to persist for thousands of years beyond the extinction of mainland populations with just ~300 individuals, they had accumulated numerous genetic mutations that may have eventually contributed to their extinction."

In other words populations of megafauna can become biologically inviable long before they disappear, if pushed beyond their 'demographic safe space.'

Females and calves key to saving the Asian elephant

The group applied the 'demographic safe space' concept to the case of the Asian elephant.

"Asian elephants are classified as 'Endangered' under the IUCN Red List because populations are thought to have declined by at least 50% in less than a century," explains de Silva. "There are fewer than 50,000 wild Asian elephants living today."

Studies show that wild Asian elephants breed extremely slowly, the majority producing just one calf in six years or more. Using mathematical modeling, de Silva and colleagues found that near-optimal reproduction and high calf survival is necessary to maintain non-negative population growth in the face of even modestly increased mortality among adult female age classes.

The approach shows a clear conservation priority for Asian elephants, a species in which the vast majority is tuskless.

"Measures to enhance survival of calves, and particularly females, are key to saving the Asian elephant," emphasizes de Silva.

"But while the attention of the world has been focused on the ivory trade, for critically endangered Asian elephant populations the greatest threat is habitat loss -- followed by illegal trade in live animals and parts.

"Habitat loss can create something known as 'extinction debt' by slowing down birth rates and increasing mortality rates. For slow breeding long-lived species, even incremental changes make a big difference, but their longevity can obscure the risk of extinction."

A demographic safe space for all megafauna

Conservation efforts for other large, slow-breeding species -- such as giraffes, rhinos, Bactrian camels and eastern gorillas -- could also benefit from modelling the interaction between vital rates. Data for these species in the wild are a scarce yet urgent necessity, suggest the authors.

"Rather than rely on simple population counts or estimates of near-term extinction probability, we urge that conservation resources for slow-breeding megafauna also be invested in identifying demographic tipping points and how to maintain populations within their safe spaces.

Read more at Science Daily

May 16, 2019

Neanderthals and modern humans diverged at least 800,000 years ago, research on teeth shows

Neanderthal vs human skull
Neanderthals and modern humans diverged at least 800,000 years ago, substantially earlier than indicated by most DNA-based estimates, according to new research by a UCL academic.

The research, published in Science Advances, analysed dental evolutionary rates across different hominin species, focusing on early Neanderthals. It shows that the teeth of hominins from Sima de los Huesos, Spain -- ancestors of the Neanderthals -- diverged from the modern human lineage earlier than previously assumed.

Sima de los Huesos is a cave site in Atapuerca Mountains, Spain, where archaeologists have recovered fossils of almost 30 people. Previous studies date the site to around 430,000 years ago (Middle Pleistocene), making it one of the oldest and largest collections of human remains discovered to date.

Dr Aida Gomez-Robles (UCL Anthropology), said: "Any divergence time between Neanderthals and modern humans younger than 800,000 years ago would have entailed an unexpectedly fast dental evolution in the early Neanderthals from Sima de los Huesos."

"There are different factors that could potentially explain these results, including strong selection to change the teeth of these hominins or their isolation from other Neanderthals found in mainland Europe. However, the simplest explanation is that the divergence between Neanderthals and modern humans was older than 800,000 years. This would make the evolutionary rates of the early Neanderthals from Sima de los Huesos roughly comparable to those found in other species."

Modern humans share a common ancestor with Neanderthals, the extinct species that were our closest prehistoric relatives. However, the details on when and how they diverged are a matter of intense debate within the anthropological community.

Ancient DNA analyses have generally indicated that both lineages diverged around 300,000 to 500,000 years ago, which has strongly influenced the interpretation of the hominin fossil record.

This divergence time, however, is not compatible with the anatomical and genetic Neanderthal similarities observed in the hominins from Sima de los Huesos. The Sima fossils are considered likely Neanderthal ancestors based on both anatomical features and DNA analysis.

Dr Gomez-Robles said: "Sima de los Huesos hominins are characterised by very small posterior teeth (premolars and molars) that show multiple similarities with classic Neanderthals. It is likely that the small and Neanderthal-looking teeth of these hominins evolved from the larger and more primitive teeth present in the last common ancestor of Neanderthals and modern humans."

Dental shape has evolved at very similar rates across all hominin species, including those with very expanded and very reduced teeth. This new study examined the time at which Neanderthals and modern humans should have diverged to make the evolutionary rate of the early Neanderthals from Sima de los Huesos similar to those observed in other hominins.

The research used quantitative data to measure the evolution of dental shape across hominin species assuming different divergent times between Neanderthals and modern humans, and accounting for the uncertainty about the evolutionary relationships between different hominin species.

"The Sima people's teeth are very different from those that we would expect to find in their last common ancestral species with modern humans, suggesting that they evolved separately over a long period of time to develop such stark differences."

Read more at Science Daily

Galaxy blazes with new stars born from close encounter

New image of irregular galaxy NGC 4485 captured by Hubble's Wide Field Camera 3 (WFC3) and Advanced Camera for Surveys (ACS).
The irregular galaxy NGC 4485 shows all the signs of having been involved in a hit-and-run accident with a bypassing galaxy. Rather than destroying the galaxy, the chance encounter is spawning a new generation of stars, and presumably planets.

The right side of the galaxy is ablaze with star formation, shown in the plethora of young blue stars and star-incubating pinkish nebulas. The left side, however, looks intact. It contains hints of the galaxy's previous spiral structure, which, at one time, was undergoing normal galactic evolution.

The larger culprit galaxy, NGC 4490, is off the bottom of the frame. The two galaxies sideswiped each other millions of years ago and are now 24,000 light-years apart. The gravitational tug-of-war between them created rippling patches of higher-density gas and dust within both galaxies. This activity triggered a flurry of star formation.

This galaxy is a nearby example of the kind of cosmic bumper-car activity that was more common billions of years ago when the universe was smaller and galaxies were closer together.

NGC 4485 lies 25 million light-years away in the northern constellation Canes Venatici (the Hunting Dogs).

This new image, captured by Hubble's Wide Field Camera 3 (WFC3), provides further insight into the complexities of galaxy evolution.

The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Maryland, conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy in Washington, D.C.

From Science Daily

Nearly a quarter of West Antarctic ice is now unstable

Antarctica.
By combining 25 years of European Space Agency satellite altimeter measurements and a model of the regional climate, the UK Centre for Polar Observation and Modelling (CPOM) have tracked changes in snow and ice cover across the continent.

A team of researchers, led by Professor Andy Shepherd from the University of Leeds, found that Antarctica's ice sheet has thinned by up to 122 metres in places, with the most rapid changes occurring in West Antarctica where ocean melting has triggered glacier imbalance.

This means that the affected glaciers are unstable as they are losing more mass through melting and iceberg calving than they are gaining through snowfall.

The team found that the pattern of glacier thinning has not been static. Since 1992, the thinning has spread across 24% of West Antarctica and over the majority of its largest ice streams -- the Pine Island and Thwaites Glaciers -- which are now losing ice five times faster than they were at the start of the survey.

The study, published today in Geophysical Research Letters, used over 800 million measurements of the Antarctic ice sheet height recorded by the ERS-1, ERS-2, Envisat, and CryoSat-2 satellite altimeter missions between 1992 and 2017 and simulations of snowfall over the same period produced by the RACMO regional climate model.

Together, these measurements allow changes in the ice sheet height to be separated into those due to weather patterns, such as less snowfall, and those due to longer term changes in climate, such as increasing ocean temperatures that eat away ice.

Lead author and CPOM Director Professor Andy Shepherd explained: "In parts of Antarctica the ice sheet has thinned by extraordinary amounts, and so we set out to show how much was due to changes in climate and how much was due to weather."

To do this, the team compared the measured surface height change to the simulated changes in snowfall, and where the discrepancy was greater they attributed its origin to glacier imbalance.

They found that fluctuations in snowfall tend to drive small changes in height over large areas for a few years at a time, but the most pronounced changes in ice thickness are signals of glacier imbalance that have persisted for decades.

Professor Shepherd added: "Knowing how much snow has fallen has really helped us to detect the underlying change in glacier ice within the satellite record. We can see clearly now that a wave of thinning has spread rapidly across some of Antarctica's most vulnerable glaciers, and their losses are driving up sea levels around the planet.

"Altogether, ice losses from East and West Antarctica have contributed 4.6 mm to global sea level rise since 1992."

Read more at Science Daily

Bedbugs evolved more than 100 million years ago

Bedbug.
Bedbugs -- some of the most unwanted human bed-mates -- have been parasitic companions with other species aside from humans for more than 100 million years, walking the earth at the same time as dinosaurs.

Work by an international team of scientists, including the University of Sheffield, compared the DNA of dozens of bedbug species in order to understand the evolutionary relationships within the group as well as their relationship with humans.

The team discovered that bedbugs are older than bats -- a mammal that people had previously believed to be their first host 50-60 million years ago. Bedbugs in fact evolved around 50 million years earlier.

Bedbugs rank high among the list of most unwanted human bedfellows but until now, little was known about when they first originated.

Experts have now discovered that the evolutionary history of bed bugs is far more complex than previously thought and the critters were actually in existence during the time of dinosaurs. More research is needed to find out what their host was at that time, although current understanding suggests it's unlikely they fed on the blood of dinosaurs. This is because bed bugs and all their relatives feed on animals that have a "home" -- such as a bird's nest, an owl's burrow, a bat's roost or a human's bed -- a mode of life that dinosaurs don't seem to have adopted.

The team spent 15 years collecting samples from wild sites and museums around the world, dodging bats and buffaloes in African caves infected with Ebola and climbing cliffs to collect from bird nests in South East Asia.

Professor Mike Siva-Jothy from the University of Sheffield's Department of Animal and Plant Sciences, who was part of the team, said: "To think that the pests that live in our beds today evolved more than 100 million years ago and were walking the earth side by side with dinosaurs, was a revelation. It shows that the evolutionary history of bed bugs is far more complex than we previously thought."

Dr Steffen Roth from the University Museum Bergen in Norway, who led the study, added: "The first big surprise we found was that bedbugs are much older than bats, which everyone assumed to be their first host. It was also unexpected to see that evolutionary older bedbugs were already specialised on a single host type, even though we don't know what the host was at the time when T. rex walked the earth."

The study also reveals that a new species of bedbug conquers humans about every half a million years: moreover that when bedbugs changed hosts, they didn't always become specialised on that new host and maintained the ability to jump back to their original host. This demonstrates that while some bedbugs become specialised, some remain generalists, jumping from host to host.

Professor Klaus Reinhardt, a bedbug researcher from Dresden University in Germany, who co-led the study, said: "These species are the ones we can reasonably expect to be the next ones drinking our blood, and it may not even take half a million years, given that many more humans, livestock and pets that live on earth now provide lots more opportunities."

The team also found that the two major bedbug pests of humans -- the common and the tropical bedbug -- are much older than humans. This contrasts with other evidence that the evolution of ancient humans caused the split of other human parasites into new species.

Professor Mike Siva-Jothy from the University of Sheffield, added: "These findings will help us better understand how bedbugs evolved the traits that make them effective pests -- that will also help us find new ways of controlling them."

Read more at Science Daily

From Earth's deep mantle, scientists find a new way volcanoes form

Volcanic eruption.
Far below Bermuda's pink sand beaches and turquoise tides, geoscientists have discovered the first direct evidence that material from deep within Earth's mantle transition zone -- a layer rich in water, crystals and melted rock -- can percolate to the surface to form volcanoes.

Scientists have long known that volcanoes form when tectonic plates (traveling on top of the Earth's mantle) converge, or as the result of mantle plumes that rise from the core-mantle boundary to make hotspots at Earth's crust. But obtaining evidence that material emanating from the mantle's transition zone -- between 250 to 400 miles (440-660 km) beneath our planet's crust -- can cause volcanoes to form is new to geologists.

"We found a new way to make volcanoes. This is the first time we found a clear indication from the transition zone deep in the Earth's mantle that volcanoes can form this way," said senior author Esteban Gazel, associate professor in the Department of Earth and Atmospheric Sciences at Cornell University. The research published in Nature.

"We were expecting our data to show the volcano was a mantle plume formation -- an upwelling from the deeper mantle -- just like it is in Hawaii," Gazel said. But 30 million years ago, a disturbance in the transition zone caused an upwelling of magma material to rise to the surface, forming a now-dormant volcano under the Atlantic Ocean and then forming Bermuda.

Using a 2,600-foot (over 700-meter) core sample -- drilled in 1972, housed at Dalhousie University, Nova Scotia -- co-author Sarah Mazza of the University of Münster, in Germany, assessed the cross-section for isotopes, trace elements, evidence of water content and other volatile material. The assessment provided a geologic, volcanic history of Bermuda.

"I first suspected that Bermuda's volcanic past was special as I sampled the core and noticed the diverse textures and mineralogy preserved in the different lava flows," Mazza said. "We quickly confirmed extreme enrichments in trace element compositions. It was exciting going over our first results ... the mysteries of Bermuda started to unfold."

From the core samples, the group detected geochemical signatures from the transition zone, which included larger amounts of water encased in the crystals than were found in subduction zones. Water in subduction zones recycles back to Earth's surface. There is enough water in the transition zone to form at least three oceans, according to Gazel, but it is the water that helps rock to melt in the transition zone.

The geoscientists developed numerical models with Robert Moucha, associate professor of Earth sciences at Syracuse University, to discover a disturbance in the transition zone that likely forced material from this deep mantle layer to melt and percolate to the surface, Gazel said.

Despite more than 50 years of isotopic measurements in oceanic lavas, the peculiar and extreme isotopes measured in the Bermuda lava core had not been observed before. Yet, these extreme isotopic compositions allowed the scientists to identify the unique source of the lava.

"If we start to look more carefully, I believe we're going to find these geochemical signatures in more places," said co-author Michael Bizimis, associate professor at the University of South Carolina.

Gazel explained that this research provides a new connection between the transition zone layer and volcanoes on the surface of Earth. "With this work we can demonstrate that the Earth's transition zone is an extreme chemical reservoir," said Gazel. "We are now just now beginning to recognize its importance in terms of global geodynamics and even volcanism."

Said Gazel: "Our next step is to examine more locations to determine the difference between geological processes that can result in intraplate volcanoes and determine the role of the mantle's transition zone in the evolution of our planet."

Read more at Science Daily

May 15, 2019

Species facing climate change could find help in odd place: Urban environments

Baltimore Checkerspot butterfly.
When it comes to wildlife conservation efforts, urban environments could be far more helpful than we think, according to new research. A study published today in Ecology shows that animals move faster through 'low quality' habitats -- evidence that could change the way conservationists think about managing landscapes to help species move in response to climate change. In light of the recent UN report indicating that 1 million species are threatened with extinction, the study provides a framework for definitive action to help preserve many species at risk. The work was carried out by researchers at Tufts University, University of Liverpool, Washington State University and the University of Ottawa.

For landscapes to facilitate range expansion, there is a balance to be struck between promoting movement with low-quality habitat (places where a species can survive, but does not have all the resources it needs to complete its life cycle) and promoting population growth with high-quality habitat. They conclude that low-quality habitats that meet a minimum standard could actually provide a benefit as conduits for movement.

The underlying behaviour that explains this surprising result is that when animals find themselves in an inhospitable area they tend to make longer and straighter movements. As long as they do not die in this area, their arrival at another breeding area will tend to be quicker. The researchers used data from 78 species in 70 studies to show that in 73 percent of cases, animals moved faster through 'lower-quality' habitats. To illustrate what this principle means on the ground, the team used mathematical models to calculate rates of range expansion across a variety of landscapes for an exemplar species -- the Baltimore checkerspot butterfly. They showed that range expansion is fastest through landscapes composed of around 15 percent high-quality habitat and 85 percent unsuitable habitat.

"At landscape scales, 15% high-quality habitat is still more than currently exists in most ecosystems. Nonetheless, our findings point to the potential of using suburban and even urban greenspace as underappreciated areas that could facilitate range shifts, if green spaces such as lawns were converted to native plant gardens, which have high conservation potential for insects and other wildlife species," said lead author Elizabeth Crone, a professor of biology at Tufts University.

Jenny Hodgson, co-author of the study and lecturer in evolution, ecology and behaviour at the University of Liverpool added: "This could offer a new perspective of flexibility for landscape planners: they needn't worry if they can't create unbroken tracts of high-quality wildlife habitat, instead they can create strategic 'stepping stones' in urban and agricultural areas. However, the stepping stones need to provide resources for breeding, not just temporary food resources."

The researchers hope their study will make designers of city and suburban environments start to think differently about their approach, by providing a starting point to assess the consequences of landscape structure in the management of wildlife, regardless of whether the goal is to enhance or restrict the potential for range expansion.

"Nearly all high-profile studies about biodiversity conservation have focused on documenting the patterns of species habitat use and movement. We feel that more insights are gained by considering the mechanisms behind these patterns," said co-author Cheryl Schultz, professor of conservation biology at Washington State University. "In this case, our discovery that lower quality habitats assist species movement to better habitats sets up a more realistic and achievable objective for urban landscapers, and provides an important complement to conservation efforts focused on preserving large tracts of natural areas and high-quality habitat."

Read more at Science Daily

Amount of carbon stored in forests reduced as climate warms

Altai Mountains, Russia.
Accelerated tree growth caused by a warming climate does not necessarily translate into enhanced carbon storage, an international study suggests.

The team, led by the University of Cambridge, found that as temperatures increase, trees grow faster, but they also tend to die younger. When these fast-growing trees die, the carbon they store is returned to the carbon cycle.

The results, reported in the journal Nature Communications, have implications for global carbon cycle dynamics. As the Earth's climate continues to warm, tree growth will continue to accelerate, but the length of time that trees store carbon, the so-called carbon residence time, will diminish.

During photosynthesis, trees and other plants absorb carbon dioxide from the atmosphere and use it to build new cells. Long-lived trees, such as pines from high elevations and other conifers found across the high-northern latitude boreal forests, can store carbon for many centuries.

"As the planet warms, it causes plants to grow faster, so the thinking is that planting more trees will lead to more carbon getting removed from the atmosphere," said Professor Ulf Büntgen from Cambridge's Department of Geography, the study's lead author. "But that's only half of the story. The other half is one that hasn't been considered: that these fast-growing trees are holding carbon for shorter periods of time."

Büntgen uses the information contained in tree rings to study past climate conditions. Tree rings are as distinctive as fingerprints: the width, density and anatomy of each annual ring contains information about what the climate was like during that particular year. By taking core samples from living trees and disc samples of dead trees, researchers are able to reconstruct how the Earth's climate system behaved in the past and understand how ecosystems were, and are, responding to temperature variation.

For the current study, Büntgen and his collaborators from Germany, Spain, Switzerland and Russia, sampled more than 1100 living and dead mountain pines from the Spanish Pyrenees and 660 Siberian larch samples from the Russian Altai: both high-elevation forest sites that have been undisturbed for thousands of years. Using these samples, the researchers were able to reconstruct the total lifespan and juvenile growth rates of trees that were growing during both industrial and pre-industrial climate conditions.

The researchers found that harsh, cold conditions cause tree growth to slow, but they also make trees stronger, so that they can live to a great age. Conversely, trees growing faster during their first 25 years die much sooner than their slow-growing relatives. This negative relationship remained statistically significant for samples from both living and dead trees in both regions.

The idea of a carbon residence time was first hypothesised by co-author Christian Körner, Emeritus Professor at the University of Basel, but this is the first time that it has been confirmed by data.

The relationship between growth rate and lifespan is analogous to the relationship between heart rate and lifespan seen in the animal kingdom: animals with quicker heart rates tend to grow faster but have shorter lives on average.

Read more at Science Daily

Chewing gums reveal the oldest Scandinavian human DNA

Birch trees.
The first humans who settled in Scandinavia more than 10,000 years ago left their DNA behind in ancient chewing gums, which are masticated lumps made from birch bark pitch. This is shown in a new study conducted at Stockholm University and published in Communications Biology.

There are few human bones of this age, close to 10,000 years old, in Scandinavia, and not all of them have preserved enough DNA for archaeogenetic studies. In fact, the DNA from these newly examined chewing gums is the oldest human DNA sequenced from this area so far. The DNA derived from three individuals, two females and one male, creates an exciting link between material culture and human genetics.

Ancient chewing gums are as of now an alternative source for human DNA and possibly a good proxy for human bones in archaeogenetic studies. The investigated pieces come from Huseby-Klev, an early Mesolithic hunter-fisher site on the Swedish west coast. The sites excavation was done in the early 1990's, but at this time it was not possible to analyse ancient human DNA at all, let alone from non-human tissue. The masticates were made out of birch bark tar and used as glue in tool production and other types of technology during the Stone Age.

"When Per Persson and Mikael Maininen proposed to look for hunter-gatherer DNA in these chewing gums from Huseby Klev we were hesitant but really impressed that archaeologists took care during the excavations and preserved such fragile material," says Natalija Kashuba, who was affiliated to The Museum of Cultural History in Oslo when she performed the experiments in cooperation with Stockholm University.

"It took some work before the results overwhelmed us, as we understood that we stumbled into this almost 'forensic research', sequencing DNA from these mastic lumps, which were spat out at the site some 10,000 years ago!" says Natalija Kashuba. Today Natalija is a Ph.D. student at Uppsala University.

Exciting link between material culture and human genetics

The results show that, genetically, the individuals whose DNA was found share close genetic affinity to other hunter-gatherers in Sweden and to early Mesolithic populations from Ice Age Europe. However, the tools produced at the site were a part of lithic technology brought to Scandinavia from the East European Plain, modern day Russia. This scenario of a culture and genetic influx into Scandinavia from two routes was proposed in earlier studies, and these ancient chewing gums provides an exciting link directly between the tools and materials used and human genetics.

Emrah Kirdök at Stockholm University conducted the computational analyses of the DNA. "Demography analysis suggests that the genetic composition of Huseby Klev individuals show more similarity to western hunter-gatherer populations than eastern hunter-gatherers," he says.

Read more at Science Daily

Dolphin ancestor's hearing was more like hoofed mammals than today's sea creatures

Dolphins.
Vanderbilt University paleontologists are looking into the evolutionary origins of the whistles and squeaks that dolphins and porpoises make -- part of the rare echolocation ability that allows them to effectively navigate their dark environment.

The team, one of the first in the world to examine the ability's origins, used a small CT scanner to look inside a 30-million-year-old ear bone fossil from a specimen resembling Olympicetus avitus. This member of the toothed whale family, in a branch that died out before modern dolphins and porpoises appeared, lived in what is now the state of Washington. The CT scan revealed cochlear coiling with more turns than in animals with echolocation, indicating hearing more similar to the cloven-hoofed, terrestrial mammals dolphins came from than the sleek sea creatures they are today.

"The simple theory is that there was one origin for echolocation in dolphins, and we'd find it in their 30-million-year-old ancestor," said Rachel A. Racicot, who completed the research as a visiting scholar at Vanderbilt. "Now, we believe it didn't evolve just once in this lineage, but more than once and in more than one lineage -- at least in xenorophids, which are extinct, and somewhere along the line to the Odontoceti crown group that still survives."

Because echolocation is useful for navigating dark waters, natural selection likely came into play with its development in the branch that survived, she said. The findings appear May 15 in The Royal Society journal Biology Letters.

Racicot will join Vanderbilt's Earth and Environmental Sciences Department after spending a year working in Germany. Her co-author, Assistant Professor of Earth and Environmental Sciences Simon A.F. Darroch, installed the CT scanner, which works the same way as those used in medicine and allows for internal examination of fossils without damaging them.

Learning echolocation's origins also can help preserve modern creatures that use it, Darroch said, by understanding how they're perceiving sound from ship engines, oil drills and other machinery. Confusion over those sounds may be causing mass stranding events, and solving the mystery could lead to methods of discouraging species such as the vaquita, a small porpoise on the brink of extinction in the Gulf of California, away from boats and nets.

"If we develop correlates for the shapes of the inner ear and how that corresponds to hearing frequencies, we can extrapolate those methods without capturing animals and bombarding them with sounds that don't work," Darroch said.

First, according to Racicot and Darroch, paleontologists will have to find and scan a much larger sampling of all the toothed whale group's ancestors and those of rare modern species.

Read more at Science Daily

May 14, 2019

Evolutionary backing found in analysis of mammalian vertebrae

Two-toed sloth.
Differences in numbers of vertebrae are most extreme in mammals which do not rely on running and leaping, such as those adapted to suspensory locomotion like apes and sloths, a team of anthropologists has concluded in a study appearing in the journal Nature Ecology & Evolution.

Previous research had posited that running speed specifically determines variation in vertebral numbers -- a conclusion not supported by the new work.

"The classic body plan of many mammals is built on a mobile back and this body plan is conserved regardless of running speed," explains New York University anthropologist Scott Williams, the paper's senior author. "More specifically, we find that a particular type of locomotor behavior -- suspensory locomotion, which involves hanging below tree branches, rather than speed -- is associated with increases in variation in numbers of vertebrae across mammals."

The work centers on an effort to better understand why certain aspects of mammals remain consistent over time -- a phenomenon known as evolutionary stasis.

Despite the diversity evolution has yielded, there remain consistencies across a wide range of distantly related organisms. Of particular note is the number neck (cervical) and back (thoracic and lumbar) vertebrae of mammals.

"Nearly all mammals have the same number of cervical vertebrae, no matter how long or short their necks are -- humans, giraffes, mice, whales, and platypuses all have exactly seven cervical vertebrae," explains co-author Jeff Spear, an NYU doctoral student.

In fact, the majority of mammals possess 19 or 20 thoracic and lumbar vertebrae, for a total of 26 or 27 "CTL" vertebrae (for "cervical, thoracic, and lumbar" vertebrae). There is little variation in these numbers, either within species or across different species -- or even different species separated by over 160 million years of evolution. Humans, with 24 CTL, are one of the exceptions.

Earlier work had hypothesized that fast running constrains the number of CTL vertebrae in mammals and that slower mammals are freer to vary their CTL numbers, suggesting an association between speed and vertebrae count. However, this conclusion was based on data from a limited sample of mammalian diversity.

In order to learn what causes evolutionary stasis, and why there are exceptions, such as in humans, the scientists aimed to create a clearer picture using a larger and more diverse sample of mammals and phylogenetic methods -- those that account for evolutionary relatedness in analyses.

In their study, they counted the vertebrae of thousands of individuals for nearly 300 species of mammals. The researchers then compared variation in the number of CTL vertebrae to traits such as speed, habitat, locomotion, spine mobility, posture, and limb use.

The analyses did not seem to show an association between vertebrae count and running speed. Rather, this trend was primarily driven by animals adapted to suspensory and other "antipronograde" behaviors, where limbs are held in tension during slow climbing, clambering, and suspension.

This observation led the researchers to hypothesize that the classic body plan of certain mammals -- therian mammals, which give birth to live young -- is built on a mobile back and that this body plan is conserved regardless of running speed.

This was based on an existing understanding of genetic activity relevant to vertebrae.

"Changes in types of vertebrae are determined by Hox genes -- the genes that organize animal bodies along the head-tail axis, ensuring that your eyes go on your face and your legs go at the base of your torso," explains Spear. "But changes in Hox gene expression sometimes creates vertebrae that are intermediate in type, which can impinge the mobility of the spine."

For animals following the ancestral body plan, from possums to tigers, departures from the ancestral types of vertebrae in the back creates a risk of inefficient locomotion and are weeded out by natural selection, he adds.

Read more at Science Daily

Small, hardy planets most likely to survive death of their stars

White dwarf illustration.
Small, hardy planets packed with dense elements have the best chance of avoiding being crushed and swallowed up when their host star dies, new research from the University of Warwick has found.

Astrophysicists from the Astronomy and Astrophysics Group have modelled the chances of different planets being destroyed by tidal forces when their host stars become white dwarfs and have determined the most significant factors that decide whether they avoid destruction.

Their 'survival guide' for exoplanets could help guide astronomers locate potential exoplanets around white dwarf stars, as a new generation of even more powerful telescopes is being developed to search for them. Their research is published in the Monthly Notices of the Royal Astronomical Society.

Most stars like our own Sun will run out of fuel eventually and shrink and become white dwarfs. Some orbiting bodies that aren't destroyed in the maelstrom caused when the star blasts away its outer layers will then be subjected to shifts in tidal forces as the star collapses and becomes super-dense. The gravitational forces exerted on any orbiting planets would be intense and would potentially drag them into new orbits, even pushing some further out in their solar systems.

By modelling the effects of a white dwarf's change in gravity on orbiting rocky bodies, the researchers have determined the most likely factors that will cause a planet to move within the star's 'destruction radius'; the distance from the star where an object held together only by its own gravity will disintegrate due to tidal forces. Within the destruction radius a disc of debris from destroyed planets will form.

Although a planet's survival is dependent on many factors, the models reveal that the more massive the planet, the more likely that it will be destroyed through tidal interactions.

But destruction is not certain based on mass alone: low viscosity exo-Earths are easily swallowed even if they reside at separations within five times the distance between the centre of the white dwarf and its destruction radius. Saturn's moon Enceladus -- often described as a 'dirty snowball' -- is a good example of a homogeneous very low viscosity planet.

High viscosity exo-Earths are easily swallowed only if they reside at distances within twice the separation between the centre of the white dwarf and its destruction radius. These planets would be composed entirely of a dense core of heavier elements, with a similar composition to the 'heavy metal' planet discovered by another team of University of Warwick astronomers recently. That planet has avoided engulfment because it is as small as an asteroid.

Dr Dimitri Veras, from the University of Warwick's Department of Physics, said: "The paper is one of the first-ever dedicated studies investigating tidal effects between white dwarfs and planets. This type of modelling will have increasing relevance in upcoming years, when additional rocky bodies are likely to be discovered close to white dwarfs."

"Our study, while sophisticated in several respects, only treats homogenous rocky planets that are consistent in their structure throughout. A multi-layer planet, like Earth, would be significantly more complicated to calculate but we are investigating the feasibility of doing so too."

Distance from the star, like the planet's mass, has a robust correlation with survival or engulfment. There will always be a safe distance from the star and this safe distance depends on many parameters. In general, a rocky homogenous planet which resides at a location from the white dwarf which is beyond about one-third of the distance between Mercury and the Sun is guaranteed to avoid being swallowed from tidal forces.

Dr Veras said: "Our study prompts astronomers to look for rocky planets close to -- but just outside of -- the destruction radius of the white dwarf. So far observations have focussed on this inner region, but our study demonstrates that rocky planets can survive tidal interactions with the white dwarf in a way which pushes the planets slightly outward.

Read more at Science Daily

Treats might mask animal intelligence

Rat with cheese.
Rewards are frequently used to promote learning, but rewards may actually mask true knowledge, finds a new Johns Hopkins University study with rodents and ferrets.

The findings, published May 14 in Nature Communications, show a distinction between knowledge and performance, and provide insight into how environment can affect the two.

"Most learning research focuses on how humans and other animals learn 'content' or knowledge. Here, we suggest that there are two parallel learning processes: one for content and one for context, or environment. If we can separate how these two pathways work, perhaps we can find ways to improve performance," says Kishore Kuchibhotla, an assistant professor in The Johns Hopkins University's department of psychological and brain sciences and the study's lead author.

While researchers have known that the presence of reinforcement, or reward, can change how animals behave, it's been unclear exactly how rewards affect learning versus performance.

An example of the difference between learning and performance, Kuchibhotla explains, is the difference between a student studying and knowing the answers at home, and a student demonstrating that knowledge on a test at school.

"What we know at any given time can be different than what we show; the ability to access that knowledge in the right environment is what we're interested in," he says.

To investigate what animals know in hopes of better understanding learning, Kuchibhotla and the research team trained mice, rats and ferrets on a series of tasks, and measured how accurately they performed the tasks with and without rewards.

For the first experiment, the team trained mice to lick for water through a lick tube after hearing one tone, and to not lick after hearing a different, unrewarded tone. It takes mice two weeks to learn this in the presence of the water reward. At a time point early in learning, around days 3-5, the mice performed the task at chance levels (about 50%) when the lick tube/reward was present. When the team removed the lick tube entirely on these early days, however, the mice performed the task at more than 90% accuracy. The mice, therefore, seemed to understand the task many days before they expressed knowledge in the presence of a reward.

To confirm this finding with other tasks and animals, the team also had mice press a lever for water when they heard a certain tone; prompted rats to look for food in a cup if they heard a tone, but not if a light appeared before the tone; had rats press a lever for sugar water when a light was presented before a tone; had rats push lever for sugar water when they heard a certain tone, and prompted ferrets to differentiate between two different sounds for water. In all experiments, the animals performed better when rewards weren't available.

"Rewards, it seems, help improve learning incrementally, but can mask the knowledge animals have actually attained, particularly early in learning," says Kuchibhotla. Furthermore, the finding that all animals' performance improved across the board without rewards, suggest that variability in learning rates may be due to differences in the animals' sensitivity to reward context rather than differences in intelligence.

The dissociation between learning and performance, the researchers suggest, may someday help us isolate the root causes of poor performance. While the study involved only rodents and ferrets, Kuchibhotla says it may be possible to someday help animals and humans alike better access content when they need it if the right mechanisms within the brain can be identified and manipulated.

Read more at Science Daily

Room for thought: Brain region that watches for walls identified

Illustration of human brain highlighting the occipital lobe.
To move through the world, you need a sense of your surroundings, especially of the constraints that restrict your movement: the walls, ceiling and other barriers that define the geometry of the navigable space around you. And now, a team of neuroscientists has identified an area of the human brain dedicated to perceiving this geometry. This brain region encodes the spatial constraints of a scene, at lightning-fast speeds, and likely contributes to our instant sense of our surroundings; orienting us in space, so we can avoid bumping into things, figure out where we are and navigate safely through our environment.

This research, published today in Neuron, sets the stage for understanding the complex computations our brains do to help us get around. Led by scientists at Columbia University's Mortimer B. Zuckerman Mind Brain Behavior Institute and Aalto University in Finland, the work is also relevant to the development of artificial intelligence technology aimed at mimicking the visual powers of the human brain.

"Vision gives us an almost instant sense where we are in space, and in particular of the geometry of the surfaces -- the ground, the walls -- which constrain our movement. It feels effortless, but it requires the coordinated activity of multiple brain regions," said Nikolaus Kriegeskorte, PhD, a principal investigator at Columbia's Zuckerman Institute and the paper's senior author. "How neurons work together to give us this sense of our surroundings has remained mysterious. With this study, we are a step closer to solving that puzzle."

To figure out how the brain perceives the geometry of its surroundings, the research team asked volunteers to look at images of different three-dimensional scenes. An image might depict a typical room, with three walls, a ceiling and a floor. The researchers then systematically changed the scene: by removing the wall, for instance, or the ceiling. Simultaneously, they monitored participants' brain activity through a combination of two cutting-edge brain-imaging technologies at Aalto's neuroimaging facilities in Finland.

"By doing this repeatedly for each participant as we methodically altered the images, we could piece together how their brains encoded each scene," Linda Henriksson, PhD, the paper's first author and a lecturer in neuroscience and biomedical engineering at Aalto University.

Our visual system is organized into a hierarchy of stages. The first stage actually lies outside brain, in the retina, which can detect simple visual features. Subsequent stages in the brain have the power to detect more complex shapes. By processing visual signals through multiple stages -- and by repeated communications between the stages -- the brain forms a complete picture of the world, with all its colors, shapes and textures.

In the cortex, visual signals are first analyzed in an area called the primary visual cortex. They are then passed to several higher-level cortical areas for further analyses. The occipital place area (OPA), an intermediate-level stage of cortical processing, proved particularly interesting in the brain scans of the participants.

"Previous studies had shown that OPA neurons encode scenes, rather than isolated objects," said Dr. Kriegeskorte, who is also a professor of psychology and neuroscience and director of cognitive imaging at Columbia. "But we did not yet understand what aspect of the scenes this region's millions of neurons encoded."

After analyzing the participants' brain scans, Drs. Kriegeskorte and Henriksson found that the OPA activity reflected the geometry of the scenes. The OPA activity patterns reflected the presence or absence of each scene component -- the walls, the floor and the ceiling -- conveying a detailed picture of the overall geometry of the scene. However, the OPA activity patterns did not depend on the components' appearance; the textures of the walls, floor and ceiling -- suggesting that the region ignores surface appearance, so as to focus solely on surface geometry. The brain region appeared to perform all the necessary computations needed to get a sense of a room's layout extremely fast: in just 100 milliseconds.

"The speed with which our brains sense the basic geometry of our surroundings is an indication of the importance of having this information quickly," said Dr. Henriksson. "It is key to knowing whether you're inside or outside, or what might be your options for navigation."

The insights gained in this study were possible through the joint use of two complementary imaging technologies: functional magnetic resonance imaging (fMRI) and magnetoencephalography (MEG). fMRI measures local changes in blood oxygen levels, which reflect local neuronal activity. It can reveal detailed spatial activity patterns at a resolution of a couple of millimeters, but it is not very precise in time, as each fMRI measurement reflects the average activity over a five to eight seconds. By contrast, MEG measures magnetic fields generated by the brain. It can track activity with millisecond temporal precision, but does not give as spatially detailed a picture.

"When we combine these two technologies, we can address both where the activity occurs and how quickly it emerges." said Dr. Henriksson, who collected the imaging data at Aalto University.

Moving forward, the research team plans to incorporate virtual reality technology to create more realistic 3D environments for participants to experience. They also plan to build neural network models that mimic the brain's ability to perceive the environment.

"We would like to put these things together and build computer vision systems that are more like our own brains, systems that have specialized machinery like what we observe here in the human brain for rapidly sensing the geometry of the environment," said Dr. Kriegeskorte.

Read more at Science Daily

May 13, 2019

Matter around a young star helps astronomers explore our stellar history

Orion Nebula.
Astronomers map the substance aluminum monoxide (AlO) in a cloud around a distant young star -- Origin Source I. The finding clarifies some important details about how our solar system, and ultimately we, came to be. The cloud's limited distribution suggests AlO gas rapidly condenses to solid grains, which hints at what an early stage of our solar evolution looked like.

Professor Shogo Tachibana of the UTokyo Organization for Planetary and Space Science has a passion for space. From small things like meteorites to enormous things like stars and nebulae -- huge clouds of gas and dust in space -- he is driven to explore our solar system's origins.

"I have always wondered about the evolution of our solar system, of what must have taken place all those billions of years ago," he said. "This question leads me to investigate the physics and chemistry of asteroids and meteorites."

Space rocks of all kinds greatly interest astronomers as these rocks can remain largely unchanged since the time our sun and planets formed from a swirling cloud of gas and dust. They contain records of the conditions at that time -- generally considered to be 4.56 billion years ago -- and their properties such as composition can tell us about these early conditions.

"On my desk is a small piece of the Allende meteorite, which fell to Earth in 1969. It's mostly dark but there are some scattered white inclusions (foreign bodies enclosed in the rock), and these are important," continued Tachibana. "These speckles are calcium and aluminum-rich inclusions (CAIs), which were the first solid objects formed in our solar system."

Minerals present in CAIs indicate that our young solar system must have been extremely hot. Physical techniques for dating these minerals reveal a fairly specific age for the solar system. However, Tachibana and colleagues wished to expand on the details of this stage of evolution.

"There are no time machines to explore our own past, so we wanted to see a young star that could share traits with our own," said Tachibana. "With the Atacama Large Millimeter/submillimeter Array (ALMA), we found the emission lines -- a chemical fingerprint -- for AlO in outflows from the circumstellar disk (gas and dust surrounding a star) of the massive young star candidate Orion Source I. It's not exactly like our sun, but it's a good start."

ALMA was the ideal tool as it offers extremely high resolution and sensitivity to reveal the distribution of AlO around the star. No other instrument can presently make such observations.

"Thanks to ALMA, we discovered the distribution of AlO around a young star for the first time. The distribution of AlO is limited to the hot region of the outflow from the disk. This implies that AlO rapidly condenses as solid grains -- similar to CAIs in our solar system," explained Tachibana. "This data allows us to place tighter constraints on hypotheses that describe our own stellar evolution. But there's still much work to do."

Read more at Science Daily

Research on repetitive worm behavior may have implications for understanding human disease

Caenorhabditis elegans.
Repetition can be useful if you're trying to memorize a poem, master a guitar riff, or just cultivate good habits. When this kind of behavior becomes compulsive, however, it can get in the way of normal life -- an impediment sometimes observed in psychiatric illnesses like Tourette's syndrome and autism spectrum disorders. Now, Rockefeller scientists have identified a brain circuit that underlies repetition in worms, a finding that may ultimately shed light on similar behavior in humans.

Studying the microscopic roundworm C. elegans, the researchers found that defects in one protein cause animals to reorient themselves over and over again. Described in Nature Communications, these observations are bolstered by previous research in mice, and suggest that similar mechanisms may drive repetitive behavior in a range of animals, including humans.

Chemical cleanup

The scientists initially set out to understand how astrocytes, star-shaped cells found in mammalian brains, help neurons do their job. Astrocytes are thought to be responsible for, among other things, disposing of excess neurochemicals at synapses, the connections between neurons. This task is vital because if chemicals are not removed in a timely fashion, they can stimulate neurons in unexpected ways, disrupting normal brain function. To better understand this process, Menachem Katz, a research associate in the lab of Shai Shaham, looked to C. elegans CEPsh glial cells, which he suspected to be the worm equivalents of astrocytes.

Confirming this suspicion, Katz, Shaham, and their colleagues, used mRNA sequencing to show that mouse astrocytes and CEPsh glia have similar genetic signatures. Among other commonalities, both cell types produce the protein GLT-1, the mammalian version of which is responsible for clearing the chemical glutamate away from synapses. This finding, says Shaham, afforded the researchers a unique opportunity to define how astrocytes and GLT-1 work.

"Scientists have been trying to understand the functions of astrocytes for many years, and in mammals it's not easy because these cells are essential for keeping neurons alive," he says. "But in C. elegans there are only four CEPsh glial cells, and they are not required for neuron survival. This allowed us to investigate the specific roles of glutamate transporters, without worrying about the side effects of neuron sickness."

To do so, the researchers created C. elegans lacking GLT-1. Surprisingly, this depletion did not result in glutamate accumulation at synapses, as was expected. Instead, the worms exhibited oscillations in synaptic glutamate levels -- and a peculiar behavioral defect.

"These animals changed their direction at a crazy rate. They just kept moving forward and going back, moving forward and going back," says Shaham, the Richard E. Salomon Family Professor. "And when we analyzed this behavior, we discovered that they did so in a really interesting pattern."

Turn, turn, turn

It's perfectly normal for C. elegans to change course every now and then. Typically, the worm reorients itself about once every 90 seconds. But worms lacking GLT-1, the researchers found, took this action to the extreme: at 90 second intervals the animals executed not one reversal, but bursts of them. "It's as if once they start the action, they can't stop repeating it," says Katz.

Further experiments revealed that removal of the glutamate receptor MGL-2 blocked both repetitive reversals and synaptic glutamate oscillations. The researchers concluded that when glutamate is not efficiently cleared, the chemical stimulates MGL-2, which in turn triggers the release of yet more glutamate. This process then repeats on a loop; and every time glutamate is released, it activates the neuron responsible for initiating reversals.

"These findings suggest a simple model for how repetition can occur in worms," says Katz. "And, it turns out, this model may hold up in more complex nervous systems."

Indeed, past experiments have shown that GLT-1 mutations cause repetitive grooming in mice, and that compounds blocking the mouse version of MGL-2 eliminate similar behavior in other contexts. Taken together with the new findings in C. elegans, this research suggests that abnormal glutamate secretion may underlie repetitive behaviors across the animal kingdom -- raising the possibility that they may be relevant to understanding pathological repetition in humans.

Consistent with this idea, human genetics studies have found mutations associated with glutamate signaling in patients with obsessive compulsive disorder and autism spectrum disorders, both of which can be accompanied by repetitive behavior.

Read more at Science daily

Tomato pan-genome makes bringing flavor back easier

Tomatoes.
Almost everyone agrees that most store-bought tomatoes don't have much flavor. Now, scientists from the Agricultural Research Service (ARS) and the Boyce Thompson Institute (BTI) may have spotlighted the solution in a paper just published in Nature Genetics.

Molecular biologist James Giovannoni with the ARS Plant, Soil and Nutrition Research Laboratory and BTI bioinformatics scientist Zhangjun Fei, both in Ithaca, New York, have finished constructing the pan-genome for the cultivated tomato and its wild relatives, mapping almost 5,000 previously undocumented genes.

A genome is a biological map of an organism's genes and their functions. But a genome is usually of a single variety, which then acts as a reference genome for the rest of the species. This pan-genome includes all of the genes from 725 different cultivated and closely related wild tomatoes, which revealed 4,873 genes that were absent from the original reference genome.

While cultivated tomatoes have a wide range of physical and metabolic variation, there have been several severe bottlenecks during its domestication and breeding. This means today's tomatoes have a narrow genetic base. The pan-genome helps identify what additional genes beyond the reference might be available for crop breeding and improvement.

In modern times, breeders have concentrated on traits such as yield, shelf life, disease resistance and stress tolerance, traits that have been economically important to growers. Tomatoes are one of the most eaten vegetables -- although they actually are fruit botanically -- with a worldwide annual production of 182 million tons, worth more than $60 billion.

U.S. tomato consumption per capita was 20.3 pounds for fresh tomatoes in 2017 plus an additional 73.3 pounds of processed tomatoes eaten per person. Tomatoes are the second most consumed vegetable in the United States after potatoes.

"One of the most important discoveries from constructing this pan-genome is a rare form of a gene labeled TomLoxC, which mostly differs in the version of its DNA gene promoter. The gene influences fruit flavor by catalyzing the biosynthesis of a number of lipid (fat)-involved volatiles -- compounds that evaporate easily and contribute to aroma," explained Giovannoni.

In addition, the researchers found a new role of TomLoxC. It also facilitates production of a group of apocarotenoids -- organic chemicals derived from carotenoids including vitamin A precursors -- that work as signaling molecules influencing a variety of responses in plants including environmental stresses. The compounds also have a variety of floral and fruity odors that are important in tomato taste.

The rare version of TomLoxC was found in only 2 percent of older or heirloom cultivated large tomato varieties, although the version was present in 91 percent of currant-sized wild tomatoes, primarily Solanum pimpinellifolium, the wild predecessor of the cultivated tomato. It is becoming more common in newer varieties.

"It appears that there may have been strong selection pressure against or at least no selection for the presence of this version of TomLoxC early in the domestication of tomatoes," Giovannoni added. "The increase in prevalence of this form in modern tomatoes likely reflects breeders' renewed interest in improved flavor."

With the availability of this wide array of specific genetic information, breeders should be able to work quickly to increase the flavor of store bought, mass production tomatoes while preserving the traits that make them an economically advantageous crop.

"These novel genes discovered from the tomato pan-genome added substantial information to the tomato genome repertoire and provide additional opportunities for tomato improvement. The presence and absence profiles of these genes in different tomato populations have shed important lights on how human selection of desired traits have reshaped the tomato genomes," said Fei.

Read more at Science Daily

Shrinking moon may be generating moonquakes

The Moon.
The Moon is shrinking as its interior cools, getting more than about 150 feet (50 meters) skinnier over the last several hundred million years. Just as a grape wrinkles as it shrinks down to a raisin, the Moon gets wrinkles as it shrinks. Unlike the flexible skin on a grape, the Moon's surface crust is brittle, so it breaks as the Moon shrinks, forming "thrust faults" where one section of crust is pushed up over a neighboring part.

"Our analysis gives the first evidence that these faults are still active and likely producing moonquakes today as the Moon continues to gradually cool and shrink," said Thomas Watters, senior scientist in the Center for Earth and Planetary Studies at the Smithsonian's National Air and Space Museum in Washington. "Some of these quakes can be fairly strong, around five on the Richter scale."

These fault scarps resemble small stair-step shaped cliffs when seen from the lunar surface, typically tens of yards (meters) high and extending for a few miles (several kilometers). Astronauts Eugene Cernan and Harrison Schmitt had to zig-zag their lunar rover up and over the cliff face of the Lee-Lincoln fault scarp during the Apollo 17 mission that landed in the Taurus-Littrow valley in 1972.

Watters is lead author of a study that analyzed data from four seismometers placed on the Moon by the Apollo astronauts using an algorithm, or mathematical program, developed to pinpoint quake locations detected by a sparse seismic network. The algorithm gave a better estimate of moonquake locations. Seismometers are instruments that measure the shaking produced by quakes, recording the arrival time and strength of various quake waves to get a location estimate, called an epicenter. The study was published May 13 in Nature Geoscience.

Astronauts placed the instruments on the lunar surface during the Apollo 11, 12, 14, 15, and 16 missions. The Apollo 11 seismometer operated only for three weeks, but the four remaining recorded 28 shallow moonquakes -- the type expected to be produced by these faults -- from 1969 to 1977. The quakes ranged from about 2 to around 5 on the Richter scale.

Using the revised location estimates from the new algorithm, the team found that eight of the 28 shallow quakes were within 30 kilometers (18.6 miles) of faults visible in lunar images. This is close enough to tentatively attribute the quakes to the faults, since modeling by the team shows that this is the distance over which strong shaking is expected to occur, given the size of these fault scarps. Additionally, the new analysis found that six of the eight quakes happened when the Moon was at or near its apogee, the farthest point from Earth in its orbit. This is where additional tidal stress from Earth's gravity causes a peak in the total stress, making slip-events along these faults more likely.

"We think it's very likely that these eight quakes were produced by faults slipping as stress built up when the lunar crust was compressed by global contraction and tidal forces, indicating that the Apollo seismometers recorded the shrinking Moon and the Moon is still tectonically active," said Watters. The researchers ran 10,000 simulations to calculate the chance of a coincidence producing that many quakes near the faults at the time of greatest stress. They found it is less than 4 percent. Additionally, while other events, such as meteoroid impacts, can produce quakes, they produce a different seismic signature than quakes made by fault slip events.

Other evidence that these faults are active comes from highly detailed images of the Moon by NASA's Lunar Reconnaissance Orbiter (LRO) spacecraft. The Lunar Reconnaissance Orbiter Camera (LROC) has imaged over 3,500 of the fault scarps. Some of these images show landslides or boulders at the bottom of relatively bright patches on the slopes of fault scarps or nearby terrain. Weathering from solar and space radiation gradually darkens material on the lunar surface, so brighter areas indicate regions that are freshly exposed to space, as expected if a recent moonquake sent material sliding down a cliff. Examples of fresh boulder fields are found on the slopes of a fault scarp in the Vitello cluster and examples of possible bright features are associated with faults that occur near craters Gemma Frisius C and Mouchez L. Other LROC fault images show tracks from boulder falls, which would be expected if the fault slipped and the resulting quake sent boulders rolling down the cliff slope. These tracks are evidence of a recent quake because they should be erased relatively quickly, in geologic time scales, by the constant rain of micrometeoroid impacts on the Moon. Boulder tracks near faults in Schrödinger basin have been attributed to recent boulder falls induced by seismic shaking.

Additionally, one of the revised moonquake epicenters is just 13 kilometers (8 miles) from the Lee-Lincoln scarp traversed by the Apollo 17 astronauts. The astronauts also examined boulders and boulder tracks on the slope of North Massif near the landing site. A large landslide on South Massif that covered the southern segment of the Lee-Lincoln scarp is further evidence of possible moonquakes generated by fault slip events.

"It's really remarkable to see how data from nearly 50 years ago and from the LRO mission has been combined to advance our understanding of the Moon while suggesting where future missions intent on studying the Moon's interior processes should go," said LRO Project Scientist John Keller of NASA's Goddard Space Flight Center in Greenbelt, Maryland.

Since LRO has been photographing the lunar surface since 2009, the team would like to compare pictures of specific fault regions from different times to see if there is any evidence of recent moonquake activity. Additionally, "Establishing a new network of seismometers on the lunar surface should be a priority for human exploration of the Moon, both to learn more about the Moon's interior and to determine how much of a hazard moonquakes present," said co-author Renee Weber, a planetary seismologist at NASA's Marshall Space Flight Center in Huntsville, Alabama.

The Moon isn't the only world in our solar system experiencing some shrinkage with age. Mercury has enormous thrust faults -- up to about 600 miles (1,000 kilometers) long and over a mile (3 kilometers) high -- that are significantly larger relative to its size than those on the Moon, indicating it shrank much more than the Moon. Since rocky worlds expand when they heat up and contract as they cool, Mercury's large faults reveal that is was likely hot enough to be completely molten after its formation. Scientists trying to reconstruct the Moon's origin wonder whether the same happened to the Moon, or if instead it was only partially molten, perhaps with a magma ocean over a more slowly heating deep interior. The relatively small size of the Moon's fault scarps is in line with the more subtle contraction expected from a partially molten scenario.

Read more at Science Daily