Feb 18, 2021

World's oldest DNA reveals how mammoths evolved

 An international team led by researchers at the Centre for Palaeogenetics in Stockholm has sequenced DNA recovered from mammoth remains that are up to 1.2 million years old. The analyses show that the Columbian mammoth that inhabited North America during the last ice age was a hybrid between the woolly mammoth and a previously unknown genetic lineage of mammoth. In addition, the study provides new insights into when and how fast mammoths became adapted to cold climate. These findings are published today in Nature.

Around one million years ago there were no woolly or Columbian mammoths, as they had not yet evolved. This was the time of their predecessor, the ancient steppe mammoth. Researchers have now managed to analyse the genomes from three ancient mammoths, using DNA recovered from mammoth teeth that had been buried for 0.7-1.2 million years in the Siberian permafrost.

This is the first time that DNA has been sequenced and authenticated from million-year-old specimens, and extracting the DNA from the samples was challenging. The scientists found that only minute amounts of DNA remained in the samples and that the DNA was degraded into very small fragments.

"This DNA is incredibly old. The samples are a thousand times older than Viking remains, and even pre-date the existence of humans and Neanderthals," says senior author Love Dalén, a Professor of evolutionary genetics at the Centre for Palaeogenetics in Stockholm.

The age of the specimens was determined using both geological data and the molecular clock. Both these types of analyses showed that two of the specimens are more than one million years old, whereas the third is roughly 700 thousand years old and represents one of the earliest known woolly mammoths.

An unexpected origin of the Columbian mammoth

Analyses of the genomes showed that the oldest specimen, which was approximately 1.2 million years old, belonged to a previously unknown genetic lineage of mammoth. The researchers refer to this as the Krestovka mammoth, based on the locality where it was found. The results show that the Krestovka mammoth diverged from other Siberian mammoths more than two million years ago.

"This came as a complete surprise to us. All previous studies have indicated that there was only one species of mammoth in Siberia at that point in time, called the steppe mammoth. But our DNA analyses now show that there were two different genetic lineages, which we here refer to as the Adycha mammoth and the Krestovka mammoth. We can't say for sure yet, but we think these may represent two different species," says the study's lead author Tom van der Valk.

The researchers also suggest that it was mammoths that belonged to the Krestovka lineage that colonised North America some 1.5 million years ago. In addition, the analyses show that the Columbian mammoth that inhabited North America during the last ice age, was a hybrid. Roughly half of its genome came from the Krestovka lineage and the other half from the woolly mammoth.

"This is an important discovery. It appears that the Columbian mammoth, one of the most iconic Ice Age species of North America, evolved through a hybridisation that took place approximately 420 thousand years ago," says co-lead author Patrícia Pec?nerova?.

Evolution and adaptation in the woolly mammoth

The second million-year-old genome, from the Adycha mammoth, appears to have been ancestral to the woolly mammoth. The researchers could therefore compare its genome with the genome from one of the earliest known woolly mammoths that lived 0.7 million years ago, as well as with mammoth genomes that are only a few thousand years old. This made it possible to investigate how mammoths became adapted to a life in cold environments and to what extent these adaptations evolved during the speciation process.

The analyses showed that gene variants associated with life in the Arctic, such as hair growth, thermoregulation, fat deposits, cold tolerance and circadian rhythms, were already present in the million-year-old mammoth, long before the origin of the woolly mammoth. These results indicate that most adaptations in the mammoth lineage happened slowly and gradually over time.

"To be able to trace genetic changes across a speciation event is unique. Our analyses show that most cold adaptations were present already in the ancestor of the woolly mammoth, and we find no evidence that natural selection was faster during the speciation process," says co-lead author David Di?ez-del-Molino.

Future research

The new results open the door for a broad array of future studies on other species. About one million years ago was a period when many species expanded across the globe. This was also a time period of major changes in climate and sea levels, as well as the last time that Earth's magnetic poles changed places. Because of this, the researchers think that genetic analyses on this time scale have great potential to explore a wide range of scientific questions.

Read more at Science Daily

New crystalline form of ice

 Three years ago, chemists found evidence for the existence of a new variety of ice. Until then, 18 types of crystalline ice were known. The team now reports on the elucidation of the crystal structure of ice XIX using neutron diffraction.

Ice is a very versatile material. In snowflakes or ice cubes, the oxygen atoms are arranged hexagonally. This ice form is called ice one (ice I). "Strictly speaking, however, these are not actually perfect crystals, but disordered systems in which the water molecules are randomly oriented in different spatial directions," explains Thomas Loerting from the Institute of Physical Chemistry at the University of Innsbruck, Austria. Including ice I, 18 crystalline forms of ice were known so far, which differ in the arrangement of their atoms. The different types of ice, known as polymorphs, form depending on pressure and temperature and have very different properties. For example, their melting points differ by several hundred degrees Celsius. "It's comparable to diamond and graphite, both of which are made of pure carbon," the chemist explains.

Icy variety

When conventional ice I is cooled strongly, the hydrogen atoms can arrange themselves periodically in addition to the oxygen atoms if the experiment is conducted correctly. Below minus 200 degrees Celsius, this can lead to the formation of so-called ice XI, in which all water molecules are ordered according to a specific pattern. Such ordered ice forms differ from the disordered parental forms, especially in their electrical properties. In the current work, the Innsbruck chemists deal with the parent form ice VI, which is formed at high pressure, for example in the Earth's mantle. Like hexagonal ice, this high-pressure form of ice is not a completely ordered crystal. More than 10 years ago, researchers at the University of Innsbruck produced a hydrogen-ordered variant of this ice, which found its way into textbooks as ice XV. By changing the manufacturing process, three years ago Thomas Loerting's team succeeded for the first time in creating a second ordered form for ice VI. To do this, the scientists significantly slowed down the cooling process and increased the pressure to around 20 kbar. This enabled them to arrange the hydrogen atoms in a second way in the oxygen lattice and produce ice XIX. "We found clear evidence at that time that it is a new ordered variant, but we were not able to elucidate the crystal structure." Now his team has succeeded in doing just that using the gold standard for structure determination -- neutron diffraction.

Crystal structure solved

For the clarification of the crystal structure, an essential technical hurdle had to be overcome. In an investigation using neutron diffraction, it is necessary to replace the light hydrogen in water with deuterium ("heavy hydrogen"). "Unfortunately, this also changes the time scales for ordering in the ice manufacturing process," says Loerting. "But Ph.D. student Tobias Gasser then had the crucial idea of adding a few percent of normal water to the heavy water -- which turned out to speed up the ordering immensely." With the ice obtained in this way, the Innsbruck scientists were finally able to measure neutron data on the high-resolution HRPD instrument at the Rutherford Appleton Laboratory in England and painstakingly solve the crystal structure of ice XIX. This required finding the best crystal structure out of several thousand candidates from the measured data -- much like searching for a needle in a haystack. A Japanese research group confirmed the Innsbruck result in another experiment under different pressure conditions. Both papers have now been published jointly in Nature Communications.

Six ice forms discovered in Innsbruck

While conventional ice and snow are abundant on Earth, no other forms are found on the surface of our planet -- except in research laboratories. However, the high-pressure forms ice VI and ice VII are found as inclusions in diamonds and have therefore been added to the list of minerals by the International Mineralogical Association (IMA). Many varieties of water ice are formed in the vastness of space under special pressure and temperature conditions. They are found, for example, on celestial bodies such as Jupiter's moon Ganymede, which is covered by layers of different ice varieties.

Ice XV and ice XIX represents the first sibling pair in ice physics in which the oxygen lattice is the same, but the pattern how hydrogen atoms are ordered is different. "This also means that for the first time it will now be possible to realize the transition between two ordered ice forms in experiments," Thomas Loerting is pleased to report. Since the 1980s, researchers at the University of Innsbruck, Austria, are now responsible for the discovery of four crystalline as well as two amorphous ice forms.

Read more at Science Daily

Wolves, dogs and dingoes, oh my

 Dogs are generally considered the first domesticated animal, while its ancestor is generally considered to be the wolf, but where the Australian dingo fits into this framework is still debated, according to a retired Penn State anthropologist.

"Indigenous Australians understood that there was something different about the dingoes and the colonial dogs," said Pat Shipman, retired adjunct professor of anthropology, Penn State. "They really are, I think, different animals. They react differently to humans. A lot of genetic and behavioral work has been done with wolves, dogs and dingoes. Dingoes come out somewhere in between."

Wolves, dogs and dingoes are all species of the canidae family and are called canids. In most animals, hybridization between closely related species does not happen, or like female horses and male donkeys, produce mules -- usually non-fertile offspring. However, many canid species, including wolves, dingoes and dogs, can interbreed and produce fertile offspring. Defining species boundaries in canids becomes more difficult.

Domestic dogs came to the Australian continent in 1788 with the first 11 ships of convicts, but dingoes were already there, as were aboriginal Australians who arrived on the continent about 65,000 years ago. A large portion of dingoes in Australia today have domestic dog in their ancestry, but dingoes came to Australia at least 4,000 years ago according to fossil evidence. Shipman believes that date may be even earlier, but no fossils have yet been found.

"Part of the reason I'm so fascinated with dingoes is that if you see a dingo through American eyes you say, 'that's a dog,'" said Shipman. "In evolutionary terms, dingoes give us a glimpse of what started the domestication process."

Shipman reports her analysis of wolves, dogs and dingoes in a January 2021 special issue of the Anatomical Record.

Dingoes, and the closely related New Guinea singing dogs, look like the default definition of dog, but they are not dogs.

"There is a basic doggy look to dingoes," said Shipman.

Genetically and behaviorally they differ from dogs and are more like wolves in their inability to digest starches and their relationships with humans.

Most domestic dogs evolved along with humans as humans became agriculturalists and moved to a diet containing large amounts of starch, whether from maize, rice, potatoes or wheat. Their genome changed to allow the digestion of these starches. Dingoes, like wolves, have very few of the genes for starch digestion.

While indigenous Australians stole dingo puppies from their dens and raised them, these puppies generally left human homes at maturity and went off to breed and raise offspring. The ability to closely bond with humans is limited in dingoes, although present in dogs. Native Australians also did not manipulate dingo breeding, which is a hallmark of domestication.

Dingoes are also well-adapted to the Australian outback and fare well in that environment. Domestic dogs that become feral do not survive well in the outback.

"Aboriginal Australians were not well-regarded as holders of knowledge or special skill when Europeans came to the continent," said Shipman. "So, no one thought to ask them about dingoes. Even recently, asking aboriginals for their scientific or behavioral knowledge really was not common."

However, aboriginal Australians have a long history of living with dingoes in their lives. Many people argue that dingoes are just dogs -- strange dogs, but just dogs, said Shipman. But, according to aboriginals, dingoes are not dogs.

Read more at Science Daily

A 'twisted elevator' could be key to understanding neurological diseases

 A University of Sydney-led international team of scientists has revealed the shape of one of the most important molecular machines in our cellsthe glutamate transporter, helping to explain how our brain cells communicate with one another.

Glutamate transporters are tiny proteins on the surface of all our cells that shut on and off the chemical signals that have a big role in making sure all cell-to-cell talk runs smoothly. They are also involved in nerve signalling, metabolism and learning and memory.

The researchers captured the transporters in exquisite detail using cryogenic electron microscopy (cryo-EM), showing they look like a 'twisted elevator' embedded in the cell membrane.

This world-first discovery opens a whole new field of possibility, studying if defects in the transporters could be the reason behind neurological diseases such as Alzheimer's disease.

The results of the research have been published in Nature.

"The first time I saw the image was amazing. It revealed so much about how this transporter works and explained years of previous research," says PhD student Ichia Chen who was lead author on the study.

A multitasking transporter

The researchers were able to 'photograph' the structure of the glutamate transporter, by analysing thousands of images trapped in a thin layer of ice using cryo-EM, a highly sensitive microscope that made this research possible.

Cryo-EM can make visible what is invisible to the naked eye, using electron beams to photograph biological molecules.

The results also confirm suspicions the researchers had for some time that the glutamate transporters were multi-taskers.

"Using Cryo-EM, we have uncovered for the first time just how these transporters can multitask -- carrying out the dual functions of moving chemicals (like glutamate) across the cell membrane while also allowing water and chloride ions to move through at the same time,"said senior author Professor Renae Ryan from the School of Medical Sciences, Faculty of Medicine and Health.

"These molecular machines use a really cool twisting, elevator-like mechanism to move their cargo across the cell membrane. But they also have an additional function where they can allow water and chloride ions to move across the cell membrane. We have been studying these dual functions for quite some time, but we could never explain how the transporters did this until now. Using a combination of techniques including cryo-EM and computer simulations, we captured this rare state, where we can observe both functions happening at the same time."

"Understanding how the molecular machines in our cells work enables us to interpret defects in these machines in disease states and also gives us clues as to how we might target these machines with therapeutics,"says Professor Ryan.

Key to bridging the gap in diseases

Mapping out in detail the structure of the glutamate transporter could be a crucial tool for researchers in understanding how our bodies work ,and the mechanism behind some diseases.

Defects in the glutamate transporter have been linked to many neurological diseases such as Alzheimer's disease and stroke.

This includes rare diseases such as episodic ataxia, a disease that impacts movement and causes periodic paralysis, caused by an uncontrolled leak of chloride through the glutamate transporter in brain cells.

"Understanding the glutamate transporter structure, which controls the normal flow of chloride, could help design drugs that can 'plug up' the chloride channel in episodic ataxia," says co-lead author Dr Qianyi Wu.

Result of teamwork

The paper was the result of seven years of work from researchers in Australia and the United States.

The work also highlights the importance and potential of high-resolution microscopy to understanding biological processes.

Read more at Science Daily

Feb 17, 2021

The smallest galaxies in our universe bring more about dark matter to light

Our universe is dominated by a mysterious matter known as dark matter. Its name comes from the fact that dark matter does not absorb, reflect or emit electromagnetic radiation, making it difficult to detect.

Now, a team of researchers has investigated the strength of dark matter scattered across the smallest galaxies in the universe using stellar kinematics.

"We discovered that the strength of dark matter is quite small, suggesting that dark matter does not easily scatter together," said professor Kohei Hayashi, lead author of the study.

Much is unknown about dark matter, but theoretical and experimental research, from particle physics to astronomy, are elucidating more about it little by little.

One prominent theory surrounding dark matter is the "self-interacting dark matter (SIDM) theory." It purports that dark matter distributions in galactic centers become less dense because of the self-scattering of dark matter.

However, supernova explosions, which occur toward the end of a massive star's life, can also form less dense distributions. This makes it challenging to distinguish whether it is the supernova explosion or the nature of dark matter that causes a less dense distribution of dark matter.

To clarify this, Hayashi and his team focused on ultra-faint dwarf galaxies. Here a few stars exist, rendering the influences of supernova explosions negligible.

Their findings showed that dark matter is dense at the center of the galaxy, challenging the basic premise of SIDM. Images from the dwarf galaxy Segue 1 revealed high dark matter density at the center of the galaxy, and that scattering is limited.

"Our study showed how useful stellar kinematics in ultra-faint dwarf galaxies are for testing existing theories on dark matter," noted Hayashi. "Further observations using next-generation wide-field spectroscopic surveys with the Subaru Prime Focus Spectrograph, will maximize the chance of obtaining dark matter's smoking gun."

From Science Daily

On the quest for other Earths

In the search for planets capable of sustaining life, an international research team with members from ETH has taken a significant step forward. As the researchers reported recently in the journal Nature Communications, they found signs of a Neptune-sized planet in the Alpha Centauri star system, a mere 4.4 light years away from Earth. This exoplanet is located in a zone that may offer suitable conditions for life. The team was able to collect data with unprecedented sensitivity, thus registering even very weak signals.

Earth is a disruptive factor

Thanks to the new process, the researchers have advanced one step closer to a major goal of exoplanet research: the discovery of Earth-like planets capable of supporting life. Direct imaging of planets delivers information about the composition of their atmospheres and possibly even signs of life. To date, however, direct measurements have mostly found exoplanets that are larger than Jupiter and orbit far away from very young host stars. In other words, these planets fall outside the habitable zone where liquid water could form.

One reason that the search for Earth-like planets has so far proved fruitless is that it has been conducted in the near-infrared range, even though Earth-like planets that might have water are brightest in the mid-infrared range. Yet it is precisely in that range that measurements with normal telescopes are difficult, because that is where the Earth and its atmosphere are also at their brightest. This means the faint signals from exoplanets are lost in particularly strong background noise.

100 hours of observations


As reported in their study, the researchers have now been able to overcome this difficulty and take measurements in the mid-infrared range. They used the Very Large Telescope at the European Southern Observatory in Chile to examine Alpha Centauri stars A and B, logging nearly 100 hours over the course of a month. "Keeping the telescope pointed at the same star for such a long time is highly unusual," explains Anna Boehle, a postdoc in ETH Professor Sascha Quanz's group. As second author of the study, Boehle was heavily involved in evaluating the data. "We assessed more than five million images," she says.

To be able to detect the faint signals from potential planets, the researchers not only processed a huge volume of data, they also employed two sophisticated measurement techniques: one was to use a new deformable secondary telescope mirror, which made it possible to correct for distortions in the light coming through the Earth's atmosphere; and the other was to use a coronagraph to alternately block the light from each of the stars in turn at very short intervals. This let the scientists further reduce signal noise while examining the surroundings of both stars.

Signs of a planet


"Our findings indicate that in principle, this process enables us to discover smaller terrestrial planets capable of hosting life," Boehle explains, "and it represents a clear improvement over previous observation methods." Indeed, in their data the researchers found a light signal that may originate from a Neptune-sized planet. Boehle says, "Whether or not this signal is actually from a planet requires further study. To that end, we plan to combine the infrared measurements with other measurement methods."

From Science Daily

One in five has a mutation that provides superior resilience to cold

 Almost one in five people lacks the protein α-actinin-3 in their muscle fibre. Researchers at Karolinska Institutet in Sweden now show that more of the skeletal muscle of these individuals comprises slow-twitch muscle fibres, which are more durable and energy-efficient and provide better tolerance to low temperatures than fast-twitch muscle fibres. The results are published in the scientific journal The American Journal of Human Genetics.

Skeletal muscle comprises fast-twitch (white) fibres that fatigue quickly and slow-twitch (red) fibres that are more resistant to fatigue. The protein α-actinin-3, which is found only in fast-twitch fibres, is absent in almost 20 per cent of people -- almost 1.5 billion individuals -- due to a mutation in the gene that codes for it. In evolutionary terms, the presence of the mutated gene increased when humans migrated from Africa to the colder climates of central and northern Europe.

"This suggests that people lacking α-actinin-3 are better at keeping warm and, energy-wise, at enduring a tougher climate, but there hasn't been any direct experimental evidence for this before," says Håkan Westerblad, professor of cellular muscle physiology at the Department of Physiology and Pharmacology, Karolinska Institutet. "We can now show that the loss of this protein gives a greater resilience to cold and we've also found a possible mechanism for this."

For the study, 42 healthy men between the ages of 18 and 40 were asked to sit in cold water (14 °C) until their body temperature had dropped to 35.5 °C. During cold water immersion, researchers measured muscle electrical activity with electromyography (EMG) and took muscle biopsies to study the protein content and fibre-type composition.

The results showed that the skeletal muscle of people lacking α-actinin-3 contains a larger proportion of slow-twitch fibres. On cooling, these individuals were able to maintain their body temperature in a more energy-efficient way. Rather than activating fast-twitch fibres, which results in overt shivering, they increased the activation of slow-twitch fibers that produce heat by increasing baseline contraction (tonus).

"The mutation probably gave an evolutionary advantage during the migration to a colder climate, but in today's modern society this energy-saving ability might instead increase the risk of diseases of affluence, which is something we now want to turn our attention to," says Professor Westerblad.

Another interesting question is how the lack of α-actinin-3 affects the body's response to physical exercise.

"People who lack α-actinin-3 rarely succeed in sports requiring strength and explosiveness, while a tendency towards greater capacity has been observed in these people in endurance sports," he explains.

Read more at Science Daily

Study links prolonged sedentary time to distractibility in adults with obesity, overweight

 Scientists used accelerometers to track daily activity levels for a week in 89 adults with obesity or overweight and, in a series of tests, measured their ability to multitask and maintain their attention despite distractions. The study revealed that individuals who spent more sedentary time in bouts lasting 20 minutes or more were less able to overcome distractions.

Reported in the International Journal of Obesity, the research adds to the evidence linking sedentary behaviors and cognition, said University of Illinois Urbana-Champaign kinesiology and community health professor Dominika Pindus, who led the work on the paper.

"Several studies have examined the relationship between different types of sedentary behaviors such as TV viewing and cognitive functions in children and adults," Pindus said. "The relationships they observed varied with the type of sedentary behavior. These studies primarily measured sedentary behaviors during leisure time."

The research found that regularly sitting for extended periods is linked to increased mortality and cardiovascular disease, Pindus said. People who do not engage in at least 60 minutes per day of moderate-to-vigorous physical activity and sit for eight hours or more have an increased health risk. Other studies suggest that bouts of prolonged sitting lasting 20 minutes or more negatively affect levels of blood sugar after a meal.

"Few studies, however, have examined the relationship between prolonged sedentary time and cognitive functions," Pindus said. To address this gap in research, she and her colleagues focused on the associations between objectively measured, prolonged sedentary time and cognition in adults 25-45 years old with obesity or overweight.

"We know from previous research that people with obesity or overweight don't do as well on certain types of cognitive tasks," Pindus said. "These tasks engage executive functions -- cognitive functions that are important for reasoning and staying focused on a goal."

Some studies have found that long-term physical activity interventions in preadolescent children or older adults can improve those functions.

"But we don't have much data on how prolonged sedentary time is linked to executive functions in working-age people with obesity or overweight," she said. "If we can show how sedentary time and physical activity in everyday life relate to executive functions in those individuals, we may be able to design more targeted lifestyle interventions to improve cognition in this population."

The researchers collected baseline information for all participants, tested their cognitive ability and calculated each person's body mass index and percent body fat. Participants wore accelerometers on their waists during waking hours for seven days. They also completed cognitive tasks and measures of brain function in a laboratory setting.

"We used EEG recordings to measure electrical potentials that are generated in the brain while participants engaged in tasks that challenged them to focus, ignore distractions and flexibly switch attention between tasks," Pindus said. A controller connected to a computer allowed participants to respond to problems while the speed and accuracy of their responses was recorded.

A statistical analysis of participants' sedentariness in relation to their speed and accuracy on a task that measures distractibility found a relationship between the two, Pindus said.

"Our key finding was that people who spent more time in prolonged sedentary bouts were more easily distracted," she said.

More research is needed to determine how the structure of a person's sedentary time influences cognition, Pindus said.

Read more at Science Daily

How the 'noise' in our brain influences our behavior

 The brain's neural activity is irregular, changing from one moment to the next. To date, this apparent "noise" has been thought to be due to random natural variations or measurement error. However, researchers at the Max Planck Institute for Human Development have shown that this neural variability may provide a unique window into brain function. In a new Perspective article out now in the journal Neuron, the authors argue that researchers need to focus more on neural variability to fully understand how behavior emerges from the brain.

When neuroscientists investigate the brain, its activity seems to vary all the time. Sometimes activity is higher or lower, rhythmic or irregular. Whereas averaging brain activity has served as a standard way of visualizing how the brain "works," the irregular, seemingly random patterns in neural signals have often been disregarded. Strikingly, such irregularities in neural activity appear regardless of whether single neurons or entire brain regions are assessed. Brains simply always appear "noisy," prompting the question of what such moment-to-moment neural variability may reveal about brain function.

Across a host of studies over the past 10 years, researchers from the Lifespan Neural Dynamics Group (LNDG) at the Max Planck Institute for Human Development and the Max Planck UCL Centre for Computational Psychiatry and Ageing Research have systematically examined the brain's "noise," showing that neural variability has a direct influence on behavior. In a new Perspective article published in the journal Neuron, the LNDG in collaboration with the University of Lübeck highlights what is now substantial evidence supporting the idea that neural variability represents a key, yet under-valued dimension for understanding brain-behavior relationships. "Animals and humans can indeed adapt successfully to environmental demands, but how can such behavioral success emerge in the face of neural variability? We argue that neuroscientists must grapple with the possibility that behavior may emerge because of neural variability, not in spite of it," says Leonhard Waschke, first author of the article and LNDG postdoctoral fellow.

A recent LNDG study published in the journal eLife exemplifies the direct link between neural variability and behavior. Participants' brain activity was measured via electroencephalography (EEG) while they responded to faint visual targets. When people were told to detect as many visual targets as possible, neural variability generally increased, whereas it was downregulated when participants were asked to avoid mistakes. Crucially, those who were better able to adapt their neural variability to these task demands performed better on the task. "The better a brain can regulate its 'noise,' the better it can process unknown information and react to it. Traditional ways of analyzing brain activity simply disregard this entire phenomenon." says LNDG postdoctoral fellow Niels Kloosterman, first author of this study and co-author of the article in Neuron.

The LNDG continues to demonstrate the importance of neural variability for successful human behavior in an ongoing series of studies. Whether one is asked to process a face, remember an object, or solve a complex task, the ability to modulate moment-to-moment variability seems to be required for optimal cognitive performance. "Neuroscientists have seen this 'noise' in the brain for decades but haven't understood what it means. A growing body of work by our group and others highlights that neural variability may indeed serve as an indispensable signal of behavioral success in its own right. With the increasing availability of tools and approaches to measure neural variability, we are excited that such a hypothesis is now immediately testable," says Douglas Garrett, Senior Research Scientist and LNDG group leader. In the next phases of their research, the group plans to examine whether neural variability and behavior can be optimized through brain stimulation, behavioral training, or medication.

From Science Daily

Feb 16, 2021

First humans in Tasmania must have seen spectacular auroras

 Drilling a 270,000-year old core from a Tasmanian lake has provided the first Australian record of a major global event where the Earth's magnetic field 'switched '- and the opportunity to establish a precedent for developing new paleomagnetic dating tools for Australian archaeology and paleosciences.

"This is the first study of this kind in Australia since pioneering studies in the 1980s," said author Dr Agathe Lisé-Provonost, a McKenzie Fellow from the School of Earth Sciences at the University of Melbourne.

"Just two lakes in north-east Australia previously provided such "full-vector" record, where both the past directions and the past intensity of the Earth magnetic field are obtained from the same cores."

Published in the journal Quaternary Geochronology, Chronostratigraphy of a 270-ka sediment record from Lake Selina, Tasmania: Combining radiometric, geomagnetic and climatic dating, details how drilling into the 5.5 metre long Lake Selina core established that 41,000 years ago, people in Tasmania must have seen spectacular auroras when the Earth's magnetic field flipped, and for a few thousand years, north was south and south was north.

"During the geomagnetic 'excursion', the strength of the Earth's magnetic field almost vanished," said DrLisé-Provonost.

"This would lead to a big increase in cosmic and solar particles bombarding our planet because the magnetic field normally acts like a shield.

"We don't know when the next geomagnetic excursion will happen, but if one was to occur today, satellites would be rendered useless, smartphone navigation apps would fail, and there would be major disruptions of power distribution systems."

Research leading to that discovery got underway in 2014 when the author travelled to a small sub-alpine lake in western Tasmania with a team led by Associate Professor Michael-Shawn Fletcher, where a makeshift floating platform rigged to two inflatable rafts was used to drill down into the sediment.

With the core containing a climate, vegetation, and paleomagnetic record of the area, the team looked to first accurately date its layers finding evidence of the ecosystem changes that occurred as Tasmanian Aboriginals arrived 43,000 years ago and managed the land over thousands of years. Abrupt changes that occurred since the arrival of Europeans 200 years ago are also evidenced.

"Magnetic particles are eroded from rocks, making their way to a lake by wind or water, and settle down on the lake bottom," said Dr Lisé-Provonost.

"The magnetic particles act like tiny compass needles, aligning with the Earth's magnetic field. As these particles accumulate and become buried, they become locked in place, leaving a history of the Earth's magnetic field. The deeper we drill, the further back in time we go."

It's hoped the research will lead the way for more studies of the past geomagnetic field behavior from Australian lakes and other geological materials such as lava flows, cave deposits and fired archaeological artefacts, for developing new paleomagnetic dating tools and improving models of the Earth's magnetic field to, one day, maybe predict the next geomagnetic excursion.

Read more at Science Daily

Drinking, smoking, and drug use linked to premature heart disease in the young

 Recreational drinking, smoking, and drug use is linked to premature heart disease in young people, particularly younger women, finds research published online in the journal Heart.

Those who regularly use 4 or more substances are 9 times as likely to be affected, the findings indicate.

The numbers of new cases of heart disease (atherosclerotic cardiovascular disease) have been increasing in young adults, but the potential role of recreational substance use isn't entirely clear.

To probe this further, the researchers explored whether the recreational use of tobacco, cannabis, alcohol, and illicit drugs, such as amphetamine and cocaine, might be linked to prematurely and extremely prematurely furred up arteries.

They drew on information supplied to the 2014-2015 nationwide Veterans Affairs Healthcare database and the Veterans with premaTure AtheroscLerosis (VITAL) registry.

Extremely premature heart disease was defined as an 'event', such as a heart attack, angina, or stroke before the age of 40, while premature heart disease was defined as an event before the age of 55 in men and before the age of 65 in women.

In all, there were 135,703 people with premature heart disease and 7716 with extremely premature heart disease. They were compared with 1,112, 45 patients who didn't have premature heart disease.

Recreational use of any substance was independently associated with a higher likelihood of premature and extremely premature heart disease.

Patients with premature heart disease were more likely to smoke (63% vs 41%), drink (32% vs 15%), and to use cocaine (13% vs 2.5%), amphetamines (3% vs 0.5%), and cannabis (12.5% vs 3%).

After accounting for potentially influential factors, such as high blood pressure, diabetes, and high cholesterol, those who smoked tobacco were nearly twice as likely to have premature heart disease while those who drank recreationally were 50% more likely to do so.

Cocaine users were almost 2.5 times as likely to have premature heart disease, while those who used amphetamines were nearly 3 times as likely to do so. Cannabis users were more than 2.5 times as likely to have premature heart disease while those using other drugs were around 2.5 times as likely to do so.

The higher the number of substances used recreationally, the greater was the risk of premature heart disease, ranging from a doubling in risk with the use of 1 substance to a 9-fold heightened risk for those using 4 or more.

Similar trends were observed among those who had extremely premature heart disease, with recreational substance use associated with 1.5 to 3 times higher odds of heart disease.

The associations were even stronger among women with premature and extremely premature heart disease than among similarly affected men.

This is an observational study, and as such can't establish causality. And the researchers acknowledge that they were unable to gather information on other potentially influential factors, such as the dose and duration of recreational substance use.

In a linked editorial, Dr Anthony Wayne Orr of LSU Health Shreveport, Louisiana, points out that use of cocaine and methamphetamine have been associated with faster cell ageing and neurocognitive decline, with higher than average loss of grey matter.

And epidemiological studies suggest that 1 in 5 young adults misuse several substances and that these 'polysubstance users' often start using at younger ages, and so have worse health over the long term, he says.

The growing body of published research on these issues "suggests the need for a nationwide education campaign on the potential long-term damage being done to the cardiovascular system in patients with substance use disorders," he argues.

These people need to be aware of the long term consequences for their health beyond the risk of an overdose, while doctors should screen patients with a history of substance misuse, he says.

Read more at Science Daily

Answer quickly to be believed

 When people pause before replying to a question, even for just a few seconds, their answers are perceived to be less sincere and credible than if they had replied immediately, according to research published by the American Psychological Association.

And the longer the hesitation, the less sincere the response appears.

"Evaluating other people's sincerity is a ubiquitous and important part of social interactions," said lead author Ignazio Ziano, PhD, of Grenoble Ecole de Management. "Our research shows that response speed is an important cue on which people base their sincerity inferences."

The research was published in the Journal of Personality and Social Psychology.

Researchers conducted a series of experiments involving more than 7,500 individuals from the United States, the United Kingdom and France. Participants either listened to an audio snippet, viewed a video or read an account of a person responding to a simple question (e.g., did they like a cake a friend made or had they stolen money from work). In each scenario, the response time varied from immediate to a 10-second delay. Participants then rated the sincerity of the response on a sliding scale.

Across all 14 experiments, participants consistently rated delayed responses as less sincere regardless of the question, whether it was a harmless one about cake or a more serious one about committing a crime.

A few conditions reduced this effect, the researchers found. For example, if the answer was considered socially undesirable, such as saying, "No, I don't like it" when a friend asks if you like their cake, response speed did not seem to matter much; the answer was considered sincere whether it was fast or slow. The researchers also found that if people thought a slower response was due to mental effort (for instance, having to think back if you had stolen candy 10 years ago), response speed had a smaller effect.

The findings have wide implications, according to Ziano. "Whenever people are interacting, they are judging each other's sincerity. These results can be applied to a wide range of interactions, going from workplace chit-chat to couples and friends bickering," he said. "Further, in job interviews and in court hearings and trials, people are often tasked with judgments of sincerity. Here, too, response speed could play a part."

For example, he said, imagine a hiring manager asking two job candidates, named Ann and Barb, whether they really know the programming language Javascript, as they claim. Ann says yes immediately, while Barb replies yes after three seconds.

"Our results suggest that in this situation, the hiring manager is more likely to believe Ann than Barb, and therefore more likely to hire Ann," said Ziano. "In general, whenever there is a response that requires an answer, such as in a job interview, delayed responses can be perceived as less sincere."

Another area where response time may be important is jury reactions to testimony in court.

"It would be unfair for the responder, such as a crime suspect, if the response delay was misattributed to thought suppression or answer fabrication when it was in fact caused by a different factor, such as simply being distracted or thoughtful," said Ziano.

The final experiment found that explicitly instructing participants to ignore delayed response reduced, but did not completely remove, the effect of delayed response on judgment of sincerity or guilt.

Read more at Science Daily

Unlocking the mystery behind skeletal aging

 Researchers from the UCLA School of Dentistry have identified the role a critical enzyme plays in skeletal aging and bone loss, putting them one step closer to understanding the complex biological mechanisms that lead to osteoporosis, the bone disease that afflicts some 200 million people worldwide.

The findings from their study in mice, published online in the journal Cell Stem Cell, could hold an important key to developing more effective treatments for osteoporosis and improving the lives of an aging population, they say.

Cells in the bone marrow known as mesenchymal stem cells serve as the building blocks of the body's skeletal tissues, but whether these stem cells ultimately develop into bone or fat tissues is controlled in part by what are known as epigenetic factors -- molecules that regulate genes, silencing some and activating others.

The UCLA researchers, led by distinguished professor Dr. Cun-Yu Wang, chair of oral biology at the dentistry school, demonstrated that when the epigenetic factor KDM4B is absent from mesenchymal stem cells, these cells are far more likely to differentiate into fat cells than bone cells, resulting in an unhealthy imbalance that exacerbates skeletal aging and leads to brittle bones and fractures over time.

"We know that bone loss comes with age, but the mechanisms behind extreme cases such as osteoporosis have, up until recently, been very vague," said Dr Wang, the study's corresponding author and the Dr. No-Hee Park Professor of Dentistry at UCLA. "In this study, we built on more than seven years of research managed by my postdoctoral scholar and lead author Dr. Peng Deng in the hope that we can eventually prevent skeletal aging and osteoporosis."

While scientists have long understood the cellular pathway involved in bone tissue formation, the role of epigenetic factors has been murkier. Previous research by Wang, Deng and others had identified that the enzyme KDM4B plays an important epigenetic role in bone formation, but they were unsure of how its absence might affect the processes of bone formation and bone loss.

To test this, the research team created a mouse model in which KDM4B was absent or removed in several different scenarios. They found that the removal of the enzyme pushed mesenchymal stem cells to create more fat instead of bone tissue, leading to bone loss over time, which mimics skeletal aging.

In one important scenario, the scientists examined stem cell senescence, or deterioration and exhaustion -- the natural process by which mesenchymal stem cells stop rejuvenating or creating more of themselves over time. The team unexpectedly found that senescence, which leads to natural skeletal aging, was characterized by a loss of KDM4B.

In addition to age, other environmental factors are thought to reduce bone quality and exacerbate bone loss, including a high-fat diet. The team demonstrated that a loss of KDM4B significantly promoted bone loss and the accumulation of marrow fat in mice placed on a high-fat diet.

Finally, the team showed that parathyroid hormone, an anabolic drug approved by the U.S. Food and Drug Administration for the treatment of aging-related bone loss, helps to maintain the pool of mesenchymal stem cells in aging mice in a KDM4B-dependent manner.

The results not only confirm the critical role KDM4B plays in mesenchymal stem cell fate decision, skeletal aging and osteoporosis, but they show that the loss of KDM4B exacerbates bone loss under a number of conditions and, surprisingly, that KDM4B controls the ability of mesenchymal stem cells to self-renew. This study is the first in vivo research to demonstrate that the loss of an epigenetic factor promotes adult stem cell deterioration and exhaustion in skeletal aging.

The findings, the researchers say, hold promise for the eventual development of strategies to reverse bone-fat imbalance, as well as for new prevention and treatment methods that address skeletal aging and osteoporosis by rejuvenating adult stem cells.

"The work of Dr. Wang, his lab members and collaborators provides new molecular insight into the changes associated with skeletal aging," said Dr. Paul Krebsbach, dean of the UCLA School of Dentistry. "These findings are an important step towards what may lead to more effective treatment for the millions of people who suffer from bone loss and osteoporosis."

Read more at Science Daily

Climate change likely drove the extinction of North America's largest animals

 A new study published in Nature Communications suggests that the extinction of North America's largest mammals was not driven by overhunting by rapidly expanding human populations following their entrance into the Americas. Instead, the findings, based on a new statistical modelling approach, suggest that populations of large mammals fluctuated in response to climate change, with drastic decreases of temperatures around 13,000 years ago initiating the decline and extinction of these massive creatures. Still, humans may have been involved in more complex and indirect ways than simple models of overhunting suggest.

Before around 10,000 years ago, North America was home to many large and exotic creatures, such as mammoths, gigantic ground-dwelling sloths, larger-than-life beavers, and huge armadillo-like creatures known as glyptodons. But by around 10,000 years ago, most of North America's animals weighing over 44 kg, also known as megafauna, had disappeared. Researchers from the Max Planck Extreme Events Research Group in Jena, Germany, wanted to find out what led to these extinctions. The topic has been intensely debated for decades, with most researchers arguing that human overhunting, climate change, or some combination of the two was responsible. With a new statistical approach, the researchers found strong evidence that climate change was the main driver of extinction.

Overhunting vs. climate change

Since the 1960's, it has been hypothesized that, as human populations grew and expanded across the continents, the arrival of specialized "big-game" hunters in the Americas some 14,000 year ago rapidly drove many giant mammals to extinction. The large animals did not possess the appropriate anti-predator behaviors to deal with a novel, highly social, tool-wielding predator, which made them particularly easy to hunt. According to proponents of this "overkill hypothesis," humans took full advantage of the easy-to-hunt prey, devastating the animal populations and carelessly driving the giant creatures to extinction.

Not everyone agrees with this idea, however. Many scientists have argued that there is too little archaeological evidence to support the idea that megafauna hunting was persistent or widespread enough to cause extinctions. Instead, significant climatic and ecological changes may have been to blame.

Around the time of the extinctions (between 15,000 and 12,000 years ago), there were two major climatic changes. The first was a period of abrupt warming that began around 14,700 years ago, and the second was a cold snap around 12,900 years ago during which the Northern Hemisphere returned to near-glacial conditions. One or both of these important temperature swings, and their ecological ramifications, have been implicated in the megafauna extinctions.

"A common approach has been to try to determine the timing of megafauna extinctions and to see how they align with human arrival in the Americas or some climatic event," says Mathew Stewart, co-lead author of the study. "However, extinction is a process -- meaning that it unfolds over some span of time -- and so to understand what caused the demise of North America's megafauna, it's crucial that we understand how their populations fluctuated in the lead up to extinction. Without those long-term patterns, all we can see are rough coincidences."

'Dates as data'

To test these conflicting hypotheses, the authors used a new statistical approach developed by W. Christopher Carleton, the study's other co-lead author, and published last year in the Journal of Quaternary Science. Estimating population sizes of prehistoric hunter-gatherer groups and long-extinct animals cannot be done by counting heads or hooves. Instead, archaeologists and palaeontologists use the radiocarbon record as a proxy for past population sizes. The rationale being that the more animals and humans present in a landscape, the more datable carbon is left behind after they are gone, which is then reflected in the archaeological and fossil records. Unlike established approaches, the new method better accounts for uncertainty in fossil dates.

The major problem with the previous approach is that it blends the uncertainty associated with radiocarbon dates with the process scientists are trying to identify.

"As a result, you can end up seeing trends in the data that don't really exist, making this method rather unsuitable for capturing changes in past population levels. Using simulation studies where we know what the real patterns in the data are, we have been able to show that the new method does not have the same problems. As a result, our method is able to do a much better job capturing through-time changes in population levels using the radiocarbon record," explains Carleton.

North American megafauna extinctions


The authors applied this new approach to the question of the Late Quaternary North American megafauna extinctions. In contrast to previous studies, the new findings show that megafauna populations fluctuated in response to climate change.

"Megafauna populations appear to have been increasing as North American began to warm around 14,700 years ago," states Stewart. "But we then see a shift in this trend around 12,900 years ago as North America began to drastically cool, and shortly after this we begin to see the extinctions of megafauna occur."

And while these findings suggest that the return to near glacial conditions around 12,900 years ago was the proximate cause for the extinctions, the story is likely to be more complicated than this.

"We must consider the ecological changes associated with these climate changes at both a continental and regional scale if we want to have a proper understanding of what drove these extinctions," explains group leader Huw Groucutt, senior author of the study. "Humans also aren't completely off the hook, as it remains possible that they played a more nuanced role in the megafauna extinctions than simple overkill models suggest."

Many researchers have argued that it is an impossible coincidence that megafauna extinctions around the world often happened around the time of human arrival. However, it is important to scientifically demonstrate that there was a relationship, and even if there was, the causes may have been much more indirect (such as through habitat modification) than a killing frenzy as humans arrived in a region.

Read more at Science Daily

Feb 15, 2021

Ancient seashell resonates after 18,000 years

 Almost 80 years after its discovery, a large shell from the ornate Marsoulas Cave in the Pyrenees has been studied by a multidisciplinary team from the CNRS, the Muséum de Toulouse, the Université Toulouse -- Jean Jaurès and the Musée du quai Branly -- Jacques-Chirac (1): it is believed to be the oldest wind instrument of its type. Scientists reveal how it sounds in a study published in the journal Science Advances on 10th February 2021.

The Marsoulas Cave, between Haute-Garonne and Ariège, was the first decorated cave to be found in the Pyrenees. Discovered in 1897, the cave bears witness to the beginning of the Magdalenian (2) culture in this region, at the end of the Last Glacial Maximum. During an inventory of the material from the archaeological excavations, most of which is kept in the Muséum de Toulouse, scientists examined a large Charonia lampas (sea snail) shell, which had been largely overlooked when discovered in 1931.

The tip of the shell is broken, forming a 3.5 cm diameter opening. As this is the hardest part of the shell, the break is clearly not accidental. At the opposite end, the shell opening shows traces of retouching (cutting) and a tomography scan has revealed that one of the first coils is perforated. Finally, the shell has been decorated with a red pigment (hematite), characteristic of the Marsoulas Cave, which indicates its status as a symbolic object.

To confirm the hypothesis that this conch was used to produce sounds, scientists enlisted the help of a horn player, who managed to produce three sounds close to the notes C, C-sharp and D. As the opening was irregular and covered with an organic coating (3), the researchers assume that a mouthpiece was also attached, as is the case for more recent conches in collection of the Musée du quai Branly -- Jacques Chirac. 3D impressions of the conch will enable this lead to be explored and verify whether it can be used to produce other notes.

The first carbon-14 dating of the cave, carried out on a piece of charcoal and a fragment of bear bone from the same archaeological level as the shell, provided a date of around 18,000 years. This makes the Marsoulas conch the oldest wind instrument of its type: to date, only flutes have been discovered in earlier European Upper Palaeolithic contexts; the conches found outside Europe are much more recent.

Read more at Science Daily

Neanderthals and Homo sapiens used identical Nubian technology

 Long held in a private collection, the newly analysed tooth of an approximately 9-year-old Neanderthal child marks the hominin's southernmost known range. Analysis of the associated archaeological assemblage suggests Neanderthals used Nubian Levallois technology, previously thought to be restricted to Homo sapiens.

With a high concentration of cave sites harbouring evidence of past populations and their behaviour, the Levant is a major centre for human origins research. For over a century, archaeological excavations in the Levant have produced human fossils and stone tool assemblages that reveal landscapes inhabited by both Neanderthals and Homo sapiens, making this region a potential mixing ground between populations. Distinguishing these populations by stone tool assemblages alone is difficult, but one technology, the distinct Nubian Levallois method, is argued to have been produced only by Homo sapiens.

In a new study published in Scientific Reports, researchers from the Max Planck Institute for the Science of Human History teamed up with international partners to re-examine the fossil and archaeological record of Shukbah Cave. Their findings extend the southernmost known range of Neanderthals and suggest that our now-extinct relatives made use of a technology previously argued to be a trademark of modern humans. This study marks the first time the lone human tooth from the site has been studied in detail, in combination with a major comparative study examining the stone tool assemblage.

"Sites where hominin fossils are directly associated with stone tool assemblages remain a rarity -- but the study of both fossils and tools is critical for understanding hominin occupations of Shukbah Cave and the larger region," says lead author Dr Jimbob Blinkhorn, formerly of Royal Holloway, University of London and now with the Pan-African Evolution Research Group (Max Planck Institute for the Science of Human History).

Shukbah Cave was first excavated in the spring of 1928 by Dorothy Garrod, who reported a rich assemblage of animal bones and Mousterian-style stone tools cemented in breccia deposits, often concentrated in well-marked hearths. She also identified a large, unique human molar. However, the specimen was kept in a private collection for most of the 20th century, prohibiting comparative studies using modern methods. The recent re-identification of the tooth at the Natural History Museum in London has led to new detailed work on the Shukbah collections.

"Professor Garrod immediately saw how distinctive this tooth was. We've examined the size, shape and both the external and internal 3D structure of the tooth, and compared that to Holocene and Pleistocene Homo sapiens and Neanderthal specimens. This has enabled us to clearly characterise the tooth as belonging to an approximately 9 year old Neanderthal child," says Dr. Clément Zanolli, from Université de Bordeaux. "Shukbah marks the southernmost extent of the Neanderthal range known to date," adds Zanolli.

Although Homo sapiens and Neanderthals shared the use of a wide suite of stone tool technologies, Nubian Levallois technology has recently been argued to have been exclusively used by Homo sapiens. The argument has been made particularly in southwest Asia, where Nubian Levallois tools have been used to track human dispersals in the absence of fossils.

"Illustrations of the stone tool collections from Shukbah hinted at the presence of Nubian Levallois technology so we revisited the collections to investigate further. In the end, we identified many more artefacts produced using the Nubian Levallois methods than we had anticipated," says Blinkhorn. "This is the first time they've been found in direct association with Neanderthal fossils, which suggests we can't make a simple link between this technology and Homo sapiens."

"Southwest Asia is a dynamic region in terms of hominin demography, behaviour and environmental change, and may be particularly important to examine interactions between Neanderthals and Homo sapiens," adds Prof Simon Blockley, of Royal Holloway, University of London. "This study highlights the geographic range of Neanderthal populations and their behavioural flexibility, but also issues a timely note of caution that there are no straightforward links between particular hominins and specific stone tool technologies."

Read more at Science Daily

The comet that killed the dinosaurs

 It was tens of miles wide and forever changed history when it crashed into Earth about 66 million years ago.

The Chicxulub impactor, as it's known, left behind a crater off the coast of Mexico that spans 93 miles and goes 12 miles deep. Its devastating impact brought the reign of the dinosaurs to an abrupt and calamitous end by triggering their sudden mass extinction, along with the end of almost three-quarters of the plant and animal species then living on Earth.

The enduring puzzle has always been where the asteroid or comet that set off the destruction originated, and how it came to strike the Earth. And now a pair of Harvard researchers believe they have the answer.

In a study published in Scientific Reports, Avi Loeb, Frank B. Baird Jr. Professor of Science at Harvard, and Amir Siraj '21, an astrophysics concentrator, put forth a new theory that could explain the origin and journey of this catastrophic object and others like it.

Using statistical analysis and gravitational simulations, Loeb and Siraj show that a significant fraction of a type of comet originating from the Oort cloud, a sphere of debris at the edge of the solar system, was bumped off-course by Jupiter's gravitational field during its orbit and sent close to the sun, whose tidal force broke apart pieces of the rock. That increases the rate of comets like Chicxulub (pronounced Chicks-uh-lub) because these fragments cross the Earth's orbit and hit the planet once every 250 to 730 million years or so.

"Basically, Jupiter acts as a kind of pinball machine," said Siraj, who is also co-president of Harvard Students for the Exploration and Development of Space and is pursuing a master's degree at the New England Conservatory of Music. "Jupiter kicks these incoming long-period comets into orbits that bring them very close to the sun."

It's because of this that long-period comets, which take more than 200 years to orbit the sun, are called sun grazers, he said.

"When you have these sun grazers, it's not so much the melting that goes on, which is a pretty small fraction relative to the total mass, but the comet is so close to the sun that the part that's closer to the sun feels a stronger gravitational pull than the part that is farther from the sun, causing a tidal force" he said. "You get what's called a tidal disruption event and so these large comets that come really close to the sun break up into smaller comets. And basically, on their way out, there's a statistical chance that these smaller comets hit the Earth."

The calculations from Loeb and Siraj's theory increase the chances of long-period comets impacting Earth by a factor of about 10, and show that about 20 percent of long-period comets become sun grazers. That finding falls in line with research from other astronomers.

The pair claim that their new rate of impact is consistent with the age of Chicxulub, providing a satisfactory explanation for its origin and other impactors like it.

"Our paper provides a basis for explaining the occurrence of this event," Loeb said. "We are suggesting that, in fact, if you break up an object as it comes close to the sun, it could give rise to the appropriate event rate and also the kind of impact that killed the dinosaurs."

Loeb and Siraj's hypothesis might also explain the makeup of many of these impactors.

"Our hypothesis predicts that other Chicxulub-size craters on Earth are more likely to correspond to an impactor with a primitive (carbonaceous chondrite) composition than expected from the conventional main-belt asteroids," the researchers wrote in the paper.

This is important because a popular theory on the origin of Chicxulub claims the impactor is a fragment of a much larger asteroid that came from the main belt, which is an asteroid population between the orbit of Jupiter and Mars. Only about a tenth of all main-belt asteroids have a composition of carbonaceous chondrite, while it's assumed most long-period comets have it. Evidence found at the Chicxulub crater and other similar craters that suggests they had carbonaceous chondrite.

This includes an object that hit about 2 billion years ago and left the Vredefort crater in South Africa, which is the largest confirmed crater in Earth's history, and the impactor that left the Zhamanshin crater in Kazakhstan, which is the largest confirmed crater within the last million years.

The researchers say that composition evidence supports their model and that the years the objects hit support both their calculations on impact rates of Chicxulub-sized tidally disrupted comets and for smaller ones like the impactor that made the Zhamanshin crater. If produced the same way, they say those would strike Earth once every 250,000 to 730,000 years.

Loeb and Siraj say their hypothesis can be tested by further studying these craters, others like them, and even ones on the surface of the moon to determine the composition of the impactors. Space missions sampling comets can also help.

Aside from composition of comets, the new Vera Rubin Observatory in Chile may be able to see the tidal disruption of long-period comets after it becomes operational next year.

"We should see smaller fragments coming to Earth more frequently from the Oort cloud," Loeb said. "I hope that we can test the theory by having more data on long-period comets, get better statistics, and perhaps see evidence for some fragments."

Loeb said understanding this is not just crucial to solving a mystery of Earth's history but could prove pivotal if such an event were to threaten the planet again.

Read more at Science Daily

Commuters are inhaling unacceptably high levels of carcinogens

 A new study finds that California's commuters are likely inhaling chemicals at levels that increase the risk for cancer and birth defects.

As with most chemicals, the poison is in the amount. Under a certain threshold of exposure, even known carcinogens are not likely to cause cancer. Once you cross that threshold, the risk for disease increases.

Governmental agencies tend to regulate that threshold in workplaces. However, private spaces such as the interior of our cars and living rooms are less studied and less regulated.

Benzene and formaldehyde -- both used in automobile manufacturing -- are known to cause cancer at or above certain levels of exposure and are Prop. 65-listed chemicals.

New UC Riverside research shows that the average commuter in California is exceeding the threshold for exposure, breathing in unsustainably high levels of both chemicals.

Both benzene and formaldehyde are carcinogens, and benzene carries the additional risk of reproductive and developmental toxicity.

"These chemicals are very volatile, moving easily from plastics and textiles to the air that you breathe," said David Volz, UCR professor of environmental toxicology.

The study, published in the journal Environment International, calculated the daily dose of benzene and formaldehyde being inhaled by drivers with commutes of at least 20 minutes per day.

It found that up to 90% of the population in Los Angeles, San Diego, Orange, Santa Clara, and Alameda counties have at least a 10% chance of exceeding cancer risk from inhaling the chemicals, based on having 30-minute average commute times.

"Of course, there is a range of exposure that depends on how long you're in the car, and how much of the compounds your car is emitting," said Aalekhya Reddam, a graduate student in the Volz laboratory, and lead author of the study.

Previously, Volz and Reddam studied commuter exposure to a flame retardant called TDCIPP or chlorinated tris, and found that longer commute times increased exposure to that carcinogen as well.

They set out on this study wanting to understand the risk of that compound relative to other chemicals introduced during car manufacturing.

Reddam advises commuters to keep the windows open during their rides if possible. "At least with some air flow, you'd be diluting the concentration of these chemicals inside your car," she said.

Read more at Science Daily

Feb 14, 2021

How a single gene alteration may have separated modern humans from predecessors

 As a professor of pediatrics and cellular and molecular medicine at University of California San Diego School of Medicine, Alysson R. Muotri, PhD, has long studied how the brain develops and what goes wrong in neurological disorders. For almost as long, he has also been curious about the evolution of the human brain -- what changed that makes us so different from preceding Neanderthals and Denisovans, our closest evolutionary relatives, now extinct?

Evolutionary studies rely heavily on two tools -- genetics and fossil analysis -- to explore how a species changes over time. But neither approach can reveal much about brain development and function because brains do not fossilize, Muotri said. There is no physical record to study.

So Muotri decided to try stem cells, a tool not often applied in evolutionary reconstructions. Stem cells, the self-renewing precursors of other cell types, can be used to build brain organoids -- "mini brains" in a laboratory dish. Muotri and colleagues have pioneered the use of stem cells to compare humans to other primates, such as chimpanzees and bonobos, but until now a comparison with extinct species was not thought possible.

In a study published February 11, 2021 in Science, Muotri's team catalogued the differences between the genomes of diverse modern human populations and the Neanderthals and Denisovans, who lived during the Pleistocene Epoch, approximately 2.6 million to 11,700 years ago. Mimicking an alteration they found in one gene, the researchers used stem cells to engineer "Neanderthal-ized" brain organoids.

"It's fascinating to see that a single base-pair alteration in human DNA can change how the brain is wired," said Muotri, senior author of the study and director of the UC San Diego Stem Cell Program and a member of the Sanford Consortium for Regenerative Medicine. "We don't know exactly how and when in our evolutionary history that change occurred. But it seems to be significant, and could help explain some of our modern capabilities in social behavior, language, adaptation, creativity and use of technology."

The team initially found 61 genes that differed between modern humans and our extinct relatives. One of these altered genes -- NOVA1 -- caught Muotri's attention because it's a master gene regulator, influencing many other genes during early brain development. The researchers used CRISPR gene editing to engineer modern human stem cells with the Neanderthal-like mutation in NOVA1. Then they coaxed the stem cells into forming brain cells and ultimately Neanderthal-ized brain organoids.

Brain organoids are little clusters of brain cells formed by stem cells, but they aren't exactly brains (for one, they lack connections to other organ systems, such as blood vessels). Yet organoids are useful models for studying genetics, disease development and responses to infections and therapeutic drugs. Muotri's team has even optimized the brain organoid-building process to achieve organized electrical oscillatory waves similar to those produced by the human brain.

The Neanderthal-ized brain organoids looked very different than modern human brain organoids, even to the naked eye. They had a distinctly different shape. Peering deeper, the team found that modern and Neanderthal-ized brain organoids also differ in the way their cells proliferate and how their synapses -- the connections between neurons -- form. Even the proteins involved in synapses differed. And electrical impulses displayed higher activity at earlier stages, but didn't synchronize in networks in Neanderthal-ized brain organoids.

According to Muotri, the neural network changes in Neanderthal-ized brain organoids parallel the way newborn non-human primates acquire new abilities more rapidly than human newborns.

"This study focused on only one gene that differed between modern humans and our extinct relatives. Next we want to take a look at the other 60 genes, and what happens when each, or a combination of two or more, are altered," Muotri said.

"We're looking forward to this new combination of stem cell biology, neuroscience and paleogenomics. The ability to apply the comparative approach of modern humans to other extinct hominins, such as Neanderthals and Denisovans, using brain organoids carrying ancestral genetic variants is an entirely new field of study."

To continue this work, Muotri has teamed up with Katerina Semendeferi, professor of anthropology at UC San Diego and study co-author, to co-direct the new UC San Diego Archealization Center, or ArchC.

Read more at Science Daily

'Gamechanger' drug for treating obesity cuts body weight by 20 percent

 One third (35%) of people who took a new drug for treating obesity lost more than one-fifth (greater than or equal to 20%) of their total body weight, according to a major global study involving UCL researchers.

The findings from the large-scale international trial, published today in the New England Journal for Medicine, are being hailed as a "gamechanger" for improving the health of people with obesity and could play a major part in helping the UK to reduce the impact of diseases, such as COVID-19.

The drug, semaglutide, works by hijacking the body's own appetite regulating system in the brain leading to reduced hunger and calorie intake.

Rachel Batterham, Professor of Obesity, Diabetes and Endocrinology who leads the Centre for Obesity Research at UCL and the UCLH Centre for Weight Management, is one of the principal authors on the paper which involved almost 2,000 people in 16 countries.

Professor Batterham (UCL Medicine) said: "The findings of this study represent a major breakthrough for improving the health of people with obesity. Three quarters (75%) of people who received semaglutide 2.4mg lost more than 10% of their body weight and more than one-third lost more than 20%. No other drug has come close to producing this level of weight loss -- this really is a gamechanger. For the first time, people can achieve through drugs what was only possible through weight-loss surgery."

Professor Batterham added: "The impact of obesity on health has been brought into sharp focus by COVID-19 where obesity markedly increases the risk of dying from the virus, as well as increasing the risk of many life-limiting serious diseases including heart disease, type 2 diabetes, liver disease and certain types of cancers. This drug could have major implications for UK health policy for years to come."

The average participant in the trial lost 15.3kg (nearly 3 stone); this was accompanied by reductions in risk factors for heart disease and diabetes, such as waist circumference, blood fats, blood sugar and blood pressure and reported improvements in their overall quality of life.

The trial's UK Chief Investigator, Professor John Wilding (University of Liverpool) said: "This is a significant advance in the treatment of obesity. Semaglutide is already approved and used clinically at a lower dose for treatment of diabetes, so as doctors we are already familiar with its use. For me this is particularly exciting as I was involved in very early studies of GLP1 (when I worked at the Hammersmith Hospital in the 1990s we were the first to show in laboratory studies that GLP1 affected appetite), so it is good to see this translated into an effective treatment for people with obesity."

With evidence from this trial, semaglutide has been submitted for regulatory approval as a treatment for obesity to the National Institute of Clinical Excellence (NICE), the European Medicines Agency (EMA) and the US Food and Drug Administration (FDA).

About the trial

The Phase III 'STEP'* randomised controlled trial involved 1,961 adults who were either overweight or had obesity (average weight 105kg/16.5 stone; body mass index 38kg/m2), and took place at 129 sites in 16 countries across Asia, Europe, North America, and South America.

Participants took a 2.4mg dose of semaglutide (or matching placebo) weekly via subcutaneously (under the skin) injection; similar to the way people with diabetes inject insulin. Overall, 94.3% of participants completed the 68-week study, which started in autumn 2018.

Those taking part also received individual face-to-face or phone counselling sessions from registered dietitians every four weeks to help them adhere to the reduced-calorie diet and increased physical activity, providing guidance, behavioural strategies and motivation. Additionally, participants received incentives such as kettle bells or food scales to mark progress and milestones.

In those taking semaglutide, the average weight loss was 15.3kg (nearly three stone), with a reduction in BMI of -5.54. The placebo group observed an average weight loss of 2.6kg (0.4 stone) with a reduction in BMI of -0.92.

Those who had taken semaglutide also saw reductions in risk factors for heart disease and diabetes, such as waist circumference, blood fats, blood sugar and blood pressure and reported improvements in their overall quality of life.

About the drug

Semaglutide is clinically approved to be used for patients with type 2 diabetes, though is typically prescribed in much lower doses of 1mg.

The drug possesses a compound structurally similar to (and mimics) the human glucagon-like peptide-1 (GLP-1) hormone, which is released into the blood from the gut after meals.

GLP-1 induces weight loss by reducing hunger, increasing feelings of fullness and thereby helping people eat less and reduce their calorie intake.

Read more at Science Daily

Hubble uncovers concentration of small black holes

 Globular clusters are extremely dense stellar systems, in which stars are packed closely together. They are also typically very old -- the globular cluster that is the focus of this study, NGC 6397, is almost as old as the Universe itself. It resides 7800 light-years away, making it one of the closest globular clusters to Earth. Because of its very dense nucleus, it is known as a core-collapsed cluster.

When Eduardo Vitral and Gary A. Mamon of the Institut d'Astrophysique de Paris set out to study the core of NGC 6397, they expected to find evidence for an "intermediate-mass" black hole (IMBH). These are smaller than the supermassive black holes that lie at the cores of large galaxies, but larger than stellar-mass black holes formed by the collapse of massive stars. IMBH are the long-sought "missing link" in black hole evolution and their mere existence is hotly debated, although a few candidates have been found (, for example).

To look for the IMBH, Vitral and Mamon analysed the positions and velocities of the cluster's stars. They did this using previous estimates of the stars' proper motions from Hubble images of the cluster spanning several years, in addition to proper motions provided by ESA's Gaia space observatory, which precisely measures the positions, distances and motions of stars. Knowing the distance to the cluster allowed the astronomers to translate the proper motions of these stars into velocities.

"Our analysis indicated that the orbits of the stars are close to random throughout the globular cluster, rather than systematically circular or very elongated," explained Mamon.

"We found very strong evidence for invisible mass in the dense central regions of the cluster, but we were surprised to find that this extra mass is not point-like but extended to a few percent of the size of the cluster," added Vitral.

This invisible component could only be made up of the remnants (white dwarfs, neutron stars, and black holes) of massive stars whose inner regions collapsed under their own gravity once their nuclear fuel was exhausted. The stars progressively sank to the cluster's centre after gravitational interactions with nearby less massive stars, leading to the small extent of the invisible mass concentration. Using the theory of stellar evolution, the scientists concluded that the bulk of the unseen concentration is made of stellar-mass black holes, rather than white dwarfs or neutron stars that are too faint to observe.

Two recent studies had also proposed that stellar remnants and in particular, stellar-mass black holes, could populate the inner regions of globular clusters.

"Our study is the first finding to provide both the mass and the extent of what appears to be a collection of mostly black holes in a core-collapsed globular cluster," said Vitral.

"Our analysis would not have been possible without having both the Hubble data to constrain the inner regions of the cluster and the Gaia data to constrain the orbital shapes of the outer stars, which in turn indirectly constrain the velocities of foreground and background stars in the inner regions," added Mamon, attesting to an exemplary international collaboration.

Read more at Science Daily

Lemurs show there's no single formula for lasting love

 Humans aren't the only mammals that form long-term bonds with a single, special mate -- some bats, wolves, beavers, foxes and other animals do, too. But new research suggests the brain circuitry that makes love last in some species may not be the same in others.

The study, appearing Feb. 12 in the journal Scientific Reports, compares monogamous and promiscuous species within a closely related group of lemurs, distant primate cousins of humans from the island Madagascar.

Red-bellied lemurs and mongoose lemurs are among the few species in the lemur family tree in which male-female partners stick together year after year, working together to raise their young and defend their territory.

Once bonded, pairs spend much of their waking hours grooming each other or huddled side by side, often with their tails wrapped around each other's bodies. Males and females of these species spend a third of a lifetime with the same mate. The same cannot be said of their closest relatives, who change partners often.

To biologists, monogamy is somewhat a mystery. That's in part because in many animal groups it's rare. While around 90% of bird species practice some form of fidelity to one partner, only 3% to 5% of mammals do. The vast majority of the roughly 6,500 known species of mammals have open relationships, so to speak.

"It's an uncommon arrangement," said lead author Nicholas Grebe, a postdoctoral associate in professor Christine Drea's lab at Duke University.

Which raises a question: what makes some species biologically inclined to pair up for the long haul while others play the field?

Studies over the last 30 years in rodents point to two hormones released during mating, oxytocin and vasopressin, suggesting that the key to lasting love may lie in differences in how they act on the brain.

Some of the first clues came from influential research on prairie voles, small mouse-like mammals that, unlike most rodents, mate for life. When researchers compared the brains of monogamous prairie voles with their promiscuous counterparts, montane voles and meadow voles, they found that prairie voles had more "docking sites" for these hormones, particularly in parts of the brain's reward system.

Since these "cuddle chemicals" were found to enhance male-female bonds in voles, researchers have long wondered if they might work the same way in humans.

That's why the Duke-led team turned to lemurs. Despite being our most distant primate relatives, lemurs are a closer genetic match to humans than voles are.

The researchers used an imaging technique called autoradiography to map binding sites for oxytocin and vasopressin in the brains of 12 lemurs that had died of natural causes at the Duke Lemur Center.

The animals represented seven species: monogamous red-bellied and mongoose lemurs along with five promiscuous species in the same genus.

"They're really the only comparable natural experiment to look for biological signatures of monogamy in primates," Grebe said.

Comparing the brain imaging results in lemurs with previous results in voles and monkeys revealed some noticeable differences in the density and distribution of hormone receptors. In other words, oxytocin and vasopressin appear to act on different parts of the brain in lemurs -- which means they may also have different effects, depending on their target cell's location.

But within lemurs, the researchers were surprised to find few consistent differences between monogamous species and promiscuous ones.

"We don't see evidence of a pair-bond circuit" akin to that found in rodent brains, Grebe said.

As a next step, the team is looking at how lemur couples behave toward each other if the actions of oxytocin are blocked, by feeding them an antagonist that temporarily prevents oxytocin from binding to its receptors in the brain.

So what can lemurs teach us about love? The authors say their findings caution against drawing simple conclusions based on rodent experiments about how human social behaviors came to be.

Oxytocin may be the "potion of devotion" for voles, but it may be the combined actions and interactions of multiple brain chemicals, along with ecological factors, that create long-lasting bonds in lemurs and other primates, including humans, Grebe said.

"There are probably a number of different ways through which monogamy is instantiated within the brain, and it depends on what animals we're looking at," Grebe said. "There's more going on than we originally thought."

Read more at Science Daily