Apr 3, 2020

Homo naledi juvenile remains offers clues to how our ancestors grew up

A partial skeleton of Homo naledi represents a rare case of an immature individual, shedding light on the evolution of growth and development in human ancestry, according to a study published April 1, 2020 in the open-access journal PLOS ONE by Debra Bolter of Modesto Junior College in California and the University of the Witwatersrand, Johannesburg, and colleagues.

Much research has gone into the evolution of ancient hominins -- human relatives and ancestors -- but little is known about their growth and development. Most hominin fossils represent adult individuals, and remains of developmentally young hominins are rare. This has left a gap in our understanding of how our ancient relatives grew from young into adults, and how modern human growth patterns evolved.

In this study, Bolter and colleagues examined fossils from the Dinaledi Chamber of the Rising Star Cave System in South Africa. This site is famous for providing abundant remains of the hominin Homo naledi, including individuals ranging from infants to adult. These fossils date to the late Middle Pleistocene, between 335,000 and 226,000 years ago, possibly overlapping in time with the earliest members of our own species. The team identified a collection of arm and leg bones and a partial jaw as the remains of a single young individual designated DH7.

The bones and teeth of DH7 were not fully developed and display a mixture of maturity patterns seen in modern humans and earlier hominins. DH7 is estimated to be similar in its developmental stage to immature specimens of other fossil hominins between 8-11 years old at death. The authors note, however, that if Homo naledi had a slower growth rate like modern humans, DH7 might have been as old as 15. Further study is needed to assess how Homo naledi grew and where it fits into the evolution of human growth and development.

Bolter adds: The rare juvenile Homo naledi partial skeleton will shed light on whether this extinct species is more human-like in its development, or more primitive. The findings help reconstruct the selective pressures that shaped extended maturity in our own species.

From Science Daily

People tune out facts and trust their guts in medical emergencies

A study conducted by two associate professors of marketing at The University of Texas at Arlington shows that people are more likely to base decisions on anecdotal information instead of facts when they feel anxious and vulnerable.

Traci Freling and Ritesh Saini, both in the College of Business, published "When poignant stories outweigh cold hard facts: A meta-analysis of the anecdotal bias" in Organizational Behavior and Human Decision Processes.

"We found that people are more likely to consider personal anecdotes than fact-based information, especially when it deals with medical emergencies," Freling said. "This has a high importance in the current environment, where everyone is concerned about the coronavirus."

Freling said people are more likely to listen to personal stories instead of facts because emotions run high during medical emergencies like the COVID-19 pandemic.

"They are especially dismissive of facts if the incident is something they personally experienced," Freling said. "Specifically, we show that when an issue is health-related, personally relevant or highly threatening, then decision-making is compromised and people tend to rely on anecdotes."

Freling pointed to the run on toilet paper buying during the COVID-19 pandemic as one example of not basing decisions on facts. This example illustrates how consumers who feel vulnerable to a particular problem may rely more heavily on subjective, anecdotal information instead of objective, statistical facts to make decisions.

Former UTA faculty member Zhiyong Yang, now a professor at the University of North Carolina-Greensboro, and two graduate students contributed to the analysis.

The research also revealed that when emotional engagement is low, statistical evidence weighs more heavily.

"Primarily, when there is low-threat severity or it's a non-health issue, people tend to take cold, hard facts into account rather than personal accounts and stories," Freling said.

Additionally, Saini noted that people make "more fact-based decisions when choosing for others, but become surprisingly irrational when choosing for self."

Elten Briggs, chair of the Department of Marketing, said Freling and Saini's analysis could have implications on decision-making processes for business and industry, especially during medical crises.

Read more at Science Daily

When three species of human ancestor walked the Earth

Homo erectus word cloud
An international team, including Arizona State University researcher Gary Schwartz, have unearthed the earliest known skull of Homo erectus, the first of our ancestors to be nearly human-like in their anatomy and aspects of their behavior.

Years of painstaking excavation at the fossil-rich site of Drimolen, nestled within the Cradle of Humankind (a UNESCO World Heritage site located just 40 kilometers or around 25 miles northwest of Johannesburg in South Africa), has resulted in the recovery of several new and important fossils. The skull, attributed to Homo erectus, is securely dated to be two million years old.

Published this week in Science, the international team of nearly 30 scientists from five countries shared details of this skull -- the most ancient fossil Homo erectus known -- and other fossils from this site and discuss how these new finds are forcing us to rewrite a part of our species' evolutionary history.

The high-resolution dating of Drimolen's fossil deposits demonstrates the age of the new skull to pre-date Homo erectus specimens from other sites within and outside of Africa by at least 100,000 to 200,000 years and thus confirms an African origin for the species.

The skull, reconstructed from more than 150 separate fragments, is of an individual likely aged between three and six years old, giving scientists a rare glimpse into childhood growth and development in these early human ancestors.

Additional fossils recovered from Drimolen belong to a different species -- in fact, a different genus of ancient human altogether -- the more heavily built, robust human ancestor Paranthropus robustus, known to also occur at several nearby cave sites preserving fossils of the same geological age. A third, distinctive species, Australopithecus sediba, is known from two-million-year old deposits of an ancient cave site virtually down the road from Drimolen.

"Unlike the situation today, where we are the only human species, two million years ago our direct ancestor was not alone," said project director and lead researcher from La Trobe University in Australia, Andy Herries.

Gary Schwartz, a paleoanthropologist and research associate with ASU's Institute of Human Origins, participated in the excavations and recovery of the new cranium, and as an expert in the evolution of growth and development, is continuing his work with the research team to analyze the many infant and juvenile specimens found at the site.

"What is really exciting is the discovery that during this same narrow time slice, at just around two million years ago, there were three very different types of ancient human ancestors roaming the same small landscape," said Schwartz.

"We don't yet know whether they interacted directly, but their presence raises the possibility that these ancient fossil humans evolved strategies to divvy up the landscape and its resources in some way to enable them to live in such close proximity." Schwartz is also an Associate Professor in the School of Human Evolution and Social Change.

The ability to date Drimolen's ancient cave deposits with such a high degree of precision, using a range of different dating techniques, allowed the team to address important broader questions about human evolution in this region of Africa.

Paper coauthor Justin Adams from Monash University (Australia) is a specialist in reconstructing paleohabitats based on the animals preserved at fossil sites, said the discovery now allows us to address what role changing habitats, resources, and the unique biological adaptations of early Homo erectus may have played in the eventual extinction of Australopithecus sediba in South Africa.

"The discovery of the earliest Homo erectus marks a milestone for South African fossil heritage," says project codirector and University of Johannesburg doctoral student Stephanie Baker.

Fieldwork will continue at Drimolen, expanding the excavations to include even more ancient components of the cave and to provide a more in-depth glimpse at the forces shaping human evolution in this part of the African continent.

Read more at Science Daily

Lucy had an ape-like brain

Australopithecus afarensis word cloud
A new study led by paleoanthropologists Philipp Gunz and Simon Neubauer from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, reveals that Lucy's species Australopithecus afarensis had an ape-like brain. However, the protracted brain growth suggests that -- as is the case in humans -- infants may have had a long dependence on caregivers.

The species Australopithecus afarensis inhabited East Africa more than three million years ago, and occupies a key position in the hominin family tree, as it is widely accepted to be ancestral to all later hominins, including the human lineage. "Lucy and her kind provide important evidence about early hominin behavior. They walked upright, had brains that were around 20 percent larger than those of chimpanzees, may have used sharp stone tools," explains senior author Zeresenay Alemseged from the University of Chicago, who directs the Dikika field project in Ethiopia, where the skeleton of an Australopithecus child was found in the year 2000. "Our new results show how their brains developed, and how they were organized," adds Alemseged.

To study brain growth and organization in Australopithecus afarensis the researchers scanned the fossil cranium of the Dikika child using synchrotron microtomography at the European Synchrotron Radiation Facility (ESRF) in Grenoble, France. With the help of this state-of the-art technology researchers can reveal the age at death with a precision of a few weeks.

In addition, seven other well-preserved fossil crania from the Ethiopian sites Dikika and Hadar were scanned using high-resolution conventional tomography. Several years of painstaking fossil reconstruction, and counting of dental growth lines, yielded an exceptionally preserved brain imprint of the Dikika child, a precise age at death, new endocranial volume estimates, and previously undetected endocranial features of well-known Australopithecus fossils.

These data shed new light on two questions that have been controversial: Is there evidence for human-like brain reorganization in Australopithecus afarensis? Was the pattern of brain growth in A. afarensis more similar to that of chimpanzees or that of humans?

Extended childhood

Contrary to previous claims, the endocranial imprints of Australopithecus afarensis reveal an ape-like brain organization, and no features derived towards humans. However, a comparison of infant and adult endocranial volumes nevertheless indicates more human-like protracted brain growth in Australopithecus afarensis, likely critical for the evolution of a long period of childhood learning in hominins.

The brains of modern humans are not only much larger than those of our closest living ape relatives, they are also organized differently, and take longer to grow and mature. For example, compared with chimpanzees, modern human infants learn longer at the expense of being entirely dependent on parental care for longer periods of time. Together, these characteristics are important for human cognition and social behavior, but their evolutionary origins remain unclear. Brains do not fossilize, but as the brain grows and expands before and after birth, the tissues surrounding its outer layer leave an imprint in the bony braincase. Based on these endocasts the researchers could measure endocranial volume, and infer key aspects of brain organization from impressions of brain convolutions in the skull.

Differences in brain organization

A key difference between apes and humans involves the organization of the brain's parietal and occipital lobes. "In all ape brains, a well-defined lunate sulcus approximates the anterior boundary of the primary visual cortex of the occipital lobes," explains co-author Dean Falk from Florida State University, a specialist in interpreting endocranial imprints. Some have previously argued that structural changes of the brain resulted in a more backwards (human-like) placement of the lunate sulcus on endocasts of australopiths, and eventually to the disappearance of a clear endocranial impression in humans. Hypothetically, such brain reorganization in australopiths could have been linked to behaviors that were more complex than those of their great ape relatives (e.g., tool manufacture, mentalizing, and vocal communication). Unfortunately, the lunate sulcus typically does not reproduce well on endocasts, so there is unresolved controversy about its position in australopiths.

The exceptionally well preserved endocast of the Dikika child has an unambiguous impression of a lunate sulcus in an ape-like position. Likewise, the computed tomographic scans reveal a previously undetected impression of an ape-like lunate sulcus in a well-known fossil of an adult Australopithecus individual from Hadar (A.L. 162-28). Contrary to previous claims, the researchers did not find evidence for brain reorganization in any Australopithecus afarensis endocast that preserves detailed sulcal impressions.

Virtual dental histology


In infants, synchrotron computed tomographic scans of the dentition make it possible to determine an individual's age at death by counting dental growth lines. Similar to the growth rings of a tree, virtual sections of a tooth reveal incremental growth lines reflecting the body's internal rhythm. Studying the fossilized teeth of the Dikika infant, the team's dental experts Paul Tafforeau (ESRF), Adeline Le Cabec (ESRF/Max Planck Institute for Evolutionary Anthropology), and Tanya Smith (Griffith University) calculated an age at death of 861 days (2.4 years).

"After seven years of work, we finally had all the puzzle pieces to study the evolution of brain growth," says lead author Philipp Gunz: "The age at death of the Dikika child and its endocranial volume, the endocranial volumes of the best-preserved adult Australopithecus afarensis fossils, and comparative data of more than 1600 modern humans and chimpanzees."

Protracted brain growth

The pace of dental development of the Dikika infant was broadly comparable to that of chimpanzees and therefore faster than in modern humans. However, given that the brains of Australopithecus afarensis adults were roughly 20 percent larger than those of chimpanzees, the Dikika child's small endocranial volume suggests a prolonged period of brain development relative to chimpanzees. "Even a conservative comparison of the Dikika infant to small-statured and small-brained adults like Lucy, suggests that brain growth in Australopithecus afarensis was protracted as in humans today," explains Simon Neubauer.

Read more at Science Daily

Apr 2, 2020

Climate change may be making migration harder by shortening nightingales' wings

The Common Nightingale, known for its beautiful song, breeds in Europe and parts of Asia and migrates to sub-Saharan Africa every winter. A new study published in The Auk: Ornithological Advances suggests that natural selection driven by climate change is causing these iconic birds to evolve shorter wings, which might make them less likely to survive their annual migration.

Complutense University of Madrid's Carolina Remacha and Javier Pérez-Tris and their colleagues analyzed twenty years of data on wing shape variation and survival in two populations of nightingales from central Spain. They found that nightingales' average wing length relative to their body size has decreased over the past two decades, becoming less optimal for migration. Shorter-winged birds were less likely to return to their breeding grounds after their first round-trip to Africa. But if this change in wing length is negatively affecting survival, what is driving it?

The "migratory gene package" hypothesis predicts that a suite of adaptations related to migration -- including a long wingspan as well as a higher resting metabolic rate, larger clutch size, and shorter lifespan -- may all be controlled by a set of genes that are linked so that selective pressures on one trait also affect the others. In recent decades, the timing of spring has shifted in central Spain and summer droughts have become longer and more intense, leaving nightingales with a shorter window in which to raise their young. This means the most successful birds may be those that lay smaller clutches of eggs, giving them fewer young to care for. And if natural selection is favoring smaller clutches, it may simultaneously push nightingales away from all of the linked traits in the "migratory gene package."

Natural selection on clutch size that inadvertently leads to shorter wings and, therefore, reduced survival is an example of "maladaptation," where organisms' responses to changing conditions end up being harmful instead of helpful. "There is much evidence that climate change is having an effect on migratory birds, changing their arrival and laying dates and their physical features over the last few decades," says lead author Carolina Remacha. "If we are to fully understand how bird populations adapt to new environments in order to help them tackle the challenges of a rapidly changing world, it is important to call attention to the potential problems of maladaptive change."

From Science Daily

Trial drug can significantly block early stages of COVID-19 in engineered human tissues

SARS-CoV-2 coronavirus illustration
An international team led by University of British Columbia researcher Dr. Josef Penninger has found a trial drug that effectively blocks the cellular door SARS-CoV-2 uses to infect its hosts.

The findings, published today in Cell, hold promise as a treatment capable of stopping early infection of the novel coronavirus that, as of April 2, has affected more than 981,000 people and claimed the lives of 50,000 people worldwide.

The study provides new insights into key aspects of SARS-CoV-2, the virus that causes COVID-19, and its interactions on a cellular level, as well as how the virus can infect blood vessels and kidneys.

"We are hopeful our results have implications for the development of a novel drug for the treatment of this unprecedented pandemic," says Penninger, professor in UBC's faculty of medicine, director of the Life Sciences Institute and the Canada 150 Research Chair in Functional Genetics at UBC.

"This work stems from an amazing collaboration among academic researchers and companies, including Dr. Ryan Conder's gastrointestinal group at STEMCELL Technologies in Vancouver, Nuria Montserrat in Spain, Drs. Haibo Zhang and Art Slutsky from Toronto and especially Ali Mirazimi's infectious biology team in Sweden, who have been working tirelessly day and night for weeks to better understand the pathology of this disease and to provide breakthrough therapeutic options."

ACE2 -- a protein on the surface of the cell membrane -- is now at centre-stage in this outbreak as the key receptor for the spike glycoprotein of SARS-CoV-2. In earlier work, Penninger and colleagues at the University of Toronto and the Institute of Molecular Biology in Vienna first identified ACE2, and found that in living organisms, ACE2 is the key receptor for SARS, the viral respiratory illness recognized as a global threat in 2003. His laboratory also went on to link the protein to both cardiovascular disease and lung failure.

While the COVID-19 outbreak continues to spread around the globe, the absence of a clinically proven antiviral therapy or a treatment specifically targeting the critical SARS-CoV-2 receptor ACE2 on a molecular level has meant an empty arsenal for health care providers struggling to treat severe cases of COVID-19.

"Our new study provides very much needed direct evidence that a drug -- called APN01 (human recombinant soluble angiotensin-converting enzyme 2 -- hrsACE2) -- soon to be tested in clinical trials by the European biotech company Apeiron Biologics, is useful as an antiviral therapy for COVID-19," says Dr. Art Slutsky, a scientist at the Keenan Research Centre for Biomedical Science of St. Michael's Hospital and professor at the University of Toronto who is a collaborator on the study.

In cell cultures analyzed in the current study, hrsACE2 inhibited the coronavirus load by a factor of 1,000-5,000. In engineered replicas of human blood vessel and kidneys -- organoids grown from human stem cells -- the researchers demonstrated that the virus can directly infect and duplicate itself in these tissues. This provides important information on the development of the disease and the fact that severe cases of COVID-19 present with multi-organ failure and evidence of cardiovascular damage. Clinical grade hrsACE2 also reduced the SARS-CoV-2 infection in these engineered human tissues.

"Using organoids allows us to test in a very agile way treatments that are already being used for other diseases, or that are close to being validated. In these moments in which time is short, human organoids save the time that we would spend to test a new drug in the human setting," says Núria Montserrat, ICREA professor at the Institute for Bioengineering of Catalonia in Spain.

Read more at Science Daily

Discovery of life in solid rock deep beneath sea may inspire new search for life on Mars

Newly discovered single-celled creatures living deep beneath the seafloor have given researchers clues about how they might find life on Mars. These bacteria were discovered living in tiny cracks inside volcanic rocks after researchers persisted over a decade of trial and error to find a new way to examine the rocks.

Researchers estimate that the rock cracks are home to a community of bacteria as dense as that of the human gut, about 10 billion bacterial cells per cubic centimeter (0.06 cubic inch). In contrast, the average density of bacteria living in mud sediment on the seafloor is estimated to be 100 cells per cubic centimeter.

"I am now almost over-expecting that I can find life on Mars. If not, it must be that life relies on some other process that Mars does not have, like plate tectonics," said Associate Professor Yohey Suzuki from the University of Tokyo, referring to the movement of land masses around Earth most notable for causing earthquakes. Suzuki is first author of the research paper announcing the discovery, published in Communications Biology.

Magic of clay minerals

"I thought it was a dream, seeing such rich microbial life in rocks," said Suzuki, recalling the first time he saw bacteria inside the undersea rock samples.

Undersea volcanoes spew out lava at approximately 1,200 degrees Celsius (2,200 degrees Fahrenheit), which eventually cracks as it cools down and becomes rock. The cracks are narrow, often less than 1 millimeter (0.04 inch) across. Over millions of years, those cracks fill up with clay minerals, the same clay used to make pottery. Somehow, bacteria find their way into those cracks and multiply.

"These cracks are a very friendly place for life. Clay minerals are like a magic material on Earth; if you can find clay minerals, you can almost always find microbes living in them," explained Suzuki.

The microbes identified in the cracks are aerobic bacteria, meaning they use a process similar to how human cells make energy, relying on oxygen and organic nutrients.

"Honestly, it was a very unexpected discovery. I was very lucky, because I almost gave up," said Suzuki.

Cruise for deep ocean samples

Suzuki and his colleagues discovered the bacteria in rock samples that he helped collect in late 2010 during the Integrated Ocean Drilling Program (IODP). IODP Expedition 329 took a team of researchers from the tropical island of Tahiti in the middle of the Pacific Ocean to Auckland, New Zealand. The research ship anchored above three locations along the route across the South Pacific Gyre and used a metal tube 5.7 kilometers long to reach the ocean floor. Then, a drill cut down 125 meters below the seafloor and pulled out core samples, each about 6.2 centimeters across. The first 75 meters beneath the seafloor were mud sediment and then researchers collected another 40 meters of solid rock.

Depending on the location, the rock samples were estimated to be 13.5 million, 33.5 million and 104 million years old. The collection sites were not near any hydrothermal vents or sub-seafloor water channels, so researchers are confident the bacteria arrived in the cracks independently rather than being forced in by a current. The rock core samples were also sterilized to prevent surface contamination using an artificial seawater wash and a quick burn, a process Suzuki compares to making aburi (flame-seared) sushi.

At that time, the standard way to find bacteria in rock samples was to chip away the outer layer of the rock, then grind the center of the rock into a powder and count cells out of that crushed rock.

"I was making loud noises with my hammer and chisel, breaking open rocks while everyone else was working quietly with their mud," he recalled.

How to slice a rock

Over the years, continuing to hope that bacteria might be present but unable to find any, Suzuki decided he needed a new way to look specifically at the cracks running through the rocks. He found inspiration in the way pathologists prepare ultrathin slices of body tissue samples to diagnose disease. Suzuki decided to coat the rocks in a special epoxy to support their natural shape so that they wouldn't crumble when he sliced off thin layers.

These thin sheets of solid rock were then washed with dye that stains DNA and placed under a microscope.

The bacteria appeared as glowing green spheres tightly packed into tunnels that glow orange, surrounded by black rock. That orange glow comes from clay mineral deposits, the "magic material" giving bacteria an attractive place to live.

Whole genome DNA analysis identified the different species of bacteria that lived in the cracks. Samples from different locations had similar, but not identical, species of bacteria. Rocks at different locations are different ages, which may affect what minerals have had time to accumulate and therefore what bacteria are most common in the cracks.

Suzuki and his colleagues speculate that the clay mineral-filled cracks concentrate the nutrients that the bacteria use as fuel. This might explain why the density of bacteria in the rock cracks is eight orders of magnitude greater than the density of bacteria living freely in mud sediment where seawater dilutes the nutrients.

From the ocean floor to Mars

The clay minerals filling cracks in deep ocean rocks are likely similar to the minerals that may be in rocks now on the surface of Mars.

"Minerals are like a fingerprint for what conditions were present when the clay formed. Neutral to slightly alkaline levels, low temperature, moderate salinity, iron-rich environment, basalt rock -- all of these conditions are shared between the deep ocean and the surface of Mars," said Suzuki.

Suzuki's research team is beginning a collaboration with NASA's Johnson Space Center to design a plan to examine rocks collected from the Martian surface by rovers. Ideas include keeping the samples locked in a titanium tube and using a CT (computed tomography) scanner, a type of 3D X-ray, to look for life inside clay mineral-filled cracks.

Read more at Science Daily

Traces of ancient rainforest in Antarctica point to a warmer prehistoric world

Antarctica
Researchers have found evidence of rainforests near the South Pole 90 million years ago, suggesting the climate was exceptionally warm at the time.

A team from the UK and Germany discovered forest soil from the Cretaceous period within 900 km of the South Pole. Their analysis of the preserved roots, pollen and spores shows that the world at that time was a lot warmer than previously thought.

The discovery and analysis were carried out by an international team of researchers led by geoscientists from the Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research in Germany and including Imperial College London researchers. Their findings are published today in Nature.

Co-author Professor Tina van de Flierdt, from the Department of Earth Science & Engineering at Imperial, said: "The preservation of this 90-million-year-old forest is exceptional, but even more surprising is the world it reveals. Even during months of darkness, swampy temperate rainforests were able to grow close to the South Pole, revealing an even warmer climate than we expected."

The work also suggests that the carbon dioxide (CO2) levels in the atmosphere were higher than expected during the mid-Cretaceous period, 115-80 million years ago, challenging climate models of the period.

The mid-Cretaceous was the heyday of the dinosaurs but was also the warmest period in the past 140 million years, with temperatures in the tropics as high as 35 degrees Celsius and sea level 170 metres higher than today.

However, little was known about the environment south of the Antarctic Circle at this time. Now, researchers have discovered evidence of a temperate rainforest in the region, such as would be found in New Zealand today. This was despite a four-month polar night, meaning for a third of every year there was no life-giving sunlight at all.

The presence of the forest suggests average temperatures were around 12 degrees Celsius and that there was unlikely to be an ice cap at the South Pole at the time.

The evidence for the Antarctic forest comes from a core of sediment drilled into the seabed near the Pine Island and Thwaites glaciers in West Antarctica. One section of the core, that would have originally been deposited on land, caught the researchers' attention with its strange colour.

The team CT-scanned the section of the core and discovered a dense network of fossil roots, which was so well preserved that they could make out individual cell structures. The sample also contained countless traces of pollen and spores from plants, including the first remnants of flowering plants ever found at these high Antarctic latitudes.

To reconstruct the environment of this preserved forest, the team assessed the climatic conditions under which the plants' modern descendants live, as well as analysing temperature and precipitation indicators within the sample.

They found that the annual mean air temperature was around 12 degrees Celsius; roughly two degrees warmer than the mean temperature in Germany today. Average summer temperatures were around 19 degrees Celsius; water temperatures in the rivers and swamps reached up to 20 degrees; and the amount and intensity of rainfall in West Antarctica were similar to those in today's Wales.

To get these conditions, the researchers conclude that 90 million years ago the Antarctic continent was covered with dense vegetation, there were no land-ice masses on the scale of an ice sheet in the South Pole region, and the carbon dioxide concentration in the atmosphere was far higher than previously assumed for the Cretaceous.

Read more at Science Daily

Apr 1, 2020

Regular exercise benefits immunity -- even in isolation

Being in isolation without access to gyms and sports clubs should not mean people stop exercising, according to a new study from researchers at the University of Bath. Keeping up regular, daily exercise at a time when much of the world is going into isolation will play an important role in helping to maintain a healthy immune system.

The analysis, published in the international journal Exercise Immunology Review, involving leading physiologists Dr James Turner and Dr John Campbell from the University of Bath's Department for Health, considers the effect of exercise on our immune function.

Over the last four decades, many studies have investigated how exercise affects the immune system. It is widely agreed that regular moderate intensity exercise is beneficial for immunity, but a view held by some is that more arduous exercise can suppress immune function, leading to an 'open-window' of heightened infection risk in the hours and days following exercise.

In a benchmark study in 2018, this 'open window' hypothesis was challenged by Dr Campbell and Dr Turner. They reported in a review article that the theory was not well supported by scientific evidence, summarising that there is limited reliable evidence that exercise suppresses immunity, concluding instead that exercise is beneficial for immune function.

They say that, in the short term, exercise can help the immune system find and deal with pathogens, and in the long term, regular exercise slows down changes that happen to the immune system with ageing, therefore reducing the risk of infections.

In a new article, published this month, leading experts, including Dr Turner and Dr Campbell, debated whether the immune system can change in a negative or positive way after exercise, and whether or not athletes get more infections than the general population. The article concludes that infections are more likely to be linked to inadequate diet, psychological stress, insufficient sleep, travel and importantly, pathogen exposure at social gathering events like marathons -- rather than the act of exercising itself.

Author Dr James Turner from the Department for Health at the University of Bath explains: "Our work has concluded that there is very limited evidence for exercise directly increasing the risk of becoming infected with viruses. In the context of coronavirus and the conditions we find ourselves in today, the most important consideration is reducing your exposure from other people who may be carrying the virus. But people should not overlook the importance of staying fit, active and healthy during this period. Provided it is carried out in isolation -- away from others -- then regular, daily exercise will help better maintain the way the immune system works -- not suppress it."

Co-author, Dr John Campbell added: "People should not fear that their immune system will be suppressed by exercise placing them at increased risk of Coronavirus. Provided exercise is carried out according to latest government guidance on social distancing, regular exercise will have a tremendously positive effect on our health and wellbeing, both today and for the future."

Regular moderate intensity aerobic exercise, such as walking, running or cycling is recommended, with the aim of achieving 150 minutes per week. Longer, more vigorous exercise would not be harmful, but if capacity to exercise is restricted due to a health condition or disability, the message is to 'move more' and that 'something is better than nothing'. Resistance exercise has clear benefits for maintaining muscles, which also helps movement.

Read more at Science Daily

Understanding brain tumors in children

Medulloblastomas are among the most common malignant brain tumours affecting children. They spread from the cerebellum to the surrounding tissue and can also spread to other parts of the central nervous system via the cerebrospinal fluid. Because these tumours grow rapidly, physicians do not have much time to find a suitable treatment.

Researchers from EMBL, together with colleagues from Hopp Children's Cancer Center Heidelberg (KiTZ), the German Cancer Consortium, and St. Jude Children's Research Hospital have conducted the most comprehensive medulloblastoma-related genetic investigation to date. "We analysed the genome and tumour genome of 800 children, adolescents, and adults with medulloblastoma and compared the genetic data with data from healthy individuals," explains lead author Dr. Sebastian Waszak from EMBL, who was also part of the EMBL led Pan-Cancer project.

In characterising the molecular properties of medulloblastoma, the scientists hope to be able to recommend other treatment options besides standard therapies, and to develop new therapies with a focus on the mode of action. In analysing the healthy and mutated genome, they came across a particularly striking hereditary difference in children and young people with brain tumours in the so-called Sonic Hedgehog medulloblastoma subgroup.

A hereditary genetic defect in 15 percent of cases meant that tumours were no longer able to produce the elongator complex protein 1 (ELP1). This protein is involved in ensuring that other proteins are properly assembled and folded in line with the genetic code. The latest findings show that, without ELP1, much of the protein production in tumours is disturbed: "The assembly and folding of larger proteins in particular does not function properly any more, and the accumulation of these non-functioning or malfunctioning proteins places the cells under permanent stress," says KiTZ Director Dr. Stefan Pfister. "Hundreds of proteins are misregulated in this way, including proteins that are important for nerve cell development."

By analysing the genomes of some of the parents and grandparents of study participants, the researchers also established that this novel genetic disease is hereditary. "That makes this the most common congenital genetic defect associated with medulloblastoma to date," says Dr. Jan Korbel, a co-author of the study and group leader at EMBL Heidelberg. Sebastian Waszak, now a group leader at the Norwegian node of the Nordic EMBL Partnership for Molecular Medicine, adds: "The latest results show that around 40 percent of children and young people who suffer from this subtype of medulloblastoma have a congenital genetic predisposition for it. That is a much higher proportion than we had assumed."

Read more at Science Daily

Hubble finds best evidence for elusive mid-sized black hole

Black hole illustration
Astronomers have found the best evidence for the perpetrator of a cosmic homicide: a black hole of an elusive class known as "intermediate-mass," which betrayed its existence by tearing apart a wayward star that passed too close.

Weighing in at about 50,000 times the mass of our Sun, the black hole is smaller than the supermassive black holes (at millions or billions of solar masses) that lie at the cores of large galaxies, but larger than stellar-mass black holes formed by the collapse of a massive star.

These so-called intermediate-mass black holes (IMBHs) are a long-sought "missing link" in black hole evolution. Though there have been a few other IMBH candidates, researchers consider these new observations the strongest evidence yet for mid-sized black holes in the universe.

It took the combined power of two X-ray observatories and the keen vision of NASA's Hubble Space Telescope to nail down the cosmic beast.

"Intermediate-mass black holes are very elusive objects, and so it is critical to carefully consider and rule out alternative explanations for each candidate. That is what Hubble has allowed us to do for our candidate," said Dacheng Lin of the University of New Hampshire, principal investigator of the study. The results are published on March 31, 2020, in The Astrophysical Journal Letters.

The story of the discovery reads like a Sherlock Holmes story, involving the meticulous step-by-step case-building necessary to catch the culprit.

Lin and his team used Hubble to follow up on leads from NASA's Chandra X-ray Observatory and ESA's (the European Space Agency) X-ray Multi-Mirror Mission (XMM-Newton). In 2006 these satellites detected a powerful flare of X-rays, but they could not determine whether it originated from inside or outside of our galaxy. Researchers attributed it to a star being torn apart after coming too close to a gravitationally powerful compact object, like a black hole.

Surprisingly, the X-ray source, named 3XMM J215022.4?055108, was not located in a galaxy's center, where massive black holes normally would reside. This raised hopes that an IMBH was the culprit, but first another possible source of the X-ray flare had to be ruled out: a neutron star in our own Milky Way galaxy, cooling off after being heated to a very high temperature. Neutron stars are the crushed remnants of an exploded star.

Hubble was pointed at the X-ray source to resolve its precise location. Deep, high-resolution imaging provides strong evidence that the X-rays emanated not from an isolated source in our galaxy, but instead in a distant, dense star cluster on the outskirts of another galaxy -- just the type of place astronomers expected to find an IMBH. Previous Hubble research has shown that the mass of a black hole in the center of a galaxy is proportional to that host galaxy's central bulge. In other words, the more massive the galaxy, the more massive its black hole. Therefore, the star cluster that is home to 3XMM J215022.4?055108 may be the stripped-down core of a lower-mass dwarf galaxy that has been gravitationally and tidally disrupted by its close interactions with its current larger galaxy host.

IMBHs have been particularly difficult to find because they are smaller and less active than supermassive black holes; they do not have readily available sources of fuel, nor as strong a gravitational pull to draw stars and other cosmic material which would produce telltale X-ray glows. Astronomers essentially have to catch an IMBH red-handed in the act of gobbling up a star. Lin and his colleagues combed through the XMM-Newton data archive, searching hundreds of thousands of observations to find one IMBH candidate.

The X-ray glow from the shredded star allowed astronomers to estimate the black hole's mass of 50,000 solar masses. The mass of the IMBH was estimated based on both X-ray luminosity and the spectral shape. "This is much more reliable than using X-ray luminosity alone as typically done before for previous IMBH candidates," said Lin. "The reason why we can use the spectral fits to estimate the IMBH mass for our object is that its spectral evolution showed that it has been in the thermal spectral state, a state commonly seen and well understood in accreting stellar-mass black holes."

This object isn't the first to be considered a likely candidate for an intermediate-mass black hole. In 2009 Hubble teamed up with NASA's Swift observatory and ESA's XMM-Newton to identify what is interpreted as an IMBH, called HLX-1, located towards the edge of the galaxy ESO 243-49. It too is in the center of a young, massive cluster of blue stars that may be a stripped-down dwarf galaxy core. The X-rays come from a hot accretion disk around the black hole. "The main difference is that our object is tearing a star apart, providing strong evidence that it is a massive black hole, instead of a stellar-mass black hole as people often worry about for previous candidates including HLX-1," Lin said.

Read more at Science Daily

Oldest ever human genetic evidence clarifies dispute over our ancestors

DNA illustration
Genetic information from an 800,000-year-old human fossil has been retrieved for the first time. The results from the University of Copenhagen shed light on one of the branching points in the human family tree, reaching much further back in time than previously possible.

An important advancement in human evolution studies has been achieved after scientists retrieved the oldest human genetic data set from an 800,000-year-old tooth belonging to the hominin species Homo antecessor.

The findings by scientists from the University of Copenhagen (Denmark), in collaboration with colleagues from the CENIEH (National Research Center on Human Evolution) in Burgos, Spain, and other institutions, are published April 1st in Nature.

"Ancient protein analysis provides evidence for a close relationship between Homo antecessor, us (Homo sapiens), Neanderthals, and Denisovans. Our results support the idea that Homo antecessor was a sister group to the group containing Homo sapiens, Neanderthals, and Denisovans," says Frido Welker, Postdoctoral Research Fellow at the Globe Institute, University of Copenhagen, and first author on the paper.

Reconstructing the human family tree

By using a technique called mass spectrometry, researchers sequenced ancient proteins from dental enamel, and confidently determined the position of Homo antecessor in the human family tree.

The new molecular method, palaeoproteomics, developed by researchers at the Faculty of Health and Medical Sciences, University of Copenhagen, enables scientists to retrieve molecular evidence to accurately reconstruct human evolution from further back in time than ever before.

The human and the chimpanzee lineages split from each other about 9-7 million years ago. Scientists have relentlessly aimed to better understand the evolutionary relations between our species and the others, all now extinct, in the human lineage.

"Much of what we know so far is based either on the results of ancient DNA analysis, or on observations of the shape and the physical structure of fossils. Because of the chemical degradation of DNA over time, the oldest human DNA retrieved so far is dated at no more than approximately 400,000 years," says Enrico Cappellini, Associate Professor at the Globe Institute, University of Copenhagen, and leading author on the paper.

"Now, the analysis of ancient proteins with mass spectrometry, an approach commonly known as palaeoproteomics, allow us to overcome these limits," he adds.

Theories on human evolution

The fossils analyzed by the researchers were found by palaeoanthropologist José María Bermúdez de Castro and his team in 1994 in stratigraphic level TD6 from the Gran Dolina cave site, one of the archaeological and paleontological sites of the Sierra de Atapuerca, Spain.

Initial observations led to conclude that Homo antecessor was the last common ancestor to modern humans and Neanderthals, a conclusion based on the physical shape and appearance of the fossils. In the following years, the exact relation between Homo antecessor and other human groups, like ourselves and Neanderthals, has been discussed intensely among anthropologists.

Although the hypothesis that Homo antecessor could be the common ancestor of Neanderthals and modern humans is very difficult to fit into the evolutionary scenario of the genus Homo, new findings in TD6 and subsequent studies revealed several characters shared among the human species found in Atapuerca and the Neanderthals. In addition, new studies confirmed that the facial features of Homo antecessor are very similar to those of Homo sapiens and very different from those of the Neanderthals and their more recent ancestors.

"I am happy that the protein study provides evidence that the Homo antecessor species may be closely related to the last common ancestor of Homo sapiens, Neanderthals, and Denisovans. The features shared by Homo antecessor with these hominins clearly appeared much earlier than previously thought. Homo antecessor would therefore be a basal species of the emerging humanity formed by Neanderthals, Denisovans, and modern humans," adds José María Bermúdez de Castro, Scientific Co-director of the excavations in Atapuerca and co-corresponding author on the paper.

World class-expertise

Findings like these are made possible through an extensive collaboration between different research fields: from paleoanthropology to biochemistry, proteomics and population genomics.

Retrieval of ancient genetic material from the rarest fossil specimens requires top quality expertise and equipment. This is the reason behind the now ten-years-long strategic collaboration between Enrico Cappellini and Jesper Velgaard Olsen, Professor at the Novo Nordisk Foundation Center for Protein Research, University of Copenhagen and co-author on the paper.

"This study is an exciting milestone in palaeoproteomics. Using state of the art mass spectrometry, we determine the sequence of amino acids within protein remains from Homo antecessor dental enamel. We can then compare the ancient protein sequences we 'read' to those of other hominins, for example Neanderthals and Homo sapiens, to determine how they are genetically related," says Jesper Velgaard Olsen.

Read more at Science Daily

Mar 31, 2020

How at risk are you of getting a virus on an airplane?

Fair or not, airplanes have a reputation for germs. However, there are ways to minimize the risks.

Historic research based on group movements of humans and animals suggest three simple rules:

  • move away from those that are too close.
  • move toward those that are far away.
  • match the direction of the movement of their neighbors.

This research is especially used for air travel where there is an increased risk for contagious infection or disease, such as the recent worldwide outbreak of the coronavirus, which causes COVID-19 disease.

"Airlines use several zones in boarding," said Ashok Srinivasan, a professor in the Department of Computer Science University of West Florida. "When boarding a plane, people are blocked and forced to stand near the person putting luggage in the bin -- people are very close to each other. This problem is exacerbated when many zones are used. Deplaning is much smoother and quicker -- there isn't as much time to get infected."

Srinivasan is the principal investigator of new research on pedestrian dynamics models that has recently been used in the analysis of procedures to reduce the risk of disease spread in airplanes. The research was published in the journal PLOS ONE in March 2020.

For many years scientists have relied on the SPED (Self Propelled Entity Dynamics) model, a social force model that treats each individual as a point particle, analogous to an atom in molecular dynamics simulations. In such simulations, the attractive and repulsive forces between atoms govern the movement of atoms. The SPED model modifies the code and replaces atoms with humans.

"[The SPED model] changes the values of the parameters that govern interactions between atoms so that they reflect interactions between humans, while keeping the functional form the same," Srinivasan said.

Srinivasan and his colleagues used the SPED model to analyze the risk of an Ebola outbreak in 2015, which was widely covered in news outlets around the world. However, one limitation of the SPED model is that it is slow -- which makes it difficult to make timely decisions. Answers are needed fast in situations such as an outbreak like COVID-19.

The researchers decided there was a need for a model that could simulate the same applications as SPED, while being much faster. They proposed the CALM model (for constrained linear movement of individuals in a crowd). CALM produces similar results to SPED, but is not based on MD code. In other words, CALM was designed to run fast.

Like SPED, CALM was designed to simulate movement in narrow, linear passageways. The results of their research shows that CALM performs almost 60 times faster than the SPED model. Apart from the performance gain, the researchers also modeled additional pedestrian behaviors.

"The CALM model overcame the limitations of SPED where real time decisions are required," Srinivasan said.

Computational Work Using Frontera

The scientists designed the CALM model from scratch so it could run efficiently on computers, especially on GPUs (graphic processing units.

For their research, Srinivasan and colleagues used Frontera, the #5 most powerful supercomputer in the world and fastest academic supercomputer, according to the November 2019 rankings of the Top500 organization. Frontera is located at the Texas Advanced Computing Center and supported by National Science Foundation.

"Once Blue Waters started being phased out, Frontera was the natural choice, given that it was the new NSF-funded flagship machine," Srinivasan said. "One question you have is whether you have generated a sufficient number of scenarios to cover the range of possibilities. We check this by generating histograms of quantities of interest and seeing if the histogram converges. Using Frontera, we were able to perform sufficiently large simulations that we now know what a precise answer looks like."

In practice, it isn't feasible to make precise predictions due to inherent uncertainties, especially at the early stages of an epidemic -- this is what makes the computational aspect of this research challenging.

"We needed to generate a large number of possible scenarios to cover the range of possibilities. This makes it computationally intensive," Srinivasan said.

The team validated their results by examining disembarkation times on three different types of airplanes. Since a single simulation doesn't capture the variety of human movement patterns, they performed simulations with 1,000 different combinations of values and compared it to the empirical data.

Using Frontera's GPU subsystem, the researchers were able to get the computation time down to 1.5 minutes. "Using the GPUs turned out to be a fortunate choice because we were able to deploy these simulations in the COVID-19 emergency. The GPUs on Frontera are a means of generating answers fast."

But Wait -- Models Don't Capture Extreme Events? In terms of general preparation, Srinivasan wants people to understand that scientific models often don't capture extreme events accurately.

Though there have been thorough empirical studies on several flights to understand human behavior and cleanliness of the surfaces and air, a major infection outbreak is an extreme event -- data from typical situations may not capture it.

There are about 100,000 flights on an average day. A very low probability event could lead to frequent infection outbreaks just because the number of flights is so large. Although models have predicted infection transmission in planes as unlikely, there have been several known outbreaks.

Srinivasan offers an example.

"It's generally believed that infection spread in planes happens two rows in front and back of the index patient," he said. "During the SARS outbreak in 2002, on the few flights with infection spread, this was mostly true. However, a single outbreak accounted for more than half the cases, and half of the infected were seated farther than two rows away on that flight. One might be tempted to look at this outbreak as an outlier. But the 'outlier' had the most impact, and so people farther than two rows away accounted for a significant number of people infected with SARS on flights."

Currently, with regard to COVID-19, the typical infected person is believed to sicken 2.5 others. However, there have been communities were a single 'super-spreader' infected a large number of people and played the driving role in an outbreak. The impact of such extreme events, and the difficulty in modeling them accurately, makes prediction difficult, according to Srinivasan.

"In our approach, we don't aim to accurately predict the actual number of cases," Srinivasan said. "Rather, we try to identify vulnerabilities in different policy or procedural options, such as different boarding procedures on a plane. We generate a large number of possible scenarios that could occur and examine whether one option is consistently better than the other. If it is, then it can be considered more robust. In a decision-making setting, one may wish to choose the more robust option, rather than rely on expected values from predictions."

Read more at Science Daily

What are you looking at? 'Virtual' communication in the age of social distancing

From health care to education to media, social distancing across the globe due to coronavirus (COVID-19) has created the need to conduct business "virtually" using Skype, web conferencing, FaceTime and any other means available. With this expansive use of mobile and video devices, now more than ever, it is important to understand how the use of these technologies may impact communication. But are all forms of online communication alike?

In a first-of-its-kind study, neuroscientists from Florida Atlantic University demonstrate that a person's gaze is altered during tele-communication if they think that the person on the other end of the conversation can see them. People are very sensitive to the gaze direction of others and even 2-day-old infants prefer faces where the eyes are looking directly back at them. The phenomenon known as "gaze cueing," a powerful signal for orienting attention, is a mechanism that likely plays a role in the developmentally and socially important wonder of "shared" or "joint" attention where a number of people attend to the same object or location. The ability to do this is what makes humans unique among primates.

Throughout almost all of human history, conversations were generally conducted face-to-face, so people knew where their conversational partner was looking and vice versa. Now, with virtual communication, that assumption no longer holds -- sometimes people communicate with both cameras on while other times only the speaker may be visible. The researchers set out to determine whether being observed affects people's behavior during online communication.

For the study, published in the journal Attention, Perception & Psychophysics, co-authors Elan Barenholtz, Ph.D., an associate professor of psychology, a member of the Center for Complex Systems and Brain Sciences in FAU's Charles E. Schmidt College of Science and a member of FAU's Brain Institute (I-BRAIN), and Michael H. Kleiman, Ph.D., a postdoctoral researcher at FAU, compared fixation behavior in 173 participants under two conditions: one in which the participants believed they were engaging in a real-time interaction and one in which they knew they were watching a pre-recorded video.

The researchers wanted to know if face fixation would increase in the real-time condition based on the social expectation of facing one's speaker in order to get attention or if it would lead to greater face avoidance, based on social norms as well as the cognitive demands of encoding the conversation.

Similarly, they wanted to know where participants would fixate on the face. Would it be the eyes more in the real-time condition because of social demands to make eye contact with one's speaker? Or, in the pre-recorded condition, where the social demands to make eye contact are eliminated, would participants spend more time looking at the mouth in order to encode the conversation, which is consistent with previous studies showing greater mouth fixations during an encoding task.

Results of the study showed that participants fixated on the whole face in the real-time condition and significantly less in the pre-recorded condition. In the pre-recorded condition, time spent fixating on the mouth was significantly greater compared to the real-time condition.

There were no significant differences in time spent fixating on the eyes between the real-time and the pre-recorded conditions. These findings may suggest that participants are more comfortable looking directly at the mouth of a speaker -- which has previously been found to be optimal for encoding speech -- when they think that no one is watching them.

To simulate a live interaction, the researchers convinced participants that they were engaging in a real-time, two-way video interaction (it was actually pre-recorded) where they could been seen and heard by the speaker, as well as a pre-recorded interaction where they knew the video was previously recorded and therefore the speaker could not see their behavior.

"Because gaze direction conveys so much socially relevant information, one's own gaze behavior is likely to be affected by whether one's eyes are visible to a speaker," said Barenholtz. "For example, people may intend to signal that they are paying more attention to a speaker by fixating their face or eyes during a conversation. Conversely, extended eye contact also can be perceived as aggressive and therefore noticing one's eyes could lead to reduced direct fixation of another's face or eyes. Indeed, people engage in avoidant eye movements by periodically breaking and reforming eye contact during conversations."

There was a highly significant tendency for participants engaging in perceived real-time interaction to display greater avoidant fixation behavior, which supports the idea that social contexts draw fixations away from the face compared to when social context is not a factor. When the face was fixated, attention was directed toward the mouth for the greater percentage of time in the pre-recorded condition versus the real-time condition. The lack of difference in time spent fixating the eyes suggests that the additional mouth fixations in the pre-recorded condition did not come at the cost of reduced eye fixation and must have derived from reduced fixations elsewhere on the face.

Comparisons between total fixation durations of the eyes versus the mouth were calculated for both the real-time and pre-recorded conditions, with the eyes of both conditions being significantly more fixated than the mouth. Gender, age, cultural background, and native language did not have an influence on fixation behavior across conditions.

Read more at Science Daily

The placebo effect and psychedelic drugs: Tripping on nothing?

A new study suggests that, in the right context, some people may experience psychedelic-like effects from placebos alone. The researchers reported some of the strongest placebo effects on consciousness in the literature relating to psychedelic drugs. Indeed, 61% of the participants in the experiment reported some effect after consuming the placebo.

There has been a lot of recent interest in the use of psychedelic drugs to treat depression. A new study from McGill suggests that, in the right context, some people may experience psychedelic-like effects from placebos alone. The researchers reported some of the strongest placebo effects (these are effects from "fake" medication) on consciousness in the literature relating to psychedelic drugs. Indeed, 61% of the participants in the experiment reported some effect after consuming the placebo.

"The study reinforces the power of context in psychedelic settings. With the recent re-emergence of psychedelic therapy for disorders such as depression and anxiety, clinicians may be able to leverage these contextual factors to obtain similar therapeutic experiences from lower doses, which would further improve the safety of the drugs," said Jay Olson, a Ph.D. candidate in McGill's Department of Psychiatry and the lead author on the research paper that was recently published in Psychopharmacology.

Setting the mood

Participants, who were expecting to take part in a study of the effects of drugs on creativity, spent four hours together in a room that had been set up to resemble a psychedelic party, with paintings, coloured lights and a DJ. To make the context seem credible and hide the deception, the study also involved ten research assistants in white lab coats, psychiatrists, and a security guard.

The 33 participants had been told they were being given a drug which resembled the active ingredient in psychedelic mushrooms and that they would experience changes in consciousness over the 4-hour period. In reality, everyone consumed a placebo. Among the participants were several actors who had been trained to slowly act out the effects of the ostensible drug. The researchers thought that this would help convince the participants that everyone had consumed a psychedelic drug and might lead them to experience placebo effects.

Strong effects for a placebo

When asked near the end of the study, the majority (61%) of the participants reported some effect of the drug, ranging from mild changes to effects resembling taking a moderate or high dose of an actual drug, though there was considerable individual variation. For example, several participants stated that they saw the paintings on the walls "move" or "reshape" themselves. Others described themselves as feeling "heavy... as if gravity [had] a stronger hold," and one had a "come down" before another "wave" hit her. Several participants reported being certain that they had taken a psychedelic drug.

Read more at Science Daily

A Martian mash up: Meteorites tell story of Mars' water history

Mars
In Jessica Barnes' palm is an ancient, coin-sized mosaic of glass, minerals and rocks as thick as a strand of wool fiber. It is a slice of Martian meteorite, known as Northwest Africa 7034 or Black Beauty, that was formed when a huge impact cemented together various pieces of Martian crust.

Barnes is an assistant professor of planetary sciences in the University of Arizona Lunar and Planetary Laboratory. She and her team chemically analyzed the Black Beauty meteorite and the infamous Allan Hills 84001 meteorite -- controversial in the 1990s for allegedly containing Martian microbes -- to reconstruct Mars' water history and planetary origins.

Their analysis, published today in Nature Geoscience, showed that Mars likely received water from at least two vastly different sources early in its history. The variability the researchers found implies that Mars, unlike Earth and the moon, never had an ocean of magma completely encompassing the planet.

"These two different sources of water in Mars' interior might be telling us something about the kinds of objects that were available to coalesce into the inner, rocky planets," Barnes said. Two distinct planetesimals with vastly different water contents could have collided and never fully mixed. "This context is also important for understanding the past habitability and astrobiology of Mars."

Reading the Water

"A lot of people have been trying to figure out Mars' water history," Barnes said. "Like, where did water come from? How long was it in the crust (surface) of Mars? Where did Mars' interior water come from? What can water tell us about how Mars formed and evolved?"

Barnes and her team were able to piece together Mars' water history by looking for clues in two types, or isotopes, of hydrogen. One hydrogen isotope contains one proton in its nucleus; this is sometimes called "light hydrogen." The other isotope is called deuterium, which contains a proton and a neutron in the nucleus; this is sometimes referred to as "heavy hydrogen." The ratio of these two hydrogen isotopes signals to a planetary scientist the processes and possible origins of water in the rocks, minerals and glasses in which they're found.

Meteorite Mystery

For about 20 years, researchers have been recording the isotopic ratios from Martian meteorites, and their data were all over the place. There seemed to be little trend, Barnes said.

Water locked in Earth rocks is what's called unfractionated, meaning it doesn't deviate much from the standard reference value of ocean water -- a 1:6,420 ratio of heavy to light hydrogen. Mars' atmosphere, on the other hand, is heavily fractionated -- it is mostly populated by deuterium, or heavy hydrogen, likely because the solar wind stripped away the light hydrogen. Measurements from Martian meteorites -- many of which were excavated from deep within Mars by impact events -- ran the gamut between Earth and Mars' atmosphere measurements.

Barnes' team set out to investigate the hydrogen isotope composition of the Martian crust specifically by studying samples they knew were originated from the crust: the Black Beauty and Allan Hills meteorites. Black Beauty was especially helpful because it's a mashup of surface material from many different points in Mars' history.

"This allowed us to form an idea of what Mars' crust looked like over several billions of years," Barnes said.

The isotopic ratios of the meteorite samples fell about midway between the value for Earth rocks and Mars' atmosphere. When the researchers' findings were compared with previous studies, including results from the Curiosity Rover, it seems that this was the case for most of Mars' 4 billion-plus-year history.

"We thought, ok this is interesting, but also kind of weird," Barnes said. "How do we explain this dichotomy where the Martian atmosphere is being fractionated, but the crust is basically staying the same over geological time?"

Barnes and her colleagues also grappled with trying to explain why the crust seemed so different from the Martian mantle, the rock later which lies below.

"If you try and explain this fairly constant isotopic ratio of Mars' crust, you really can't use the atmosphere to do that," Barnes said. "But we know how crusts are formed. They're formed from molten material from the interior that solidifies on the surface."

"The prevailing hypothesis before we started this work was that the interior of Mars was more Earthlike and unfractionated, and so the variability in hydrogen isotope ratios within Martian samples was due to either terrestrial contamination or atmospheric implantation as it made its way off Mars," Barnes said.

The idea that Mars' interior was Earthlike in composition came from one study of a Martian meteorite thought to have originated from the mantle -- the interior between the planet's core and its surface crust.

However, Barnes said, "Martian meteorites basically plot all over the place, and so trying to figure out what these samples are actually telling us about water in the mantle of Mars has historically been a challenge. The fact that our data for the crust was so different prompted us to go back through the scientific literature and scrutinize the data."

The researchers found that two geochemically different types of Martian volcanic rocks -enriched shergottites and depleted shergottites -- contain water with different hydrogen isotope ratios. Enriched shergottites contain more deuterium than the depleted shergottites, which are more Earth-like, they found.

"It turns out that if you mix different proportions of hydrogen from these two kinds of shergottites, you can get the crustal value," Barnes said.

Read more at Science Daily

Mar 30, 2020

How animals understand numbers influences their chance of survival

While they can't pick out precise numbers, animals can comprehend that more is, well, more. From birds to bees and wolves to frogs, animals use numbers to hunt, find a mate, return to their home, and more -- and researchers believe that this ability to process and represent numbers, known as numerical competence, plays an important role in how animals make these decisions and influences an animal's chance of survival. In a Review publishing March 30 in the journal Trends in Ecology and Evolution, Andreas Nieder, a neurobiologist at the University of Tuebingen, Germany, explores the current literature on how different animal species comprehend numbers and the impact on their survival, arguing that we won't fully understand the influence of numerical competence unless we study it directly.

"Interestingly, we know now that numerical competence is present on almost every branch on the animal tree of life," says Nieder, who works with different animal species to explore how trained animals discriminate and represent numbers as well as how numbers are represented in the brain. "Different groups of animals obviously developed this trait independently from other lineages and that strongly indicates that it has to be of adaptive value. So the capability to discriminate numbers has to have a strong survival benefit and reproduction benefit."

Honeybees, for instance, can remember the number of landmarks they pass when searching for food in order to find their way back to the hive. "The last common ancestor between honeybees and us primates lived about 600 million years ago," he says. "But still, they evolved numerical competence that, in many respects, is comparable to vertebrae numerical competence."

This can also be seen in animals choosing a larger amount of food over a small amount or in animals forming hunting alliances. Wolves are more likely to hunt successfully if they have the right number of wolves in their pack for the size of their prey: with prey like elk and moose, only around six to eight wolves are needed, while hunting bison requires a pack of nine to thirteen. Their prey also use this concept to protect themselves from predators -- elk tend to live in smaller herds, which rarely have encounters with wolves, or gather in large herds to reduce the chance of any individual becoming prey. "So obviously they are assessing the number of individuals in their groups for their everyday life situations," Nieder says.

Furthermore, it has been shown that numerical competence even plays a role in attracting a mate. For example, male frogs sing "advertisement" calls to attract females. The females, listening for the complexity of their calls, choose the male that sings the most "chucks" in their mating call. Even once they've attracted a mate, species like the mealworm beetle and the cowbird use numerical competence to increase the likelihood of having offspring.

Despite these many examples of numerical competence in animals, this subject has not gotten many first-hand studies. "Many of these behavioral findings in the wild have usually been collected as by-products or accidental findings of other research questions," says Nieder.

Researchers do have some sense of the rules that govern numerical competence in animals, including that they count approximately rather than specifically and that two numbers need to be more different for them to tell them apart as those numbers get bigger -- and it does seem apparent that those abilities are adaptive. However, Nieder argues that more research needs to be done to fully understand the selective pressures and fitness payoffs of numerical competence.

Read more at Science Daily

How social media makes it difficult to identify real news

There's a price to pay when you get your news and political information from the same place you find funny memes and cat pictures, new research suggests.

The study found that people viewing a blend of news and entertainment on a social media site tended to pay less attention to the source of content they consumed -- meaning they could easily mistake satire or fiction for real news.

People who viewed content that was clearly separated into categories -- such as current affairs and entertainment -- didn't have the same issues evaluating the source and credibility of content they read.

The findings show the dangers of people getting their news from social media sites like Facebook or Twitter, said study author George Pearson, a senior lecturer and research associate in communication at The Ohio State University.

"We are drawn to these social media sites because they are one-stop shops for media content, updates from friends and family, and memes or cat pictures," Pearson said.

"But that jumbling of content makes everything seem the same to us. It makes it harder for us to distinguish what we need to take seriously from that which is only entertainment."

The study appears online in the journal New Media & Society.

For the study, Pearson created a fictional social media site called "Link Me." The 370 participants saw four webpages with either two or four posts each. Each post consisted of a headline and short paragraph summarizing the story, as well as information on the source of the post.

The sources were designed to be either high or low credibility, based on their name and description. (The sources' credibility was tested in a previous study to make sure people understood.)

For example, one high-credibility source was called "Washington Daily News" and was described as a "professional news organization renowned for high-quality and objective journalism."

One low-credibility source in the study was called "Hot Moon" and described as "a collective of nonprofessional writers."

All posts were based on real articles or public social media posts taken from Reddit or Tumblr.

After viewing the site, participants were asked a variety of questions. Pearson was most interested in whether they paid more attention to the posts about current affairs topics than those in other categories, such as entertainment.

"That would suggest that they were paying attention to the sources of the posts and understanding what was news and what was not," Pearson said.

The results showed that when the content was not grouped by distinct topics -- in other words, news posts appeared on the same page with entertainment posts -- participants reported paying less attention to the source of the content.

"They were less likely to verify source information to ensure that it was a credible source," he said.

That may be one reason why satirical and other types of fake news get shared by people who evidently think it is real, Pearson said.

For example, in 2018 the website React365 posted an article about a cruise ship disaster in Mexico that killed at least 32 people. The article generated more than 350,000 engagements on Facebook.

The misinformation was quickly debunked by Snopes.com, which noted that react365's homepage clearly showed it was a prank website where people could upload their own fictitious stories.

Pearson said one of the problems is that many social media sites present content in the exact same way, no matter the source.

"There is no visual distinction on Facebook between something from the New York Times and something from a random blog. They all have the same color scheme, same font," he said.

One solution would be for social media companies to develop tools to distinguish content.

But until that happens, it is up to users to pay more attention to where their news is coming from -- as difficult as that may be, Pearson said.

Read more at Science Daily

Lessons from the Spanish flu: Early restrictions lowered disease, mortality rates

Large events are cancelled, restaurants and non-essential businesses are closed, and in many states, residents have been asked to shelter in place, all to limit the spread and impact of the COVID-19 virus. But are strict and early isolation and other preventative mandates really effective in minimizing the spread and impact of a disease outbreak?

Stefan E. Pambuccian, MD, a Loyola Medicine cytologist, surgical pathologist and professor and vice chair of the Department of Pathology and Laboratory Medicine at Loyola University Chicago Stritch School of Medicine, has reviewed published data and research from three papers dating back to the 1918-19 Spanish flu pandemic, which infected one-fifth to one-third of the world's population and killed 50 million people.

According to the data and analysis, cities that adopted early, broad isolation and prevention measures -- closing of schools and churches, banning of mass gatherings, mandated mask wearing, case isolation and disinfection/hygiene measures -- had lower disease and mortality rates. These cities included San Francisco, St. Louis, Milwaukee and Kansas City, which collectively had 30% to 50% lower disease and mortality rates than cities that enacted fewer and later restrictions. One analysis showed that these cities also had greater delays in reaching peak mortality, and the duration of these measures correlated with a reduced total mortality burden.

"The stricter the isolation policies, the lower the mortality rate," says Dr. Pambuccian. He studied the Spanish flu, including prevention measures and outcomes, to help develop standards for staffing and safety in the cytology lab, where infectious diseases like the COVID-19 virus are diagnosed and studied at the cellular level. His broader article appeared online this week in the Journal of the American Society of Cytopathology.

Like today, not everyone in 1918 and 1919 thought the strict measures were appropriate or effective at the time.

An estimated 675,000 people died in the U.S. from the Spanish flu, "and there was skepticism that these policies were actually working," says Dr. Pambuccian. "But they obviously did make a difference."

In 1918, the world was still at war "with overcrowded barracks," and much of the U.S. lived with "poverty, poor nutrition, poor hygiene, household/community-level crowding, and a lack of preparation of the population and decision makers due to cognitive inertia and poor medical and insufficient nursing care," says Dr. Pambuccian.

Read more at Science Daily

New research sheds light on potentially negative effects of cannabis

Coughing fits, anxiety and paranoia are three of the most common adverse reactions to cannabis, according to a recent study by Washington State University researchers.

The researchers surveyed more than 1,500 college students on the type and frequency of adverse reactions they had experienced while using cannabis for their study in the Journal of Cannabis Research. They also collected information on the students' demographics, personality traits, cannabis use patterns and motives for using the drug.

"There's been surprisingly little research on the prevalence or frequency of various adverse reactions to cannabis and almost no research trying to predict who is more likely to experience these types of adverse reactions," said Carrie Cuttler, assistant professor of psychology and an author on the paper. "With the legalization of cannabis in Washington and 10 other states, we thought it would be important to document some of this information so that more novice users would have a better sense of what types of adverse reactions they may experience if they use cannabis."

More than 50% of the study participants reported having experienced coughing fits, anxiety and/or paranoia while using cannabis. On the other end of the spectrum, the three least-common reported reactions were fainting/passing out, non-auditory/visual hallucinations and cold sweats.

The researchers found the most frequently occurring adverse reactions were coughing fits, chest/lung discomfort and body humming, which a subset of the study group reported occurring approximately 30-40% of the time they were using cannabis.

Panic attacks, fainting and vomiting were considered the most distressing of the 26 possible adverse reactions.

"It is worth noting even the most distressing reactions to cannabis were only rated between moderately' and quite distressing," Cuttler said. "This suggests cannabis users do not, in general, find acute adverse reactions to cannabis to be severely distressing."

The least distressing reactions were reported to be body humming, numbness and feeling off balance/unsteady, the researchers found.

The study showed less frequent users are more likely to report negative effects. Additionally, individuals who reported using cannabis to try to fit in with friends, displayed cannabis use disorder symptoms or had anxiety sensitivity -- a tendency to imagine the worse possible outcome -- were more likely to report adverse reactions as well as experiencing a greater amount of distress.

"Interestingly, we didn't find that quantity of use during a single session predicted very much in terms of whether or not a person was going to have a bad reaction," Cuttler said. "It was the people who smoke on a less frequent basis who tend to have these bad experiences more often."

Moving forward, Cuttler hopes the results of the study will be put to use by doctors, medical cannabis distributors and even bud tenders to give people a better idea of what could go wrong when they get high.

Read more at Science Daily

Mar 29, 2020

Some COVID-19 patients still have coronavirus after symptoms disappear

In a new study, researchers found that half of the patients they treated for mild COVID-19 infection still had coronavirus for up to eight days after symptoms disappeared. The research letter was published online in the American Thoracic Society's American Journal of Respiratory and Critical Care Medicine.

In "Time Kinetics of Viral Clearance and Resolution of Symptoms in Novel Coronavirus Infection," Lixin Xie, MD, Lokesh Sharma, PhD, and co-authors report on a study of 16 patients with COVID-19, who were treated and released from the Treatment Center of PLA General Hospital in Beijing between January 28 and Feb. 9, 2020. Patients studied had a median age of 35.5 years.

Researchers collected samples from throat swabs taken from all patients on alternate days and analyzed. Patients were discharged after their recovery and confirmation of negative viral status by at least two consecutive polymerase chain reaction (PCR) tests.

"The most significant finding from our study is that half of the patients kept shedding the virus even after resolution of their symptoms," said co-lead author Dr. Sharma, instructor of medicine, Section of Pulmonary, Critical Care & Sleep Medicine, Department of Medicine, Yale School of Medicine. "More severe infections may have even longer shedding times."

The primary symptoms in these patients included fever, cough, pain in the pharynx (pharyngalgia) and difficult or labored breathing (dyspnea). Patients were treated with a range of medications.

The time from infection to onset of symptoms (incubation period) was five days among all but one patient. The average duration of symptoms was eight days, while the length of time patients remained contagious after the end of their symptoms ranged from one to eight days. Two patients had diabetes and one had tuberculosis, neither of which affected the timing of the course of COVID-19 infection.

"If you had mild respiratory symptoms from COVID-19 and were staying at home so as not to infect people, extend your quarantine for another two weeks after recovery to ensure that you don't infect other people," recommended corresponding author Lixin Xie, MD, professor, College of Pulmonary and Critical Care Medicine, Chinese PLA General Hospital, Beijing.

The authors had a special message for the medical community: "COVID-19 patients can be infectious even after their symptomatic recovery, so treat the asymptomatic/recently recovered patients as carefully as symptomatic patients."

The researchers emphasized that all of these patients had milder infections and recovered from the disease, and that the study looked at a small number of patients. They noted that it is unclear whether similar results would hold true for more vulnerable patients such as the elderly, those with suppressed immune systems and patients on immunosuppressive therapies.

Read more at Science Daily

In Earth's largest extinction, land animal die-offs began long before marine extinction

View of Earth from space.
The mass extinction at the end of the Permian Period 252 million years ago -- one of the great turnovers of life on Earth -- appears to have played out differently and at different times on land and in the sea, according to newly redated fossils beds from South Africa and Australia.

New ages for fossilized vertebrates that lived just after the demise of the fauna that dominated the late Permian show that the ecosystem changes began hundreds of thousands of years earlier on land than in the sea, eventually resulting in the demise of up to 70% of terrestrial vertebrate species. The later marine extinction, in which nearly 95% of ocean species disappeared, may have occurred over the time span of tens of thousands of years.

Though most scientists believe that a series of volcanic eruptions, occurring in large pulses over a period of a million years in what is now Siberia, were the primary cause of the end-Permian extinction, the lag between the land extinction in the Southern Hemisphere and the marine extinction in the Northern Hemisphere suggests different immediate causes.

"Most people thought that the terrestrial collapse started at the same time as the marine collapse, and that it happened at the same time in the Southern Hemisphere and in the Northern Hemisphere," said paleobotanist Cindy Looy, University of California, Berkeley, associate professor of integrative biology. "The fact that the big changes were not synchronous in the Northern and Southern hemispheres has a big effect on hypotheses for what caused the extinction. An extinction in the ocean does not, per se, have to have the same cause or mechanism as an extinction that happened on land."

Members of Looy's lab have conducted experiments on living plants to determine whether a collapse of Earth's protective ozone layer may have irradiated and wiped out plant species. Other global changes -- a warming climate, a rise in carbon dioxide in the atmosphere and an increase in ocean acidification -- also occurred around the end of the Permian period and the beginning of the Triassic and likely contributed.

On land, the end-Permian extinction of vertebrates is best documented in Gondwana, the southern half of the supercontinent known as Pangea that eventually separated into the continents we know today as Antarctica, Africa, South America and Australia. There, in the South African Karoo Basin, populations of large herbivores, or plant eaters, shifted from the Daptocephalus assemblage to the Lystrosaurus assemblage. These groups are now extinct.

In the ocean, the extinction is best documented in the Northern Hemisphere, in particular by Chinese fossils. The end-Permian extinction is perhaps best associated with the demise of trilobites.

To improve on previous dates for the land extinction, an international team of scientists, including Looy, conducted uranium-lead dating of zircon crystals in a well-preserved volcanic ash deposit from the Karoo Basin. Looy, who is also a curator of paleobotany at the campus's Museum of Paleontology and curator of gymnosperms at the University and Jepson Herbaria, confirmed that sediments from several meters above the dated layer were devoid of Glossopteris pollen, evidence that these seed ferns, which used to dominate late Permian Gondwanan floras, became extinct around that time.

At 252.24 million years old, the zircons -- microscopic silicate crystals that form in rising magma inside volcanoes and are spewed into the atmosphere during eruptions -- are 300,000 years older than dates obtained for the confirmed Permian-Triassic (P-T) boundary in China. This means that the sediment layer assumed to contain the P-T boundary in South Africa was actually at least 300,000 years too old.

Dates for an ash deposit in Australia, just above the layers that document the initial plant extinction, similarly came in almost 400,000 years older than thought. That work was published in January by Christopher Fielding and colleagues at the University of Nebraska in Lincoln.

"The Karoo Basin is the poster child for the end-Permian vertebrate turnover, but until recently, it was not well-dated," Looy said. "Our new zircon date shows that the base of the Lystrosaurus zone predates the marine extinction with several hundred thousand years, similar to the pattern in Australia. This means that both the floral and faunal turnover in Gondwana is out of sync with the Northern Hemisphere marine biotic crisis.

"For some years now, we have known that -- in contrast to the marine mass extinction -- the pulses of disturbance of life on land continued deep into the Triassic Period. But that the start of the terrestrial turnover happened so long before the marine extinction was a surprise."

Read more at Science Daily