Jun 5, 2021

Did heat from impacts on asteroids provide the ingredients for life on Earth?

A research group from Kobe University has demonstrated that the heat generated by the impact of a small astronomical body could enable aqueous alteration and organic solid formation to occur on the surface of an asteroid. They achieved this by first conducting high-velocity impact cratering experiments using an asteroid-like target material and measuring the post-impact heat distribution around the resulting crater. From these results, they then established a rule-of-thumb for maximum temperature and the duration of the heating, and developed a heat conduction model from this.

The research group consisted of the following members from Kobe University's Graduate School of Science; Lecturer YASUI Minami, TAZAWA Taku (a 2nd year masters student at the time of research), HASHIMOTO Ryohei (then a 4th year undergraduate in the Faculty of Science) and Professor ARAKAWA Masahiko, in addition to JAXA Space Exploration Center's Associate Senior Researcher OGAWA Kazunori (who was a technical specialist at Kobe University at the time of the study).

These results have expanded the spatial and temporal range over which the necessary conditions for aqueous alteration and organic solid formation could occur. This is expected to significantly increase the number of prospective astronomical bodies that could have brought water and the origins of life to Earth.

These research results were published in the British scientific journal Communications Earth and Environment (Nature Publishing Group) on May 18, 2021.

Main Points
 

  • The researchers used porous gypsum as an imitation asteroid and inserted multiple thermocouples inside it. They conducted high-velocity impact experiments on this target at impact speeds of 1km/s and over, and succeeded in measuring changes in temperature duration around the resulting crater soon after impact.
  • This revealed that, regardless of the impact speed and projectile's size and density, the maximum temperature and its duration were dependent upon dimensionless distance (the distance from the impact point scaled by the crater radius).
  • Using the above results, the researchers calculated the temporal changes in thermal heat distribution after the crater's formation on the asteroid's surface. These calculations suggested that, at distances within 2 astronomical units, aqueous alteration can occur if the crater has a radius of over 20km, and organic solid formation can be supported by craters of over 1km.
  • These findings will enable an increased number of astronomical bodies to be considered as candidates for the source of the water and organic substances necessary for the beginning of life on Earth.


Research Background

It is believed that the water and organic substances necessary for life to begin on Earth were the result of a comet or asteroid impacting the planet. Minerals and organic substances that have experienced aqueous alteration have been discovered in meteorites (from which asteroids originate), providing proof that they once contained water. However, a heat source is necessary for the chemical reactions that cause aqueous alteration and organic solid formation inside asteroids.

One sufficiently strong heat source is the radioactive decay heating of 26Al (aluminum), a short-lived radioactive nuclide found inside rocks. However, it is said that the radioactive heating that caused aqueous alteration and solid formation on asteroid parent bodies could have only occurred at the beginning of the solar system's history due to the short half-life of 26Al (720,000 years).

In recent years, the theory that the impact heat generated when a small astronomical body hits an asteroid could also be a viable heat source has started to gain attention. However, it is not known how much heat is generated depending on the astronomical body's characteristics (size, density, impact speed) and how far within the asteroid this generated heat is transmitted. Up until now, there have been no studies that have experimentally investigated this heat generation and propagation process to determine whether aqueous alteration and organic substance formation would be possible.

Research Methodology

This research group conducted laboratory experiments to investigate the relationship between the impact heat generated on an asteroid (as a result of a small astronomical body's impact) and the impact's characteristics. For the target, they used gypsum (a porous mineral composed of calcium sulfate dihydrate) to imitate an asteroid. They accelerated projectiles at the target at high impact velocities of between 1km/s to 5km/s using Kobe University's two-stage horizontal gas gun. Multiple thermocouples were set in the gypsum target in order to measure the temperature changes post-impact. In this series of experiments, the researchers changed the size, density, impact speed of the projectiles and the thermocouples' positions in order to investigate the differences in heat duration depending on the characteristics of the impact.

From the heat duration graph, the research group investigated the maximum temperature and its duration, and looked at how this related to the impact characteristics. By using the dimensionless distance obtained by normalizing the distance from the impact point (where the projectile hit the target) by the crater radius, they successfully determined how maximum temperature and its duration are altered by impact characteristics and came up with a rule-of-thumb for this.

Subsequently constructing a heat conduction model incorporating this rule of thumb, enabled them to calculate the heat distribution around the crater formed on the asteroid surface. The research group checked the numerical results from the heat conduction model against data on the required heat and duration for aqueous alteration and organic solid formation obtained from past analyses of meteorites.

These results showed that aqueous alteration could occur if a crater with a radius of over 20km was formed within 2au from the Sun. In addition, they estimated that even a small crater with a 100m radius on an asteroid within 4au could heat up to 100°C, meaning that it could support organic solid formation. Most asteroids are located within 4au. The researchers also found that if a crater with a radius of over 1km is formed within 2au, the circumference of the crater can heat up to 0°C (the temperature at which ice becomes water), thus enabling organic solids to be formed.

Further Developments

It is thought that radioactive decay heating of 26Al triggers the chemical reactions for aqueous alteration and organic solid formation on asteroids. However, this heating can only occur near the core of comparatively large asteroids that are tens of kilometers in diameter. Furthermore, it is said that this could have only occurred within a million years after the Sun's formation due to the short half-life of 26Al. On the other hand, collisions between asteroids still occur today, and it is possible that such collisions heat up the surface of even small asteroids, providing that the impact does not destroy the asteroid itself. In other words, these research results show that the potential for asteroids to support aqueous alteration and organic solid formation is temporarily and spatially far greater than previously thought. This will contribute towards an increased number of astrological bodies being considered as candidates that brought the water and organic substances for the beginning of life on Earth.

Read more at Science Daily

Computer simulations of the brain can predict language recovery in stroke survivors

At Boston University, a team of researchers is working to better understand how language and speech is processed in the brain, and how to best rehabilitate people who have lost their ability to communicate due to brain damage caused by a stroke, trauma, or another type of brain injury. This type of language loss is called aphasia, a long-term neurological disorder caused by damage to the part of the brain responsible for language production and processing that impacts over a million people in the US.

"It's a huge problem," says Swathi Kiran, director of BU's Aphasia Research Lab, and College of Health & Rehabilitation Sciences: Sargent College associate dean for research and James and Cecilia Tse Ying Professor in Neurorehabilitation. "It's something our lab is working to tackle at multiple levels."

For the last decade, Kiran and her team have studied the brain to see how it changes as people's language skills improve with speech therapy. More recently, they've developed new methods to predict a person's ability to improve even before they start therapy. In a new paper published in Scientific Reports, Kiran and collaborators at BU and the University of Texas at Austin report they can predict language recovery in Hispanic patients who speak both English and Spanish fluently -- a group of aphasia patients particularly at risk of long-term language loss -- using sophisticated computer models of the brain. They say the breakthrough could be a game changer for the field of speech therapy and for stroke survivors impacted by aphasia.

"This [paper] uses computational modeling to predict rehabilitation outcomes in a population of neurological disorders that are really underserved," Kiran says. In the US, Hispanic stroke survivors are nearly two times less likely to be insured than all other racial or ethnic groups, Kiran says, and therefore they experience greater difficulties in accessing language rehabilitation. On top of that, oftentimes speech therapy is only available in one language, even though patients may speak multiple languages at home, making it difficult for clinicians to prioritize which language a patient should receive therapy in.

"This work started with the question, 'If someone had a stroke in this country and [the patient] speaks two languages, which language should they receive therapy in?'" says Kiran. "Are they more likely to improve if they receive therapy in English? Or in Spanish?"

This first-of-its-kind technology addresses that need by using sophisticated neural network models that simulate the brain of a bilingual person that is language impaired, and their brain's response to therapy in English and Spanish. The model can then identify the optimal language to target during treatment, and predict the outcome after therapy to forecast how well a person will recover their language skills. They found that the models predicted treatment effects accurately in the treated language, meaning these computational tools could guide healthcare providers to prescribe the best possible rehabilitation plan.

"There is more recognition with the pandemic that people from different populations -- whether [those be differences of] race, ethnicity, different disability, socioeconomic status -- don't receive the same level of [healthcare]," says Kiran. "The problem we're trying to solve here is, for our patients, health disparities at their worst; they are from a population that, the data shows, does not have great access to care, and they have communication problems [due to aphasia]."

As part of this work, the team is examining how recovery in one language impacts recovery of the other -- will learning the word "dog" in English lead to a patient recalling the word "perro," the word for dog in Spanish?

"If you're bilingual you may go back and forth between languages, and what we're trying to do [in our lab] is use that as a therapy piece," says Kiran.

Clinical trials using this technology are already underway, which will soon provide an even clearer picture of how the models can potentially be implemented in hospital and clinical settings.

Read more at Science Daily

Jun 4, 2021

Front-row view reveals exceptional cosmic explosion

Scientists have gained the best view yet of the brightest explosions in the universe: A specialised observatory in Namibia has recorded the most energetic radiation and longest gamma-ray afterglow of a so-called gamma-ray burst (GRB) to date. The observations with the High Energy Stereoscopic System (H.E.S.S.) challenge the established idea of how gamma-rays are produced in these colossal stellar explosions which are the birth cries of black holes, as the international team reports in the journal Science.

"Gamma-ray bursts are bright X-ray and gamma-ray flashes observed in the sky, emitted by distant extragalactic sources," explains DESY scientist Sylvia Zhu, one of the authors of the paper. "They are the biggest explosions in the universe and associated with the collapse of a rapidly rotating massive star to a black hole. A fraction of the liberated gravitational energy feeds the production of an ultrarelativistic blast wave. Their emission is divided into two distinct phases: an initial chaotic prompt phase lasting tens of seconds, followed by a long-lasting, smoothly fading afterglow phase."

On 29 August 2019 the satellites Fermi and Swift detected a gamma-ray burst in the constellation of Eridanus. The event, catalogued as GRB 190829A according to its date of occurrence, turned out to be one of the nearest gamma-ray bursts observed so far, with a distance of about one billion lightyears. For comparison: The typical gamma-ray burst is about 20 billion lightyears away. "We were really sitting in the front row when this gamma-ray burst happened," explains co-author Andrew Taylor from DESY. The team caught the explosion's afterglow immediately when it became visible to the H.E.S.S. telescopes. "We could observe the afterglow for several days and to unprecedented gamma-ray energies," reports Taylor.

The comparatively short distance to this gamma-ray burst allowed detailed measurements of the afterglow's spectrum, which is the distribution of "colours" or photon energies of the radiation, in the very-high energy range. "We could determine GRB 190829A's spectrum up to an energy of 3.3 tera-electronvolts, that's about a trillion times as energetic as the photons of visible light," explains co-author Edna Ruiz-Velasco from the Max Planck Institute for Nuclear Physics in Heidelberg. "This is what's so exceptional about this gamma-ray burst -- it happened in our cosmic backyard where the very-high-energy photons were not absorbed in collisions with background light on their way to Earth, as it happens over larger distances in the cosmos."

The team could follow the afterglow up to three days after the initial explosion. The result came as a surprise: "Our observations revealed curious similarities between the X-ray and very-high energy gamma-ray emission of the burst's afterglow," reports Zhu. Established theories assume that the two emission components must be produced by separate mechanisms: the X-ray component originates from ultra-fast electrons that are deflected in the strong magnetic fields of the burst's surroundings. This "synchrotron" process is quite similar to how particle accelerators on Earth produce bright X-rays for scientific investigations.

However, according to existing theories it seemed very unlikely that even the most powerful explosions in the universe could accelerate electrons enough to directly produce the observed very-high-energy gamma rays. This is due to a "burn-off limit," which is determined by the balance of acceleration and cooling of particles within an accelerator. Producing very-high energy gamma-rays requires electrons with energies well beyond the burn-off limit. Instead, current theories assume that in a gamma-ray burst, fast electrons collide with synchrotron photons and thereby boost them to gamma-ray energies in a process dubbed synchrotron self-Compton.

But the observations of GRB 190829A's afterglow now show that both components, X-ray and gamma ray, faded in sync. Also, the gamma-ray spectrum clearly matched an extrapolation of the X-ray spectrum. Together, these results are a strong indication that X-rays and very-high-energy gamma rays in this afterglow were produced by the same mechanism. "It is rather unexpected to observe such remarkably similar spectral and temporal characteristics in the X-ray and very-high energy gamma-ray energy bands, if the emission in these two energy ranges had different origins," says co-author Dmitry Khangulyan from Rikkyo University in Tokyo. This poses a challenge for the synchrotron self-Compton origin of the very-high energy gamma-ray emission.

Read more at Science Daily

Puppies are wired to communicate with people

Dogs may have earned the title "man's best friend" because of how good they are at interacting with people. Those social skills may be present shortly after birth rather than learned, a new study by University of Arizona researchers suggests.

Published today in the journal Current Biology, the study also finds that genetics may help explain why some dogs perform better than others on social tasks such as following pointing gestures.

"There was evidence that these sorts of social skills were present in adulthood, but here we find evidence that puppies -- sort of like humans -- are biologically prepared to interact in these social ways," said lead study author Emily Bray, a postdoctoral research associate in the UArizona School of Anthropology in the College of Social and Behavioral Sciences.

Bray has spent the last decade conducting research with dogs in collaboration with California-based Canine Companions, a service dog organization serving clients with physical disabilities. She and her colleagues hope to better understand how dogs think and solve problems, which could have implications for identifying dogs that would make good service animals.

To better understand biology's role in dogs' abilities to communicate with humans, Bray and her collaborators looked at how 375 of the organization's 8-week-old budding service dogs, which had little previous one-on-one interaction with humans, performed on a series of tasks designed to measure their social communication skills.

Because the researchers knew each puppy's pedigree -- and therefore how related they were to one another -- they were also able to look at whether inherited genes explain differences in dogs' abilities. Genetics explained more than 40% of the variation in puppies' abilities to follow human pointing gestures, as well as variation in how long they engaged in eye contact with humans during a task designed to measure their interest in people.

"People have been interested in dogs' abilities to do these kinds of things for a long time, but there's always been debate about to what extent is this really in the biology of dogs, versus something they learn by palling around with humans," said study co-author Evan MacLean, assistant professor of anthropology and director of the Arizona Canine Cognition Center at the University of Arizona. "We found that there's definitely a strong genetic component, and they're definitely doing it from the get-go."

At the time of the study, the puppies were still living with their littermates and had not yet been sent to live with a volunteer puppy raiser. Therefore, their interactions with humans had been limited, making it unlikely that the behaviors were learned, Bray said.

The researchers engaged the puppies in four different tasks. In one task, an experimenter hid a treat beneath one of two overturned cups and pointed to it to see if the puppy could follow the gesture. To ensure that the pups weren't just following their noses, a treat was also taped to the inside of both cups. In another version of the task, puppies watched as the researchers placed a yellow block next to the correct cup, instead of pointing, to indicate where the puppy should look for the food.

The other two tasks were designed to observe puppies' propensity to look at human faces. In one task, the researchers spoke to the puppies in "dog-directed speech," reciting a script in the sort of high-pitched voice people sometimes use when talking to a baby. They then measured how long the puppy held a gaze with the human. In the final task -- a so-called "unsolvable task" -- researchers sealed a treat inside a closed container and presented it to the puppy, then measured how often the puppy looked to the human for help opening the container.

While many of the puppies were responsive to humans' physical and verbal cues, very few looked to humans for help with the unsolvable task. That suggests that while puppies may be born knowing how to respond to human-initiated communication, the ability to initiate communication on their own may come later.

"In studies of adult dogs, we find a tendency for them to look to humans for help, especially when you look at adult dogs versus wolves. Wolves are going to persist and try to independently problem solve, whereas dogs are more likely to look to the social partner for help," Bray said. "In puppies, this help-seeking behavior didn't really seem to be part of their repertoire yet."

In many ways, that mirrors what we see in human children's development, Bray said.

"If you think about language learning, children can understand what we're saying to them before they can physically produce the words," she said. "It's potentially a similar story with puppies; they are understanding what is being socially conveyed to them, but the production of it on their end is probably going to take a little bit longer, developmentally."

MacLean said the next step will be to see if researchers can identify the specific genes that may contribute to dogs' capacity to communicate with humans.

Read more at Science Daily

Age doesn't affect perception of 'speech-to-song illusion'

A strange thing sometimes happens when we listen to a spoken phrase again and again: It begins to sound like a song.

This phenomenon, called the "speech-to-song illusion," can offer a window into how the mind operates and give insight into conditions that affect people's ability to communicate, like aphasia and aging people's decreased ability to recall words.

Now, researchers from the University of Kansas have published a study in PLOS ONE examining if the speech-to-song illusion happens in adults who are 55 or older as powerfully as it does with younger people.

The KU team recruited 199 participants electronically on Amazon's Mechanical Turk (MTurk), a website used to conduct research in the field of psychology. The subjects listened to a sound file that exemplified the speech-to-song illusion, then completed surveys relating to three different studies.

"In the first study, we just played them the canonical stimulus made by the researcher that discovered this illusion -- if that can't create the illusion, then nothing can," said co-author Michael Vitevitch, professor of psychology at KU. "Then we simply asked people, 'Did you experience the illusion or not?' There was no difference in the age of the number of people that said yes or no."

While the researchers hypothesized fewer older people would perceive the illusion than younger people, the study showed no difference due to age.

While older and younger people perceived the speech-to-song illusion at the same rates, in the second study investigators sought to discover if older people experienced it less powerfully.

"We thought maybe 'yes or no' was too coarse of a measurement, so let's try to use a five-point rating scale," Vitevitch said. "Maybe older adults would rate it as being a little bit more speech-like and younger adults will rate it as being more song-like and you'll see it on this five-point scale, maybe. But there was no difference in the numbers with the younger and older adults."

In the third study, Vitevitch wanted to see if older adults perhaps experience the illusion more slowly than younger people.

"We thought maybe it's not the strength of the illusion that's different but maybe it's when the illusion occurred," he said. "So, we did a final study and asked people to click a button on the screen when their perception shifted from speech to song -- we thought maybe older adults would need a few more repetitions for it to switch over. But we got the same number for both younger adults and older."

Vitevitch's co-authors were KU undergraduate researchers Hollie Mullin, Evan Norkey and Anisha Kodwani, as well as Nichol Castro of the University of Buffalo.

According to Vitevitch, the findings might translate to good news for older adults.

"We have this common misconception that everything goes downhill cognitively as we age," said the KU researcher. "That's not the case. There are some things that do get worse with age, but there are some things that actually get better with age, and some things that stay consistent with age -- in the case of this illusion, you're going to get equally suckered whether you're an older adult or a younger adult."

In another aspect of the research, the investigators found people with musical training experienced the speech-to-song illusion at similar rates as people with no background in music.

"There's a debate about whether musicians or musically trained people experienced the illusion more or less or sooner or more strongly," Vitevitch said. "We looked at it and there was really no difference there either. Musicians and non-musically trained people experience this at about the same rates and have the same sort of experience. The amount of musical training didn't matter. It was just amazingly consistent however we looked at it."

Read more at Science Daily

Novel antibody drug wakes up the body's defense system in advanced-stage cancer

Researchers at the University of Turku, Finland, showed that the antibody treatment reactivates the immune defense in patients with advanced-stage cancer. The treatment alters the function of the body's phagocytes and facilitates extensive activation of the immune system.

The immune defense is the body's own defense system equipped to combat cancer. However, cancer learns to hide from immune attacks and harnesses this system to promote its own growth. Therefore, it would be beneficial to be able to return the immune defense back to restricting the advancement of cancer.

Macrophages, a type of white blood cell, are central in the fight against cancer. Cancer educates macrophages to subdue the defense system and renders many treatments targeting the immune system ineffective.

Academy Research Fellow Maija Hollmén's research group has searched for means of altering the activity of macrophages in order to direct the immune defense to attack cancer. The antibody bexmarilimab, developed based on this research and in collaboration with Faron Pharmaceuticals, is currently undergoing clinical trials in patients. Hollmén's group has studied the changes occurring in the defense systems of patients with cancer following antibody treatment.

"In the majority of patients, the antibody treatment activated killer T cells, which are the body's strike force against cancer. Additionally, the antibody treatment successfully lowered the suppressive potential of macrophage precursors travelling in the blood circulation. The patients also showed increases in certain mediators of inflammation and types of white blood cell in the blood," describes Hollmén.

"The activation of the killer T cells is a very promising demonstration of the antibody's capability to boost the defense system against cancer. The treated patients had very advanced and poorly treatable cancers, which highlights the significance of the results," says Doctoral Candidate Jenna Rannikko.

Bexmarilimab May Benefit Patients for Whom Current Treatment Options Are Ineffective

The research also yielded new information on the mode of action of bexmarilimab. The antibody binds the molecule Clever-1 present on macrophages and alters its function.

Clever-1 transports material needless to the body inside macrophages to be degraded. Objects disposed in this manner are swept under the rug, in a manner of speaking. This kind of concealment is beneficial for the body's natural balance and helps to avoid stirring the immune defense unnecessarily.

"However, cells originating from cancer should be detected. When the antibody is used to block Clever-1 from performing its cleaning job, it facilitates the activation of cells of the immune defense. This in part leads to the waking up of the T cells in patients," describes Doctoral Candidate Miro Viitala.

There is demand for treatments that boost the activity of the immune defense since the current options on the market only help some patients.

"Bexmarilimab's mode of action is different from the drug treatments against cancer currently on the market. Therefore, it can be beneficial for patients for whom current treatment options are ineffective," concludes Postdoctoral Researcher Reetta Virtakoivu.

Read more at Science Daily

Jun 2, 2021

Turbulence in interstellar gas clouds reveals multi-fractal structures

 In interstellar dust clouds, turbulence must first dissipate before a star can form through gravity. A German-French research team has now discovered that the kinetic energy of the turbulence comes to rest in a space that is very small on cosmic scales, ranging from one to several light-years in extent. The group also arrived at new results in the mathematical method: Previously, the turbulent structure of the interstellar medium was described as self-similar -- or fractal. The researchers found that it is not enough to describe the structure mathematically as a single fractal, a self-similar structure as known from the Mandelbrot set. Instead, they added several different fractals, so-called multifractals. The new methods can thus be used to resolve and represent structural changes in astronomical images in detail. Applications in other scientific fields such as atmospheric research is also possible.

The German-French programme GENESIS (Generation of Structures in the Interstellar Medium) is a cooperation between the University of Cologne's Institute for Astrophysics, LAB at the University of Bordeaux and Geostat/INRIA Institute Bordeaux. In a highlight publication of the journal Astronomy & Astrophysics, the research team presents the new mathematical methods to characterize turbulence using the example of the Musca molecular cloud in the constellation of Musca.

Stars form in huge interstellar clouds composed mainly of molecular hydrogen -- the energy reservoir of all stars. This material has a low density, only a few thousand to several tens of thousands of particles per cubic centimetre, but a very complex structure with condensations in the form of 'clumps' and 'filaments', and eventually 'cores' from which stars form by gravitational collapse of the matter.

The spatial structure of the gas in and around clouds is determined by many physical processes, one of the most important of which is interstellar turbulence. This arises when energy is transferred from large scales, such as galactic density waves or supernova explosions, to smaller scales. Turbulence is known from flows in which a liquid or gas is 'stirred', but can also form vortices and exhibit brief periods of chaotic behaviour, called intermittency. However, for a star to form, the gas must come to rest, i.e., the kinetic energy must dissipate. After that, gravity can exert enough force to pull the hydrogen clouds together and form a star. Thus, it is important to understand and mathematically describe the energy cascade and the associated structural change.

From Science Daily

Young T. rexes had a powerful bite, capable of exerting one-sixth the force of an adult

Jack Tseng loves bone-crunching animals -- hyenas are his favorite -- so when paleontologist Joseph Peterson discovered fossilized dinosaur bones that had teeth marks from a juvenile Tyrannosaurus rex, Tseng decided to try to replicate the bite marks and measure how hard those kids could actually chomp down.

Last year, he and Peterson made a metal replica of a scimitar-shaped tooth of a 13-year-old juvie T. rex, mounted it on a mechanical testing frame commonly used in engineering and materials science, and tried to crack a cow legbone with it.

Based on 17 successful attempts to match the depth and shape of the bite marks on the fossils -- he had to toss out some trials because the fresh bone slid around too much -- he determined that a juvenile could have exerted up to 5,641 newtons of force, somewhere between the jaw forces exerted by a hyena and a crocodile.

Compare that to the bite force of an adult T. rex -- about 35,000 newtons -- or to the puny biting power of humans: 300 newtons.

Previous bite force estimates for juvenile T. rexes -- based on reconstruction of the jaw muscles or from mathematically scaling down the bite force of adult T. rexes -- were considerably less, about 4,000 newtons.

Why does it matter? Bite force measurements can help paleontologists understand the ecosystem in which dinosaurs -- or any extinct animal -- lived, which predators were powerful enough to eat which prey, and what other predators they competed with.

"If you are up to almost 6,000 newtons of bite force, that places them in a slightly different weight class," said Tseng, UC Berkeley assistant professor of integrative biology. "By really refining our estimates of juvenile bite force, we can more succinctly place them in a part of the food web and think about how they may have played the role of a different kind of predator from their larger, adult parents."

The study reveals that juvenile T. rexes, while not yet able to crush bones like their 30- or 40-year-old parents, were developing their biting techniques and strengthening their jaw muscles to be able do so once their adult teeth came in.

"This actually gives us a little bit of a metric to help us gauge how quickly the bite force is changing from juvenile to adulthood, and something to compare with how the body is changing during that same period of time," said Peterson, a professor at the University of Wisconsin in Oshkosh and a paleopathologist -- a specialist on the injuries and deformities visible in fossil skeletons. "Are they already crushing bone? No, but they are puncturing it. It allows us to get a better idea of how they are feeding, what they are eating. It is just adding more to that full picture of how animals like tyrannosaurs lived and grew and the roles that they played in that ecosystem."

Tseng, Peterson and graduate student Shannon Brink of East Carolina University in Greenville, North Carolina, will publish their findings this week in the journal PeerJ.

Teeth marks galore, but who was the biter?

Experiments using metal casts of dinosaur teeth to match observed bite marks are rare, not because bite marks on dinosaur fossils are rare, but because the identity of the biter is seldom clear.

Two dinosaur fossils that Peterson excavated years earlier from the Hell Creek Formation of eastern Montana, however, proved ideal for such an experiment. One, the skull of a juvenile T. rex, had a healed bite mark on its face. "What, other than another T. rex, would be able to chomp another T. rex and puncture its skull?" he reasoned. Tyrannosaurs, like crocodiles today, played rough, and the wound was likely from a fight over food or territory.

In addition, the puncture holes in the skull, which had healed, were the size and shape of juvenile T. rex teeth, and the spacing fit a juvenile's tooth gap. Juvenile T. rexes have teeth that are oval in cross section: more knife-like, presumably to cut and tear flesh. Adult T. rexes have teeth with round cross sections: more like posts, to crush bone. Both juveniles and adults could replace lost or broken teeth from spares buried in the jaw that emerged once the socket was empty.

Because skull bone is harder than other bone, Peterson said, matching these holes with punctures made by the metal tooth in a cow bone provided an upper limit to the bite force.

The other fossil was a tail vertebra from a plant-eating, duckbilled dinosaur, an Edmontosaurus. It had two puncture marks from teeth that matched those of a juvenile T. rex. Peterson said that T. rex was the only predator around at that time -- the late Cretaceous Period, more than 66 million years ago -- that could have bitten that hard on the tailbone of a duckbill. The juvenile likely punctured the bone when chomping down on a meaty part of the tail of the already dead animal.

Because vertebrae are softer, experimentally creating similar punctures in a cow bone gave the researchers a lower limit on bite force.

Tseng employed a testing technique that was used in 2010 by researchers who measured the bite force of a much older and smaller dinosaur from the early Cretaceous: a Deinonychus, made famous under a different name -- Velociraptor -- in the 1993 movie Jurassic Park. Its bite force was between 4,000 and 8,000 newtons.

Tseng, then at the University at Buffalo in New York, and Peterson made a replica of a juvenile T. rex tooth from the middle of the jaw using a dental-grade cobalt chromium alloy, which is much harder than dinosaur tooth enamel, Tseng said.

They then mounted the metal tooth in a mechanical testing frame and pushed it slowly, at a millimeter per second, into a fresh-frozen and thawed humerus of a cow. Bones are easier to fracture at low speed than with a rapid chomp. Because the middle of the humerus has a thicker cortex than the bone near the joint ends, the middle was used to replicate the facial punctures. The ends were used to simulate the vertebra punctures.

"What we did, an actualistic study, is to say, 'Let's actually stab the thing with a tooth and see what it does,'" Peterson said. "What we are finding is that our estimates are slightly different than other models, but they are within a close enough range -- we are on the same page."

Tseng emphasized that there is no one number describing the bite force of any animal: it depends on how the creature bites and adjusts the prey in its mouth for the best leverage.

"They probably were not just chomping down. If you look at modern predators, even reptilian predators, sometimes there is adjustment. Maybe they are finding the most mechanically advantageous place, or the strongest tooth to make their bite," said Tseng, who is a 2004 graduate of UC Berkeley's Department of Integrative Biology and an assistant curator in the University of California Museum of Paleontology. "Presumably, there is some tuning involved before they make that bite, so they can literally take the best bite forward to make that kill or to damage whatever they are trying to get into."

Nevertheless, the measurements are a start in charting the increase in tyrannosaurs' bite force as they mature, similar to how paleontologists have charted T. rex size and weight with age.

"Just as you can do a growth curve for such an organism, you can also do a strength curve for their bite force -- what was their bite force at 12 or 13 years old, what was it at 30, 35 or 40 years old. And what does that potentially mean about the role that those animals played in that ecosystem at the time?" Peterson said. "What's cool about finding bite marks in bone from a juvenile tyrannosaur is that it is tells us that at 13 years old, they weren't capable of crushing bone yet, but they were already trying, they were puncturing bone, pretty deep. They are probably building up their strength as they get older."

Tseng, whose primary interest is mammals, is eager to resume studies interrupted by the pandemic to measure the bite force of various living and extinct animals in order to infer the ecosystem niches of predators no longer alive. For those creatures, fossils are all that paleontologists have, in order to "interpret behavior and breathe some life into these extinct animals," said Peterson.

Read more at Science Daily

How an elephant's trunk manipulates air to eat and drink

New research from the Georgia Institute of Technology finds that elephants dilate their nostrils in order to create more space in their trunks, allowing them to store up to nine liters of water. They can also suck up three liters per second -- a speed 50 times faster than a human sneeze (150 meters per second/330 mph).

The Georgia Tech College of Engineering study sought to better understand the physics of how elephants use their trunks to move and manipulate air, water, food and other objects. They also sought to learn if the mechanics could inspire the creation of more efficient robots that use air motion to hold and move things.

While octopus use jets of water to move and archer fish shoot water above the surface to catch insects, the Georgia Tech researchers found that elephants are the only animals able to use suction on land and underwater.

The paper, "Suction feeding by elephants," is published in the Journal of the Royal Society Interface.

"An elephant eats about 400 pounds of food a day, but very little is known about how they use their trunks to pick up lightweight food and water for 18 hours, every day," said Georgia Tech mechanical engineering Ph.D. student Andrew Schulz, who led the study. "It turns out their trunks act like suitcases, capable of expanding when necessary."

Schulz and the Georgia Tech team worked with veterinarians at Zoo Atlanta, studying elephants as they ate various foods. For large rutabaga cubes, for example, the animal grabbed and collected them. It sucked up smaller cubes and made a loud vacuuming sound, or the sound of a person slurping noodles, before transferring the vegetables to its mouth.

To learn more about suction, the researchers gave elephants a tortilla chip and measured the applied force. Sometimes the animal pressed down on the chip and breathed in, suspending the chip on the tip of trunk without breaking it. It was similar to a person inhaling a piece of paper onto their mouth. Other times the elephant applied suction from a distance, drawing the chip to the edge of its trunk.

"An elephant uses its trunk like a Swiss Army Knife," said David Hu, Schulz's advisor and a professor in Georgia Tech's George W. Woodruff School of Mechanical Engineering. "It can detect scents and grab things. Other times it blows objects away like a leaf blower or sniffs them in like a vacuum."

By watching elephants inhale liquid from an aquarium, the team was able to time the durations and measure volume. In just 1.5 seconds, the trunk sucked up 3.7 liters, the equivalent of 20 toilets flushing simultaneously.

An ultrasonic probe was used to take trunk wall measurements and see how the trunk's inner muscles work. By contracting those muscles, the animal dilates its nostrils up to 30 percent. This decreases the thickness of the walls and expands nasal volume by 64 percent.

"At first it didn't make sense: an elephant's nasal passage is relatively small and it was inhaling more water than it should," said Schulz. "It wasn't until we saw the ultrasonographic images and watched the nostrils expand that we realized how they did it. Air makes the walls open, and the animal can store far more water than we originally estimated."

Based on the pressures applied, Schulz and the team suggest that elephants inhale at speeds that are comparable to Japan's 300-mph bullet trains.

Schulz said these unique characteristics have applications in soft robotics and conservation efforts.

Read more at Science Daily

Scientists learn what fuels the 'natural killers' of the immune system

Despite a name straight from a Tarantino movie, natural killer (NK) cells are your allies when it comes to fighting infections and cancer. If T cells are like a team of specialist doctors in an emergency room, NK cells are the paramedics: They arrive first on the scene and perform damage control until reinforcements arrive.

Part of our innate immune system, which dispatches these first responders, NK cells are primed from birth to recognize and respond to danger. Learning what fuels NK cells is an active area of research in immunology, with important clinical implications.

"There's a lot of interest right now in NK cells as a potential target of immunotherapy," says Joseph Sun, an immunologist in the Sloan Kettering Institute. "The more we can understand what drives these cells, the better we can program them to fight disease."

Despite a name straight from a Tarantino movie, natural killer (NK) cells are your allies when it comes to fighting infections and cancer. If T cells are like a team of specialist doctors in an emergency room, NK cells are the paramedics: They arrive first on the scene and perform damage control until reinforcements arrive.

Part of our innate immune system, which dispatches these first responders, NK cells are primed from birth to recognize and respond to danger. Learning what fuels NK cells is an active area of research in immunology, with important clinical implications.

"There's a lot of interest right now in NK cells as a potential target of immunotherapy," says Joseph Sun, an immunologist in the Sloan Kettering Institute. "The more we can understand what drives these cells, the better we can program them to fight disease."

First in Line

Previous work from researchers at MSK and elsewhere has shown that T cells rely on aerobic glycolysis to carry out their protective functions. But whether NK cells depend on this form of metabolism to power their own activities was not known.

Because Dr. Sun and his colleagues studied NK cells in animals instead of a dish, they could establish what type of metabolism NK cells use and compare it to T cells in a natural setting. They found that NK cells ramp up aerobic glycolysis about five days prior to when T cells respond with their own glycolytic surge.

"This fits with the idea that NK cells are innate immune cells that are really critical for mounting a rapid response," Dr. Sheppard says.

The findings are relevant to ongoing efforts to use NK cells as immunotherapy in people with cancer and other conditions. In particular, they have implications for using NK cells as a form of cell therapy -- when cells are grown outside a patient and then infused back into the patient's blood.

"If you're growing these cells in a dish and you push them to divide too rapidly, they may not have as much potential to undergo aerobic glycolysis when you put them into a patient," Dr. Sheppard says.

The takeaway for researchers designing clinical trials is this: They must find a balance between encouraging NK cells to multiply and preserving their stamina. These NK cells are the paramedics of our immune system, so it's important to keep them speedy and responsive.

Read more at Science Daily

Jun 1, 2021

Newly discovered African 'climate seesaw' drove human evolution

While it is widely accepted that climate change drove the evolution of our species in Africa, the exact character of that climate change and its impacts are not well understood. Glacial-interglacial cycles strongly impact patterns of climate change in many parts of the world, and were also assumed to regulate environmental changes in Africa during the critical period of human evolution over the last ~1 million years. The ecosystem changes driven by these glacial cycles are thought to have stimulated the evolution and dispersal of early humans.

A paper published in Proceedings of the National Academy of Sciences (PNAS) this week challenges this view. Dr. Kaboth-Bahr and an international group of multidisciplinary collaborators identified ancient El Niño-like weather patterns as the drivers of major climate changes in Africa. This allowed the group to re-evaluate the existing climatic framework of human evolution.

Walking with the rain

Dr. Kaboth-Bahr and her colleagues integrated 11 climate archives from all across Africa covering the past 620 thousand years to generate a comprehensive spatial picture of when and where wet or dry conditions prevailed over the continent. "We were surprised to find a distinct climatic east-west 'seesaw' very akin to the pattern produced by the weather phenomena of El Niño, that today profoundly influences precipitation distribution in Africa," explains Dr. Kaboth-Bahr, who led the study.

The authors infer that the effects of the tropical Pacific Ocean on the so-called "Walker Circulation" -- a belt of convection cells along the equator that impact the rainfall and aridity of the tropics -- were the prime driver of this climate seesaw. The data clearly shows that the wet and dry regions shifted between the east and west of the African continent on timescales of approximately 100,000 years, with each of the climatic shifts being accompanied by major turnovers in flora and mammal fauna.

"This alternation between dry and wet periods appeared to have governed the dispersion and evolution of vegetation as well as mammals in eastern and western Africa," explains Dr. Kaboth-Bahr. "The resultant environmental patchwork was likely to have been a critical component of human evolution and early demography as well."

The scientists are keen to point that although climate change was certainly not the sole factor driving early human evolution, the new study nevertheless provides a novel perspective on the tight link between environmental fluctuations and the origin of our early ancestors.

"We see many species of pan-African mammals whose distributions match the patterns we identify, and whose evolutionary history seems to articulate with the wet-dry oscillations between eastern and western Africa," adds Dr. Eleanor Scerri, one of the co-authors and an evolutionary archaeologist at the Max Planck Institute for the Science of Human History in Germany. "These animals preserve the signals of the environments that humans evolved in, and it seems likely that our human ancestors may have been similarly subdivided across Africa as they were subject to the same environmental pressures."

Ecotones: the transitional regions between different ecological zones

The scientists' work suggests that a seesaw-like pattern of rainfall alternating between eastern and western Africa probably had the effect of creating critically important ecotonal regions -- the buffer zones between different ecological zones, such grassland and forest.

"Ecotones provided diverse, resource-rich and stable environmental settings thought to have been important to early modern humans," adds Dr. Kaboth-Bahr. "They certainly seem to have been important to other faunal communities."

To the scientists, this suggests that Africa's interior regions may have been critically important for fostering long-term population continuity. "We see the archaeological signatures of early members of our species all across Africa," says Dr. Scerri, "but innovations come and go and are often re-invented, suggesting that our deep population history saw a constant saw-tooth like pattern of local population growth and collapse. Ecotonal regions may have provided areas for longer term population continuity, ensuring that the larger human population kept going, even if local populations often went extinct."

Read more at Science Daily

Californian smoke drifted as far as Europe in 2020 and caused heavy clouding of sun

The smoke from the extreme forest fires on the US West Coast in September 2020 travelled over many thousands of kilometres to Central Europe, where it continued to affect the atmosphere for days afterwards. A comparison of ground and satellite measurements now shows: The forest fire aerosol disturbed the free troposphere over Leipzig in Germany as never before. An evaluation by an international research team led by the Leibniz Institute for Tropospheric Research (TROPOS) revealed an extraordinary optical thickness on 11 September 2020, which attenuated sunlight by a third. The study, published in Geophysical Research Letters, is the first publication to show that ESA's novel Aeolus satellite can not only reliably measure global wind profiles but also aerosols in the atmosphere as it was shown by comparing Aeolus measurements with lidar measurements from the ground. The Centre National de Recherches Météorologiques (CNRM) of the University of Toulouse, the German Aerospace Center (DLR) and the European Space Agency (ESA) were involved in the study.

Since August 2018, a new type of research satellite has been orbiting the Earth, named after a Greek wind god -- Aeolus. The aim of Aeolus is to actively measure wind from space and thus improve weather forecasting. On board of this satellite of the European Space Agency (ESA) is the "Atmospheric Laser Doppler Instrument" (ALADIN), a high-performance laser. ALADIN is the first instrument in space that can actively measure vertical profiles of wind speed. It uses the principle of a light radar (short: Lidar from "LIght Detection And Ranging"). A signal is emitted and the reflection provides information about location and distance. The Doppler effect is then used to measure the wind speed at different heights in the atmosphere. To validate the laser measurements in space, they are compared with laser measurements from the ground. Several research groups from Germany are involved in this effort within the framework of the EVAA initiative (Experimental Validation and Assimilation of Aeolus observations). TROPOS, for example, measures with its lidar devices every Friday evening and Sunday morning when the Aeolus satellite flies over Leipzig. The data from ground and space can then be compared. On 11 September 2020, this resulted in the rare constellation that the extraordinary plume of smoke from the Californian forest fires could be measured over Leipzig simultaneously from ground and from space.

"Using revolutionary laser technology, Aeolus is currently the only satellite in the world that can measure profiles of horizontal wind speed as well as the backscatter and extinction of aerosols and clouds independently. The satellite thus provides valuable information on the radiative properties of these smoke aerosols," emphasises Dr Sebastian Bley of TROPOS, who has been involved in the Aeolus project at the European Space Agency's (ESA) ESRIN research centre for the past three years. "It is expected that this unique configuration will contribute to improved predictions of such global smoke dispersion but also of weather in general."

In September 2020, the heat from the extreme forest fires on the US West Coast transported the smoke to high altitudes. Once high up, it was then transported with the jet stream across North America and the Atlantic to Europe. In Leipzig, Germany, the smoke layer appeared at an altitude of around 12 kilometres on the morning of 11.09.2020 and sank to an altitude of around 5 kilometres in the course of the day. This is shown by the data from the PollyXT lidar at TROPOS. Lidar measurements in Leipzig confirmed the strong attenuation of the direct sunlight on this Friday: "It was -- measured by the Aerosol Optical Thickness (AOT) -- the strongest influence of forest fire aerosol on the free troposphere above Leipzig ever observed since the beginning of regular lidar observations in 1997," reports Dr Holger Baars from TROPOS, "The free troposphere is the region of the atmosphere in which the weather takes place but the direct influence from the ground is low. We were able to estimate an average mass concentration of forest fire aerosol of 8 micrograms per cubic metre between 4 and 11 km altitude. At the peak it was even 22 micrograms per cubic metre -- that's quite remarkable for these altitudes." Saturday and Sunday were hazy days despite cloudless skies. The UV index of the Federal Office for Radiation Protection (BfS), among others, also showed how strongly the smoke layers dampened solar radiation in Saxony: the TROPOS station in Melpitz near Torgau registered about a quarter less UV radiation at noon on 12 September 20 than would have been possible under clear sky. The unusual state of the atmosphere was particularly striking at sunset with a distinctive milky-yellow light.

The researchers were able to confirm the origin of the smoke using a computer model: The backward simulation proves that the air masses that arrived at noon on 11 September at an altitude of 8.5 km above Leipzig originated from the west coast of North America, where intense fires took place days before. The frequency and intensity of fires in California continued to increase during the first week of September as satellite images show. Slightly weaker fires were observed in Oregon, Washington and Montana. "Due to the prevailing winds, the travel time of the smoke from the US West Coast to Europe was only about 3 to 4 days. The air masses even made the approximately 3000 kilometres across the Atlantic Ocean between Newfoundland and Ireland at high speed in only one day (9 September)," explains Martin Radenz from TROPOS.

Read more at Science Daily

Scientists say active early learning shapes the adult brain

An enhanced learning environment during the first five years of life shapes the brain in ways that are apparent four decades later, say Virginia Tech and University of Pennsylvania scientists writing in the June edition of the Journal of Cognitive Neuroscience.

The researchers used structural brain imaging to detect the developmental effects of linguistic and cognitive stimulation starting at six weeks of age in infants. The influence of an enriched environment on brain structure had formerly been demonstrated in animal studies, but this is the first experimental study to find a similar result in humans.

"Our research shows a relationship between brain structure and five years of high-quality, educational and social experiences," said Craig Ramey, professor and distinguished research scholar with Fralin Biomedical Research Institute at VTC and principal investigator of the study. "We have demonstrated that in vulnerable children who received stimulating and emotionally supportive learning experiences, statistically significant changes in brain structure appear in middle age."

The results support the idea that early environment influences the brain structure of individuals growing up with multi-risk socioeconomic challenges, said Martha Farah, director of the Center for Neuroscience and Society at Penn and first author of the study.

"This has exciting implications for the basic science of brain development, as well as for theories of social stratification and social policy," Farah said.

The study follows children who have continuously participated in the Abecedarian Project, an early intervention program initiated by Ramey in Chapel Hill, North Carolina, in 1971 to study the effects of educational, social, health, and family support services on high-risk infants.

Both the comparison and treatment groups received extra health care, nutrition, and family support services; however, beginning at six weeks of age, the treatment group also received five years of high quality educational support, five days a week, 50 weeks a year.

When scanned, the Abecedarian study participants were in their late 30s to early 40s, offering the researchers a unique look at how childhood factors affect the adult brain.

"People generally know about the potentially large benefits of early education for children from very low resource circumstances," said co-author Sharon Landesman Ramey, professor and distinguished research scholar at Fralin Biomedical Research Institute. "The new results reveal that biological effects accompany the many behavioral, social, health, and economic benefits reported in the Abecedarian Project. This affirms the idea that positive early life experiences contribute to later positive adjustment through a combination of behavioral, social, and brain pathways."

During follow-up examinations, structural MRI scans of the brains of 47 study participants were conducted at the Fralin Biomedical Research Institute Human Neuroimaging Lab. Of those, 29 individuals had been in the group that received the educational enrichment focused on promoting language, cognition, and interactive learning.

The other 18 individuals received the same robust health, nutritional, and social services supports provided to the educational treatment group, and whatever community childcare or other learning their parents provided. The two groups were well matched on a variety of factors such as maternal education, head circumference at birth and age at scanning.

Analyzing the scans, the researchers looked at brain size as a whole, including the cortex, the brain's outermost layer, as well as five regions selected for their expected connection to the intervention's stimulation of children's language and cognitive development.

Those included the left inferior frontal gyrus and left superior temporal gyrus, which may be relevant to language, and the right inferior frontal gyrus and bilateral anterior cingulate cortex, relevant to cognitive control. A fifth, the bilateral hippocampus, was added because its volume is frequently associated with early life adversity and socioeconomic status.

The researchers determined that those in the early education treatment group had increased size of the whole brain, including the cortex.

Several specific cortical regions also appeared larger, according to study co-authors Read Montague, professor and director of the Human Neuroimaging Lab and Computational Psychiatry Unit at the Fralin Biomedical Research Institute, and Terry Lohrenz, research assistant professor and member of the institute's Human Neuroimaging Laboratory.

The scientists noted the group intervention treatment results for the brain were substantially greater for males than for females. The reasons for this are not known, and were surprising, since both the boys and girls showed generally comparable positive behavioral and educational effects from their early enriched education. The current study cannot adequately explain the sex differences.

"When we launched this project in the 1970s, the field knew more about how to assess behavior than it knew about how to assess brain structure," Craig Ramey said. "Because of advances in neuroimaging technology and through strong interdisciplinary collaborations, we were able to measure structural features of the brain. The prefrontal cortex and areas associated with language were definitely affected; and to our knowledge, this is the first experimental evidence on a link between known early educational experiences and long-term changes in humans."

"We believe that these findings warrant careful consideration and lend further support to the value of ensuring positive learning and social-emotional support for all children -- particularly to improve outcomes for children who are vulnerable to inadequate stimulation and care in the early years of life," Craig Ramey said.

Read more at Science Daily

Mass of human chromosomes measured

Mass of human chromosomes have been measured for the first time.

The mass of human chromosomes, which contain the instructions for life in nearly every cell of our bodies, has been measured with X-rays for the first time in a new study led by UCL researchers.

For the study, published in Chromosome Research, researchers used a powerful X-ray beam at the UK's national synchrotron facility, Diamond Light Source, to determine the number of electrons in a spread of 46 chromosomes which they used to calculate mass.

They found that the chromosomes were about 20 times heavier than the DNA they contained -- a much larger mass than previously expected, suggesting there might be missing components yet to be discovered.

As well as DNA, chromosomes consist of proteins that serve a variety of functions, from reading the DNA to regulating processes of cell division to tightly packaging two-metre strands of DNA into our cells.

Senior author Professor Ian Robinson (London Centre for Nanotechnology at UCL) said: "Chromosomes have been investigated by scientists for 130 years but there are still parts of these complex structures that are poorly understood.

"The mass of DNA we know from the Human Genome Project, but this is the first time we have been able to precisely measure the masses of chromosomes that include this DNA.

"Our measurement suggests the 46 chromosomes in each of our cells weigh 242 picograms (trillionths of a gram). This is heavier than we would expect, and, if replicated, points to unexplained excess mass in chromosomes."

In the study, researchers used a method called X-ray ptychography, which involves stitching together the diffraction patterns that occur as the X-ray beam passes through the chromosomes, to create a highly sensitive 3D reconstruction. The fine resolution was possible as the beam deployed at Diamond Light Source was billions of times brighter than the Sun (ie, there was a very large number of photons passing through at a given time).

The chromosomes were imaged in metaphase, just before they were about to divide into two daughter cells. This is when packaging proteins wind up the DNA into very compact, precise structures.

Archana Bhartiya, a PhD student at the London Centre for Nanotechnology at UCL and lead author of the paper, said: "A better understanding of chromosomes may have important implications for human health.

"A vast amount of study of chromosomes is undertaken in medical labs to diagnose cancer from patient samples. Any improvements in our abilities to image chromosomes would therefore be highly valuable."

Each human cell, at metaphase, normally contains 23 pairs of chromosomes, or 46 in total. Within these are four copies of 3.5 billion base pairs of DNA.

Read more at Science Daily

May 30, 2021

Dark energy survey releases most precise look at the universe's evolution

In 29 new scientific papers, the Dark Energy Survey examines the largest-ever maps of galaxy distribution and shapes, extending more than 7 billion light-years across the Universe. The extraordinarily precise analysis, which includes data from the survey's first three years, contributes to the most powerful test of the current best model of the Universe, the standard cosmological model. However, hints remain from earlier DES data and other experiments that matter in the Universe today is a few percent less clumpy than predicted.

New results from the Dark Energy Survey (DES) use the largest-ever sample of galaxies observed over nearly one-eighth of the sky to produce the most precise measurements to date of the Universe's composition and growth.

DES images the night sky using the 570-megapixel Dark Energy Camera on the National Science Foundation's Víctor M. Blanco 4-meter Telescope at Cerro Tololo Inter-American Observatory (CTIO) in Chile, a Program of NSF's NOIRLab. One of the most powerful digital cameras in the world, the Dark Energy Camera was designed specifically for DES. It was funded by the Department of Energy (DOE) and was built and tested at DOE's Fermilab.

Over the course of six years, from 2013 to 2019, DES used 30% of the time on the Blanco Telescope and surveyed 5000 square degrees -- almost one-eighth of the entire sky -- in 758 nights of observation, cataloging hundreds of millions of objects. The results announced today draw on data from the first three years -- 226 million galaxies observed over 345 nights -- to create the largest and most precise maps yet of the distribution of galaxies in the Universe at relatively recent epochs. The DES data were processed at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.

"NOIRLab is a proud host for and member of the DES collaboration," said Steve Heathcote, CTIO Associate Director. "Both during and after the survey, the Dark Energy Camera has been a popular choice for community and Chilean astronomers."

At present the Dark Energy Camera is used for programs covering a huge range of science including cosmology. The Dark Energy Camera science archive, including DES Data Release 2 on which these results are based, is curated by the Community Science and Data Center (CSDC), a Program of NSF's NOIRLab. CSDC provides software systems, user services, and development initiatives to connect and support the scientific missions of NOIRLab's telescopes, including the Blanco telescope at CTIO.

Since DES studied nearby galaxies as well as those billions of light-years away, its maps provide both a snapshot of the current large-scale structure of the Universe and a view of how that structure has evolved over the past 7 billion years.

Ordinary matter makes up only about 5% of the Universe. Dark energy, which cosmologists hypothesize drives the accelerating expansion of the Universe by counteracting the force of gravity, accounts for about 70%. The last 25% is dark matter, whose gravitational influence binds galaxies together. Both dark matter and dark energy remain invisible. DES seeks to illuminate their nature by studying how the competition between them shapes the large-scale structure of the Universe over cosmic time.

To quantify the distribution of dark matter and the effect of dark energy, DES relied mainly on two phenomena. First, on large scales galaxies are not distributed randomly throughout space but rather form a weblike structure that is due to the gravity of dark matter. DES measured how this cosmic web has evolved over the history of the Universe. The galaxy clustering that forms the cosmic web in turn revealed regions with a higher density of dark matter.

Second, DES detected the signature of dark matter through weak gravitational lensing. As light from a distant galaxy travels through space, the gravity of both ordinary and dark matter in the foreground can bend its path, as if through a lens, resulting in a distorted image of the galaxy as seen from Earth. By studying how the apparent shapes of distant galaxies are aligned with each other and with the positions of nearby galaxies along the line of sight, DES scientists were able to infer the clumpiness of the dark matter in the Universe.

To test cosmologists' current model of the Universe, DES scientists compared their results with measurements from the European Space Agency's orbiting Planck observatory. Planck used light known as the cosmic microwave background to peer back to the early Universe, just 400,000 years after the Big Bang. The Planck data give a precise view of the Universe 13 billion years ago, and the standard cosmological model predicts how the dark matter should evolve to the present.

Combined with earlier results DES provides the most powerful test of the current best model of the Universe to date, and the results are consistent with the predictions of the standard model of cosmology. However, hints remain from DES and several previous galaxy surveys that the Universe today is a few percent less clumpy than predicted.

Ten regions of the sky were chosen as "deep fields" that the Dark Energy Camera imaged repeatedly throughout the survey. Stacking those images together allowed the scientists to glimpse more distant galaxies. The team then used the redshift information from the deep fields to calibrate the rest of the survey region. This and other advancements in measurements and modeling, coupled with a threefold increase in data compared to the first year, enabled the team to pin down the density and clumpiness of the Universe with unprecedented precision.

DES concluded its observations of the night sky in 2019. With the experience gained from analyzing the first half of the data, the team is now prepared to handle the complete dataset. The final DES analysis is expected to paint an even more precise picture of the dark matter and dark energy in the Universe.

The DES collaboration consists of over 400 scientists from 25 institutions in seven countries.

"The collaboration is remarkably young. It's tilted strongly in the direction of postdocs and graduate students who are doing a huge amount of this work," said DES Director and spokesperson Rich Kron, who is a Fermilab and University of Chicago scientist. "That's really gratifying. A new generation of cosmologists are being trained using the Dark Energy Survey."

Read more at Science Daily

Declining biodiversity in wild Amazon fisheries threatens human diet

A new study of dozens of wild fish species commonly consumed in the Peruvian Amazon says that people there could suffer major nutritional shortages if ongoing losses in fish biodiversity continue. Furthermore, the increasing use of aquaculture and other substitutes may not compensate. The research has implications far beyond the Amazon, since the diversity and abundance of wild-harvested foods is declining in rivers and lakes globally, as well as on land. Some 2 billion people globally depend on non-cultivated foods; inland fisheries alone employ some 60 million people, and provide the primary source of protein for some 200 million. The study appears this week in the journal Science Advances.

The authors studied the vast, rural Loreto department of the Peruvian Amazon, where most of the 800,000 inhabitants eat fish at least once a day, or an average of about 52 kilograms (115 pounds) per year. This is their primary source not only of protein, but fatty acids and essential trace minerals including iron, zinc and calcium. Unfortunately, it is not enough; a quarter of all children are malnourished or stunted, and more than a fifth of women of child-bearing age are iron deficient.

Threats to Amazon fisheries, long a mainstay for both indigenous people and modern development, are legion: new hydropower dams that pen in big migratory fish (some travel thousands of miles from Andes headwaters to the Atlantic estuary and back); soil erosion into rivers from deforestation; toxic runoff from gold mines; and over-exploitation by fishermen themselves, who are struggling to feed fast-growing populations. In Loreto, catch tonnages are stagnating; some large migratory species are already on the decline, and others may be on the way. It is the same elsewhere; globally, a third of freshwater fish species are threatened with extinction, and 80 are already known to be extinct, according to the World Wildlife Fund.

Different species of animals and plants contain different ratios of nutrients, so biodiversity is key to adequate human nutrition, say the researchers. "If fish decline, the quality of the diet will decline," said the study's senior coauthor, Shahid Naeem, director of Columbia University's Earth Institute Center for Environmental Sustainability. "Things are definitely declining now, and they could be on the path to crashing eventually."

To study the region's fish, the study's lead author, then-Columbia PhD. student Sebastian Heilpern, made numerous shopping trips to the bustling Belén retail market in the provincial capital of Iquitos. He also visited the city's Amazon River docks, where wholesale commerce begins at 3:30 in the morning. He and another student bought multiple specimens of as many different species as they could find, and ended up with 56 of the region's 60-some main food species. These included modest-size scale fish known locally as ractacara and yulilla; saucer-shaped palometa (related to piranha); and giant catfish extending six feet or more. (The researchers settled for chunks of the biggest ones.)

The fish were flown on ice to a government lab in Lima, where each species was analyzed for protein, fatty acids and trace minerals. The researchers then plotted the nutritional value of each species against its probability of surviving various kinds of ongoing environmental degradation. From this, they drew up multiple scenarios of how people's future diet would be affected as various species dropped out of the mix.

Overall, the biomass of fish caught has remained stable in recent years. However, large migratory species, the most vulnerable to human activities, comprise a shrinking portion, and as they disappear, they are being replaced by smaller local species. Most fish contain about the same amount of protein, so this has not affected the protein supply. And, the researchers found, many smaller fish in fact contain higher levels of omega-3 fatty acids, so their takeover may actually increase those supplies. On the other hand, as species compositions lean more to smaller fish, supplies of iron, zinc are already going down, and will continue to decline, they say.

"Like any other complex system, you see a tradeoff," said Heilpern. "Some things are going up while other things are going down. But that only lasts up to a point." Exactly which species will fill the gaps left when others decline is difficult to predict -- but the researchers project that the overall nutritional value of the catch will nosedive around the point where 40 of the 60 food species become scarce or extinct. "You have a tipping point, where the species that remain can be really lousy," said Heilpern.

One potential solution: in many places around the world where wild foods including fish and bush meat (such as monkeys and lizards) are declining, people are turning increasingly to farm-raised chicken and aquaculture -- a trend encouraged by the World Bank and other powerful organizations. This is increasingly the case in Loreto. But in a separate study published in March, Heilpern, Naeem and their colleagues show that this, too, is undermining human nutrition.

The researchers observed that chicken production in the region grew by about three quarters from 2010 to 2016, and aquaculture nearly doubled. But in analyzing the farmed animals' nutritional values, they found that they typically offer poorer nutrition than a diverse mix of wild fish. In particular, the move to chicken and aquaculture will probably exacerbate the region's already serious iron deficiencies, and limit supplies of essential fatty acids, they say. "Because no single species can offer all key nutrients, a diversity of species is needed to sustain nutritionally adequate diets," they write.

Besides this, chicken farming and aquaculture exert far more pressure on the environment than fishing. In addition to encouraging clearing of forests to produce feed for the animals, animal farming produces more more greenhouse gases, and introduces fertilizers and other pollutants into nearby waters, says Heilpern.

"Inland fish are fundamental for nutrition in many low-income and food-deficit countries, and of course landlocked countries," said John Valbo Jørgensen, a Rome-based expert on inland fisheries with the UN Food and Agriculture Organization. "Many significant inland fisheries, including those of Peru, take place in remote areas with poor infrastructure and limited inputs. It will not be feasible to replace those fisheries with farmed animals including fish."

Heilpern is now working with the Wildlife Conservation Society to produce an illustrated guide to the region's fish, including their nutritional values, in hopes of promoting a better understanding of their value among both fishermen and consumers.

Read more at Science Daily