Mar 12, 2022

Scientists announce discovery of supermassive binary black holes

A team of researchers from Purdue University and other institutions have discovered a supermassive black hole binary system, one of only two known such systems. The two black holes, which orbit each other, likely weigh 100 million suns each. One of the black holes powers a massive jet that moves outward at very close to the speed of light. The system is so far away that the visible light seen today was emitted 8.8 billion years ago.

The two are only between 200 AU and 2,000 AU apart (one AU is the distance from the Earth to the sun), at least 10 times closer than the only other known supermassive binary black hole system.

The close separation is significant because such systems are expected to merge eventually. That event will release a massive amount of energy in the form of gravitational waves, causing ripples in space in every direction (and oscillations in matter) as the waves pass through.

Finding systems like this is also important for understanding the processes by which galaxies formed and how they ended up with massive black holes at their centers.

Methods


Researchers serendipitously discovered the system when they noticed a repeating sinusoidal pattern in its radio brightness emission variations over time, based on data taken after 2008. A subsequent search of historical data revealed that the system also was varying in the same manner in the late 1970s to early 1980s. That type of variation is exactly what researchers would expect if the jetted emission from one black hole is affected by the Doppler effect due to its orbital motion as it swings around the other black hole. 

Read more at Science Daily

Scientists make leap forward for genetic sequencing

In a paper published today in Sciences Advances, researchers in the Department of Chemistry and the Department of Physics & Astronomy at the University of California, Irvinerevealed new details about a key enzyme that makes DNA sequencing possible. The finding is a leap forward into the era of personalized medicine when doctors will be able to design treatments based on the genomes of individual patients.

"Enzymes make life possible by catalyzing chemical transformations that otherwise would just take too long for an organism," said Greg Weiss, UCI professor of chemistry and a co-corresponding author of the new study. "One of the transformations we're really interested in is essential for all life on the planet -- it's the process by which DNA is copied and repaired."

The molecule the UCI-led team studied is an enzyme called Taq, a name derived from the microorganism it was first discovered in, Thermos aquaticus. The molecule the UCI-led team studied is an enzyme called Taq, a name derived from the microorganism it was first discovered in, Thermos aquaticus. Taq replicates DNA. Polymerase chain reaction, the technique with thousands of uses from forensics to PCR tests to detect COVID-19, takes advantage of Taq.

The UCI-led team found that Taq, as it helps make new copies of DNA, behaves completely unlike what scientists previously thought. Instead of behaving like a well-oiled, efficient machine continuously churning out DNA copies, the enzyme, Weiss explained, acts like an indiscriminate shopper who cruises the aisles of a store, throwing everything they see into the shopping cart.

"Instead of carefully selecting each piece to add to the DNA chain, the enzyme grabs dozens of misfits for each piece added successfully," said Weiss. "Like a shopper checking items off a shopping list, the enzyme tests each part against the DNA sequence it's trying to replicate."

It's well-known that Taq rejects any wrong items that land into its proverbial shopping cart -- that rejection is the key, after all, to successfully duplicating a DNA sequence. What's surprising in the new work is just how frequently Taq rejects correct bases. "It's the equivalent of a shopper grabbing half a dozen identical cans of tomatoes, putting them in the cart, and testing all of them when only one can is needed."

The take-home message: Taq is much, much less efficient at doing its job than it could be.

The find is a leap toward revolutionizing medical care, explained Philip Collins, a professor in the UCI Department of Physics & Astronomy who's a co-corresponding author of the new research. That's because if scientists understand how Taq functions, then they can better understand just how accurate a person's sequenced genome truly is.

"Every single person has a slightly different genome," said Collins, "with different mutations in different places. Some of those are responsible for diseases, and others are responsible for absolutely nothing. To really get at whether these differences are important or healthcare -- for properly prescribing medicines -- you need to know the differences accurately."

"Scientists don't know how these enzymes achieve their accuracy," said Collins, whose lab created the nano-scale devices for studying Taq's behavior. "How do you guarantee to a patient that you've accurately sequenced their DNA when it's different from the accepted human genome? Does the patient really have a rare mutation," asks Collins, "or did the enzyme simply make a mistake?"

"This work could be used to develop improved versions of Taq that waste less time while making copies of DNA," Weiss said.

The impacts of the work don't stop at medicine; every scientific field that relies on accurate DNA sequencing stands to benefit from a better understanding of how Taq works. In interpreting evolutionary histories using ancient DNA, for example, scientists rely on assumptions about how DNA changes over time, and those assumptions rely on accurate genetic sequencing.

"We've entered the century of genomic data," said Collins. "At the beginning of the century we unraveled the human genome for the very first time, and we're starting to understand organisms and species and human history with this newfound information from genomics, but that genomic information is only useful if it's accurate."

Read more at Science Daily

Mar 11, 2022

Magnetic reconnection breakthrough may help predict space weather

A West Virginia University postdoctoral researcher in the Department of Physics and Astronomy has made a breakthrough in the study of magnetic reconnection, which could prevent space storms from wreaking havoc on the Earth's satellite and power grid systems.

Peiyun Shi's research is the first-of-its-kind in the laboratory setting and is part of the PHASMAproject, a complex experiment composed of advanced diagnostics, electromagnets and lab-created plasma to reveal new details about how the universe functions.

For his experiment, Shi uses a laser-based diagnostic to probe plasma. Laser beams are directed in the diagnostic and the light scatters off of electrons. The way the light scatters gives insight into how fast the electrons are moving. And because the plasma is more than 10,000 degrees Fahrenheit, the lasers allow for measuring particles without using a probe or a thermometer which would melt at such high temperatures.

According to Shi, the technique is analogous to the Doppler effect, which is an increase or decrease in the frequency of sound or light waves emanating from a source as an observer moves towards or away from the source.

Shi's findings were published in Physical Review Letters.

"It's like a radar gun for particles," said Earl Scime, director of the WVU Center for Kinetic Experiment, Theory and Integrated Computation Physics and Oleg D. Jefimenko professor of physics. According to Scime, similar studies are only able to determine the average properties of the electrons, but with the technology available as part of the PHASMA project, Shi is able to measure the actual speeds of the electrons.

"Our work proves to the fundamental plasma community that advanced laser diagnostics can measure important kinetic features not accessible to any other conventional diagnostics," Shi said. "This is essential for understanding various plasma physics processes and for complementing modern satellite observations. It's a great privilege to work on such a promising project with a fantastic team here, and the productive collaboration with Paul Cassak and his graduate M. Hasan Barbhuiya is also critical for this work and much appreciated."

This research has a big impact on broader issues such as predicting space weather events. Magnetic reconnection plays a major role in how eruptions of plasma occur on the sun. Those eruptions can result in solar flares which increase X-ray and ultraviolent emissions, which poses a threat to astronauts in the International Space Station. The eruptions can also result in large masses of plasma that travel through space and slam into the Earth's magnetosphere. Those space storms can play havoc with satellite and power grid systems on Earth.

"Every time we understand more about magnetic reconnection, it has applications from space weather to thermonuclear fusion, to a basic understanding of how the universe works," Scime said.

The PHASMA project is located in the Center for KINETIC Plasma Physics. PHASMA -- or the PHAse Space MApping experiment as it's officially dubbed -- is the focus of the WVU Center for Kinetic Experiment, Theory and Integrated Computation Plasma Physics.

PHASMA is designed to make three-dimensional measurements of the motion of the ions and electrons in a plasma at very small scales and is the only facility in the world capable of performing these detailed measurements.

Read more at Science Daily

Heat stress for cattle may cost billions by century's end, study finds

Climate change poses a potentially devastating economic threat to low-income cattle farmers in poor countries due to increasing heat stress on the animals. Globally, by the end of this century those producers may face financial loss between $15 and $40 billion annually.

Farmers in tropical regions -- including large parts of South America, Asia and Africa -- are likely to suffer significantly, particularly when compared with producers in the world's wealthier temperate zones, according to a study by an international team of scientists and economists published in the Lancet Planetary Health.

The researchers, including Mario Herrero, professor of sustainable food systems at Cornell University, and lead author Philip Thornton, of the International Livestock Research Institute and CGIAR; published "Impacts of Heat Stress on Global Cattle Production During the 21st Century."

Escalating demand for livestock products in low- and middle-income countries, along with steadily increasing global average temperatures, is an uncomfortable mix, the researchers said. If livestock are to adapt to new thermal environments and increase their productivity, infrastructural investments or adjustments -- such as switching to more heat-tolerant cattle breeds, and improving shade, ventilation and cooling systems -- will be required.

In the paper's high greenhouse-gas emission scenario, cattle production losses from heat stress are estimated to be $39.94 billion annually, or 9.8% of the value of production of meat and milk from cattle in 2005 -- the scientists' baseline year. The low-emission scenario projects production losses at $14.9 billion annually, or 3.7% of the 2005 value.

By the end of the century, dairy and beef production in the United States is projected to decline by 6.8%, while India -- a major dairy production country -- is projected to lose more than 45% of its dairy farming due to heat stress increases.

"Resource-poor farmers in low-income countries depend heavily on their livestock for their livelihoods," Thornton said. "The adaptation needs are even higher in these countries, and those farmers are the ones where the hit is even more severe."

With climate change, farming and sustainability, Herrero suggested there is a need to create equitable adaptation practices "by design and to think intentionally on reaching the vulnerable sectors of global society," he said. "We cannot just hope that the poor will not be affected."

Read more at Science Daily

Newly identified softshell turtle lived alongside T. rex and Triceratops

A newly described softshell turtle that lived in North Dakota 66.5 million years ago at the end of the Cretaceous Period, just before the end-Cretaceous mass extinction,is oneof the earliest known species of the genus, according to new research shared in the journal Cretaceous Research.

Hutchemys walkerorum lived during a period when large and well-known dinosaurs also roamed Earth, including Tyrannosaurus rex and Triceratops. The find adds important information to scientists' understanding of softshell turtles more broadly, including the potential effects of the end-Cretaceous mass extinction, which took place in this same time period, on their evolution.

Steven Jasinski, who recently completed his Ph.D. in Penn's Department of Earth and Environmental Sciences in the School of Arts & Sciences, led the research, collaborating with advisor Peter Dodson of the School of Veterinary Medicine and Penn Arts & Sciences. The research team included Andrew Heckert and Ciara Sailar of Appalachian State University and Asher Lichtig and Spencer Lucas of the New Mexico Museum of Natural History and Science.

Hutchemys walkerorum belongs to a particular group of softshell turtles in the Trionychidae family called plastomenines. These turtles are similar to the softshell turtles that exist today, although the plastron of plastomenine turtles -- the bones covering their stomach and abdominal area -- are more strongly sutured together and often larger and more robust than in other softshell turtles.

Plastomenines lived during the Cretaceous and Paleogene periods, around 80 million to 50 million years ago. Members of this group first appear in the fossil record during the Late Cretaceous, and a single species continues into the Eocene Epoch, 50 million years ago, but they are at their peak diversity before and after the Cretaceous-Paleogene boundary.

"Until recently we didn't understand these softshell turtles very well," says Jasinski. "However, we are starting to get more information on this extinct group of turtles and further understanding their evolution, including how they dealt with the mass extinction."

The fossil specimen of the new species, a partial carapace -- the bones that cover the back and what people think of as a turtle's "shell" -- was discovered in 1975 in southwestern North Dakota. A field crew from Appalachian State University led by Frank K. McKinney and John E. Callahan collected the specimen, along with a specimen of Triceratops, that summer. The fossil turtle specimen remained at Appalachian State until 2013, when Heckert discussed it with Jasinski, a master's student at East Tennessee State University at the time.

Research started in earnest around that time and continued as Jasinski was at Penn for his doctoral studies. Based on the structure of the specimen, he and colleagues determined this fossil belonged to a genus of turtles from the American West known as Hutchemys. Hutchemys walkerorum represents one of the rare occurrences of these turtles prior to the mass extinction event that brought the Age of Dinosaurs to an end. It also represents the easternmost occurrence of the genus during the Cretaceous Period.

"With this study we gain further insight into winners and losers during the cataclysm that ended the Age of Dinosaurs," says Dodson. "The mighty dinosaurs fell, and the lowly turtle survived."

A phylogenetic analysis, comparing the new species with other known trionychids, or softshell turtles, gave the scientists a better understanding of the group's evolutionary relationships. Their analysis placed Hutchemys walkerorum with other known species of Hutchemys and several other turtles in a distinct group of derived plastomenines, which they named Plastomenini. In addition, the researchers found a group of early trionychids, placing them in a newly established subfamily, Kuhnemydinae. Kuhnemydines are fossil species from Asia, and the team's analysis suggests the family Trionychidae originated in Asia before migrating to North America sometime in the Late Cretaceous.

The researchers' investigations also led them to another new classification in the Trionychidae family, a subfamily they named Chitrainae. This group encompasses modern softshell turtles, including the narrow-headed and giant softshell turtles found in southern Asia.

The species name walkerorum honors Greg and Susan Walker, whose philanthropy created The Greg and Susan Walker Endowment in 2006. Under the terms of that gift, students in the Department of Earth and Environmental Science (EES) may apply for funds to undertake research projects for which no other source of support is immediately available.

"The Greg and Susan Walker Endowment awards research support, typically for projects costing up to $5,000, in response to proposals submitted to the endowment through the Department of Earth and Environmental Science," says Robert Giegengack, professor emeritus.

"The professors and advisors who approve the endowment do an awesome job in helping the students thrive," says Joan Buccilli, an administrator in the EES department who assists students seeking support. "However, I really do feel I have the best job, getting to navigate through their awards with them and getting to see firsthand how excited they are and what they have accomplished."

Jasinski was awarded Walker Research Grant funds for this project as well as others describing new species of dinosaurs, turtles, dogs, and investigations of dinosaurs and carnivorous mammals. "The Walkers' generous support helped me get the most out of my time while at Penn," says Jasinski, "and I know they were vital to the research of other students as well. This was one of the major reasons we wanted to name this new species in their honor."

Read more at Science Daily

New window system allows for long-term studies of brain activity

Bilal Haider is studying how multiple areas of the brain work together for visual perception. This could help researchers understand if neural activity "traffic jams" underlie all kinds of visual impairments: from running a red light when visual attention is elsewhere, to shedding light on the autism-affected brain.

To do this kind of work, researchers need a reliable "map" of all the visual brain areas with specific coordinates for each unique brain. Drawing the map requires monitoring and recording data from an active, working brain, which usually means creating a window in the skull to watch blood flow activity.

Haider's team has developed a better approach -- a new kind of window that's more stable and allows for longer-term studies. The assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University explains how in a paper published in February in Scientific Reports, an open access forum of Nature publishing.

To get a clear image of the brain's visual network, Haider's lab uses an established technique called blood flow imaging, which tracks oxygen in the blood, measuring the active and inactive areas of a mouse brain while the animal views visual stimuli. To capture a strong blood flow signal, researchers typically create a cranial window by thinning the skull or removing a piece of it altogether. These procedures can diminish stability in the awake, pulsing brain -- detrimental conditions for delicate electrophysiological measurements made in the same visual areas after imaging.

"Standard windows give really good pictures of the vasculature," Haider said. "But the downside is, if you're working with an animal learning how to perform a sophisticated task that requires weeks of training, and you want to do neural recordings from the brain later, that area has been compromised if the skull is missing or thinned out."

The team's new cranial window system allows for high-quality blood flow imaging and stable electrical recordings for weeks or even months. The secret is a surgical glue called Vetbond -- which contains cyanoacrylate, the same compound that's in Krazy Glue -- and a tiny glass window.

Basically, a thin layer of the glue is applied to the skull with a micropipette and a curved glass coverslip is placed on top of that. The cyanoacrylate creates a "transparent skull" effect. Haider's team developed the new window system and then vetted the accuracy of the resulting visual brain maps.

"Sometimes the simplest things work. The glue creates a barrier allowing all of the normal physiological processes underneath to carry on, but leaving the bone transparent," Haider said. "It's like putting a protector on your smartphone. The protector is over the glass surface, but everything underneath stays crystal clear and functioning."

Haider's approach will help his team accomplish their larger goals -- to measure the activity of neurons in the brain's visual pathways and understand how neural traffic jams diminish our visual attention, and how these processes may contribute to visual impairments in people with autism. It's work that's getting a boost, thanks to recent support of the Simons Foundation Autism Research Initiative.

Haider said proper study of brain function requires repeatable measurements of neural activity, so he has made the new window system publicly available.

Read more at Science Daily

Mar 10, 2022

196 lasers help scientists recreate the conditions inside gigantic galaxy clusters

Galaxies rarely live alone. Instead, dozens to thousands are drawn together by gravity, forming vast clusters that are the largest objects in the universe.

"Galaxy clusters are one of the most awe-inspiring things in the universe," said Prof. Emeritus Don Lamb, a University of Chicago astrophysicist and co-author on a new paper published March 9 -- one that may point the way towards solving a decades-long mystery.

Scientists have long known that the hydrogen gas in galaxy clusters is searingly hot -- about 10 million degrees Kelvin, or roughly the same temperature as the center of the sun -- which is so hot that hydrogen atoms cannot exist. Instead the gas is a plasma consisting of protons and electrons.

But a puzzle persists: There is no straightforward explanation for why or how the gas stays so hot. According to the normal rules of physics, it should have cooled within the age of the universe. But it hasn't.

The challenge for anyone trying to solve this puzzle is that you can't exactly create these kinds of powerfully hot and magnetic conditions in your backyard.

However, there is now one place on Earth where you can: the most energetic laser facility in the world. The National Ignition Facility at Lawrence Livermore National Laboratory is able to create such extreme conditions -- though only for a tiny fraction of a second in a volume the size of a dime.

Scientists from UChicago, the University of Oxford, and the University of Rochester worked together to use the National Ignition Facility -- located in Livermore, California -- to create conditions similar to the hot gas in gigantic galaxy clusters. "The experiments conducted at the NIF are literally out of this world," said Jena Meinecke, who was the first author on the paper.

The scientists focused 196 lasers onto a single tiny target, creating a white-hot plasma with intense magnetic fields that exists for a few billionths of a second.

This was long enough for them to determine that instead of a uniform temperature, there were hot and cold spots in the plasma.

This dovetails with one of the theories that has been proposed for how heat is trapped inside galaxy clusters. Normally, heat would be easily distributed as electrons collide with each other. But the tangled magnetic fields inside the plasma can affect these electrons, causing them to spiral along the direction of magnetic fields -- which can prevent them from evenly distributing and dispersing their energy.

In fact, in the experiment they saw that the conduction of energy was suppressed by more than a factor of 100.

"This is an incredibly exciting result because we've been able to show that what astrophysicists have proposed is on the right track," said Lamb, the Robert A. Millikan Distinguished Service Professor Emeritus in Astronomy and Astrophysics.

"This is indeed an astonishing result," added study co-author University of Rochester Prof. Petros Tzeferacos, who oversaw computer simulations of the complicated experiment. "The simulations were key to untangling the physics at play in the turbulent, magnetized plasma, but the level of thermal transport suppression was beyond what we expected."

The simulations were done with a computer code called the FLASH codes, which was developed at the University of Chicago and is now hosted at the University of Rochester's Flash Center for Computational Science, which is led by Tzeferacos. The code allows scientists to simulate their laser experiments in exquisite detail before they do them, so that they can achieve the results they seek.

This is critical because the scientists only get a precious few shots at the facility -- if something goes wrong, there's no redo. And because the experiment conditions only last nanoseconds, the scientists have to make sure they make the measurements they need at exactly the right time. This means everything has to be precisely plotted out far ahead of time.

"It's a challenge when you're at the very extremes of what can be done, but that's where the frontier is," said Lamb.

More questions remain about the physics of galaxy clusters, however. Though the hot and cold spots are solid evidence for the impact of magnetic fields on the cooling of the hot gas in galaxy clusters, further experiments are needed to understand exactly what is happening. The group is planning its next round of experiments at NIF later this year.

For the moment, though, they're happy to have shed light on why the gas in galaxy clusters is still hot even after billions of years.

Galaxies rarely live alone. Instead, dozens to thousands are drawn together by gravity, forming vast clusters that are the largest objects in the universe.

"Galaxy clusters are one of the most awe-inspiring things in the universe," said Prof. Emeritus Don Lamb, a University of Chicago astrophysicist and co-author on a new paper published March 9 -- one that may point the way towards solving a decades-long mystery.

Scientists have long known that the hydrogen gas in galaxy clusters is searingly hot -- about 10 million degrees Kelvin, or roughly the same temperature as the center of the sun -- which is so hot that hydrogen atoms cannot exist. Instead the gas is a plasma consisting of protons and electrons.

But a puzzle persists: There is no straightforward explanation for why or how the gas stays so hot. According to the normal rules of physics, it should have cooled within the age of the universe. But it hasn't.

The challenge for anyone trying to solve this puzzle is that you can't exactly create these kinds of powerfully hot and magnetic conditions in your backyard.

However, there is now one place on Earth where you can: the most energetic laser facility in the world. The National Ignition Facility at Lawrence Livermore National Laboratory is able to create such extreme conditions -- though only for a tiny fraction of a second in a volume the size of a dime.

Scientists from UChicago, the University of Oxford, and the University of Rochester worked together to use the National Ignition Facility -- located in Livermore, California -- to create conditions similar to the hot gas in gigantic galaxy clusters. "The experiments conducted at the NIF are literally out of this world," said Jena Meinecke, who was the first author on the paper.

The scientists focused 196 lasers onto a single tiny target, creating a white-hot plasma with intense magnetic fields that exists for a few billionths of a second.

This was long enough for them to determine that instead of a uniform temperature, there were hot and cold spots in the plasma.

This dovetails with one of the theories that has been proposed for how heat is trapped inside galaxy clusters. Normally, heat would be easily distributed as electrons collide with each other. But the tangled magnetic fields inside the plasma can affect these electrons, causing them to spiral along the direction of magnetic fields -- which can prevent them from evenly distributing and dispersing their energy.

In fact, in the experiment they saw that the conduction of energy was suppressed by more than a factor of 100.

"This is an incredibly exciting result because we've been able to show that what astrophysicists have proposed is on the right track," said Lamb, the Robert A. Millikan Distinguished Service Professor Emeritus in Astronomy and Astrophysics.

"This is indeed an astonishing result," added study co-author University of Rochester Prof. Petros Tzeferacos, who oversaw computer simulations of the complicated experiment. "The simulations were key to untangling the physics at play in the turbulent, magnetized plasma, but the level of thermal transport suppression was beyond what we expected."

The simulations were done with a computer code called the FLASH codes, which was developed at the University of Chicago and is now hosted at the University of Rochester's Flash Center for Computational Science, which is led by Tzeferacos. The code allows scientists to simulate their laser experiments in exquisite detail before they do them, so that they can achieve the results they seek.

This is critical because the scientists only get a precious few shots at the facility -- if something goes wrong, there's no redo. And because the experiment conditions only last nanoseconds, the scientists have to make sure they make the measurements they need at exactly the right time. This means everything has to be precisely plotted out far ahead of time.

"It's a challenge when you're at the very extremes of what can be done, but that's where the frontier is," said Lamb.

More questions remain about the physics of galaxy clusters, however. Though the hot and cold spots are solid evidence for the impact of magnetic fields on the cooling of the hot gas in galaxy clusters, further experiments are needed to understand exactly what is happening. The group is planning its next round of experiments at NIF later this year.

Read more at Science Daily

Giant impact crater in Greenland occurred a few million years after dinosaurs went extinct

Danish and Swedish researchers have dated the enormous Hiawatha impact crater, a 31 km-wide meteorite crater buried under a kilometer of Greenlandic ice. The dating ends speculation that the meteorite impacted after the appearance of humans and opens up a new understanding of Earth's evolution in the post-dinosaur era.

Ever since 2015, when researchers at the University of Copenhagen's GLOBE Institute discovered the Hiawatha impact crater in northwestern Greenland, uncertainty about the crater's age has been the subject of considerable speculation. Could the asteroid have slammed into Earth as recently as 13,000 years ago, when humans had long populated the planet? Could its impact have catalyzed a nearly 1,000-year period of global cooling known as the Younger Dryas?

New analyses performed on grains of sand and rocks from the Hiawatha impact crater by the Natural History Museum of Denmark and the GLOBE Institute at the University of Copenhagen, as well as the Swedish Museum of Natural History in Stockholm, demonstrate that the answer is no. The Hiawatha impact crater is far older. In fact, a new study published in the journal Science Advances today reports its age to be 58 million years old.

"Dating the crater has been a particularly tough nut to crack, so it's very satisfying that two laboratories in Denmark and Sweden, using different dating methods arrived at the same conclusion. As such, I'm convinced that we've determined the crater's actual age, which is much older than many people once thought," says Michael Storey of the Natural History Museum of Denmark.

"Determining the new age of the crater surprised us all. In the future, it will help us investigate the impact's possible effect on climate during an important epoch of Earth's history" says Dr. Gavin Kenny of the Swedish Museum of Natural History.

As one of those who helped discover the Hiawatha impact crater in 2015, Professor Nicolaj Krog Larsen of the GLOBE Institute at the University of Copenhagen is pleased that the crater's exact age is now confirmed.

"It is fantastic to now know its age. We've been working hard to find a way to date the crater since we discovered it seven years ago. Since then, we have been on several field trips to the area to collect samples associated with the Hiawatha impact," says Professor Larsen

Age revealed by laser beams and grains of sand

No kilometer-thick ice sheet draped Northwest Greenland when the Hiawatha asteroid rammed into Earth surface releasing several million times more energy than an atomic bomb. At the time, the Arctic was covered with a temperate rainforest and wildlife abounded -- and temperatures of 20 degrees Celsius were the norm. Eight million years earlier, an even larger asteroid struck present-day Mexico, causing the extinction of Earth's dinosaurs.

The asteroid smashed into Earth, leaving a thirty-one-kilometer-wide, one-kilometer-deep crater. The crater is big enough to contain the entire city of Washington D.C. Today, the crater lies beneath the Hiawatha Glacier in Northwest Greenland. Rivers flowing from the glacier supplied the researchers with sand and rocks that were superheated by the impact 58 million years ago.

The sand was analyzed at the Natural History Museum of Denmark by heating the grains with a laser until they released argon gas, whereas the rock samples were analyzed at the Swedish Museum of Natural History using uranium-lead dating of the mineral zircon.

Read more at Science Daily

Florida's 76,000 stormwater ponds emit more carbon than they store

When you look at ponds, you might see birds and fish, but you probably don't think about carbon. In fact, Florida's 76,000 ponds store plenty of carbon, and a lot of it escapes into the atmosphere.

In fact, ponds lose more carbon via gas than they store in the muck, a new University of Florida study found.

"That finding means some ponds are doing us an ecosystem 'disservice,'" said Mary Lusk, a UF/IFAS assistant professor of soil and water sciences. "Globally, we expect that as urbanization continues, there will be more and more of these small human-made ponds in urban landscapes."

This research will inform scientists' attempts to estimate how much carbon is entering the atmosphere from these ponds on a regional basis, said Lusk, a faculty member at the UF/IFAS Gulf Coast Research and Education Center.

"Then, once people start to understand that better, we hope they will take stormwater ponds into account for policies related to carbon control," she said. "Stormwater ponds are everywhere in Florida. But they are understudied in terms of how they are impacting local ecosystems. Because they are human-made parts of the landscape, they sort of get overlooked, and people might assume they're not very important ecologically."

The sheer number of ponds compelled Lusk to study if they could have larger environmental effects than people think. She originally wanted to focus on nitrogen and phosphorus in ponds, but one of her graduate students, Audrey Goeckner, wanted to study carbon.

"When I learned that I had the chance to work in stormwater ponds, similar to what I had grown up in around in my neighborhood, I immediately asked myself, well what about these little urban ponds? How do they compare to other aquatic ecosystems?" asked Goeckner, now a Ph.D. student in soil and water sciences on the main UF/IFAS campus in Gainesville. "Turns out that despite their small size,

they can rapidly store and process carbon, which adds up when you consider how many of them exist in developed landscapes and how many continue to be built."

For the study, done as part of her master's thesis at GCREC, Goeckner designed a way to measure the amount of carbon leaving ponds. Although Goeckner studied ponds in Manatee County, her findings hold implications for pond carbon emissions globally.

Goeckner took two canoes (attached to each other) into the ponds. She and a lab technician each sat in one canoe to balance the weight. Goeckner then collected muck from the bottom of the ponds and measured the depth of muck above a sandy layer of sediment, indicating when the pond was constructed and the amount of organic carbon stored in it.

That's how Goeckner found the amount of carbon buried in the ponds.

Secondly, she modified a chamber that is normally used to measure greenhouse gases -- carbon dioxide and methane -- from soil. Instead, Goeckner used the chamber to measure these gases from the surface of the ponds.

She found the quantity of these gases that escapes from the ponds each year, and then compared carbon stored in the pond muck vs. carbon lost via gaseous loss. As a result, scientists now know that ponds give off more carbon than they store and that the amount lost changes over the lifetime of a pond.

As Florida continues to grow, it's going to become more urbanized. With new development often comes new stormwater ponds, which are not as good at storing carbon as older ones, Lusk said.

As ponds age, their sediment and biogeochemical properties may promote the amount of carbon stored, rather than emitted as a gas, Goeckner said. That translates to better storage efficiency of organic carbon that enters the water.

Read more at Science Daily

Eating protein from a greater variety of sources may lower risk of high blood pressure

Eating a balanced diet including protein from a greater variety of sources may help adults lower the risk of developing high blood pressure, according to new research published today in Hypertension, a peer-reviewed journal of the American Heart Association.

Nearly half of the U.S. population has hypertension, or high blood pressure -- one of the leading contributors to cardiovascular disease. When left untreated, high blood pressure damages the circulatory system and is a significant contributing factor to heart attack, stroke and other health conditions.

"Nutrition may be an easily accessible and effective measure to fight against hypertension. Along with fat and carbohydrates, protein is one of the three basic macronutrients," said study author Xianhui Qin, M.D., of the National Clinical Research Center for Kidney Disease at Nanfang Hospital, Southern Medical University in Guangzhou, China.

There is a strong association between poor diet quality and increased risk of cardiovascular disease and death from cardiovascular disease. In its 2021 dietary guidance to improve cardiovascular health, the American Heart Association advises people eat healthy sources of protein, mostly from plants and may include seafood and low-fat or fat-free dairy products, and, if desired, lean cuts and unprocessed forms of meat or poultry. The American Heart Association recommends eating one to two servings, or 5.5 ounces, of protein daily.

The study authors analyzed health information for nearly 12,200 adults living in China who were part of at least 2 out of 7 rounds of the China Health and Nutrition Survey from 1997 to 2015 (surveys taken every 2-4 years). Participants' initial survey was used as a baseline, while data from their last round was used as a follow-up for comparison. Participants were an average age of 41 years, and 47% were men. The survey measured dietary intake in three consecutive 24-hour dietary recalls and a household food inventory. A trained interviewer collected 24-hour dietary information over 3 days in the same week during each round of the survey.

Participants were given a protein "variety score" based on the number of different sources of protein eaten out of 8 reported: whole grains, refined grains, processed red meat, unprocessed red meat, poultry, fish, egg and legumes. One point was given for each source of protein, with a maximum variety score of 8. The researchers then evaluated the association for new onset hypertension in relation to the protein variety score.

New-onset hypertension was defined as systolic (top number) blood pressure greater than or equal to 140 mm Hg and/or diastolic (bottom number) blood pressure greater than or equal to 90 mm Hg, taking blood pressure-lowering medicine, or self-reporting that a physician diagnosed high blood pressure since their last survey visit. Average time to follow-up was 6 years.

The analysis found:
 

  • More than 35% of the nearly 12,200 participants developed new-onset high hypertension during follow-up.
  • Compared to participants with the lowest variety score for protein intake (less than 2), those with the highest variety score (4 or higher) had a 66% lower risk of developing high blood pressure.
  • For each of the 8 protein types, there was a window of consumption amount where the risk of hypertension was lower. Researchers described this as the appropriate level of consumption.
  • When total quantity of protein intake was considered, the amount consumed was divided into five categories (quintiles), from least to most intake. People who ate the least amount of total protein and those who ate most protein had the highest risk for new onset of hypertension.


"The heart health message is that consuming a balanced diet with proteins from various different sources, rather than focusing on a single source of dietary protein, may help to prevent the development of high blood pressure," Qin said.

A limitation of the study is its observational design. Because researchers used prior health information, they could not definitively prove protein intake of any kind or quantity caused or prevented new-onset hypertension.

Read more at Science Daily

Mar 9, 2022

Black hole billiards in the centers of galaxies

Researchers provide the first plausible explanation to why one of the most massive black hole pairs observed to date by gravitational waves also seemed to merge on a non-circular orbit. Their suggested solution, now published in Nature, involves a chaotic triple drama inside a giant disk of gas around a super massive black hole in a galaxy far, far away.

Black holes are one of the most fascinating objects in the Universe, but our knowledge of them is still limited -- especially because they do not emit any light. Up until a few years ago, light was our main source of knowledge about our universe and its black holes, until the Laser Interferometer Gravitational Wave Observatory (LIGO) in 2015 made its breakthrough observation of gravitational waves from the merger of two black holes.

"But how and where in our Universe do such black holes form and merge? Does it happen when nearby stars collapse and both turn into black holes, is it through close chance encounters in star clusters, or is it something else? These are some of the key questions in the new era of Gravitational Wave Astrophysics," says Assist. Prof. Johan Samsing from the Niels Bohr Institute at the University of Copenhagen, lead author of the paper.

He and his collaborators may have now provided a new piece to the puzzle, which possibly solves the last part of a mystery that astrophysicists have struggled with for the past few years.

Unexpected Discovery in 2019

The mystery dates back to 2019, when an unexpected discovery of gravitational waves was made by the LIGO and Virgo observatories. The event named GW190521 is understood to be the merger of two black holes, which not only were heavier than previously thought physically possible, but had in addition produced a flash of light.

Possible explanations have since been provided for these two characteristics, but the gravitational waves also revealed a third astonishing feature of this event -- namely that the black holes did not orbit each other along a circle in the moments before merging.

"The gravitational wave event GW190521 is the most surprising discovery to date. The black holes' masses and spins were already surprising, but even more surprising was that they appeared not to have a circular orbit leading up to the merger," says co-author Imre Bartos, Prof. at the University of Florida.

But why is a non-circular orbit so unusual and unexpected?

"This is because of the fundamental nature of the gravitational waves emitted, which not only brings the pair of black holes closer for them to finally merge but also acts to circularize their orbit." explains co-author Zoltan Haiman, a Professor at Columbia University.

This observation made many people around the world, including Johan Samsing in Copenhagen, wonder,

"It made me start thinking about how such non-circular (known as "eccentric") mergers can happen with the surprisingly high probability as the observation suggests," says Johan Samsing.

It Takes Three to Tango

A possible answer would be found in the harsh environment in the centers of galaxies harboring a giant black hole millions of times the mass of the Sun surrounded by a flat, rotating disk of gas.

"In these environments the typical velocity and density of black holes is so high that smaller black holes bounce around as in a giant game of billiards and wide circular binaries cannot exist," points out co-author Prof. Bence Kocsis from the University of Oxford.

But as the group further argued, a giant black hole is not enough, "New studies show that the gas disk plays an important role in capturing smaller black holes, which over time move closer to the center and also closer to one other. This not only implies they meet and form pairs, but also that such a pair might interact with another, third, black hole, often leading to a chaotic tango with three black holes flying around, " explains astrophysicist Hiromichi Tagawa from Tohoku University, co-author of the study.

However, all previous studies up to observation of GW190521 indicated that forming eccentric black hole mergers is relatively rare. This naturally brings up the question: Why did the already unusual gravitational wave source GW190521 also merge on an eccentric orbit?

Two Dimensional Black Hole Billiards

Everything that has been calculated so far was based on the notion that the black hole interactions are taking place in three dimensions, as expected in the majority of stellar systems considered so far.

"But then we started thinking about what would happen if the black hole interactions were instead to take place in a flat disk, which is closer to a two-dimensional environment. Surprisingly, we found in this limit that the probability of forming an eccentric merger increases by as much as a 100 times, which leads to about half of all black hole mergers in such disks possibly being eccentric," says Johan Samsing and continues:

"And that discovery fits incredibly well with the observation in 2019, which all in all now points in the direction that the otherwise spectacular properties of this source are not so strange again, if it was created in a flat gas disk surrounding a super massive black hole in a galactic nucleus."

This possible solution also adds to a century-old problem in mechanics, "The interaction between 3 objects is one of the oldest problems in physics, which both Newton, myself, and others have intensely studied. That this now seems to play a crucial role in how black holes merge in some of the most extreme places of our Universe is incredibly fascinating ," says co-author Nathan W. Leigh, Prof. at Universidad de Concepción, Chile.

Black Holes in Gaseous Disks


The theory of the gas disk also fits with other researchers' explanations of the other two puzzling properties of GW190521. The large masses of the black hole have been reached by successive mergers inside the disk, while the emission of light could originate from the ambient gas.

"We have now shown that there can be a huge difference in the signals emitted from black holes that merge in flat, two-dimensional disks, versus those we often consider in three-dimensional stellar systems, which tells us that we now have an extra tool that we can use to learn about how black holes are created and merge in our Universe, " says Johan Samsing.

Read more at Science Daily

Large mammals can help climate change mitigation and adaptation

 When it comes to helping mitigate the effects of climate change by absorbing carbon, flora rather than fauna usually comes to mind. A new study published in Current Biology now explores the role of large wild animals in restoring ecosystems and battling climate change.

Professor Yadvinder Malhi, Environmental Change Institute at the University of Oxford, said:

'Conservation efforts usually focus on either trees and carbon or the broad conservation appeal of large mammals. This study looked at whether it was possible to align these agendas -- under what context could protecting and restoring large animal wildlife help us tackle and adapt to climate change.'

The researchers highlighted three key eco-touchpoints where large animals such as elephants, rhinoceroses, giraffes, whales, bison, and moose had the greatest potential to mitigate climate change: carbon stocks, albedo (the ability of surfaces to reflect solar radiation (energy from the sun) and fire regimes.

When they graze, large herbivores disperse seeds, clear vegetation and fertilise the soil, which helps build more complex and more resilient ecosystems. These activities can maintain and increase carbon stocks in the soil, roots and above-ground parts of plants, helping to reduce CO2 in the atmosphere.

When large animals graze and trample vegetation they can change the habitat from dense shrubs and trees to open mixes of grass and shrubs or trees, which can also reveal snow-covered ground in polar regions. These open habitats tend to be paler (with higher albedo) and reflect more solar radiation into the atmosphere, cooling the Earth's surface, rather than absorbing it and warming the Earth's surface.

In 2021, global wildfire CO2 emissions reached a record high. When wildfires burn, the carbon stored in trees and vegetation is released into the atmosphere as greenhouse gases. Elephants, rhinoceroses, zebras and other large grazing animals can lessen wildfire risk by browsing on woody vegetation that could otherwise fuel the fires, trampling paths and making other gaps in vegetation that act as firebreaks.

The research, commissioned by wildlife charity Tusk, also looked at how protecting and restoring large animal wildlife could support climate change efforts and found several animal-climate interaction points that could provide 'win-win' opportunities.

In temperate, tropical and subtropical grassland ecosystems, large animals can reduce forest and bush fires, increase albedo and help retain carbon in the vegetation and the soil. Protecting large animal wildlife and their role in these complex ecosystems supports local biodiversity and ecological resilience.

Dr Tonya Lander, Department of Plant Sciences at Oxford University said:

'Animals can also help with localised adaptation to climate change in these environments by diversifying vegetation and increasing habitat heterogeneity. Diversity of species and microhabitats can make the ecosystem as a whole more able to resist climate change, return to a stable state following a climate-related disturbance, or find a new stable state that functions within the changed and changing climate.'

When large herbivores are present in tundra ecosystems, they help to keep down woody plant encroachment which encourages local flowering plants and grasses -- and exposes more of the ground to the cold air. That exposure maintains the permafrost and prevents the carbon in the soil from getting released into the atmosphere. Programmes that rewild bison and other animals into the arctic tundra can play important roles in both conservation and climate change adaptation at a local scale.

In marine ecosystems, whales and other large animals fertilise phytoplankton. Phytoplankton is estimated to capture 37 billion tonnes of CO2 each year and may release particles into the air which can help seed clouds and reflect sunlight into the atmosphere.

Large terrestrial and marine carnivores also affect these processes through their influence on herbivore abundance and behaviour.

Read more at Science Daily

New twist on an 80-year-old biochemical pathway

Every year, thousands of biochemistry majors and medical students around the world learn to memorize the major biochemical pathways that allow cells to function. How these 10 or so pathways are described in textbooks hasn't changed much since the early 20th century, when they were first discovered.

But with the resurgence of interest in cancer metabolism in the past decade, researchers are coming to realize that there is more to a cell's biochemistry than once thought.

The latest plot twist comes from a team of scientists at the Sloan Kettering Institute who report that they have discovered a previously unappreciated metabolic pathway -- an alternate version of the famous Krebs cycle, also known as the tricarboxylic acid (TCA) cycle.

The TCA or Krebs cycle -- named after Hans Krebs, the German-born biochemist who discovered it in 1937 -- is a central hub of cellular metabolism. It is a core part of the process by which cells "burn" sugars to make ATP, the cell's energy-carrying molecule. In its standard form, the cycle occurs entirely in a cell's mitochondria.

"We and other scientists have recognized for a while that there is variation in the degree to which cells use parts of the TCA cycle, suggesting that cells may have multiple ways to meet their fundamental metabolic needs," says Lydia Finley, a cell biologist in SKI who led the team. "Now, with this latest research, we can say there is a complete alternative to the canonical TCA cycle, and we explain how it works."

Implications for Understanding Cancer Cell Metabolism

Through several converging lines of evidence, Dr. Finley's team showed that an alternate version of the TCA cycle takes place partly in the mitochondria and partly in the cytosol. Rather than burning sugar for energy, this alternate version of the TCA cycle allows cells to use the carbons in sugar to build important molecules such as lipids for cell membranes.

Not only that, but a cell's use of one or the other version of the TCA cycle is associated with changes in its identity, the team showed.

These findings, which were reported on March 9, 2022, in Nature, have broad implications for understanding how cells adapt their metabolism to meet changing needs. They also may suggest additional avenues for cancer therapies geared at targeting a tumor's metabolism.

Putting Together the Puzzle Pieces

The new results came out of a productive collaboration in the Finley lab between Gerstner Sloan Kettering graduate student Paige Arnold and Tri-Institutional MD-PhD student Benjamin Jackson.

Arnold had been using carbon-tracing techniques to study the flow of carbons through the TCA cycle in different cell types. She had noticed, for example, that there seemed to be variation in the extent to which cells put their carbons into the TCA cycle versus skipping one part of it.

Around the same time, Jackson was using computational methods to analyze publicly available data from experiments in which the genome-editing tool CRISPR had been used to systematically knock out genes for various enzymes, one at a time, to see what effect this had on cells.

"You would hypothesize that if the TCA cycle were one functional module, then any one of those enzymes should have a relatively similar effect when you remove it," Dr. Finley points out. "What Ben noticed is that's not actually the case."

"The metabolic enzymes seemed to form two separate modules," Jackson says. "This backed up the anecdotal evidence that we were accumulating that there were different parts of the TCA cycle that cells could use or not use."

The CRISPR studies Jackson analyzed were performed in cancer cell lines -- in other words, cells that aren't "normal." Arnold wanted to know if normal also engage in this alternative or noncanonical cycle. The Finley lab often works with embryonic stem cells, so Arnold had easy access to these normal cells. Arnold traced the flow of carbons through them and found that they also engaged in the noncanonical TCA cycle.

Lessons From 80 Years Ago

These two sets of experiments seemed to confirm that there really was an alternate way to perform the TCA cycle, one that is not in textbooks. But why had Krebs missed it?

To try to answer that question, Arnold decided to review Krebs' original papers from the 1930s and 40s. She found, to her surprise, that Krebs had made his pivotal discoveries in one particular type of tissue: pigeon breast muscle.

"Nobody really talks about that," Arnold says. "But it made us wonder if maybe different cell types have distinct preferences for whether they use the traditional TCA cycle or this alternate version."

She decided to reconstruct Krebs' original experiments, only in a dish rather than in pigeon muscle. She used mouse stem-like muscle cells to grow a muscle fiber precursor called a myotube and then traced the carbons. When she did this, she saw something interesting: "When the cells were still in a more stem-like stage, they seemed to be doing a lot of this noncanonical TCA cycle, similar to embryonic stem cells and cancer cells," Arnold says. "But as soon as the cells had differentiated into myotubes, they immediately switched to the more traditional TCA cycle. This is in keeping with what Krebs saw in pigeon muscle tissue."

To the team, this result suggested a clear link between changes in cell identity and usage of particular biochemical pathways. To test whether the changes in cell fate required use of the different pathways, the team performed additional experiments in which they chemically or genetically blocked certain enzymes in the cycles and asked whether the cells could still change their fate. They could not. This finding implied that changes in cell fate required different biochemical pathways.

To Burn or To Build

Why would a cell opt for a different form of the TCA cycle at all? According to Dr. Finley, the Krebs cycle is really good at maximizing ATP production. It helps cells combust all their nutrients down to carbon dioxide.

"That's great if what you really care about is making ATP," Dr. Finley says. "But if you want to grow, ATP is actually not the limiting reagent. You actually need to retain those carbons to make new biomass. That's what the noncanonical TCA cycle does: It allows you to take carbons from glucose and export them to the cytosol, where they can be used to build other molecules. So, instead of burning the carbon, you get to keep it."

This growth-oriented cycle may have particular relevance to cancer, whose signature characteristic is unlimited growth.

Dr. Finley cautions that their laboratory experiments were all done in a dish rather than in animals. The team is keenly interested in understanding whether and when it occurs in vivo, both in normal animals and in tumors.

"That will help us know whether it might be a good cancer drug target," Dr. Finley says.

An Unexpected Opportunity due to the COVID Pandemic

Dr. Finley thinks that the more researchers begin to look for alternative biochemical pathways, the more they might find. In some ways, their discovery of a noncanonical TCA cycle was facilitated by unplanned downtime in the lab, owing to the COVID-19 pandemic.

As Jackson explains: "I was at home, and we could not come into the lab because of the pandemic. So it became a very fortuitous time to work on this project, to work out all the bugs of the code."

For Arnold, too, the pandemic-related downtime provided a chance to really delve into the historical literature and mull over other labs' data in which she thought she could see evidence of this other cycle operating.

"In the end, the computational work that I did and the model Paige was building came together, and it became a really satisfying collaboration," Jackson says.

Read more at Science Daily

Pig grunts reveal their emotions

We can now decode pigs' emotions. Using thousands of acoustic recordings gathered throughout the lives of pigs, from their births to deaths, an international team of researchers is the first in the world to translate pig grunts into actual emotions across an extended number of conditions and life stages. The research is led by the University of Copenhagen, the ETH Zurich and the France's National Research Institute for Agriculture, Food and Environment (INRAE), and can be used to improve animal welfare in the future.

Is a pig grunt worth a thousand words? Perhaps so. In a new study, an international team of researchers from Denmark, Switzerland, France, Germany, Norway and the Czech Republic have translated pig grunts into emotions. The findings have been published today in Scientific Reports.

Using more than 7000 audio recordings of pigs, the researchers designed an algorithm that can decode whether an individual pig is experiencing a positive emotion ('happy' or 'excited'), a negative one ('scared' or 'stressed') or somewhere in between. The recordings were collected in a wide range of situations encountered by commercial pigs, both positive and negative, from when they are born until their deaths.

"With this study, we demonstrate that animal sounds provide great insight into their emotions. We also prove that an algorithm can be used to decode and understand the emotions of pigs, which is an important step towards improved animal welfare for livestock," says Associate Professor Elodie Briefer of the University of Copenhagen's Department of Biology at the University of Copenhagen, who co-led the study.

Short grunts are 'happy' grunts

The researchers recorded pig sounds in both commercial and experimental scenarios, which based on the behavior of the pigs, are either associated with a positive and negative emotion. Positive situations included, for example, those when piglets suckle from their mothers or when they arere united with their family after being separated. The emotionally negative situations included, among others, separation, fights between piglets, castration and slaughter.

In experimental stables, the researchers also created various mock scenarios for the pigs, designed to evoke more nuanced emotions in the middle of the spectrum. These included an arena with toys or food and a corresponding arena without any stimuli. The researchers also placed new and unfamiliar objects in the arena for the pigs to interact with. Along the way, the pigs' calls, behavior and heartrates were monitored and recorded when possible.

The researchers then analyzed the more than 7000 audio recordings to see if there was a pattern in the sounds as a function of the emotions, and if they could discern the positive situations and emotions from the negative ones. As already revealed in previous research, the researchers collected more high-frequency calls (such as screams and squeals) in negative situations. At the same time, low-frequency calls (such as barks and grunts) occurred both in situations where the pigs experienced positive or negative emotions.

The situations between the extremes were particularly interesting. With an even more thorough analysis of the sound files, the researchers found a new pattern that revealed what the pigs experienced in certain situations in even greater detail.

"There are clear differences in pig calls when we look at positive and negative situations. In the positive situations, the calls are far shorter, with minor fluctuations in amplitude. Grunts, more specifically, begin high and gradually go lower in frequency. By training an algorithm to recognize these sounds, we can classify 92% of the calls to the correct emotion," explains Elodie Briefer.

Farmers can monitor animal emotions

The study of animal emotions is a relatively new field that has come about over the last 20 years. Today, it is widely accepted that the mental health of livestock is important for their overall well-being. Nevertheless, today's animal welfare focuses primarily on the physical health of livestock. Indeed, several systems exist that can automatically monitor an animal's physical health for a farmer.

Analogous systems to monitor the mental health of animals have yet to be developed. The researchers of the study hope their algorithm might pave the way for a new platform for farmers to keep an eye on their animals' psychological well-being.

"We have trained the algorithm to decode pig grunts. Now, we need someone who wants to develop the algorithm into an app that farmers can use to improve the welfare of their animals," says Elodie Briefer.

Read more at Science Daily

Mar 8, 2022

Astronomers discover largest molecule yet in a planet-forming disc

Using the Atacama Large Millimeter/submillimeter Array (ALMA) in Chile, researchers at Leiden Observatory in the Netherlands have for the first time detected dimethyl ether in a planet-forming disc. With nine atoms, this is the largest molecule identified in such a disc to date. It is also a precursor of larger organic molecules that can lead to the emergence of life.

"From these results, we can learn more about the origin of life on our planet and therefore get a better idea of the potential for life in other planetary systems. It is very exciting to see how these findings fit into the bigger picture," says Nashanty Brunken, a Master's student at Leiden Observatory, part of Leiden University, and lead author of the study published today in Astronomy & Astrophysics.

Dimethyl ether is an organic molecule commonly seen in star-forming clouds, but had never before been found in a planet-forming disc. The researchers also made a tentative detection of methyl formate, a complex molecule similar to dimethyl ether that is also a building block for even larger organic molecules.

"It is really exciting to finally detect these larger molecules in discs. For a while we thought it might not be possible to observe them," says co-author Alice Booth, also a researcher at Leiden Observatory.

The molecules were found in the planet-forming disc around the young star IRS 48 (also known as Oph-IRS 48) with the help of ALMA, an observatory co-owned by the European Southern Observatory (ESO). IRS 48, located 444 light-years away in the constellation Ophiuchus, has been the subject of numerous studies because its disc contains an asymmetric, cashew-nut-shaped "dust trap." This region, which likely formed as a result of a newly born planet or small companion star located between the star and the dust trap, retains large numbers of millimetre-sized dust grains that can come together and grow into kilometre-sized objects like comets, asteroids and potentially even planets.

Many complex organic molecules, such as dimethyl ether, are thought to arise in star-forming clouds, even before the stars themselves are born. In these cold environments, atoms and simple molecules like carbon monoxide stick to dust grains, forming an ice layer and undergoing chemical reactions, which result in more complex molecules. Researchers recently discovered that the dust trap in the IRS 48 disc is also an ice reservoir, harbouring dust grains covered with this ice rich in complex molecules. It was in this region of the disc that ALMA has now spotted signs of the dimethyl ether molecule: as heating from IRS 48 sublimates the ice into gas, the trapped molecules inherited from the cold clouds are freed and become detectable.

"What makes this even more exciting is that we now know these larger complex molecules are available to feed forming planets in the disc," explains Booth. "This was not known before as in most systems these molecules are hidden in the ice."

The discovery of dimethyl ether suggests that many other complex molecules that are commonly detected in star-forming regions may also be lurking on icy structures in planet-forming discs. These molecules are the precursors of prebiotic molecules such as amino acids and sugars, which are some of the basic building blocks of life.

By studying their formation and evolution, researchers can therefore gain a better understanding of how prebiotic molecules end up on planets, including our own. "We are incredibly pleased that we can now start to follow the entire journey of these complex molecules from the clouds that form stars, to planet-forming discs, and to comets. Hopefully with more observations we can get a step closer to understanding the origin of prebiotic molecules in our own Solar System," says Nienke van der Marel, a Leiden Observatory researcher who also participated in the study.

Read more at Science Daily

Hurricanes and other tropical cyclones linked to rise in U.S. deaths from several major causes

Over recent decades, hurricanes and other tropical cyclones in the U.S. were associated with up to 33.4 percent higher death rates from several major causes in subsequent months.

This is the finding of research from Columbia University Mailman School of Public Health, Colorado State University, Imperial College London, and Harvard T. H. Chan School of Public Health, published in the journal JAMA.

The study exemplifies how far-reaching and varied the hidden costs to life could be from climate-related disasters and climate change.

Until now, there had been a critical knowledge gap about cause-specific tropical cyclone mortality risks from a large-scale study covering the entire U.S. across multiple decades.

After collecting 33.6 million U.S. death records from 1988 to 2018, the researchers used a statistical model to calculate how death rates changed after tropical cyclones and hurricanes (a subset of the strongest tropical cyclones) when compared to equivalent periods in other years.

The researchers found the largest overall increase in the month of hurricanes for injuries (33.4 percent),with increases in death rates in the month after tropical cyclones for injuries (3.7 percent), infectious and parasitic diseases (1.8 percent), respiratory diseases (1.3 percent), cardiovascular diseases (1.2 percent), and neuropsychiatric conditions (1.2 percent).

Residents of 1206 counties, covering half of the entire U.S. population, experienced at least one tropical cyclone during the study period. Tropical cyclones were most frequent in eastern and south-eastern coastal counties.

"Recent tropical cyclone seasons -- which have yielded stronger, more active, andlonger-lasting tropical cyclones than previously recorded -- indicate that tropical cyclones will remain an important public health concern," said Robbie Parks, PhD, post-doctoral research scientist at Columbia University Mailman School of Public Health, and first author. "Our results show that tropical cyclones in the U.S. were associated with increases in deaths for several major causes of death, speaking to the 'hidden burden' of climate-related exposures and climate change.

An outsized proportion of low-income and historically-disadvantaged communities in the United States reside in tropical cyclone-affected areas; understanding the public health consequences of climate-related disasters such as hurricanes and other tropical cyclones is an essential component of environmental justice."

Female injury death rate increases (46.5 percent) were higher than males (27.6 percent) in the month of hurricanes. Death rate increases were higher for those aged 65 years or older in the month after tropical cyclones (6.4 percent) when compared with younger ages (2.7 percent).

"In the U.S., tropical cyclones, such as hurricanes and tropical storms, have a devastating effect on society, yet a comprehensive assessment of their continuing health impacts had been lacking,"said Marianthi-Anna Kioumourtzoglou, ScD, assistant professor of Environmental Health Sciences at Columbia Mailman School of Public Health, and senior author. "Our study is a first major step in better understanding how cyclones may affect deaths, which provides an essential foundation for improving resilience to climate-related disasters across the days, weeks, months, and years after they wreak destruction."

Read more at Science Daily

New species of extinct vampire-squid-like cephalopod is the first of its kind with 10 functional arms

New research led by scientists at the American Museum of Natural History and Yale shows that the oldest ancestors of the group of animals that includes octopuses and vampire squids had not eight but 10 arms. The study, which describes a new species of vampyropod based on a 328-million-year-old fossil that had not been previously described, pushes back the age of the group by nearly 82 million years. The details are published today in the journal Nature Communications.

"This is the first and only known vampyropod to possess 10 functional appendages," said lead author Christopher Whalen, a postdoctoral researcher in the Museum's Division of Paleontology and a National Science Foundation postdoctoral fellow in Yale's Department of Earth & Planetary Sciences.

Vampyropods are soft-bodied cephalopods typically characterized by eight arms and an internalized chitinous shell or fin supports. Because they lack hard structures, Vampyropoda are not well represented in the fossil record. The new study is based on an exceptionally well-preserved vampyropod fossil from the collections of the Royal Ontario Museum (ROM). Originally discovered in what is now Montana and donated to ROM in 1988.

Whalen and coauthor Neil Landman, a curator emeritus in the Museum's Division of Paleontology, identified the fossil specimen as a completely new genus and species that dates to about 328 million years old, making it the oldest known vampyropod and extending the fossil record of the group by about 82 million years. In the new study, they also describe its 10 arms -- all with preserved suckers -- corroborating previous scientific arguments that the common ancestor of vampyropods had 10 arms as well.

"The arm count is one of the defining characteristics separating the 10-armed squid and cuttlefish line (Decabrachia) from the eight armed octopus and vampire squid line (Vampyropoda). We have long understood that octopuses achieve the eight arm count through elimination of the two filaments of vampire squid, and that these filaments are vestigial arms," said Whalen. "However, all previously reported fossil vampyropods preserving the appendages only have 8 arms, so this fossil is arguably the first confirmation of the idea that all cephalopods ancestrally possessed ten arms."

Two of the cephalopod's arms appear to have been elongated relative to the other eight arms, and its torpedo-shaped body is reminiscent of today's squids. The fossil was given the name Syllipsimopodi bideni. The genus name is derived from the Greek word "syllípsimos" for "prehensile" and "pódi" for "foot" -- because this is the oldest known cephalopod to develop suckers, allowing the arms, which are modifications of the molluscan foot, to better grasp prey and other objects. The species name is to honor the recently inaugurated (at the time of paper submission) 46th President of the United States, Joseph R. Biden.

"Syllipsimopodi may have filled a niche more similar to extant squids, a midlevel aquatic predator," said Landman. "It is not inconceivable that it might have used its sucker-laden arms to pry small ammonoids out of their shells or ventured more inshore to prey on brachiopods, bivalves, or other shelled marine animals."

Based on the age, characters, and phylogenetic position, the fossil challenges the predominant arguments for vampyropod origins, and the authors propose a new model for coleoid (internally shelled cephalopod) evolution.

Read more at Science Daily

Lead exposure in last century shrank IQ scores of half of Americans, study finds

In 1923, lead was first added to gasoline to help keep car engines healthy. However, automotive health came at the great expense of our own health and well-being.

A new study calculates that exposure to car exhaust from leaded gas during childhood stole a collective 824 million IQ points from more than 170 million Americans alive today, about half the population of the United States.

The findings, from Aaron Reuben, a PhD candidate in clinical psychology at Duke University, and colleagues at Florida State University, suggest that Americans born before 1996 may now be at greater risk for lead-related health problems, such as faster aging of the brain. Leaded gas for cars was banned in the U.S. in 1996, but the researchers say that anyone born before the end of that era, and especially those at the peak of its use in the 1960s and 1970s, had concerningly high lead exposures as children.

The team's paper appeared the week of March 7 in the journal Proceedings of the National Academy of Sciences.

Lead is neurotoxic and can erode brain cells after it enters the body. As such, there is no safe level of exposure at any point in life, health experts say. Young children are especially vulnerable to lead's ability to impair brain development and lower cognitive ability. Unfortunately, no matter what age, our brains are ill-equipped for keeping it at bay.

"Lead is able to reach the bloodstream once it's inhaled as dust, or ingested, or consumed in water," Reuben said. "In the bloodstream, it's able to pass into the brain through the blood-brain barrier, which is quite good at keeping a lot of toxicants and pathogens out of the brain, but not all of them."

One major way lead used to invade bloodstreams was through automotive exhaust.

To answer the complex question of how leaded gas use for more than 70 years may have left a permanent mark on human health, Reuben and his co-authors Michael McFarland and Mathew Hauer, both professors of sociology at Florida State University, opted for a fairly simple strategy.

Using publicly available data on U.S. childhood blood-lead levels, leaded-gas use, and population statistics, they determined the likely lifelong burden of lead exposure carried by every American alive in 2015. From this data, they estimated lead's assault on our intelligence by calculating IQ points lost from leaded gas exposure as a proxy for its harmful impact on public health.

The researchers were stunned by the results.

"I frankly was shocked," McFarland said. "And when I look at the numbers, I'm still shocked even though I'm prepared for it."

As of 2015, more than 170 million Americans (more than half of the U.S. population) had clinically concerning levels of lead in their blood when they were children, likely resulting in lower IQs and putting them at higher risk for other long-term health impairments, such as reduced brain size, greater likelihood of mental illness, and increased cardiovascular disease in adulthood.

Leaded gasoline consumption rose rapidly in the early 1960s and peaked in the 1970s. As a result, Reuben and his colleagues found that essentially everyone born during those two decades are all but guaranteed to have been exposed to pernicious levels of lead from car exhaust.

Even more startling was lead's toll on intelligence: childhood lead exposure may have blunted America's cumulative IQ score by an estimated 824 million points -- nearly three points per person on average. The researchers calculated that at its worst, people born in the mid-to-late 1960s may have lost up to six IQ points, and children registering the highest levels of lead in their blood, eight times the current minimum level to initiate clinical concern, fared even worse, potentially losing more than seven IQ points on average.

Dropping a few IQ points may seem negligible, but the authors note that these changes are dramatic enough to potentially shift people with below-average cognitive ability (IQ score less than 85) to being classified as having an intellectual disability (IQ score below 70).

Moving forward, McFarland is analyzing the racial disparities of childhood lead exposure, hoping to highlight the health inequities suffered by Black children, who were exposed more often to lead and in greater quantities than white children.

Reuben's next step will be to examine the long-term consequences of past lead exposure on brain health in old age, based on previous findings that adults with high childhood lead exposure may experience accelerated brain aging.

Read more at Science Daily

Mar 7, 2022

Amazon rainforest is losing resilience: New evidence from satellite data analysis

The Amazon rainforest is likely losing resilience, data analysis from high-resolution satellite images suggests. This is due to stress from a combination of logging and burning -- the influence of human-caused climate change is not clearly determinable so far, but will likely matter greatly in the future. For about three quarters of the forest, the ability to recover from perturbation has been decreasing since the early 2000s, which the scientists see as a warning sign. The new evidence is derived from advanced statistical analysis of satellite data of changes in vegetation biomass and productivity.

"Reduced resilience -- the ability to recover from perturbations like droughts or fires -- can mean an increased risk of dieback of the Amazon rainforest. That we see such a resilience loss in observations is worrying," says Niklas Boers from the Potsdam Institute for Climate Impact Research and the Technical University of Munich, who conducted the study jointly with researchers from the University of Exeter, UK.

"The Amazon rainforest is a home to a unique host of biodiversity, strongly influences rainfall all over South America by way of its enormous evapotranspiration, and stores huge amounts of carbon that could be released as greenhouse gases in the case of even partial dieback, in turn contributing to further global warming," Boers explains. "This is why the rainforest is of global relevance."

"When the tipping itself will be observable, it would be too late"

The Amazon is considered a potential tipping element in the Earth system and a number of studies revealed its vulnerability. "However, computer simulation studies of its future yield quite a range of results," says Boers. "We've therefore been looking into specific observational data for signs of resilience changes during the last decades. We see continuously decreasing rainforest resilience since the early 2000s, but we cannot tell when a potential transition from rainforest to savanna might happen. When it will be observable, it would likely be too late to stop it." The research is part of the project 'Tipping Points in the Earth System' (TiPES) funded by European Union's Horizon 2020 programme.

The team from the Potsdam Institute for Climate Impact Research and the Global Systems Institute of the University of Exeter used stability indicators that had previously already been applied to the Greenland ice sheet and the Atlantic overturning circulation. These statistical indicators aim at predicting the approach of a system towards an abrupt change by identifying a critical slowing down of the system's dynamics, for instance its reaction to weather variability. The analysis of two satellite data sets, representing biomass and the greenness of the forest, revealed the critical slowing down. This critical slowing down can be seen as a weakening of the restoring forces that usually bring the system back to its equilibrium after perturbations.

"A system might seem stable if one is considering only its mean state"

While a system might seem stable if one is considering only its mean state, taking a closer look at the data with innovative statistical methods can reveal resilience loss," says Chris Boulton from the University of Exeter's Global Systems Institute. "Previous studies based on computer simulations indicated that large parts of the Amazon can be committed to dieback before showing a strong change in the mean state. Our observational analysis now shows that in many areas destabilization indeed seems to be underway already."

To try and determine causes for the loss of resilience that the scientists find in the data, they explored the relation to rainfall in a given area in the Amazon, culminating in three 'once in a century' drought events in the region. Drier areas turn out to be more at risk than wetter ones. "This is alarming, as the IPCC models project an overall drying of the Amazon region in response to anthropogenic global warming," says Boers. Another factor is the distance of an area to roads and settlements from where people can access the forest. The data confirms that areas close to human land-use are more threatened.

Read more at Science Daily

Higher risk of temperature-related death if global warming exceeds 2°C

The death rate linked to extreme temperatures will increase significantly under global warming of 2°C, finds a report by researchers from UCL and the University of Reading.

Temperature-related mortality -- where a death is directly linked to climate temperature -- in England and Wales during the hottest days of the year will increase by 42% under a warming scenario of 2°C from pre-industrial levels. This means an increase from present-day levels of around 117 deaths per day, averaged over the 10 hottest days of the year, to around 166 deaths per day. The findings underline the importance of keeping global warming levels to below 2°C.

At current global warming levels of around 1.21°C we see a slight decrease in temperature-related mortality in winter and a minimal net effect in summer, meaning that overall, at this level of warming we see a slight decrease in temperature-related mortality rate.

In the paper, published in Environmental Research Letters, the team examined the impact of climate change on temperature-related mortality rates in England and Wales, focusing on the risk from heat in summer and cold in winter. They found that as the global mean temperature increases, temperature-related mortality in summer will increase at a much faster, non-linear rate.

The rate of increase particularly speeds up at 2°C of warming, with a much higher risk appearing beyond 2.5°C. The researchers say that 3°C warming could lead to a 75% increase in mortality risk during heatwaves.

When plotted on a graph, the relationship between temperature and mortality is roughly u-shaped, meaning that at extremely high temperatures, which the population is not used to, the mortality risk increases sharply for each degree rise of daily mean temperature.

The rate in winter will continue to decrease, although this doesn't take side effects of extreme weather -- such as storms -- into account.

Lead author Dr Katty Huang (UCL Civil, Environmental & Geomatic Engineering) said: "The increase in mortality risk under current warming levels is mainly notable during heatwaves, but with further warming, we would see risk rise on average summer days in addition to escalating risks during heatwaves. What this means is that we shouldn't expect past trends of impact per degree of warming to apply in the future. One degree of global warming beyond 2°C would have a much more severe impact on health in England and Wales than one degree warming from pre-industrial levels, with implications for how the NHS can cope."

In England and Wales, temperature is associated with around 9% of total population mortality, meaning that 9% of all deaths during 2021 could be associated with the temperature. Most of those deaths are related to the side effects of cold weather.

The team analysed the 2018 UK Climate Projections (UKCP18) with data on present-day temperature and mortality in order to predict changes in temperature-related mortality relative to degrees of global warming.

In order to isolate the effects of global warming on mortality risk, the researchers looked at the potential impact for the current population, without taking into account future changes such as average age and health conditions.

Read more at Science Daily

How does the brain make memories?

Researchers have discovered two types of brain cells that play a key role in dividing continuous human experience into distinct segments that can be recalled later. The discovery provides new promise as a path toward development of novel treatments for memory disorders such as dementia and Alzheimer's disease.

In a study led by Cedars-Sinai, researchers have discovered two types of brain cells that play a key role in dividing continuous human experience into distinct segments that can be recalled later. The discovery provides new promise as a path toward development of novel treatments for memory disorders such as dementia and Alzheimer's disease.

The study, part of a multi-institutional BRAIN Initiative consortium funded by the National Institutes of Health and led by Cedars-Sinai, was published in the peer-reviewed journal Nature Neuroscience. As part of ongoing research into how memory works, Ueli Rutishauser, PhD, professor of Neurosurgery, Neurology, and Biomedical Sciences at Cedars-Sinai, and co-investigators looked at how brain cells react as memories are formed.

"One of the reasons we can't offer significant help for somebody who suffers from a memory disorder is that we don't know enough about how the memory system works," said Rutishauser, senior author of the study, adding that memory is foundational to us as human beings.

Human experience is continuous, but psychologists believe, based on observations of people's behavior, that memories are divided by the brain into distinct events, a concept known as event segmentation. Working with 19 patients with drug-resistant epilepsy, Rutishauser and his team were able to study how neurons perform during this process.

Patients participating in the study had electrodes surgically inserted into their brains to help locate the focus of their epileptic seizures, allowing investigators to record the activity of individual neurons while the patients viewed film clips that included cognitive boundaries.

While these boundaries in daily life are nuanced, for research purposes, the investigators focused on "hard" and "soft" boundaries.

"An example of a soft boundary would be a scene with two people walking down a hallway and talking, and in the next scene, a third person joins them, but it is still part of the same overall narrative," said Rutishauser, interim director of the Center for Neural Science and Medicine and the Board of Governors Chair in Neurosciences at Cedars-Sinai.

In the case of a hard boundary, the second scene might involve a completely different set of people riding in a car. "The difference between hard and soft boundaries is in the size of the deviation from the ongoing narrative," Rutishauser said. "Is it a totally different story, or like a new scene from the same story?"

When study participants watched film clips, investigators noted that certain neurons in the brain, which they labeled "boundary cells," increased their activity after both hard and soft boundaries. Another group of neurons, labeled "event cells," increased their activity only in response to hard boundaries, but not soft boundaries.

Rutishauser and his co-investigators theorize that peaks in the activity of boundary and event cells -- which are highest after hard boundaries, when both types of cells fire -- send the brain into the proper state for initiating a new memory.

"A boundary response is kind of like creating a new folder on your computer," said Rutishauser. "You can then deposit files in there. And when another boundary comes around, you close the first folder and create another one."

To retrieve memories, the brain uses boundary peaks as what Rutishauser calls "anchors for mental time travel."

"When you try to remember something, it causes brain cells to fire," Rutishauser said. "The memory system then compares this pattern of activity to all the previous firing peaks that happened shortly after boundaries. If it finds one that is similar, it opens that folder. You go back for a few seconds to that point in time, and things that happened then come into focus."

To test their theory, investigators gave study participants two memory tests.

They first showed participants a series of still images and asked them whether or not they had seen them in the film clips they had viewed. Study participants were more likely to remember images that closely followed a hard or soft boundary, when a new "memory folder" would have been created.

Investigators also showed participants pairs of images from film clips they had viewed and asked which of the images appeared first. Participants had difficulty remembering the correct order of images that appeared on opposite sides of a hard boundary, possibly because the brain had segmented those images into separate memory folders.

Rutishauser said that therapies that improve event segmentation could help patients with memory disorders. Even something as simple as a change in atmosphere can amplify event boundaries, he explained.

"The effect of context is actually quite strong," Rutishauser said. "If you study in a new place, where you have never been before, instead of on your couch where everything is familiar, you will create a much stronger memory of the material."

The research team included postdoctoral fellow Jie Zheng, PhD, and neuroscientist Gabriel Kreiman, PhD, from Boston Children's Hospital; neurosurgeon Taufik A. Valiante, MD, PhD, of the University of Toronto; and Adam Mamelak, MD, professor of Neurosurgery and director of the Functional Neurosurgery Program at Cedars-Sinai.

In follow-up studies, the team plans to test the theory that boundary and event cells activate dopamine neurons when they fire, and that dopamine, a chemical that sends messages between cells, might be used as a therapy to strengthen memory formation.

Rutishauser and his team also noted during this study that when event cells fired in time with one of the brain's internal rhythms, the theta rhythm -- a repetitive pattern of activity linked to learning, memory and navigation -- subjects were better able to remember the order of images they had seen. This is an important new insight because it shows that deep brain stimulation that adjusts theta rhythms could prove therapeutic for memory disorders.

Read more at Science Daily

Collectors in the prehistoric world recycled old stone tools to preserve the memory of their ancestors

A first-of-its-kind study at Tel Aviv University asks what drove prehistoric humans to collect and recycle flint tools that had been made, used, and discarded by their predecessors. After examining flint tools from one layer at the 500,000-year-old prehistoric site of Revadim in the south of Israel's Coastal Plain, the researchers propose a novel explanation: prehistoric humans, just like us, were collectors by nature and culture. The study suggests that they had an emotional urge to collect old human-made artefacts, mostly as a means for preserving the memory of their ancestors and maintaining their connectedness with place and time.

The study was led by PhD student Bar Efrati and Prof. Ran Barkai of the Jacob M. Alkow Department of Archaeology and Ancient Near Eastern Cultures at TAU's Entin Faculty of Humanities, in collaboration with Dr. Flavia Venditti from the University of Tubingen in Germany and Prof. Stella Nunziante Cesaro from the Sapienza University of Rome, Italy. The paper appeared in the journal Scientific Reports, published by Nature.

Bar Efrati explains that stone tools with two lifecycles have been found at prehistoric sites all over the world, but the phenomenon has never been thoroughly investigated. In the current study the researchers focused on a specific layer at Revadim -- a large, open-air, multi-layered site in the south of Israel's Coastal Plain, dated to about 500,000 years ago. The rich findings at Revadim suggest that this was a popular spot in the prehistoric landscape, revisited over and over again by early humans drawn by an abundance of wildlife, including elephants. Moreover, the area is rich with good-quality flint, and most tools found at Revadim were in fact made of fresh flint.

Bar Efrati: "The big question is: Why did they do it? Why did prehistoric humans collect and recycle actual tools originally produced, used, and discarded by their predecessors, many years earlier? Scarcity of raw materials was clearly not the reason at Revadim, where good-quality flint is easy to come by. Nor was the motivation merely functional, since the recycled tools were neither unusual in form nor uniquely suitable for any specific use."

The key to identifying the recycled tools and understanding their history is the patina -- a chemical coating which forms on flint when it is exposed to the elements for a long period of time. Thus, a discarded flint tool that lay on the ground for decades or centuries accumulated an easily identifiable layer of patina, which is different in both color and texture from the scars of a second cycle of processing that exposed the original color and texture of flint.

In the current study, 49 flint tools with two lifecycles were examined. Produced and used in their first lifecycle, these tools were abandoned, and years later, after accumulating a layer of patina, they were collected, reworked, and used again. The individuals who recycled each tool removed the patina, exposing fresh flint, and shaped a new active edge. Both edges, the old and the new, were examined by the researchers under two kinds of microscopes, and via various chemical analyses, in search of use-wear marks and/or organic residues. In the case of 28 tools, use-wear marks were found on the old and/or new edges, and in 13 tools, organic residues were detected, evidence of contact with animal bones or fat.

Surprisingly, the tools had been used for very different purposes in their two lifecycles -- the older edges primarily for cutting, and the newer edges for scraping (processing soft materials like leather and bone). Another baffling discovery: in their second lifecycle the tools were reshaped in a very specific and minimal manner, preserving the original form of the tool, including its patina, and only slightly modifying the active edge.

Read more at Science Daily