Oct 10, 2020

Previous infection with other types of coronaviruses may lessen severity of COVID-19

 

Person blowing nose
Being previously infected with a coronaviruses that cause the "common cold" may decrease the severity of severe acute respiratory syndrome coronavirus (SARS-CoV-2) infections, according to results of a new study. Led by researchers at Boston Medical Center and Boston University School of Medicine, the study also demonstrates that the immunity built up from previous non-SARS-CoV-2 coronavirus infections does not prevent individuals from getting COVID-19. Published in the Journal of Clinical Investigation, the findings provide important insight into the immune response against SARS-CoV-2, which could have significant implications on COVID-19 vaccine development.

The COVID-19 pandemic has led to more than 200,000 deaths in the US, and more than one million globally. There is a growing body of research looking into specific ways that the SARS-CoV-2 virus impacts different populations, including why some people are infected and are asymptomatic, as well as what increases ones mortality as a result of infection. There are a number of vaccines under development in order to determine what type of vaccine (mRNA, viral vector) will be most effective at preventing SARS-CoV-2 infections.

While SARS-CoV-2 is a relatively new pathogen, there are many other types of coronaviruses that are endemic in humans and can cause the "common cold" and pneumonia. These coronaviruses share some genetic sequences with SARS-CoV-2, and the immune responses from these coronaviruses can cross-react against SARS-CoV-2.

In this study, the researchers looked at electronic medical record data from individuals who had a respiratory panel test (CRP-PCR) result between May 18, 2015 and March 11, 2020. The CRP-PCR detects diverse respiratory pathogens including the endemic "common cold" coronaviruses. They also examined data from individuals who were tested for SARS-CoV-2 between March 12, 2020 and June 12, 2020. After adjusting for age, gender, body mass index, and diabetes mellitus diagnosis, COVID-19 hospitalized patients who had a previous positive CRP-PCR test result for a coronoavirus had significantly lower odds of being admitted to the intensive care unit (ICU), and lower trending odds of requiring mechanical ventilation during COVID. The probability of survival was also significantly higher in COVID-19 hospitalized patients with a previous positive test result for a "common cold" coronoavirus. However, a previous positive test result for a coronavirus did not prevent someone from getting infected with SARS-CoV-2.

"Our results show that people with evidence of a previous infection from a "common cold" coronavirus have less severe COVID-19 symptoms," said Manish Sagar, MD, an infectious diseases physician and researcher at Boston Medical Center, associate professor of medicine and microbiology at Boston University School of Medicine and the study's co-corresponding author. Another interesting finding, the authors note, is that immunity may prevent disease (COVID-19) in ways that are different from preventing infection by SARS-CoV-2. This is demonstrated by the fact that the patient groups had similar likelihoods of infection but differing likelihoods of ending up in the ICU or dying.

"People are routinely infected with coronaviruses that are different from SARS-CoV-2, and these study results could help identify patients at lower and greater risk of developing complications after being infected with SARS-CoV-2," said Joseph Mizgerd, ScD, professor of medicine, microbiology, and biochemistry at Boston University School of Medicine who is the study's co-corresponding author. "We hope that this study can be the springboard for identifying the types of immune responses for not necessarily preventing SARS-CoV-2 infection but rather limiting the damage from COVID-19."

Read more at Science Daily

Nitrous oxide emissions pose an increasing climate threat, study finds

 

Tractor spraying field
Rising nitrous oxide (N2O) emissions are jeopardizing the climate goals of the Paris Agreement, according to a major new study by an international team of scientists.

The growing use of nitrogen fertilizers in the production of food worldwide is increasing atmospheric concentrations of N2O -- a greenhouse gas 300 times more potent than carbon dioxide (CO2) that remains in the atmosphere for more than 100 years.

Published today in the journal Nature, the study was led Auburn University, in the US, and involved scientists from 48 research institutions in 14 countries -- including the University of East Anglia (UEA) in the UK -- under the umbrella of the Global Carbon Project and the International Nitrogen Initiative.

The aim was to produce the most comprehensive assessment to date of all global sources and sinks of N2O. Their findings show N2O emissions are increasing faster than any emission scenario developed by the Intergovernmental Panel on Climate Change (IPCC), consistent with greenhouse gas scenarios that lead to global mean temperature increases well above 3°C from pre-industrial levels. The Paris Agreement aims to limit warming to less than 2°C but ideally no more than 1.5°C.

The study points to an alarming trend affecting climate change: N2O has risen 20 per cent from pre-industrial levels -- from 270 parts per billion (ppb) in 1750 to 331ppb in 2018 -- with the fastest growth observed in the last 50 years due to emissions from human activities.

Prof Hanqin Tian, director of the International Center for Climate and Global Change Research at Auburn University's School of Forestry and Wildlife Sciences, co-led the study.

"The dominant driver of the increase in atmospheric nitrous oxide comes from agriculture, and the growing demand for food and feed for animals will further increase global nitrous oxide emissions," said Prof Tian. "There is a conflict between the way we are feeding people and stabilizing the climate."

Like CO2, N2O is a long-lived greenhouse gas and is also currently the most significant human-induced agent depleting the stratospheric ozone layer, which protects Earth from most of the Sun's harmful ultraviolet radiation

Lead UK author Dr Parvadha Suntharalingam, of UEA's School of Environmental Sciences, said: "This study presents the most comprehensive and detailed picture to date, of N2O emissions and their impact on climate.

"This new analysis identifies the factors driving the steadily increasing atmospheric levels of N2O, and highlights the urgent need to develop effective mitigation strategies if we are to limit global warming and meet climate goals."

The study presents a comprehensive global N2O inventory that incorporates both natural and human-related sources, and accounts for the interaction between nitrogen additions to the earth system and the biochemical processes that control N2O emissions. It covers 21 natural and human-related sectors between 1980 and 2016.

Human-induced emissions, which are dominated by nitrogen additions to croplands, increased by 30 per cent over the past four decades to 7.3 teragrams of nitrogen per year.

The analysis also reveals an emerging N2O-climate 'feedback' resulting from interactions between nitrogen additions to crops for food production and global warming, further enhancing emissions derived from agriculture.

The study found that the largest contributors to global N2O emissions come from East Asia, South Asia, Africa and South America. Emissions from synthetic fertilizers dominate releases in China, India and the US, while emissions from the application of livestock manure as fertilizer dominates releases in Africa and South America. The highest growth rates in emissions are in emerging economies, particularly Brazil, China and India, where crop production and livestock numbers have increased.

However, N2O emissions in Europe decreased in agriculture and the chemical industry. This was due to a combination of factors, including voluntary measures to remove N2O from flue gases in the Nylon industry and the introduction of an emissions trading scheme, as well as agriculture in many Western European countries moving to more efficient use of fertilizer to reduce environmental impacts such as pollution of groundwater and surface water. Policies on nitrogen fertilizer usage were also introduced.

Study co-leader Dr Josep 'Pep' Canadell, of the Commonwealth Scientific and Industrial Research Organisation (CSIRO) in Australia, is executive director of the Global Carbon Project. He said: "This new analysis calls for a full-scale rethink in the ways we use and abuse nitrogen fertilizers globally and urges us to adopt more sustainable practices in the way we produce food, including the reduction of food waste.

Read more at Science Daily

Oct 9, 2020

Oldest monkey fossils outside of Africa found

 Three fossils found in a lignite mine in southeastern Yunan Province, China, are about 6.4 million years old, indicate monkeys existed in Asia at the same time as apes, and are probably the ancestors of some of the modern monkeys in the area, according to an international team of researchers.

"This is significant because they are some of the very oldest fossils of monkeys outside of Africa," said Nina G. Jablonski, Evan Pugh University Professor of Anthropology, Penn State. "It is close to or actually the ancestor of many of the living monkeys of East Asia. One of the interesting things from the perspective of paleontology is that this monkey occurs at the same place and same time as ancient apes in Asia."

The researchers, who included Jablonski and long-time collaborator Xueping Ji, department of paleoanthropology, Yunnan Institute of Cultural Relics and Archaeology, Kunming, China, studied the fossils unearthed from the Shuitangba lignite mine that has yielded many fossils. They report that "The mandible and proximal femur were found in close proximity and are probably of the same individual," in a recent issue of the Journal of Human Evolution. Also uncovered slightly lower was a left calcaneus -- heel bone -- reported by Dionisios Youlatos, Aristotle University of Thessaloniki, Greece, in another paper online in the journal, that belongs to the same species of monkey, Mesopithecus pentelicus.

"The significance of the calcaneus is that it reveals the monkey was well adapted for moving nimbly and powerfully both on the ground and in the trees," said Jablonski. "This locomotor versatility no doubt contributed to the success of the species in dispersing across woodland corridors from Europe to Asia."

The lower jawbone and upper portion of the leg bone indicate that the individual was female, according to the researchers. They suggest that these monkeys were probably "jacks of all trades" able to navigate in the trees and on land. The teeth indicate they could eat a wide variety of plants, fruits and flowers, while apes eat mostly fruit.

"The thing that is fascinating about this monkey, that we know from molecular anthropology, is that, like other colobines (Old World monkeys), it had the ability to ferment cellulose," said Jablonski. "It had a gut similar to that of a cow."

These monkeys are successful because they can eat low-quality food high in cellulose and obtain sufficient energy by fermenting the food and using the subsequent fatty acids then available from the bacteria. A similar pathway is used by ruminant animals like cows, deer and goats.

"Monkeys and apes would have been eating fundamentally different things," said Jablonski. "Apes eat fruits, flowers, things easy to digest, while monkeys eat leaves, seeds and even more mature leaves if they have to. Because of this different digestion, they don't need to drink free water, getting all their water from vegetation."

These monkeys do not have to live near bodies of water and can survive periods of dramatic climatic change.

"These monkeys are the same as those found in Greece during the same time period," said Jablonski. "Suggesting they spread out from a center somewhere in central Europe and they did it fairly quickly. That is impressive when you think of how long it takes for an animal to disperse tens of thousands of kilometers through forest and woodlands."

While there is evidence that the species began in Eastern Europe and moved out from there, the researchers say the exact patterns are unknown, but they do know the dispersal was rapid, in evolutionary terms. During the end of the Miocene when these monkeys were moving out of Eastern Europe, apes were becoming extinct or nearly so, everywhere except in Africa and parts of Southeast Asia.

Read more at Science Daily

Signals from distant stars connect optical atomic clocks across Earth for the first time

 Using radio telescopes observing distant stars, scientists have connected optical atomic clocks on different continents. The results were published in the scientific journal Nature Physics by an international collaboration between 33 astronomers and clock experts at the National Institute of Information and Communications Technology (NICT, Japan), the Istituto Nazionale di Ricerca Metrologica (INRIM, Italy), the Istituto Nazionale di Astrofisica (INAF, Italy), and the Bureau International des Poids et Mesures (BIPM, France).

The BIPM in Sèvres near Paris routinely calculates the international time recommended for civil use (UTC, Coordinated Universal Time) from the comparison of atomic clocks via satellite communications. However, the satellite connections that are essential to maintaining a synchronized global time have not kept up with the development of new atomic clocks: optical clocks that use lasers interacting with ultracold atoms to give a very refined ticking. "To take the full benefit of optical clocks in UTC, it is important to improve worldwide clock comparison methods." said Gérard Petit, physicist at the Time Department at BIPM.

In this new research, highly-energetic extragalactic radio sources replace satellites as the source of reference signals. The group of SEKIDO Mamoru at NICT designed two special radio telescopes, one deployed in Japan and the other in Italy, to realize the connection using the technique of Very Long Baseline Interferometry (VLBI). These telescopes are capable of observations over a large bandwidth, while antenna dishes of just 2.4 meter diameter keep them transportable. "We want to show that broadband VLBI has potential to be a powerful tool not only for geodesy and astronomy, but also for metrology." commented SEKIDO. To reach the required sensitivity, the small antennas worked in tandem with a larger 34 m radio telescope in Kashima, Japan during the measurements taken from October 14 2018 to February 14 2019. For the Kashima radio telescope, these were among the last observations before the telescope was irreparably damaged by typhoon Faxai in September 2019.

The goal of the collaboration was to connect two optical clocks in Italy and Japan, separated by a baseline distance of 8700 km. These clocks load hundreds of ultra-cold atoms in an optical lattice, an atomic trap engineered with laser light. The clocks use different atomic species: ytterbium for the clock at INRIM and strontium at NICT. Both are candidates for a future redefinition of the second in the International System of Units (SI). "Today, the new generation of optical clocks is pushing to review the definition of the second. The road to a redefinition must face the challenge of comparing clocks globally, at the intercontinental scale, with better performances than today," said Davide Calonico, head of the "Quantum Metrology and Nanotechnology" division and coordinator of the research at INRIM.

The connection is possible by observing quasars billions of light-years away: radio sources powered by black holes weighing millions of solar masses, but so distant that they can be considered fixed points in the sky. The telescopes aim at a different star every few minutes to compensate for the effects of the atmosphere. "We observed the signal not from satellites, but from cosmic radio sources," commented IDO Tetsuya, director of the "Space-Time Standards Laboratory" and coordinator of the research at NICT. "VLBI may allow us in Asia to access the UTC relying on what we can prepare by ourselves." IDO added.

Antennas like the transportable ones used in these measurements can be installed directly at the laboratories developing optical clocks around the world. According to SEKIDO, "a global optical clock network connected by VLBI may be realized by collaboration between the international communities of metrology and geodesy, just like the broadband VLBI network of the VLBI Global Observing System (VGOS) has already been established," while Petit commented: "waiting for long-distance optical links, this research shows that there is still to gain from radio links, where VLBI with transportable antennas can complement the Global Navigation Satellite Systems and telecommunication satellites."

Read more at Science Daily

Bone Loss: Perforated bone tissue from too little sugar

 Could something as simple as a certain type of sugar water be medicine for perforated bones, and even bone marrow cancer itself?

Inside our bodies are some jellyfish-like cells that actually eat away at our bones. Every year, they eat about ten per cent of the bone mass in our body. Fortunately, other cells usually follow and build up new bone.

We undergo a kind of continuous remodelling and repair that enables most of us to traipse around with steel in our legs and arms.

In people with bone marrow cancer, the bone-eating cells run amok. They become too numerous and eat too much. The bone-building gang doesn't have time to rebuild the bone mass, despite overtime and long shifts. Bone tissue gets gobbled up.

Many people with bone marrow cancer often end up with perforated bones, a condition that is very painful to live with. They sometimes experience collapsed vertebrae or suffer broken bones just by turning in bed.

For decades, scientists around the world have been scratching their heads and wondering what the cause could be. Various theories have been launched, but researchers have not reached a consensus on the main cause.

Bone marrow cancer remains an incurable disease so far. Available treatments can prolong life, but not cure the disease.

Now Standal and her research group at the Centre of Molecular Inflammation Research (CEMIR) at the Norwegian University of Science and Technology (NTNU) have discovered a piece of the puzzle that looks very promising.

They have come to the conclusion that the cause of the bone destruction is too little sugar. We're not talking about the sugar we eat in our cakes and biscuits, but sugar that resides in a substance that is important for the immune system.

To get to the bottom of how sugar is related to bone loss, we need to get into the bone marrow. This is the soft cavity that inside all our bones.

Within the bones are plasma cells. When bacteria or viruses enter the body, the plasma cells begin their job of getting rid of the invaders. Antibodies are produced which are sent via the blood, ready to do battle.

So far so good, but in people with bone marrow cancer, far too much of one type of antibody is produced. It's going amok here, too. The antibody that the cancer makes is also completely useless. It doesn't knock out either the cold or the flu but just takes up too much space and displaces other types of antibodies.

"I thought simply. If people with bone marrow cancer have too much of the antibody and too many bone-eating cells, then they must be connected," Standal says.

The search for an answer gobbled a lot of her working hours for almost five years. The hard work was fortunately not in vain, and has led to a completely new and fundamental understanding.

This is how Standal arrived at the answer:

The vast majority of patients with bone marrow cancer develop perforated bones, but not all. Standal asked nicely, and received samples from patients with bone loss. She also took samples from patients without this kind of bone loss.

The researchers extracted antibodies from the samples and cultured bone-eating cells in the laboratory.

When Standal placed the bone-eating cells into the antibody of the patients with bone perforations, she discovered that the number of bone-eating cells increased.

When she put the bone-eating cells into the antibody of the patients without bone perforations, she discovered that the number of bone-eating cells did not increase.

"Why that was the case became the next interesting thing to figure out," Standal says.

The antibody carries a type of sugar that "decorates" it, in a way. The sugar has an effect on how the antibody works. Standal found her way to Manfred Wuhrer at the Center for Proteomics and Metabolomics of the Leiden University Medical Center in the Netherlands. He is a specialist in this type of sugar, and Standal sent the samples to him.

He found that individuals with bone loss were missing two sugar molecules at the end of a long chain inside the antibody.

"There was too little sugar," says Standal.

But this answer wasn't sufficient, either.

Although a difference was detected between the two groups, the researchers could not confirm that the missing sugar molecules were the reason patients developed more bone-eating cells. Several further experiments had to be conducted.

The research team went to the lab and put more sugar on the antibody. This did not lead to more bone-eating cells. Standal also did the opposite, removing sugar from the antibody. This did lead to more bone-eating cells.

The researchers then had sufficient test results to show that too little sugar can be decisive for the number of bone-eating cells. But this is not enough in medical research -- at least not if the goal is to use the knowledge to make medicine for humans.

The next step involved animal experiments with mice that have bone marrow cancer. The mice were divided into two groups and were given two different types of sugar water. In theory, one type of sugar water would lead to more sugar on the antibody.

"The theory actually worked. The mice that received this type of sugar water had smaller perforations in their bone tissue. They also developed less cancer," says Standal.

Now she has to carry out more animal experiments to move forward on the path towards a treatment that can give patients with bone marrow cancer a better life.

Read more at Science Daily

Genomic study reveals evolutionary secrets of banyan tree

 

Ficus microcarpa
The banyan fig tree Ficus microcarpa is famous for its aerial roots, which sprout from branches and eventually reach the soil. The tree also has a unique relationship with a wasp that has coevolved with it and is the only insect that can pollinate it.

In a new study, researchers identify regions in the banyan fig's genome that promote the development of its unusual aerial roots and enhance its ability to signal its wasp pollinator.

The study, published in the journal Cell, also identifies a sex-determining region in a related fig tree, Ficus hispida. Unlike F. microcarpa, which produces aerial roots and bears male and female flowers on the same tree, F. hispida produces distinct male and female trees and no aerial roots.

Understanding the evolutionary history of Ficus species and their wasp pollinators is important because their ability to produce large fruits in a variety of habitats makes them a keystone species in most tropical forests, said Ray Ming, a plant biology professor at the University of Illinois, Urbana-Champaign who led the study with Jin Chen, of the Chinese Academy of Sciences. Figs are known to sustain at least 1,200 bird and mammal species. Fig trees were among the earliest domesticated crops and appear as sacred symbols in Hinduism, Buddhism and other spiritual traditions.

The relationship between figs and wasps also presents an intriguing scientific challenge. The body shapes and sizes of the wasps correspond exactly to those of the fig fruits, and each species of fig produces a unique perfume to attract its specific wasp pollinator.

To better understand these evolutionary developments, Ming and his colleagues analyzed the genomes of the two fig species, along with that of a wasp that pollinates the banyan tree.

"When we sequenced the trees' genomes, we found more segmental duplications in the genome of the banyan tree than in F. hispida, the fig without the aerial roots," Ming said. "Those duplicated regions account for about 27% of the genome."

The duplications increased the number of genes involved in the synthesis and transport of auxins, a class of hormones that promote plant growth. The duplicated regions also contained genes involved in plant immunity, nutrition and the production of volatile organic compounds that signal pollinators.

"The levels of auxin in the aerial roots are five times higher than in the leaves of trees with or without aerial roots," Ming said. The elevated auxin levels appear to have triggered aerial root production. The duplicated regions also include genes that code for a light receptor that accelerates auxin production.

When they studied the genome of the fig wasp and compared it with those of other related wasps, the researchers observed that the wasps were retaining and preserving genes for odorant receptors that detect the same smelly compounds the fig trees produce. These genomic signatures are a signal of coevolution between the fig trees and the wasps, the researchers report.

Ming and his colleagues also discovered a Y chromosome-specific gene that is expressed only in male plants of F. hispida and three other fig species that produce separate male and female plants, a condition known as dioecy.

Read more at Science Daily

Oct 8, 2020

Moon's magnetic crust research sees scientists debunk long-held theory

 New international research into the Moon provides scientists with insights as to how and why its crust is magnetised, essentially 'debunking' one of the previous longstanding theories.

Australian researcher and study co-author Dr Katarina Miljkovic, from the Curtin Space Science and Technology Centre, located within the School of Earth and Planetary Sciences at Curtin University, explained how the new research, published by Science Advances, expands on decades of work by other scientists.

"There are two long term hypotheses associated with why the Moon's crust might be magnetic: One is that the magnetisation is the result of an ancient dynamo in the lunar core, and the other is that it's the result of an amplification of the interplanetary magnetic field, created by meteoroid impacts," Dr Miljkovic said.

"Our research is a deep numerical study that challenges that second theory -- the impact-related magnetisation -- and it essentially 'debunks' it. We found that meteoroid impact plasmas interact much more weakly with the Moon compared to the magnetisation levels obtained from the lunar crust.

"This finding leads us to conclude that a core dynamo is the only plausible source of the magnetisation of the Moon's crust."

To carry out her portion of the research, Dr Miljkovic provided the team with numerical estimates of the vapour formation that occurred during large meteoroid impact bombardment on the Moon approximately 4 billion years ago.

"When we look at the Moon with the naked eye, we can see these large craters caused by ancient meteoroid impacts. They are now filled with volcanic maria, or seas, causing them to look darker on the surface," Dr Miljkovic said.

"During these impact events, the meteoroids hit the Moon at a very high speed, causing displacement, melting, and vaporisation of the lunar crust.

"My work calculated the mass and thermal energy of the vapour emitted during these impacts. That was then used as input for further calculations and investigation of the behaviour of the ambient magnetic field at the Moon, following these large impact events.

"Basically, we made a much more inclusive, high fidelity and high-resolution investigation that led to debunking of the older hypothesis."

The study's lead researcher Dr Rona Oran, a research scientist in the Department of Earth, Atmospheric and Planetary Sciences (EAPS) at the Massachusetts Institute of Technology (MIT), said the impact simulations, combined with plasma simulations, harness the latest developments in scientific codes and computing power and allowed the team to perform the first simulations that could realistically capture and test this long-proposed mechanism.

Using such tools was key to allowing the team to look at many different scenarios, and in this way to rule out this mechanism under any feasible conditions that could have existed during the impact. This refutation could have important implications to determine what did magnetise the Moon, and even other objects in the solar system with unexplainable magnetised crusts.

Read more at Science Daily

New solar panel design could lead to wider use of renewable energy

 Designing solar panels in checkerboard lines increases their ability to absorb light by 125 per cent, a new study says.

Researchers say the breakthrough could lead to the production of thinner, lighter and more flexible solar panels that could be used to power more homes and be used in a wider range of products.

The study -- led by researchers from the University of York and conducted in partnership with NOVA University of Lisbon (CENIMAT-i3N) -- investigated how different surface designs impacted on the absorption of sunlight in solar cells, which put together form solar panels.

Scientists found that the checkerboard design improved diffraction, which enhanced the probability of light being absorbed which is then used to create electricity.

The renewable energy sector is constantly looking for new ways to boost the light absorption of solar cells in lightweight materials that can be used in products from roof tiles to boat sails and camping equipment.

Solar grade silicon -- used to create solar cells -- is very energy intensive to produce, so creating slimmer cells and changing the surface design would make them cheaper and more environmentally friendly.

Dr Christian Schuster from the Department of Physics said: "We found a simple trick for boosting the absorption of slim solar cells. Our investigations show that our idea actually rivals the absorption enhancement of more sophisticated designs -- while also absorbing more light deep in the plane and less light near the surface structure itself.

"Our design rule meets all relevant aspects of light-trapping for solar cells, clearing the way for simple, practical, and yet outstanding diffractive structures, with a potential impact beyond photonic applications.

"This design offers potential to further integrate solar cells into thinner, flexible materials and therefore create more opportunity to use solar power in more products."

The study suggests the design principle could impact not only in the solar cell or LED sector but also in applications such as acoustic noise shields, wind break panels, anti-skid surfaces, biosensing applications and atomic cooling.

Dr Schuster added: "In principle, we would deploy ten times more solar power with the same amount of absorber material: ten times thinner solar cells could enable a rapid expansion of photovoltaics, increase solar electricity production, and greatly reduce our carbon footprint.

"In fact, as refining the silicon raw material is such an energy-intensive process, ten times thinner silicon cells would not only reduce the need for refineries but also cost less, hence empowering our transition to a greener economy."

Read more at Science Daily

Silk fibers improve bioink for 3D-printed artificial tissues and organs

 How do you test, in early-stage research, whether a potential pharmaceutical effectively targets a human tumor, organ, or some other part of the body? How do you grow a new hand or some other body part? Researchers are in the early stages of using 3D cell printing technology to make developments like these happen. A standard way -- currently unavailable -- to fix the cells in place after printing would help researchers avoid having to 'reinvent the wheel' in every new investigation.

In a study recently published in Materials Today Bio, researchers from Osaka University have used silk nanofibers obtained by mechanical disintegration to enhance the printing process without damaging the cells or cell assemblies. An attractive point of silk for this application is that silk is believed to be a safe material for humans. This development will help bring 3D cell printing research out of the laboratory and into real-world biomedical use.

To obtain the fibers, the researchers started with virgin silk, then removed the protein sericin from it because this protein causes inflammation in patients. Next, the researchers ground the remaining biocompatible material into nanofibers. The fibers can be sterilized -- without damaging them -- for medical use, with common laboratory equipment.

"Our silk fibers are excellent additives to bioink cell printing media," says lead author Shinji Sakai. "They are compatible with many media, such as those containing gelatin, chitosan, or hyaluronic acid, giving them a broad range of potential applications."

The main purpose of the fibers was to ensure that the cells in the bioink retained their 3D positioning after printing without damaging the cells. The fibers fulfill this purpose by enhancing the integrity of the bioink and minimizing the damaging high mechanical stresses often placed on cells during printing.

"Various mechanical experiments say the same thing: the nanofibers enhanced the properties of the printing media," explains Professor Sakai. "For example, Young's modulus -- a measure of stiffness -- increased several-fold and remained enhanced for over a month."

The fibers help printed configurations retain their structural integrity after printing. For example, a nose-shaped configuration retained its shape only when printed with bioink containing the silk fibers. Over 85% of the cells in the bioink remained alive after a week in the printed bioink with or without the added fibers, indicating that adding the fibers did not damage the cells.

Read more at Science Daily

New research explores how super flares affect planets' habitability

 

Exoplanet illustration
Ultraviolet light from giant stellar flares can destroy a planet's habitability. New research from the University of North Carolina at Chapel Hill will help astrobiologists understand how much radiation planets experience during super flares and whether life could exist on worlds beyond our solar system.

Super flares are bursts of energy that are 10 to 1,000 times larger than the biggest flares from the Earth's sun. These flares can bathe a planet in an amount of ultraviolet light huge enough to doom the chances of life surviving there.

Researchers from UNC-Chapel Hill have for the first time measured the temperature of a large sample of super flares from stars, and the flares' likely ultraviolet emissions. Their findings, published Oct. 5 ahead of print in Astrophysical Journal, will allow researchers to put limits on the habitability of planets that are targets of upcoming planet-finding missions.

"We found planets orbiting young stars may experience life-prohibiting levels of UV radiation, although some micro-organisms might survive," said lead study author Ward S. Howard, a doctoral student in the Department of Physics and Astronomy at UNC-Chapel Hill.

Howard and colleagues at UNC-Chapel Hill used the UNC-Chapel Hill Evryscope telescope array and NASA's Transiting Exoplanet Survey Satellite (TESS) to simultaneously observe the largest sample of super flares.

The team's research expands upon previous work that has largely focused on flare temperatures and radiation from only a handful of super flares from a few stars. In expanding the research, the team discovered a statistical relationship between the size of a super flare and its temperature. The temperature predicts the amount of radiation that potentially precludes on-surface life.

Super flares typically emit most of their UV radiation during a rapid peak lasting only five to 15 minutes. The simultaneous Evryscope and TESS observations were obtained at two-minute intervals, ensuring multiple measurements were taken during the peak of each super flare.

This is the first time the temperatures of such a large sample of super flares has ever been studied. The frequency of observations allowed the team to discover the amount of time super flares can cook orbiting planets with intense UV radiation.

The flares observed have already informed the TESS Extended Mission to discover thousands of exoplanets in orbit around the brightest dwarf stars in the sky. TESS is now targeting high priority flare stars from the UNC-Chapel Hill sample for more frequent observations.

Read more at Science Daily

Oct 7, 2020

Astronomers turn up the heavy metal to shed light on star formation

 

Astronomers from The University of Western Australia's node of the International Centre for Radio Astronomy Research (ICRAR) have developed a new way to study star formation in galaxies from the dawn of time to today.

"Stars can be thought of as enormous nuclear-powered processing plants," said lead researcher Dr Sabine Bellstedt, from ICRAR.

"They take lighter elements like hydrogen and helium, and, over billions of years, produce the heavier elements of the periodic table that we find scattered throughout the Universe today.

"The carbon, calcium and iron in your body, the oxygen in the air you breathe, and the silicon in your computer all exist because a star created these heavier elements and left them behind," Bellstedt said.

"Stars are the ultimate element factories in the Universe."

Understanding how galaxies formed stars billions of years ago requires the very difficult task of using powerful telescopes to observe galaxies many billions of light-years away in the distant Universe.

However, nearby galaxies are much easier to observe. Using the light from these local galaxies, astronomers can forensically piece together the history of their lives (called their star-formation history). This allows researchers to determine how and when they formed stars in their infancy, billions of years ago, without struggling to observe galaxies in the distant Universe.

Traditionally, astronomers studying star formation histories assumed the overall metallicity -- or amount of heavy elements -- in a galaxy doesn't change over time.

But when they used these models to pinpoint when stars in the Universe should have formed, the results didn't match up with what they were seeing through their telescopes.

"The results not matching up with our observations is a big problem," Bellstedt said. "It tells us we're missing something."

"That missing ingredient, it turns out, is the gradual build-up of heavy metals within galaxies over time."

Using a new algorithm to model the energy and wavelengths of light coming from almost 7000 nearby galaxies, the researchers succeeded in reconstructing when most of the stars in the Universe formed -- in agreement with telescope observations for the first time.

The designer of the new code -- known as ProSpect -- is Associate Professor Aaron Robotham from ICRAR's University of Western Australia node.

"This is the first time we've been able to constrain how the heavier elements in galaxies change over time based on our analysis of these 7000 nearby galaxies," Robotham said.

"Using this galactic laboratory on our own doorstep gives us lots of observations to test this new approach, and we're very excited that it works.

"With this tool, we can now dissect nearby galaxies to determine the state of the Universe and the rate at which stars form and mass grows at any stage over the past 13 billion years.

"It's absolutely mind-blowing stuff."

This work also confirms an important theory about when most of the stars in the Universe formed.

"Most of the stars in the Universe were born in extremely massive galaxies early on in cosmic history -- around three to four billion years after the Big Bang," Bellstedt said.

"Today, the Universe is almost 14 billion years old, and most new stars are being formed in much smaller galaxies."

Based on this research, the next challenge for the team will be to expand the sample of galaxies being studied using this technique, in an effort to understand when, where and why galaxies die and stop forming new stars.

Read more at Science Daily

There's a reason bacteria stay in shape

 Fat bacteria? Skinny bacteria? From our perspective on high, they all seem to be about the same size. In fact, they are.

Precisely why has been an open question, according to Rice University chemist Anatoly Kolomeisky, who now has a theory.

A primal mechanism in bacteria that keeps them in their personal Goldilocks zones -- that is, just right -- appears to depend on two random means of regulation, growth and division, that cancel each other out. The same mechanism may give researchers a new perspective on disease, including cancer.

The "minimal model" by Kolomeisky, Rice postdoctoral researcher and lead author Hamid Teimouri and Rupsha Mukherjee, a former research assistant at Rice now at the Indian Institute of Technology Gandhinagar, appears in the American Chemical Society's Journal of Physical Chemistry Letters.

"Everywhere we see bacteria, they more or less have the same sizes and shapes," Kolomeisky said. "It's the same for the cells in our tissues. This is a signature of homeostasis, where a system tries to have physiological parameters that are almost the same, like body temperature or our blood pressure or the sugar level in our blood.

"Nature likes to have these parameters in a very narrow range so that living systems can work the most efficiently," he said. "Deviations from these parameters are a signature of disease."

Bacteria are models of homeostasis, sticking to a narrow distribution of sizes and shape. "But the explanations we have so far are not good," Kolomeisky said. "As we know, science does not like magic. But something like magic -- thresholds -- is proposed to explain it."

For bacteria, he said, there is no threshold. "Essentially, there's no need for one," he said. "There are a lot of underlying biochemical processes, but they can be roughly divided into two stochastic chemical processes: growth and division. Both are random, so our problem was to explain why these random phenomenon lead to a very deterministic outcome."

The Rice lab specializes in theoretical modeling that explains biological phenomena including genome editing, antibiotic resistance and cancer proliferation. Teimouri said the highly efficient chemical coupling between growth and division in bacteria was far easier to model.

"We assumed that, at typical proliferation conditions, the number of division and growth protein precursors are always proportional to the cell size," he said. T

he model predicts when bacteria will divide, allowing them to optimize their function. The researchers said it agrees nicely with experimental observations and noted manipulating the formula to knock bacteria out of homeostasis proved their point. Increasing the theoretical length of post-division bacteria, they said, simply leads to faster rates of division, keeping their sizes in check.

"For short lengths, growth dominates, again keeping the bacteria to the right size," Kolomeisky said.

The same theory doesn't necessarily apply to larger organisms, he said. "We know that in humans, there are many other biochemical pathways that might regulate homeostasis, so the problem is more complex."

However, the work may give researchers new perspective on the proliferation of diseased cells and the mechanism that forces, for instance, cancer cells to take on different shapes and sizes.

Read more at Science Daily

Mammals share gene pathways that allow zebrafish to grow new eyes

 Working with fish, birds and mice, Johns Hopkins Medicine researchers report new evidence that some animals' natural capacity to regrow neurons is not missing, but is instead inactivated in mammals. Specifically, the researchers found that some genetic pathways that allow many fish and other cold-blooded animals to repair specialized eye neurons after injury remain present in mammals as well, but are turned off, blocking regeneration and healing.

A description of the study, published online by the journal Science on Oct. 1, offers a better understanding of how genes that control regeneration are conserved across species, as well as how they function. This may help scientists develop ways to grow cells that are lost due to hereditary blindness and other neurodegenerative diseases.

"Our research overall indicates that the potential for regeneration is there in mammals, including humans, but some evolutionary pressure has turned it off," says Seth Blackshaw, Ph.D., professor of neuroscience at the Johns Hopkins University School of Medicine. "In fact, regeneration seems to be the default status, and the loss of that ability happened at multiple points on the evolutionary tree," he says.

For the study, Blackshaw's team focused on supportive cells in the back of the eye. In zebrafish, a standard laboratory model whose genome has been well defined, these cells, known as Müller glia, respond and repair the light-sensitive retina by growing new cells in the central nervous system called neurons. In addition to regrowing eye tissue, zebrafish's regenerative abilities extend to other body parts, including fins, tails and some internal organs.

The retina is a good testing ground for mapping genetic activity, explains Blackshaw, because it contains structures common to other cells in the nervous system. In previous studies, moreover, scientists have found that the genetic networks in the retina are well conserved across species, so comparisons among fish, birds, mice and even humans are possible.

For the new experiments, the Johns Hopkins researchers created retinal injuries in zebrafish, chickens and mice. Then they used high-powered microscopes and a previously developed gene mapping tool to observe how the supportive Müller glia cells responded.

Blackshaw said the team was surprised to find, immediately after the injury, that the cells in each of the three species behaved the same way: They entered an "active state" characterized by the activation of specific genes, some of which control inflammation.

This active state, says Blackshaw, primarily helps to contain the injury and send signals to immune system cells to combat foreign invaders such as bacteria, or to clean up broken tissue.

Beyond that step, however, the species' responses diverged.

In zebrafish, active Müller glia began turning on a network of transcription factors that control which genes are 'on' and 'off.' In the current experiment, the NFI transcription factors activated genes that are linked to cell maturity, sending the Müller glia cells back in developmental time to a more primitive state, which then allows them to develop into many different cell types. The Müller glia then "differentiated" into new cells to replace the ones lost to injury.

In contrast, the research team saw that chickens with damaged retinas activate only some of the transcription factor 'gene control switches' that are turned on in zebrafish. Thus, chickens have much less capability to create new Müller glia and other neurons in the eye following injury.

Finally, the researchers looked at the injury response in mice. Mice share the vast majority of their DNA with humans, and their eyes are similar to human eyes. The researchers found that injured Müller glia in mice remained in the first "active" state for several days, much longer than the eight to 12 hours that zebrafish are in this state, and yet never acquired the ability to make new neurons.

Müller glia in all three species also express high levels of nuclear factor I (NFI) transcription factors, but rapidly turn them off following injury. In mice, however, the NFI genes are turned back on soon thereafter, and actively block the Müller glia from generating neurons.

The researchers found, to their surprise, they say, that the same genes that allowed the zebrafish cells to regenerate were "primed and ready to go" in the mouse eye, but that the "on" transcription factor was never activated. Instead, the NFI factors actively block the cells' regenerative potential.

Blackshaw suspects that animals with a higher potential to develop disease in brain and other neurological tissue may have lost this capability over evolutionary time to help protect and stabilize other brain cells. "For example, we know that certain viruses, bacteria and even parasites can infect the brain. It could be disastrous if infected brain cells were allowed to grow and spread the infection through the nervous system," says Blackshaw.

Read more at Science Daily

Nobel Prize in Chemistry 2020: CRISPR/Cas9 method for genome editing

CRISPR/Cas9 gene editing, concept illustration

The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Chemistry 2020 to Emmanuelle Charpentier, Max Planck Unit for the Science of Pathogens, Berlin, Germany, and Jennifer A. Doudna, University of California, Berkeley, USA "for the development of a method for genome editing."

Genetic scissors: a tool for rewriting the code of life

Emmanuelle Charpentier and Jennifer A. Doudna have discovered one of gene technology's sharpest tools: the CRISPR/Cas9 genetic scissors. Using these, researchers can change the DNA of animals, plants and microorganisms with extremely high precision. This technology has had a revolutionary impact on the life sciences, is contributing to new cancer therapies and may make the dream of curing inherited diseases come true.

Researchers need to modify genes in cells if they are to find out about life's inner workings. This used to be time-consuming, difficult and sometimes impossible work. Using the CRISPR/Cas9 genetic scissors, it is now possible to change the code of life over the course of a few weeks.

"There is enormous power in this genetic tool, which affects us all. It has not only revolutionised basic science, but also resulted in innovative crops and will lead to ground-breaking new medical treatments," says Claes Gustafsson, chair of the Nobel Committee for Chemistry.

As so often in science, the discovery of these genetic scissors was unexpected. During Emmanuelle Charpentier's studies of Streptococcus pyogenes, one of the bacteria that cause the most harm to humanity, she discovered a previously unknown molecule, tracrRNA. Her work showed that tracrRNA is part of bacteria's ancient immune system, CRISPR/Cas, that disarms viruses by cleaving their DNA.

Charpentier published her discovery in 2011. The same year, she initiated a collaboration with Jennifer Doudna, an experienced biochemist with vast knowledge of RNA. Together, they succeeded in recreating the bacteria's genetic scissors in a test tube and simplifying the scissors' molecular components so they were easier to use.

In an epoch-making experiment, they then reprogrammed the genetic scissors. In their natural form, the scissors recognise DNA from viruses, but Charpentier and Doudna proved that they could be controlled so that they can cut any DNA molecule at a predetermined site. Where the DNA is cut it is then easy to rewrite the code of life.

Since Charpentier and Doudna discovered the CRISPR/Cas9 genetic scissors in 2012 their use has exploded. This tool has contributed to many important discoveries in basic research, and plant researchers have been able to develop crops that withstand mould, pests and drought. In medicine, clinical trials of new cancer therapies are underway, and the dream of being able to cure inherited diseases is about to come true. These genetic scissors have taken the life sciences into a new epoch and, in many ways, are bringing the greatest benefit to humankind.

Read more at Science Daily

Oct 6, 2020

6,500-year-old copper workshop uncovered in the Negev Desert's Beer Sheva

 A new study by Tel Aviv University and the Israel Antiquities Authority indicates that a workshop for smelting copper ore once operated in the Neveh Noy neighborhood of Beer Sheva, the capital of the Negev Desert. The study, conducted over several years, began in 2017 in Beer Sheva when the workshop was first uncovered during an Israel Antiquities Authority emergency archeological excavation to safeguard threatened antiquities.

The new study also shows that the site may have made the first use in the world of a revolutionary apparatus: the furnace.

The study was conducted by Prof. Erez Ben-Yosef, Dana Ackerfeld, and Omri Yagel of the Jacob M. Alkow Department of Archeology and Ancient Near Eastern Civilizations at Tel Aviv University, in conjunction with Dr. Yael Abadi-Reiss, Talia Abulafia, and Dmitry Yegorov of the Israel Antiquities Authority and Dr. Yehudit Harlavan of the Geological Survey of Israel. The results of the study were published online on September 25, 2020, in the Journal of Archaeological Science: Reports.

According to Ms. Abulafia, Director of the excavation on behalf of the Israel Antiquities Authority, "The excavation revealed evidence for domestic production from the Chalcolithic period, about 6,500 years ago. The surprising finds include a small workshop for smelting copper with shards of a furnace -- a small installation made of tin in which copper ore was smelted -- as well as a lot of copper slag."

Although metalworking was already in evidence in the the Chalcolithic period, the tools used were still made of stone. (The word "chalcolithic" itself is a combination of the Greek words for "copper" and "stone.") An analysis of the isotopes of ore remnants in the furnace shards show that the raw ore was brought to Neveh Noy neighborhood from Wadi Faynan, located in present-day Jordan, a distance of more than 100 kilometers from Beer Sheva.

During the Chalcolithic period, when copper was first refined, the process was made far from the mines, unlike the prevalent historical model by which furnaces were built near the mines for both practical and economic reasons. The scientists hypothesize that the reason was the preservation of the technological secret.

"It's important to understand that the refining of copper was the high-tech of that period. There was no technology more sophisticated than that in the whole of the ancient world," Prof. Ben-Yosef says. "Tossing lumps of ore into a fire will get you nowhere. You need certain knowledge for building special furnaces that can reach very high temperatures while maintaining low levels of oxygen."

Prof. Ben-Yosef notes that the archeology of the land of Israel shows evidence of the Ghassulian culture. The culture was named for Tulaylât al-Ghassûl, the archeological site in Jordan where the culture was first identified. This culture, which spanned the region from the Beer Sheva Valley to present-day southern Lebanon, was unusual for its artistic achievements and ritual objects, as evidenced by the copper objects discovered at Nahal Mishmar and now on display at the Israel Museum in Jerusalem.

According to Prof. Ben-Yosef, the people who lived in the area of the copper mines traded with members of the Ghassulian culture from Beer Sheva and sold them the ore, but they were themselves incapable of reproducing the technology. Even among the Ghassulian settlements along Wadi Beer Sheva, copper was refined by experts in special workshops. A chemical analysis of remnants indicates that every workshop had its own special "recipe" which it did not share with its competitors. It would seem that, in that period, Wadi Beer Sheva was filled with water year-round, making the location convenient for smelting copper where the furnaces and other apparatus were made of clay.

Prof. Ben-Yosef further notes that, even within Chalcolithic settlements that possessed both stone and copper implements, the secret of the gleaming metal was held by the very few members of an elite. "At the beginning of the metallurgical revolution, the secret of metalworking was kept by guilds of experts. All over the world, we see metalworkers' quarters within Chalcolithic settlements, like the neighborhood we found in Beer Sheva."

The study discusses the question of the extent to which this society was hierarchical or socially stratified, as society was not yet urbanized. The scientists feel that the findings from Neveh Noy strengthen the hypothesis of social stratification. Society seems to have consisted of a clearly defined elite possessing expertise and professional secrets, which preserved its power by being the exclusive source for the shiny copper. The copper objects were not made to be used, instead serving some ritual purpose and thus possessing symbolic value. The copper axe, for example, wasn't used as an axe. It was an artistic and/or cultic object modeled along the lines of a stone axe. The copper objects were probably used in rituals while the everyday objects in use continued to be of stone.

"At the first stage of humankind's copper production, crucibles rather than furnaces were used," says Prof. Ben-Yosef. "This small pottery vessel, which looks like a flower pot, is made of clay. It was a type of charcoal-based mobile furnace. Here, at the Neveh Noy workshop that the Israel Antiquities Authority uncovered, we show that the technology was based on real furnaces. This provides very early evidence for the use of furnaces in metallurgy and it raises the possibility that the furnace was invented in this region.

Read more at Science Daily

Trans-Neptunian object Arrokoth: Flattening of a snowman

 The trans-Neptunian object Arrokoth, also known as Ultima Thule, which NASA's space probe New Horizons passed on New Year's Day 2019, may have changed its shape significantly in the first 100 million years since its formation. In today's issue of the journal Nature Astronomy, researchers led by the Chinese Academy of Sciences and the Max Planck Institute for Solar System Research (MPS) suggest that the current shape of Arrokoth, which resembles a flattened snowman, could be of evolutionary origin due to volatile outgassing. Their calculations help to understand what the current state of bodies from the edge of the Solar System may teach us about their original properties.

The many millions of bodies populating the Kuiper Belt beyond Neptune's orbit are yet to reveal many of their secrets. NASA's spacecraft New Horizons sent the first images from the outermost edge of the solar system to Earth: in the summer of 2015 of dwarf planet Pluto and three and a half years later of the trans-Neptunian object Arrokoth, about 30 kilometers in size. Not yet officially named, the body was nicknamed Ultima Thule at the time, in reference to the northernmost land point on Earth. After all, the trans-Neptunian object is the body furthest away from the Sun that has ever been visited and imaged by a human-made probe.

Especially Arrokoth's strange shape caused a sensation in the days after the fly-by. The body is a contact binary, believed to be a result of low velocity merging of two separate bodies that formed close together. It is composed of two connected lobes, of which the smaller one is slightly flattened, the larger one strongly so, creating the impression of a squashed snowman. In their current publication, the researchers from China, Germany, and the USA investigate how this shape came to be. A pronounced bi-lobed shape is also known from some comets. However, there is no other known body that is as flat as Arrokoth. Did Arrokoth already look like this when it was created? Or did its shape develop gradually?

"We like to think of the Kuiper Belt as a region where time has more or less stood still since the birth of the Solar System," explains Dr. Ladislav Rezac from MPS, one of the two first authors of the current publication. More than four billion kilometers away from the Sun, the bodies of the Kuiper Belt have remained frozen and unchanged, so is the common belief. New Horizon's images of Arrokoth challenge this idea by its apparently smooth surface without signs of frequent cratering events and by its peculiar, flattened shape. Scientists assume that the Solar System was formed 4.6 billion years ago from a disk of dust: the particles from this nebula agglomerated into ever larger clumps; these clumps collided and merged into even larger bodies. "There is as yet no explanation as to how a body as flat as Arrokoth could emerge from this process," says Rezac.

Another possibility would be that Arrokoth had a more ordinary shape to begin with. It may have started as a merger between a spherical and an oblate body at the time of its creation and only gradually become flattened. Earlier studies suggest that during the formation of the Solar System, the region where Arrrokoth is located could have been a distinct environment in the cold, dust-shaded mid-plane of the outer nebula. The low temperatures enabled volatiles such as carbon monoxide and methane to freeze onto dust grains and compose planetesimals. When the nebular dust cleared after Arrokoth's formation, solar illumination would have raised its temperature and hence rapidly driven off the condensed volatiles. Arrokoth's strange shape would then be a natural outcome due to a favorable combination of its large obliquity, small eccentricity and mass-loss rate variation with solar flux, resulting in nearly symmetric erosion between north and south hemispheres.

"For a body to change its shape as extremely as Arrokoth, its rotational axis needs to be oriented in a special way," Rezac explains. Unlike Earth's rotational axis, Arrokoth's is almost parallel to the orbital plane. During its 298 year orbit around the Sun, one polar region of Arrokoth faces the Sun continuously for nearly half the time while the other faces away. Regions at equator and lower latitudes are dominated by diurnal variations year round. "This causes the poles to heat up the most, so that frozen gases escape from there most efficiently resulting in a strong mass loss," says Dr. Yuhui Zhao from the Purple Mountain Observatory of the Chinese Academy of Sciences. The flattening process most likely occurred early in the evolution history of the body and proceeded rather quickly on a timescale of about one to 100 million years during the presence of super volatile ices in the near subsurface layers. In addition, the scientists self-consistently demonstrated that the induced torques would play a negligible role in the planetesimal's spin state change during the mass loss phase.

Read more at Science Daily

Dog brains do not prefer faces

 Even though dogs gaze into man's eyes, dog brains may not process faces as human brains do. A new study from JNeurosci suggests that the canine visual system is organized differently: the face network found in primates may not extend to all mammals.

Faces constitute a critical part of communication for humans and other primates, so much so that faces have a special status in their visual system. Areas in the face network, like the fusiform face area, activate specifically to faces. Dogs care about faces, too, but they may not have face areas.

Bunford, Hernández-Pérez et al. used fMRI to compare the brain activity of humans and pet dogs as they watched brief videos of other humans and dogs. Human brains showed a preference for faces, meaning that some visual areas had greater activity in response to a face compared to the back of the head. A subset of these regions also displayed species preference, with increased activity in response to viewing a human over a dog. In contrast, dog brains only showed species preference. Visual areas had greater activity in response to seeing a dog over a human, and no activity difference between seeing a face vs. the back of the head.

From Science Daily

How malaria parasites withstand a fever's heat

 Even when a person suffering from malaria is burning up with fever and too sick to function, the tiny blood-eating parasites lurking inside them continue to flourish, relentlessly growing and multiplying as they gobble up the host's red blood cells.

The single-celled Plasmodium parasites that cause 200 million cases of malaria each year can withstand feverish temperatures that make their human hosts miserable. And now, a Duke University-led team is beginning to understand how they do it.

Assistant professor of chemistry Emily Derbyshire and colleagues have identified a lipid-protein combo that springs into action to gird the parasite's innards against heat shock.

Understanding how the malaria parasite protects its cells against heat stress and other onslaughts could lead to new ways to fight resistant strains, which have evolved ways to survive the drugs traditionally used to kill them, the researchers say.

Nearly half of the world's population is at risk of contracting malaria. The disease kills 400,000 people a year, most of them children.

Long before the cause of malaria was identified, the disease's harrowing fevers were well known. References to them have been found on 5,000-year-old clay tablets from ancient Mesopotamia. The Greek poet Homer wrote about their misery. Hippocrates too.

The Duke team, collaborating with professor of biological engineering Jacquin Niles at the Massachusetts Institute of Technology, wanted to know how the malaria parasites inside a person's body make it through these fevers unscathed.

When the parasites enter a person's bloodstream through the bite of an infected mosquito, the temperature around them jumps from the balmy mid-70s of the mosquito to 98.6 degrees in the human. The human host's body temperature can then rocket to 105 degrees or higher before dropping back down to normal two to six hours later, a roller coaster pattern that repeats itself every two to three days.

"It's like going from room temperature water to a hot tub," said first author Kuan-Yi Lu, who earned his Ph.D. in molecular genetics and microbiology in Derbyshire's lab at Duke.

For the paper, published Sept. 25 in the journal eLife, Lu spent hundreds of hours peering at parasites under the microscope, trying to figure out what happens inside them when temperatures seesaw.

To mimic malarial fever in the lab, the researchers placed malaria-infected red blood cells in an incubator heated to 104 degrees Fahrenheit for six hours before bringing them back down to normal body temperature, 98.6 degrees.

They found that when temperatures rise, the parasites produce more of a lipid molecule called phosphatidylinositol 3-phosphate, or PI(3)P.

This substance builds up in the outer wall of a tiny sac inside the parasite's cells called the food vacuole -- the protist's version of a gut. There, it recruits and binds to another molecule, a heat shock protein called Hsp70, and together they help shore up the food vacuole's outer walls.

Without this lipid-protein boost, the team found that heat can make the food vacuole start to leak, unleashing its acidic contents into the gel-like fluid that fills the cell and possibly even digesting the parasite from the inside.

The findings are important because they could help researchers make the most of existing malaria drugs.

Previous research has shown that malaria parasites with higher-than-normal PI(3)P levels are more resistant to artemisinins, the leading class of antimalarials. Since artemisinins were first introduced in the 1970s, partial resistance has been increasingly reported in parts of Southeast Asia, raising fears that we may be losing one of our best weapons against the disease.

But the Duke-led study raises the possibility that new combination therapies for malaria -- artemisinins combined with other drugs that reduce the parasite's PI(3)P lipid levels and disrupt the food vacuole's membrane -- could be a way to re-sensitize resistant parasites, breaking down their defenses so the malaria treatments we already have are effective again.

"If there is an alternative way to increase the permeability of the digestive vacuole, it could make the digestive vacuole more accessible to those drugs again," Lu said.

The findings also suggest caution in giving malaria patients ibuprofen for fever if they're already taking artemisinin-based compounds, Derbyshire said. That's because artemisinins kill malaria parasites by damaging their cell's survival machinery, including the machinery that makes PI(3)P. If artemisinins suppress PI(3)P levels, and thereby make malaria parasites more vulnerable to heat stress, then fever reducers could prolong the time it takes for artemisinin-based drugs to kill the parasites, as some reports have suggested.

Read more at Science Daily

Nobel Prize in Physics 2020: Discoveries about black holes

 

Abstract illustration of black hole.
The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Physics 2020 with one half to Roger Penrose, University of Oxford, UK, "for the discovery that black hole formation is a robust prediction of the general theory of relativity" and the other half jointly to Reinhard Genzel, Max Planck Institute for Extraterrestrial Physics, Garching, Germany and University of California, Berkeley, USA and Andrea Ghez, University of California, Los Angeles, USA "for the discovery of a supermassive compact object at the centre of our galaxy."

Black holes and the Milky Way's darkest secret

Three Laureates share this year's Nobel Prize in Physics for their discoveries about one of the most exotic phenomena in the universe, the black hole. Roger Penrose showed that the general theory of relativity leads to the formation of black holes. Reinhard Genzel and Andrea Ghez discovered that an invisible and extremely heavy object governs the orbits of stars at the centre of our galaxy. A supermassive black hole is the only currently known explanation.

Roger Penrose used ingenious mathematical methods in his proof that black holes are a direct consequence of Albert Einstein's general theory of relativity. Einstein did not himself believe that black holes really exist, these super-heavyweight monsters that capture everything that enters them. Nothing can escape, not even light.

In January 1965, ten years after Einstein's death, Roger Penrose proved that black holes really can form and described them in detail; at their heart, black holes hide a singularity in which all the known laws of nature cease. His groundbreaking article is still regarded as the most important contribution to the general theory of relativity since Einstein.

Reinhard Genzel and Andrea Ghez each lead a group of astronomers that, since the early 1990s, has focused on a region called Sagittarius A* at the centre of our galaxy. The orbits of the brightest stars closest to the middle of the Milky Way have been mapped with increasing precision. The measurements of these two groups agree, with both finding an extremely heavy, invisible object that pulls on the jumble of stars, causing them to rush around at dizzying speeds. Around four million solar masses are packed together in a region no larger than our solar system.

Using the world's largest telescopes, Genzel and Ghez developed methods to see through the huge clouds of interstellar gas and dust to the centre of the Milky Way. Stretching the limits of technology, they refined new techniques to compensate for distortions caused by the Earth's atmosphere, building unique instruments and committing themselves to long-term research. Their pioneering work has given us the most convincing evidence yet of a supermassive black hole at the centre of the Milky Way.

Read more at Science Daily

Oct 5, 2020

Some planets may be better for life than Earth

 Earth is not necessarily the best planet in the universe. Researchers have identified two dozen planets outside our solar system that may have conditions more suitable for life than our own. Some of these orbit stars that may be better than even our sun.

A study led by Washington State University scientist Dirk Schulze-Makuch recently published in the journal Astrobiology details characteristics of potential "superhabitable" planets, that include those that are older, a little larger, slightly warmer and possibly wetter than Earth. Life could also more easily thrive on planets that circle more slowly changing stars with longer lifespans than our sun.

The 24 top contenders for superhabitable planets are all more than 100 light years away, but Schulze-Makuch said the study could help focus future observation efforts, such as from NASA's James Web Space Telescope, the LUVIOR space observatory and the European Space Agency's PLATO space telescope.

"With the next space telescopes coming up, we will get more information, so it is important to select some targets," said Schulze-Makuch, a professor with WSU and the Technical University in Berlin. "We have to focus on certain planets that have the most promising conditions for complex life. However, we have to be careful to not get stuck looking for a second Earth because there could be planets that might be more suitable for life than ours."

For the study, Schulze-Makuch, a geobiologist with expertise in planetary habitability teamed up with astronomers Rene Heller of the Max Planck Institute for Solar System Research and Edward Guinan of Villanova University to identify superhabitability criteria and search among the 4,500 known exoplanets beyond our solar system for good candidates. Habitability does not mean these planets definitely have life, merely the conditions that would be conducive to life.

The researchers selected planet-star systems with probable terrestrial planets orbiting within the host star's liquid water habitable zone from the Kepler Object of Interest Exoplanet Archive of transiting exoplanets.

While the sun is the center of our solar system, it has a relatively short lifespan of less than 10 billion years. Since it took nearly 4 billion years before any form of complex life appeared on Earth, many similar stars to our sun, called G stars, might run out of fuel before complex life can develop.

In addition to looking at systems with cooler G stars, the researchers also looked at systems with K dwarf stars, which are somewhat cooler, less massive and less luminous than our sun. K stars have the advantage of long lifespans of 20 billion to 70 billion years. This would allow orbiting planets to be older as well as giving life more time to advance to the complexity currently found on Earth. However, to be habitable, planets should not be so old that they have exhausted their geothermal heat and lack protective geomagnetic fields. Earth is around 4.5 billion years old, but the researchers argue that the sweet spot for life is a planet that is between 5 billion to 8 billion years old.

Size and mass also matter. A planet that is 10% larger than the Earth should have more habitable land. One that is about 1.5 times Earth's mass would be expected to retain its interior heating through radioactive decay longer and would also have a stronger gravity to retain an atmosphere over a longer time period.

Water is key to life and the authors argue that a little more of it would help, especially in the form of moisture, clouds and humidity. A slightly overall warmer temperature, a mean surface temperature of about 5 degrees Celsius (or about 8 degrees Fahrenheit) greater than Earth, together with the additional moisture, would be also better for life. This warmth and moisture preference is seen on Earth with the greater biodiversity in tropical rain forests than in colder, drier areas.

Among the 24 top planet candidates none of them meet all the criteria for superhabitable planets, but one has four of the critical characteristics, making it possibly much more comfortable for life than our home planet.

Read more at Science Daily

Gemini South's high-def version of 'A Star is Born'

 NASA's James Webb Space Telescope is still more than a year from launching, but the Gemini South telescope in Chile has provided astronomers a glimpse of what the orbiting observatory should deliver.

Using a wide-field adaptive optics camera that corrects for distortion from Earth's atmosphere, Rice University's Patrick Hartigan and Andrea Isella and Dublin City University's Turlough Downes used the 8.1-meter telescope to capture near-infrared images of the Carina Nebula with the same resolution that's expected of the Webb Telescope.

Hartigan, Isella and Downes describe their work in a study published online this week in Astrophysical Journal Letters. Their images, gathered over 10 hours in January 2018 at the international Gemini Observatory, a program of the National Science Foundation's NOIRLab, show part of a molecular cloud about 7,500 light years from Earth. All stars, including Earth's sun, are thought to form within molecular clouds.

"The results are stunning," Hartigan said. "We see a wealth of detail never observed before along the edge of the cloud, including a long series of parallel ridges that may be produced by a magnetic field, a remarkable almost perfectly smooth sine wave and fragments at the top that appear to be in the process of being sheared off the cloud by a strong wind."

The images show a cloud of dust and gas in the Carina Nebula known as the Western Wall. The cloud's surface is slowly evaporating in the intense glow of radiation from a nearby cluster of massive young stars. The radiation causes hydrogen to glow with near-infrared light, and specially designed filters allowed the astronomers to capture separate images of hydrogen at the cloud's surface and hydrogen that was evaporating.

An additional filter captured starlight reflected from dust, and combining the images allowed Hartigan, Isella and Downes to visualize how the cloud and cluster are interacting. Hartigan has previously observed the Western Wall with other NOIRLab telescopes and said it was a prime choice to follow up with Gemini's adaptive optics system.

"This region is probably the best example in the sky of an irradiated interface," he said. "The new images of it are so much sharper than anything we've previously seen. They provide the clearest view to date of how massive young stars affect their surroundings and influence star and planet formation."

Images of star-forming regions taken from Earth are usually blurred by turbulence in the atmosphere. Placing telescopes in orbit eliminates that problem. And one of the Hubble Space Telescope's most iconic photographs, 1995's "Pillars of Creation," captured the grandeur of dust columns in a star-forming region. But the beauty of the image belied Hubble's weakness for studying molecular clouds.

"Hubble operates at optical and ultraviolet wavelengths that are blocked by dust in star-forming regions like these," Hartigan said.

Because near-infrared light penetrates the outer layers of dust in molecular clouds, near-infrared cameras like the Gemini South Adaptive Optics Imager can see what lies beneath. Unlike traditional infrared cameras, Gemini South's imager uses "a mirror that changes its shape to correct for shimmering in our atmosphere," Hartigan said. The result: photos with roughly 10 times the resolution of images taken from ground-based telescopes that don't use adaptive optics.

But the atmosphere causes more than blur. Water vapor, carbon dioxide and other atmospheric gases absorb some parts of the near-infrared spectrum before it reaches the ground.

"Many near-infrared wavelengths will only be visible from a space telescope like the Webb," Hartigan said. "But for near-infrared wavelengths that reach Earth's surface, adaptive optics can produce images as sharp as those acquired from space."

The advantages of each technique bode well for the study of star formation, he said.

Read more at Science Daily

Process for regenerating neurons in the eye and brain identified

 The death of neurons, whether in the brain or the eye, can result in a number of human neurodegenerative disorders, from blindness to Parkinson's disease. Current treatments for these disorders can only slow the progression of the illness, because once a neuron dies, it cannot be replaced.

Now, a team of researchers from the University of Notre Dame, Johns Hopkins University, Ohio State University and the University of Florida has identified networks of genes that regulate the process responsible for determining whether neurons will regenerate in certain animals, such as zebrafish.

"This study is proof of principle, showing that it is possible to regenerate retinal neurons. We now believe the process for regenerating neurons in the brain will be similar," said David Hyde, professor in the Department of Biological Sciences at Notre Dame and co-author on the study.

For the study, published in Science, the researchers mapped the genes of animals that have the ability to regenerate retinal neurons. For example, when the retina of a zebrafish is damaged, cells called the Müller glia go through a process known as reprogramming. During reprogramming, the Müller glia cells will change their gene expression to become like progenitor cells, or cells that are used during early development of an organism. Therefore, these now progenitor-like cells can become any cell necessary to fix the damaged retina.

Like zebrafish, people also have Müller glia cells. However, when the human retina is damaged, the Müller glia cells respond with gliosis, a process that does not allow them to reprogram.

"After determining the varying animal processes for retina damage recovery, we had to decipher if the process for reprogramming and gliosis were similar. Would the Müller glia follow the same path in regenerating and non-regenerating animals or would the paths be completely different?" said Hyde, who also serves as the Kenna Director of the Zebrafish Research Center at Notre Dame. "This was really important, because if we want to be able to use Müller glia cells to regenerate retinal neurons in people, we need to understand if it would be a matter of redirecting the current Müller glia path or if it would require an entirely different process."

The research team found that the regeneration process only requires the organism to "turn back on" its early development processes. Additionally, researchers were able to show that during zebrafish regeneration, Müller glia also go through gliosis, meaning that organisms that are able to regenerate retinal neurons do follow a similar path to animals that cannot. While the network of genes in zebrafish was able to move Müller glia cells from gliosis into the reprogrammed state, the network of genes in a mouse model blocked the Müller glia from reprogramming.

From there, researchers were able to modify zebrafish Müller glia cells into a similar state that blocked reprogramming while also having a mouse model regenerate some retinal neurons.

Read more at Science Daily

Nobel Prize in Physiology or Medicine 2020: Discovery of Hepatitis C virus

 

Hepatitis C virus infection medical concept, 3D illustration.
The Nobel Assembly at Karolinska Institutet has today decided to award the 2020 Nobel Prize in Physiology or Medicine jointly to Harvey J. Alter, Michael Houghton and Charles M. Rice for the discovery of Hepatitis C virus.

This year's Nobel Prize is awarded to three scientists who have made a decisive contribution to the fight against blood-borne hepatitis, a major global health problem that causes cirrhosis and liver cancer in people around the world.

Harvey J. Alter, Michael Houghton and Charles M. Rice made seminal discoveries that led to the identification of a novel virus, Hepatitis C virus. Prior to their work, the discovery of the Hepatitis A and B viruses had been critical steps forward, but the majority of blood-borne hepatitis cases remained unexplained. The discovery of Hepatitis C virus revealed the cause of the remaining cases of chronic hepatitis and made possible blood tests and new medicines that have saved millions of lives.

Hepatitis -- a global threat to human health

Liver inflammation, or hepatitis, a combination of the Greek words for liver and inflammation, is mainly caused by viral infections, although alcohol abuse, environmental toxins and autoimmune disease are also important causes. In the 1940's, it became clear that there are two main types of infectious hepatitis. The first, named hepatitis A, is transmitted by polluted water or food and generally has little long-term impact on the patient. The second type is transmitted through blood and bodily fluids and represents a much more serious threat since it can lead to a chronic condition, with the development of cirrhosis and liver cancer. This form of hepatitis is insidious, as otherwise healthy individuals can be silently infected for many years before serious complications arise. Blood-borne hepatitis is associated with significant morbidity and mortality, and causes more than a million deaths per year world-wide, thus making it a global health concern on a scale comparable to HIV-infection and tuberculosis.

An unknown infectious agent

The key to successful intervention against infectious diseases is to identify the causative agent. In the 1960's, Baruch Blumberg determined that one form of blood-borne hepatitis was caused by a virus that became known as Hepatitis B virus, and the discovery led to the development of diagnostic tests and an effective vaccine. Blumberg was awarded the Nobel Prize in Physiology or Medicine in 1976 for this discovery.

At that time, Harvey J. Alter at the US National Institutes of Health was studying the occurrence of hepatitis in patients who had received blood transfusions. Although blood tests for the newly-discovered Hepatitis B virus reduced the number of cases of transfusion-related hepatitis, Alter and colleagues worryingly demonstrated that a large number of cases remained. Tests for Hepatitis A virus infection were also developed around this time, and it became clear that Hepatitis A was not the cause of these unexplained cases.

It was a great source of concern that a significant number of those receiving blood transfusions developed chronic hepatitis due to an unknown infectious agent. Alter and his colleagues showed that blood from these hepatitis patients could transmit the disease to chimpanzees, the only susceptible host besides humans. Subsequent studies also demonstrated that the unknown infectious agent had the characteristics of a virus. Alter's methodical investigations had in this way defined a new, distinct form of chronic viral hepatitis. The mysterious illness became known as "non-A, non-B" hepatitis.

Identification of Hepatitis C virus

Identification of the novel virus was now a high priority. All the traditional techniques for virus hunting were put to use but, in spite of this, the virus eluded isolation for over a decade. Michael Houghton, working for the pharmaceutical firm Chiron, undertook the arduous work needed to isolate the genetic sequence of the virus. Houghton and his co-workers created a collection of DNA fragments from nucleic acids found in the blood of an infected chimpanzee. The majority of these fragments came from the genome of the chimpanzee itself, but the researchers predicted that some would be derived from the unknown virus. On the assumption that antibodies against the virus would be present in blood taken from hepatitis patients, the investigators used patient sera to identify cloned viral DNA fragments encoding viral proteins. Following a comprehensive search, one positive clone was found. Further work showed that this clone was derived from a novel RNA virus belonging to the Flavivirus family and it was named Hepatitis C virus. The presence of antibodies in chronic hepatitis patients strongly implicated this virus as the missing agent.

The discovery of Hepatitis C virus was decisive; but one essential piece of the puzzle was missing: could the virus alone cause hepatitis? To answer this question the scientists had to investigate if the cloned virus was able to replicate and cause disease. Charles M. Rice, a researcher at Washington University in St. Louis, along with other groups working with RNA viruses, noted a previously uncharacterized region in the end of the Hepatitis C virus genome that they suspected could be important for virus replication. Rice also observed genetic variations in isolated virus samples and hypothesized that some of them might hinder virus replication. Through genetic engineering, Rice generated an RNA variant of Hepatitis C virus that included the newly defined region of the viral genome and was devoid of the inactivating genetic variations. When this RNA was injected into the liver of chimpanzees, virus was detected in the blood and pathological changes resembling those seen in humans with the chronic disease were observed. This was the final proof that Hepatitis C virus alone could cause the unexplained cases of transfusion-mediated hepatitis.

Read more at Science Daily

Oct 4, 2020

Climate change responsible for record sea temperature levels

 Global warming is driving an unprecedented rise in sea temperatures including in the Mediterranean, according to a major new report published by the peer-reviewed Journal of Operational Oceanography.

Data from the European Union's (EU) Copernicus Marine Environment Monitoring Service (CMEMS) will increase concerns about the threat to the world's seas and oceans from climate change.

The Ocean State Report reveals an overall trend globally of surface warming based on evidence from 1993 to 2018, with the largest rise in the Arctic Ocean.

European seas experienced record high temperatures in 2018, a phenomenon which the researchers attribute to extreme weather conditions -- a marine heat wave lasting several months.

In the same year, a large mass of warm water occurred in the northeast Pacific Ocean, according to the report. This was similar to a marine heatwave -- dubbed 'the Blob' -- which was first detected in 2013 and had devastating effects on marine life.

Now the study authors are calling for improved monitoring to provide better data and knowledge. They argue this will help countries progress towards sustainable use of seas and oceans which are an essential source of food, energy and other resources.

Findings from the report confirm record rises in sea temperatures

"Changes to the ocean have impacted on these (ocean) ecosystem services and stretched them to unsustainable limits," says Karina von Schuckmann and Pierre-Yves Le Traon, the report's editors.

"More than ever a long term, comprehensive and systematic monitoring, assessment and reporting of the ocean is required. This is to ensure a sustainable science-based management of the ocean for societal benefit."

The Ocean State Report identifies other major strains on the world's seas and oceans from climate change including acidification caused by carbon dioxide uptake from the atmosphere, sea level rise, loss of oxygen and sea ice retreat.

Long-term evidence of global warming outlined in the report includes a decrease over 30 years of up to two days in the period of Baltic Sea ice cover and an acceleration in the global mean sea level rise.

Read more at Science Daily

Plastic-eating enzyme 'cocktail' heralds new hope for plastic waste

 

Plastic bottles and other waste
The scientists who re-engineered the plastic-eating enzyme PETase have now created an enzyme 'cocktail' which can digest plastic up to six times faster.

A second enzyme, found in the same rubbish dwelling bacterium that lives on a diet of plastic bottles, has been combined with PETase to speed up the breakdown of plastic.

PETase breaks down polyethylene terephthalate (PET) back into its building blocks, creating an opportunity to recycle plastic infinitely and reduce plastic pollution and the greenhouse gases driving climate change.

PET is the most common thermoplastic, used to make single-use drinks bottles, clothing and carpets and it takes hundreds of years to break down in the environment, but PETase can shorten this time to days.

The initial discovery set up the prospect of a revolution in plastic recycling, creating a potential low-energy solution to tackle plastic waste. The team engineered the natural PETase enzyme in the laboratory to be around 20 percent faster at breaking down PET.

Now, the same trans-Atlantic team have combined PETase and its 'partner', a second enzyme called MHETase, to generate much bigger improvements: simply mixing PETase with MHETase doubled the speed of PET breakdown, and engineering a connection between the two enzymes to create a 'super-enzyme', increased this activity by a further three times.

The study is published in the journal Proceedings of the National Academy of Sciences.

The team was co-led by the scientists who engineered PETase, Professor John McGeehan, Director of the Centre for Enzyme Innovation (CEI) at the University of Portsmouth, and Dr Gregg Beckham, Senior Research Fellow at the National Renewable Energy Laboratory (NREL) in the US.

Professor McGeehan said: "Gregg and I were chatting about how PETase attacks the surface of the plastics and MHETase chops things up further, so it seemed natural to see if we could use them together, mimicking what happens in nature.

"Our first experiments showed that they did indeed work better together, so we decided to try to physically link them, like two Pac-men joined by a piece of string.

"It took a great deal of work on both sides of the Atlantic, but it was worth the effort -- we were delighted to see that our new chimeric enzyme is up to three times faster than the naturally evolved separate enzymes, opening new avenues for further improvements."

The original PETase enzyme discovery heralded the first hope that a solution to the global plastic pollution problem might be within grasp, though PETase alone is not yet fast enough to make the process commercially viable to handle the tons of discarded PET bottles littering the planet.

Combining it with a second enzyme, and finding together they work even faster, means another leap forward has been taken towards finding a solution to plastic waste.

PETase and the new combined MHETase-PETase both work by digesting PET plastic, returning it to its original building blocks. This allows for plastics to be made and reused endlessly, reducing our reliance on fossil resources such as oil and gas.

Professor McGeehan used the Diamond Light Source, in Oxfordshire, a synchrotron that uses intense beams of X-rays 10 billion times brighter than the Sun to act as a microscope powerful enough to see individual atoms. This allowed the team to solve the 3D structure of the MHETase enzyme, giving them the molecular blueprints to begin engineering a faster enzyme system.

The new research combined structural, computational, biochemical and bioinformatics approaches to reveal molecular insights into its structure and how it functions. The study was a huge team effort involving scientists at all levels of their careers.

One of the most junior authors, Rosie Graham, a joint Portsmouth CEI-NREL PhD student said: "My favourite part of research is how the ideas start, whether it's over coffee, on a train commute or when passing in the university corridors it can really be at any moment.

"It's a really great opportunity to learn and grow as part of this UK-USA collaboration and even more so to contribute another piece of the story on using enzymes to tackle some of our most polluting plastics."

Read more at Science Daily