Dec 4, 2021

Immune system-stimulating nanoparticle could lead to more powerful vaccines

A common strategy to make vaccines more powerful is to deliver them along with an adjuvant -- a compound that stimulates the immune system to produce a stronger response.

Researchers from MIT, the La Jolla Institute for Immunology, and other institutions have now designed a new nanoparticle adjuvant that may be more potent than others now in use. Studies in mice showed that it significantly improved antibody production following vaccination against HIV, diphtheria, and influenza.

"We started looking at this particular formulation and found that it was incredibly potent, better than almost anything else we had tried," says Darrell Irvine, the Underwood-Prescott Professor with appointments in MIT's departments of Biological Engineering and Materials Science and Engineering; an associate director of MIT's Koch Institute for Integrative Cancer Research; and a member of the Ragon Institute of MGH, MIT, and Harvard.

The researchers now hope to incorporate the adjuvant into an HIV vaccine that is currently being tested in clinical trials, in hopes of improving its performance.

Irvine and Shane Crotty, a professor at the Center for Infectious Disease and Vaccine Research at the La Jolla Institute for Immunology, are the senior authors of the study, which appears today in Science Immunology. The lead authors of the paper are Murillo Silva, a former MIT postdoc, and Yu Kato, a staff scientist at the La Jolla Institute.

More powerful vaccines

Although the idea of using adjuvants to boost vaccine effectiveness has been around for decades, there are only a handful of FDA-approved vaccine adjuvants. One is aluminum hydroxide, an aluminum salt that induces inflammation, and another is an oil and water emulsion that is used in flu vaccines. A few years ago, the FDA approved an adjuvant based on saponin, a compound derived from the bark of the Chilean soapbark tree.

Saponin formulated in liposomes is now used as an adjuvant in the shingles vaccine, and saponins are also being used in a cage-like nanoparticle called an immunostimulatory complex (ISCOM) in a Covid-19 vaccine that is currently in clinical trials.

Researchers have shown that saponins promote inflammatory immune responses and stimulate antibody production, but how they do that is unclear. In the new study, the MIT and La Jolla team wanted to figure out how the adjuvant exerts its effects, and to see if they could make it more potent.

They designed a new type of adjuvant that is similar to the ISCOM adjuvant but also incorporates a molecule called MPLA, which is a toll-like receptor agonist. When these molecules bind to toll-like receptors on immune cells, they promote inflammation. The researchers call their new adjuvant SMNP (saponin/MPLA nanoparticles).

"We expected that this could be interesting because saponin and toll-like receptor agonists are both adjuvants that have been studied separately and shown to be very effective," Irvine says.

The researchers tested the adjuvant by injecting it into mice along with a few different antigens, or fragments of viral proteins. These included two HIV antigens, as well as diphtheria and influenza antigens. They compared the adjuvant to several other approved adjuvants and found that the new saponin-based nanoparticle elicited a stronger antibody response than any of the others.

One of the HIV antigens that they used is an HIV envelope protein nanoparticle, which presents many copies of the gp120 antigen that is present on the HIV viral surface. This antigen recently completed initial testing in phase 1 clinical trials. Irvine and Crotty are part of the Consortium for HIV/AIDS Vaccine Development at the Scripps Research Institute, which ran that trial. The researchers now hope to develop a way to manufacture the new adjuvant at large scale so it can be tested along with an HIV envelope trimer in another clinical trial beginning next year. Clinical trials that combine envelope trimers with the traditional vaccine adjuvant aluminum hydroxide are also underway.

"Aluminum hydroxide is safe but not particularly potent, so we hope that (the new adjuvant) would be an interesting alternative to elicit neutralizing antibody responses in people," Irvine says.

Rapid flow

When vaccines are injected into the arm, they travel through lymph vessels to the lymph nodes, where they encounter and activate B cells. The research team found that the new adjuvant speeds up the flow of lymph to the nodes, helping the antigen to get there before it starts to break down. It does this in part by stimulating immune cells called mast cells, which previously were not known to be involved in vaccine responses.

"Getting to the lymph nodes quickly is useful because once you inject the antigen, it starts slowly breaking down. The sooner a B cell can see that antigen, the more likely it's fully intact, so that B cells are targeting the structure as it will be present on the native virus," Irvine says.

Additionally, once the vaccine reaches the lymph nodes, the adjuvant causes a layer of cells called macrophages, which act as a barrier, to die off quickly, making it easier for the antigen to get into the nodes.

Another way that the adjuvant helps boost immune responses is by activating inflammatory cytokines that drive a stronger response. The TLR agonist that the researchers included in the adjuvant is believed to amplify that cytokine response, but the exact mechanism for that is not known yet.

This kind of adjuvant could also be useful for any other kind of subunit vaccine, which consists of fragments of viral proteins or other molecules. In addition to their work on HIV vaccines, the researchers are also working on a potential Covid-19 vaccine, along with J. Christopher Love's lab at the Koch Institute. The new adjuvant also appears to help stimulate T cell activity, which could make it useful as a component of cancer vaccines, which aim to stimulate the body's own T cells to attack tumors.

Read more at Science Daily

Daytime meals may reduce health risks linked to night shift work

A small clinical trial supported by the National Institutes of Health has found that eating during the nighttime -- like many shift workers do -- can increase glucose levels, while eating only during the daytime might prevent the higher glucose levels now linked with a nocturnal work life. The findings, the study authors said, could lead to novel behavioral interventions aimed at improving the health of shift workers -- grocery stockers, hotel workers, truck drivers, first responders, and others -- who past studies show may be at an increased risk for diabetes, heart disease, and obesity.

The new study, which the researchers noted is the first to demonstrate the beneficial effect of this type of meal timing intervention in humans, appears online in the journal Science Advances. It was funded primarily by the National Heart, Lung, and Blood Institute (NHLBI), part of NIH.

"This is a rigorous and highly controlled laboratory study that demonstrates a potential intervention for the adverse metabolic effects associated with shift work, which is a known public health concern," said Marishka Brown, Ph.D., director of the NHLBI's National Center on Sleep Disorders Research. "We look forward to additional studies that confirm the results and begin to untangle the biological underpinnings of these findings."

For the study, the researchers enrolled 19 healthy young participants (seven women and 12 men). After a preconditioning routine, the participants were randomly assigned to a 14-day controlled laboratory protocol involving simulated night work conditions with one of two meal schedules. One group ate during the nighttime to mimic a meal schedule typical among night workers, and one group ate during the daytime.

The researchers then evaluated the effects of these meal schedules on their internal circadian rhythms. That's the internal process that regulates not just the sleep-wake cycle, but also the 24-hour cycle of virtually all aspects of your bodily functions, including metabolism.

The researchers found that nighttime eating boosted glucose levels -- a risk factor for diabetes -- while restricting meals to the daytime prevented this effect. Specifically, average glucose levels for those who ate at night increased by 6.4% during the simulated night work, while those who ate during the daytime showed no significant increases.

"This is the first study in humans to demonstrate the use of meal timing as a countermeasure against the combined negative effects of impaired glucose tolerance and disrupted alignment of circadian rhythms resulting from simulated night work," said study leader Frank A.J.L. Scheer, Ph.D., professor of medicine at Harvard Medical School and director of the Medical Chronobiology Program at Brigham & Women's Hospital in Boston.

The researchers said that the mechanisms behind the observed effects are complex. They believe that the nighttime eating effects on glucose levels during simulated night work are caused by circadian misalignment. That corresponds to the mistiming between the central circadian "clock" (located in the brain's hypothalamus) and behavioral sleep/wake, light/dark, and fasting/eating cycles, which can influence peripheral "clocks" throughout the body. The current study shows that, in particular, mistiming of the central circadian clock with the fasting/eating cycles plays a key role in boosting glucose levels. The work further suggests the beneficial effects of daytime eating on glucose levels during simulated night work may be driven by better alignment between these central and peripheral "clocks."

Read more at Science Daily

Dec 3, 2021

Stellar cocoon with organic molecules at the edge of our galaxy

For the first time, astronomers have detected a newborn star and the surrounding cocoon of complex organic molecules at the edge of our Galaxy, which is known as the extreme outer Galaxy. The discovery, which revealed the hidden chemical complexity of our Universe, appears in a paper in The Astrophysical Journal.

The scientists from Niigata University (Japan), Academia Sinica Institute of Astronomy and Astrophysics (Taiwan), and the National Astronomical Observatory of Japan, used the Atacama Large Millimeter/submillimeter Array (ALMA) in Chile to observe a newborn star (protostar) in the WB89-789 region, located in the extreme outer Galaxy. A variety of carbon-, oxygen-, nitrogen-, sulfur-, and silicon-bearing molecules, including complex organic molecules containing up to nine atoms, were detected. Such a protostar, as well as the associated cocoon of chemically-rich molecular gas, were for the first time detected at the edge of our Galaxy.

The ALMA observations reveal that various kinds of complex organic molecules, such as methanol (CH3OH), ethanol (C2H5OH), methyl formate (HCOOCH3), dimethyl ether (CH3OCH3), formamide (NH2CHO), propanenitrile (C2H5CN), etc., are present even in the primordial environment of the extreme outer Galaxy. Such complex organic molecules potentially act as the feedstock for larger prebiotic molecules.

Interestingly, the relative abundances of complex organic molecules in this newly discovered object resemble remarkably well what is found in similar objects in the inner Galaxy. The observations suggest that complex organic molecules are formed with similar efficiency even at the edge of our Galaxy, where the environment is very different from the solar neighborhood.

It is believed that the outer part of our Galaxy still harbors a primordial environment that existed in the early epoch of galaxy formation. The environmental characteristics of the extreme outer Galaxy, e.g., low abundance of heavy elements, small or no perturbation from Galactic spiral arms, are very different from those seen in the present-day solar neighborhood. Because of its unique characteristics, the extreme outer Galaxy is an excellent laboratory to study star formation and the interstellar medium in the past Galactic environment.

"With ALMA we were able to see a forming star and the surrounding molecular cocoon at the edge of our Galaxy," says Takashi Shimonishi, an astronomer at Niigata University, Japan, and the paper's lead author. "To our surprise, a variety of abundant complex organic molecules exists in the primordial environment of the extreme outer Galaxy. The interstellar conditions to form the chemical complexity might have persisted since the early history of the Universe," Shimonishi adds.

"These observations have revealed that complex organic molecules can be efficiently formed even in low-metallicity environments like the outermost regions of our Galaxy. This finding provides an important piece of the puzzle to understand how complex organic molecules are formed in the Universe," says Kenji Furuya, an astronomer at the National Astronomical Observatory of Japan, and the paper's co-author.

Read more at Science Daily

Combined heat and power as a platform for clean energy systems

The state of Georgia could dramatically reduce its greenhouse gas emissions, while creating new jobs and a healthier public, if more of its energy-intensive industries and commercial buildings were to utilize combined heat and power (CHP), according to the latest research from Georgia Tech's School of Public Policy.

The paper, digitally available now and in print on December 15 in the journal Applied Energy, finds that CHP -- or cogeneration -- could measurably reduce Georgia's carbon footprint while creating green jobs. Georgia ranks 8th among all 50 states for total net electricity generation and 11th for total carbon dioxide emissions, according to data from the U.S. Energy Information Administration.

"There is an enormous opportunity for CHP to save industries money and make them more competitive, while at the same time reducing air pollution, creating jobs and enhancing public health," said principal investigator Marilyn Brown, Regents and Brook Byers professor of Sustainable Systems at Georgia Tech's School of Public Policy.

Benefiting the Environment, Economy, and Public Health

The research finds that if Georgia added CHP systems to the 9,374 sites that are suitable for cogeneration, it could reduce carbon emissions in Georgia by 13%. Bringing CHP to just 34 of Georgia's industrial plants, each with 25 megawatts of electricity capacity, could reduce greenhouse gas emissions by 2%. The study authors, using modeling tools they developed, note that this "achievable" level of CHP adoption could add 2,000 jobs to the state; full deployment could support 13,000 new jobs.

According to Brown, CHP systems can be 85 to 90% efficient, compared with 45 to 60% efficiency of traditional heat and power systems. CHP has advantages over renewable electricity from solar and wind, which only offers intermittent power.

CHP technologies co-produce electricity useful for heat and cooling, resulting in ultra-high system efficiencies, cleaner air, and more affordable energy. Georgia industries that would profit from CHP include chemical, textile, pulp and paper, and food production. Large commercial buildings, campuses, and military bases also could benefit from CHP. By utilizing both electricity and heat from a single source onsite, the energy system if more reliable, resilient, and efficient.

CHP can meet the same needs at higher efficiency using less overall energy, while reducing peak demand on a region's utility-operated power grid, Brown explained. In addition, if there is an outage or disruption in a community's power grid, companies with their own onsite electricity sources can continue to have power.

Calculating CHP Costs and Benefits per Plant

The research used a database of every Georgia industrial site to determine which facilitates operated or could operate a CHP system. They then identified the appropriate type of CHP system for plants without one. To help assess if a CHP system was a financially sound investment, they developed a model to estimate the benefits and costs of each CHP system, factoring in the cost to install the equipment, operations and maintenance, fuel expenses, and financing. The result was an estimated "net present value" of each system that reflected the present value of future costs and benefits, Brown explained.

The paper also used data analytics to predict economic and health benefits of CHP for Georgians. Plants converting to cogeneration could boost the state's clean energy workforce by 2,000 to 13,000 depending on how widely it's adopted, Brown said. Currently, the state has about 2,600 jobs in electric vehicle manufacturing and less than 5,000 in the solar industry, according to the 11th Annual National Solar Jobs Census 2020.

In addition to job growth, CHP adoption could lead to dramatic health benefits for the state's more vulnerable residents, Brown emphasized. "We're displacing more polluting electricity when companies generate their own from waste heat," she noted.

The study estimates nearly $150 million in reduced health costs and ecological damages in 2030 in the "achievable" scenario for CHP, with nearly $1 billion in health and ecological benefits if every Georgia plant identified in the study adopted CHP.

"The public health improvements are gigantic -- that's a lot of lives saved, as well as childhood asthma and heart problems avoided," Brown said.

Georgia Tech's research was sponsored by Drawdown Georgia, a statewide initiative focused on scaling market-ready, high-impact climate solutions in Georgia this decade. The organization has identified a roadmap of 20 solutions, including electricity solutions such as CHP.

The impact of CHP could be dramatic considering that electricity generation accounts for nearly 37% of Georgia's energy-related carbon dioxide emissions, according to findings Brown and other researchers published earlier this year in the journal, Environmental Management.

Identifying Ideal CHP Sites

Georgia Tech researchers identified numerous different industrial sites in Georgia that could use combined heat and power. Ideal locations include established universities or military bases, and large industrial sites such as paper making, chemical sites, and food processing facilities. Georgia's number-one industry is agriculture, with chemicals and wood products among the state's top manufacturers.

"I find Georgia's potential to take advantage of existing industrial and commercial facilities to build CHP plants very interesting," said study co-author Valentina Sanmiguel, a 2020 master's graduate of the School of Public Policy in sustainable energy and environmental management. "I hope both industries and policymakers in Georgia realize the benefits that cogeneration has on the environment, the economy and society and take action to implement CHP in the state at a greater scale."

Dissecting Hurdles to Adoption

Despite the advantages of CHP, there remains hurdles to its adoption -- for one, establishing these facilities is capital-intensive, ranging from tens of millions for a campus CHP plant to hundreds of millions for a large plant at an industrial site. Once built, these facilities require their own workforce to operate, explained Brown.

"The cost-competitiveness of CHP systems depends significantly on two factors -- whether they are customer or utility-owned, and the type of rate tariff they operate under," said Brown.

In the paper, Georgia Tech cited three ways to improve the business case for CHP: clean energy portfolio standards, regulatory reform, and financial incentives such as tax credits.

Those approaches have worked well in North Carolina, noted Isaac Panzarella, director of the Department of Energy Southeast CHP Technical Assistance Partnership, and the assistant director for Technical Services for the North Carolina's Clean Energy Technology Center at North Carolina State University. North Carolina State University, where Panzarella is based, recently installed its second CHP facility on campus.

North Carolina, he added, has a policy that supports the use of renewable energy. Along with solar and wind, North Carolina embraced converting waste from swine and poultry-feeding operations into renewable energy.

"It's taken a long time, but finally there are more and more of these digester or biomass operations, using CHP to generate electricity and thermal energy from those waste resources," he said.

While Georgia Tech is not yet operating a CHP system, the Campus Sustainability Committee is currently examining options for lessening their energy footprint.

"Georgia Tech seeks to leverage Dr. Brown's important research, and the deep faculty expertise at Georgia Tech in climate solutions, as we advance the development of a campus-wide Carbon Neutrality Plan and Campus Master Plan in 2022," said Anne Rogers, associate director, Office of Campus Sustainability. "The Campus Master and Carbon Neutrality Plan will provide a roadmap to implementing sustainable infrastructure solutions to advance Georgia Tech's strategic goals."

Read more at Science Daily

Most dog breeds highly inbred

Dog breeds are often recognized for distinctive traits -- the short legs of a dachshund, wrinkled face of a pug, spotted coat of a Dalmatian. Unfortunately, the genetics that give various breeds their particular attributes are often the result of inbreeding.

In a recent study published in Canine Medicine and Genetics, an international team of researchers led by University of California, Davis, veterinary geneticist Danika Bannasch show that the majority of canine breeds are highly inbred, contributing to an increase in disease and health care costs throughout their lifespan.

"It's amazing how inbreeding seems to matter to health," Bannasch said. "While previous studies have shown that small dogs live longer than large dogs, no one had previously reported on morbidity, or the presence of disease. This study revealed that if dogs are of smaller size and not inbred, they are much healthier than larger dogs with high inbreeding."

Inbreeding affects health

The average inbreeding based on genetic analysis across 227 breeds was close to 25%, or the equivalent of sharing the same genetic material with a full sibling. These are levels considered well above what would be safe for either humans or wild animal populations. In humans, high levels of inbreeding (3-6%) have been associated with increased prevalence of complex diseases as well as other conditions.

"Data from other species, combined with strong breed predispositions to complex diseases like cancer and autoimmune diseases, highlight the relevance of high inbreeding in dogs to their health," said Bannasch, who also serves as the Maxine Adler Endowed Chair in Genetics at the UC Davis School of Veterinary Medicine.

The researchers partnered with Wisdom Health Genetics, a world leader in pet genetics, to obtain the largest sample size possible for analysis. Wisdom Health's database is the largest dog DNA database in the world, helping researchers collect data from 49,378 dogs across 227 breeds -- primarily from European sources.

Some breeds more inbred

So, what makes a dog breed more inbred than others? Bannasch explained that it's often a combination of a small founding population followed by strong selection for particular traits in a breed -- often based on looks rather than purpose. While she has always had an interest in the population structure of some of these breeds, she became particularly interested in the Danish-Swedish farmdog several years ago. She fell in love with their compact size, disposition and intelligence, and ended up importing one from Sweden.

Bannasch discovered that Danish-Swedish farmdogs have a low level of inbreeding based on their history of a relatively large founding population of 200, and being bred for function, rather than a strong artificial selection for looks. And according to the insurance health data on breeds collected from Agria Insurance Sweden and hosted online by the International Partnership for Dogs, the farmdog is one of the healthiest breeds.

The study also revealed a significant difference in morbidity between brachycephalic (short skull and snout) and non-brachycephalic breeds. While that finding wasn't unexpected, the researchers removed brachycephalic breeds from the final analysis on effects of inbreeding on health.

Preserving genetic diversity

In the end, Bannasch said she isn't sure there is a way out of inbred breeds. People have recognized that creating matches based solely on pedigrees is misleading. The inbreeding calculators don't go back far enough in a dog's genetic line, and that method doesn't improve overall high levels of population inbreeding.

There are other measures that can be taken to preserve the genetic diversity and health of a breed, she said. They include careful management of breeding populations to avoid additional loss of existing genetic diversity, through breeder education and monitoring of inbreeding levels enabled by direct genotyping technologies.

Outcrosses are being proposed or have already been carried out for some breeds and conditions as a measure to increase genetic diversity, but care must be taken to consider if these will effectively increase overall breed diversity and therefore reduce inbreeding, Bannasch said. In particular, in the few breeds with low inbreeding levels, every effort should be made to maintain the genetic diversity that is present.

Read more at Science Daily

Whether people inform themselves or remain ignorant is due to three factors

People choose whether to seek or avoid information about their health, finances and personal traits based on how they think it will make them feel, how useful it is, and if it relates to things they think about often, finds a new study by UCL researchers.

Most people fall into one of three 'information-seeking types': those that mostly consider the impact of information on their feelings when deciding whether to get informed, those that mostly consider how useful information will be for making decisions, and those that mostly seek information about issues they think about often, according to the findings published in Nature Communications.

Co-lead author Professor Tali Sharot (UCL Psychology & Language Sciences and Max Planck UCL Centre for Computational Psychiatry and Ageing Research) said: "Vast amounts of information are now available to individuals. This includes everything from information about your genetic make-up to information about social issues and the economy. We wanted to find out: how do people decide what they want to know? And why do some people actively seek out information, for example about COVID vaccines, financial inequality and climate change, and others don't?

"The information people decide to expose themselves to has important consequences for their health, finance and relationships. By better understanding why people choose to get informed, we could develop ways to convince people to educate themselves."

The researchers conducted five experiments with 543 research participants, to gauge what factors influence information-seeking.

In one of the experiments, participants were asked how much they would like to know about health information, such as whether they had an Alzheimer's risk gene or a gene conferring a strong immune system. In another experiment, they were asked whether they wanted to see financial information, such as exchange rates or what income percentile they fall into, and in another one, whether they would have liked to learn how their family and friends rated them on traits such as intelligence and laziness.

Later, participants were asked how useful they thought the information would be, how they expected it would make them feel, and how often they thought about each subject matter in question.

The researchers found that people choose to seek information based on these three factors: expected utility, emotional impact, and whether it was relevant to things they thought of often. This three-factor model best explained decisions to seek or avoid information compared to a range of other alternative models tested.

Some participants repeated the experiments a couple of times, months apart. The researchers found that most people prioritise one of the three motives (feelings, usefulness, frequency of thought) over the others, and their specific tendency remained relatively stable across time and domains, suggesting that what drives each person to seek information is 'trait-like'.

In two experiments, participants also filled out a questionnaire to gauge their general mental health. The researchers found that when people sought information about their own traits, participants who mostly wanted to know about traits they thought about often, reported better mental health.

Co-lead author, PhD student Christopher Kelly (UCL Psychology & Language Sciences and Max Planck UCL Centre for Computational Psychiatry and Ageing Research) said: "By understanding people's motivations to seek information, policy makers may be able to increase the likelihood that people will engage with and benefit from vital information. For example, if policy makers highlight the potential usefulness of their message and the positive feelings that it may elicit, they may improve the effectiveness of their message.

Read more at Science Daily

Dec 2, 2021

Lunar radar data uncovers new clues about moon’s ancient past

The dusty surface of the moon -- immortalized in images of Apollo astronauts' lunar footprints -- formed as the result of asteroid impacts and the harsh environment of space breaking down rock over millions of years. An ancient layer of this material, covered by periodic lava flows and now buried under the lunar surface, could provide new insight into the Moon's deep past, according to a team of scientists.

"Using careful data processing, we found interesting new evidence that this buried layer, called paleoregolith, may be much thicker than previously expected," said Tieyuan Zhu, assistant professor of geophysics at Penn State. "These layers have been undisturbed since their formation and could be important records for determining early asteroid impact and volcanic history of the moon."

The team, led by Zhu, conducted new analysis of radar data collected by China's Chang'e 3 mission in 2013, which performed the first direct ground radar measurements on the moon.

The researchers identified a thick layer of paleoregolith, roughly 16 to 30 feet, sandwiched between two layers of lava rock believed to be 2.3 and 3.6 billion years old. The findings suggest the paleoregolith formed much faster than previous estimates of 6.5 feet per billion years, the scientists said.

The moon has experienced volcanic activity throughout its history, depositing lava rock on the surface. Over time, the rock breaks down into dust and soil, called regolith, with repeated asteroid impacts and space weathering, only to be buried by subsequent lava flows, the scientists said.

"Lunar scientists count craters on the moon and use computer models to determine the rate that regolith is produced," Zhu said. "Our findings provide a constraint on what happened between two and three billion years ago. This is the very unique contribution of this work."

Previous studies have examined the dataset, created when the Yutu rover sent electromagnetic pulses into the lunar underground and listened as they echoed back. Zhu said his team developed a four-step data processing flow to enhance the signal and suppress noise in the data.

The scientists observed changes in polarity as the electromagnetic pulses traveled down through the dense lava rock and the paleoregolith, allowing the team to distinguish between the different layers.

"Our paper is really providing the first geophysical evidence to see this electromagnetic permittivity changed from a small value for the paleoregolith to a large value for the lava flows," Zhu said. "We discovered this polarity change in the data and created a detailed geophysical image of the subsurface up to a few hundred meters depth."

The results may indicate higher meteoric activity in the solar system during this period billions of years ago, according to the team, who recently reported their findings in the journal Geophysical Research Letters.

Zhu said the data processing tools may have use for interpreting similar data collected during future missions to the moon, Mars or elsewhere in the solar system. His team is now working with machine learning technology to further improve the findings.

Read more at Science Daily

Climate modeling confirms historical records showing rise in hurricane activity

When forecasting how storms may change in the future, it helps to know something about their past. Judging from historical records dating back to the 1850s, hurricanes in the North Atlantic have become more frequent over the last 150 years.

However, scientists have questioned whether this upward trend is a reflection of reality, or simply an artifact of lopsided record-keeping. If 19th-century storm trackers had access to 21st-century technology, would they have recorded more storms? This inherent uncertainty has kept scientists from relying on storm records, and the patterns within them, for clues to how climate influences storms.

A new MIT study published today in Nature Communications has used climate modeling, rather than storm records, to reconstruct the history of hurricanes and tropical cyclones around the world. The study finds that North Atlantic hurricanes have indeed increased in frequency over the last 150 years, similar to what historical records have shown.

In particular, major hurricanes, and hurricanes in general, are more frequent today than in the past. And those that make landfall appear have grown more powerful, carrying more destructive potential.

Curiously, while the North Atlantic has seen an overall increase in storm activity, the same trend was not observed in the rest of the world. The study found that the frequency of tropical cyclones globally has not changed significantly in the last 150 years.

"The evidence does point, as the original historical record did, to long-term increases in North Atlantic hurricane activity, but no significant changes in global hurricane activity," says study author Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in MIT's Department of Earth, Atmospheric, and Planetary Sciences. "It certainly will change the interpretation of climate's effects on hurricanes -- that it's really the regionality of the climate, and that something happened to the North Atlantic that's different from the rest of the globe. It may have been caused by global warming, which is not necessarily globally uniform."

Chance encounters

The most comprehensive record of tropical cyclones is compiled in a database known as the International Best Track Archive for Climate Stewardship (IBTrACS). This historical record includes modern measurements from satellites and aircraft that date back to the 1940s. The database's older records are based on reports from ships and islands that happened to be in a storm's path. These earlier records date back to 1851, and overall the database shows an increase in North Atlantic storm activity over the last 150 years.

"Nobody disagrees that that's what the historical record shows," Emanuel says. "On the other hand, most sensible people don't really trust the historical record that far back in time."

Recently, scientists have used a statistical approach to identify storms that the historical record may have missed. To do so, they consulted all the digitally reconstructed shipping routes in the Atlantic over the last 150 years and mapped these routes over modern-day hurricane tracks. They then estimated the chance that a ship would encounter or entirely miss a hurricane's presence. This analysis found a significant number of early storms were likely missed in the historical record. Accounting for these missed storms, they concluded that there was a chance that storm activity had not changed over the last 150 years.

But Emanuel points out that hurricane paths in the 19th century may have looked different from today's tracks. What's more, the scientists may have missed key shipping routes in their analysis, as older routes have not yet been digitized.

"All we know is, if there had been a change (in storm activity), it would not have been detectable, using digitized ship records," Emanuel says "So I thought, there's an opportunity to do better, by not using historical data at all."

Seeding storms

Instead, he estimated past hurricane activity using dynamical downscaling -- a technique that his group developed and has applied over the last 15 years to study climate's effect on hurricanes. The technique starts with a coarse global climate simulation and embeds within this model a finer-resolution model that simulates features as small as hurricanes. The combined models are then fed with real-world measurements of atmospheric and ocean conditions. Emanuel then scatters the realistic simulation with hurricane "seeds" and runs the simulation forward in time to see which seeds bloom into full-blown storms.

For the new study, Emanuel embedded a hurricane model into a climate "reanalysis" -- a type of climate model that combines observations from the past with climate simulations to generate accurate reconstructions of past weather patterns and climate conditions. He used a particular subset of climate reanalyses that only accounts for observations collected from the surface -- for instance from ships, which have recorded weather conditions and sea surface temperatures consistently since the 1850s, as opposed to from satellites, which only began systematic monitoring in the 1970s.

"We chose to use this approach to avoid any artificial trends brought about by the introduction of progressively different observations," Emanuel explains.

He ran an embedded hurricane model on three different climate reanalyses, simulating tropical cyclones around the world over the past 150 years. Across all three models, he observed "unequivocal increases" in North Atlantic hurricane activity.

"There's been this quite large increase in activity in the Atlantic since the mid-19th century, which I didn't expect to see," Emanuel says.

Within this overall rise in storm activity, he also observed a "hurricane drought" -- a period during the 1970s and 80s when the number of yearly hurricanes momentarily dropped. This pause in storm activity can also be seen in historical records, and Emanuel's group proposes a cause: sulfate aerosols, which were byproducts of fossil fuel combustion, likely set off a cascade of climate effects that cooled the North Atlantic and temporarily suppressed hurricane formation.

Read more at Science Daily

Strategies to improve sales of imperfect carrots

Explaining the value of misshapen vegetables -- that they are as healthful as their picture-perfect counterparts and buying them helps reduce food waste -- could help improve sales of "ugly" produce, new research suggests.

The study measured consumers' responses to hypothetical shopping scenarios for carrots. Participants were most open to buying bunches containing imperfect carrots after being presented with both of those marketing messages promoting ugly carrots' personal and societal benefits. Either message alone was not effective at convincing consumers to buy misfit carrots.

Findings also showed that respondents were willing to pay, with a small discount, for some level of mixed bunches containing both ugly and standard carrots, maxing out at 40% of misshapen carrots -- a sign to regulators who set the tolerance level for cosmetic standards that such a practice could be profitable.

One 2018 study in North Carolina suggested that about 41% of unharvested food is edible but unmarketable because of its appearance. The researchers are assessing ways to "win" with ugly foods in the marketplace by testing consumer acceptance of imperfect foods that don't come with a built-in discount -- a tactic used by some brick-and-mortar and online retailers that hasn't had much staying power.

"Any time you codify that cosmetically imperfect produce is somehow lesser, you're stuck selling it for less and therefore you undermine the entire value chain," said senior study author Brian Roe, professor in the Department of Agricultural, Environmental and Development Economics at The Ohio State University.

"We see that once you promote it as being more natural and as reducing wasted food, the discount is less than it otherwise would be -- but there is also a cluster of folks who are actually willing to pay as much or more because they value reducing food waste and they value the fact that it's got just as much nutrition as standard produce."

Roe conducted the study with Danyi Qi and Jerrod Penn of Louisiana State University and Ran Li, an Ohio State PhD student. The research is published online ahead of print in the Journal of Retailing and Consumer Services.

The researchers surveyed 1,300 U.S. residents who shopped and cooked for their households. Participants in the online survey were randomly assigned to receive one or a combination of two marketing messages: ugly carrots' nutritional quality equals that of blemish-free produce, and there are social costs linked to throwing away food with cosmetic flaws.

Participants also selected from images of their preferred 2-pound carrot bunches and price points, with six bunches -- either with or without their greenery attached -- containing 0% to 100% ugly carrots and prices ranging from $2.18 to $1.39 per pound. In another choice test, consumers could select from just two options, a bunch of all standard carrots or all imperfect carrots with or without green leaves attached, in a hypothetical purchase from either a farmers market or a conventional grocery store.

Participants consistently disliked bunches that included any ugly carrots at all, and the amount they were willing to pay for any number of imperfect carrots was always lower than what they'd pay for 100% standard carrots.

But a top contender in terms of profitability for farmers did emerge from the analysis of participant responses: bunches containing 40% ugly carrots and 60% standard carrots with green leaves attached sold at farmers markets where consumers are exposed to the combined marketing messages.

"If you're at a farmers market, you're thinking more holistically, you're not thinking about cosmetic perfection. You expect things to be more 'real,'" Roe said. "So I think then people realize this is what we might expect if we're getting produce directly from a farmer -- there's more room for imperfection because it's probably not interpreted as imperfection. It's interpreted as naturalness."

The research team analyzed the tipping point in consumer willingness to pay that could make harvesting ugly carrots profitable -- an important calculation for farmers who need a positive return on their investment into planting, picking and shipping their crops. The U.S. Department of Agriculture also has a say in the percentage of non-standard produce that can be sent to market -- a limit that may need to be revisited, Roe said.

"We hope these findings will change the viewpoint of the industry. If you want to move into the ugly produce space, you probably need to rebrand it rather than locking in a discount and saying, 'This is ugly food that should be worth less, so let's just lock it in as being an inferior good from the get-go," he said. "There hasn't been a lot of rethinking of standards in light of food waste, so that would be one policy lever that could be re-examined to deal with food waste in the modern era."

Read more at Science Daily

Stem cell-based implants successfully secrete insulin in patients with type 1 diabetes

Interim results from a multicenter clinical trial demonstrate insulin secretion from engrafted cells in patients with type 1 diabetes. The safety, tolerability, and efficacy of the implants, which consisted of pancreatic endoderm cells derived from human pluripotent stem cells (PSCs), were tested in 26 patients. While the insulin secreted by the implants did not have clinical effects in the patients, the data are the first reported evidence of meal-regulated insulin secretion by differentiated stem cells in human patients. The results appear December 2 in the journals Cell Stem Cell and Cell Reports Medicine.

"A landmark has been set. The possibility of an unlimited supply of insulin-producing cells gives hope to people living with type 1 diabetes," says Eelco de Koning of Leiden University Medical Center, a co-author of an accompanying commentary published in Cell Stem Cell. "Despite the absence of relevant clinical effects, this study will remain an important milestone for the field of human PSC-derived cell replacement therapies as it is one of the first to report cell survival and functionality one year after transplantation."

Approximately 100 years following the discovery of the hormone insulin, type 1 diabetes remains a life-altering and sometimes life-threatening diagnosis. The disease is characterized by the destruction of insulin-producing ?-cells in the Islets of Langerhans of the pancreas, leading to high levels of the blood sugar glucose.

Insulin treatment lowers glucose concentrations but does not completely normalize them. Moreover, modern insulin delivery systems can be burdensome to wear for long periods, sometimes malfunction, and often lead to long-term complications. While islet replacement therapy could offer a cure because it restores insulin secretion in the body, this procedure has not been widely adopted because donor organs are scarce. These challenges underscore the need for an abundant alternate supply of insulin-producing cells.

The use of human PSCs has made significant progress toward becoming a viable clinical option for the mass production of insulin-producing cells. In 2006, scientists at Novocell (now ViaCyte) reported a multi-stage protocol directing the differentiation of human embryonic stem cells into immature pancreatic endoderm cells. This stepwise protocol manipulating key signaling pathways was based on embryonic development of the pancreas. Follow-up studies showed that these pancreatic endoderm cells were able to mature further and become fully functional when implanted in animal models. Based on these results, clinical trials were started using these pancreatic endoderm cells.

Now two groups report on a phase I/II clinical trial in which pancreatic endoderm cells were placed in non-immunoprotective ("open") macroencapsulation devices, which allowed for direct vascularization of the cells, and implanted under the skin in patients with type 1 diabetes. The use of third-party off-the-shelf cells in this stem cell-based islet replacement strategy required immunosuppressive agents, which protect against graft rejection but can cause major side effects, such as cancer and infections. The participants underwent an immunosuppressive treatment regimen that is commonly used in donor islet transplantation procedures.

In Cell Stem Cell, Timothy Kieffer of the University of British Columbia and his collaborators provided compelling evidence of functional insulin-secreting cells after implantation. PEC-01s -- the drug candidate pancreatic endoderm cells produced by ViaCyte -- survived and matured into glucose-responsive, insulin-secreting cells within 26 weeks after implantation. Over the up to one year of follow-up, patients had 20% reduced insulin requirements, and spent 13% more time in target blood glucose range. Overall, the implants were well tolerated with no severe graft-related adverse events.

"For the first time, we provide evidence that stem cell-derived PEC-01s can mature into glucose-responsive, insulin-producing mature ?-cells in vivo in patients with type 1 diabetes," Kieffer says. "These early findings support future investment and investigation into optimizing cell therapies for diabetes."

However, two patients experienced serious adverse events associated with the immunosuppression protocol. Moreover, there was no control group and the interventions were not blinded, limiting causal conclusions, and outcomes were highly variable among the small number of participants. In addition, further studies need to determine the dose of pancreatic endoderm cells necessary to achieve clinically relevant benefits for patients.

In Cell Reports Medicine, Howard Foyt of ViaCyte and his collaborators reported engraftment and insulin expression in 63% of devices explanted from trial subjects at time periods ranging from 3 to 12 months after implantation. The progressive accumulation of functional, insulin-secreting cells occurred over a period of approximately 6-9 months from the time of implant.

The majority of reported adverse events were related to surgical implant or explant procedures or to immunosuppressive side effects. Despite potent systemic immune suppression, multiple surgical implantation sites, and the presence of foreign materials, the risk of local infection was exceedingly low, suggesting that this approach is well tolerated in subjects who are at risk for a poor healing response. The researchers are currently working on ways to promote graft vascularization and survival.

"The present study demonstrates definitively for the first time to our knowledge, in a small number of human subjects with type 1 diabetes, that PSC-derived pancreatic progenitor cells have the capacity to survive, engraft, differentiate, and mature into human islet-like cells when implanted subcutaneously," Foyt says.

Both reports showed that the grafts were vascularized and that cells in the device can survive up to 59 weeks after implantation. Analyses of the grafts revealed that the main islet cell types, including ?-cells, are present. Moreover, there was no formation of tumors called teratomas. However, the ratio of different endocrine cell types was atypical compared to mature pancreatic islets, and the total percentage of insulin-positive cells in the device was relatively low.

Regarding safety, most severe adverse events were associated with the use of immunosuppressive agents, emphasizing the life-long use of these drugs as a major hurdle for wider implementation of these types of cell replacement therapies. "An ideal and sunny possible future scenario would be the wide availability of a safe and efficacious stem cell-based islet replacement therapy without the need for these immunosuppressive agents or invasive, high-risk transplantation procedures," says Françoise Carlotti of Leiden University Medical Center, a co-author of the related commentary.

According to de Koning and Carlotti, many questions remain to be answered. For example, researchers need to determine the differentiation stage at which the cells are most optimal for transplantation, and the best transplantation site. It is also not clear whether the effectiveness and safety of the cells can be maintained over time, and whether it is possible to eliminate the need for immunosuppressive therapy.

Read more at Science Daily

Dec 1, 2021

Astronomers discover strangely massive black hole in Milky Way satellite galaxy

Astronomers at The University of Texas at Austin's McDonald Observatory have discovered an unusually massive black hole at the heart of one of the Milky Way's dwarf satellite galaxies, called Leo I. Almost as massive as the black hole in our own galaxy, the finding could redefine our understanding of how all galaxies -- the building blocks of the universe -- evolve. The work is published in a recent issue of The Astrophysical Journal.

The team decided to study Leo I because of its peculiarity. Unlike most dwarf galaxies orbiting the Milky Way, Leo I does not contain much dark matter. Researchers measured Leo I's dark matter profile -- that is, how the density of dark matter changes from the outer edges of the galaxy all the way into its center. They did this by measuring its gravitational pull on the stars: The faster the stars are moving, the more matter there is enclosed in their orbits. In particular, the team wanted to know whether dark matter density increases toward the galaxy's center. They also wanted to know whether their profile measurement would match previous ones made using older telescope data combined with computer models.

Led by recent UT Austin doctoral graduate María José Bustamante, the team includes UT astronomers Eva Noyola, Karl Gebhardt and Greg Zeimann, as well as colleagues from Germany's Max Planck Institute for Extraterrestrial Physics (MPE).

For their observations, they used a unique instrument called VIRUS-W on McDonald Observatory's 2.7-meter Harlan J. Smith Telescope.

When the team fed their improved data and sophisticated models into a supercomputer at UT Austin's Texas Advanced Computing Center, they got a startling result.

"The models are screaming that you need a black hole at the center; you don't really need a lot of dark matter," Gebhardt said. "You have a very small galaxy that is falling into the Milky Way, and its black hole is about as massive as the Milky Way's. The mass ratio is absolutely huge. The Milky Way is dominant; the Leo I black hole is almost comparable." The result is unprecedented.

The researchers said the result was different from the past studies of Leo I due to a combination of better data and the supercomputer simulations. The central, dense region of the galaxy was mostly unexplored in previous studies, which concentrated on the velocities of individual stars. The current study showed that for those few velocities that were taken in the past, there was a bias toward low velocities. This, in turn, decreased the inferred amount of matter enclosed within their orbits.

The new data is concentrated in the central region and is unaffected by this bias. The amount of inferred matter enclosed within the stars' orbits skyrocketed.

The finding could shake up astronomers' understanding of galaxy evolution, as "there is no explanation for this kind of black hole in dwarf spheroidal galaxies," Bustamante said.

The result is all the more important as astronomers have used galaxies such as Leo I, called "dwarf spheroidal galaxies," for 20 years to understand how dark matter is distributed within galaxies, Gebhardt added. This new type of black hole merger also gives gravitational wave observatories a new signal to search for.

"If the mass of Leo I's black hole is high, that may explain how black holes grow in massive galaxies," Gebhardt said. That's because over time, as small galaxies like Leo I fall into larger galaxies, the smaller galaxy's black hole merges with that of the larger galaxy, increasing its mass.

Read more at Science Daily

Male animals are subject to stronger evolutionary pressures than females

Male animals are subject to stronger selection pressures than females, which may allow populations to adapt to environmental change more efficiently, according to a report published in the open-access journal eLife.

The study supports one of the long-standing assumptions underpinning the idea that sexual selection bolsters adaptation: that stronger selection on males allows them to purge the population of genetic mutations that reduce survival fitness.

Sexual selection is selection arising from competition for mating partners and/or their reproductive cells (their eggs or sperm). For almost a century, researchers have thought that sexual selection is the ultimate selective force that generates the differences we see between male and female animals in terms of reproductive fitness and life history. Yet, little is known about how sexual selection combines with other environmental pressures to impact population demography and adaptive ability.

Living organisms accumulate mutations throughout life - some of which help them become fitter for survival, and some of which provide no benefit and may even cause a disadvantage (called deleterious mutations). Sexual selection is thought to promote evolutionary adaptation if it gives rise to stronger net selection - that is, the total purifying selection against deleterious mutations - in males rather than females. This is because a population's productivity relies on females' ability to reproduce, so that stronger net selection on males allows a population to get rid of the deleterious mutations quickly and adapt to their environment with a lower cost to the population, which may eventually reduce the risk of extinction.

"Our knowledge on whether such stronger sexual selection on males translates into stronger net selection to females is still limited," says first author Lennart Winkler, a PhD student at TU Dresden, Germany. "Previous studies have used the phenotypic variance of fitness to measure net selection, but its relevance has been questioned. An alternative measure is the organism's genetic variance of fitness. We used both measures to show whether net selection is generally stronger on males across a broad range of species."

The team ran a systematic literature search and compiled 101 paired estimates of male and female genetic variances across 26 species for two important components of an organism's fitness: reproductive success and lifespan.

They then tested whether the phenotypic variances were aligned to the genetic variances, and whether genetic variances show consistent sex differences. They predicted that males would show larger genetic variance in reproductive success but not in lifespan.

They found that the phenotypic variance of lifespan but not of reproductive success predicted the genetic variance in either males or females. Importantly, however, the phenotypic variance of reproductive success was larger in males than females, and this translated into a male bias in genetic variance. This sex difference could be detected in polygamous but not monogamous species. By contrast, there were no consistent sex differences in phenotypic or genetic variance for lifespan.

Read more at Science Daily

Coffee time: Caffeine improves reaction to moving targets

In the first study of its kind to explore caffeine's effects on dynamic visual skills, researchers concluded that caffeine increases alertness and detection accuracy for moving targets. Caffeine also improved participants' reaction times.

"A lot of what happens in our environment is moving -- like trying to cross a busy intersection as a pedestrian or finding something on a shelf as you're walking through the aisles of a grocery store," said Dr. Kristine Dalton of Waterloo's School of Optometry & Vision Science. "Testing visual acuity under dynamic conditions can provide more information about our functional performance in these scenarios than traditional static visual acuity measurements alone."

Visual acuity, also known as clarity of vision or sharpness of vision, refers to a person's ability to detect and recognize small details and can be measured under static (stationary) or dynamic (moving) conditions. While both static and dynamic visual acuity provide important information about how we interact with the world around us, dynamic visual acuity skills are especially important in the many daily activities in which we, or objects around us are moving.

"While we already know that caffeine increases the velocity of rapid-eye movements, we wanted to further investigate how exactly caffeine enhances visual processing and facilitates the detection of moving visual stimuli by testing dynamic visual acuity," said co-author Beatríz Redondo of the University of Granada's Department of Optics.

On two separate days, half of the study's participants ingested a caffeine capsule (4mg/kg) while the other half ingested a placebo capsule. Using a computer-based test designed and validated at the University of Waterloo, each participant's dynamic visual acuity skills were measured before and 60 minutes after caffeine ingestion.

Researchers found that participants who had ingested the caffeine capsules showed significantly greater accuracy and faster speed when identifying smaller moving stimuli, inferring caffeine positively influences participants' stimulus processing and decision-making. Eye movement velocity and contrast sensitivity, which are implicated in dynamic visual acuity performance, were also sensitive to caffeine intake.

"Our findings show that caffeine consumption can actually be helpful for a person's visual function by enhancing alertness and feelings of wakefulness," Dalton said. "This is especially true for those critical, everyday tasks, like driving, riding a bike or playing sports, that require us to attend to detailed information in moving objects when making decisions."

Read more at Science Daily

The diabetes medication that could revolutionize heart failure treatment

A medication originally used for patients with diabetes is the first to help people with heart failure and could revolutionise treatment, according to new research from the University of East Anglia.

Early research had shown that Sodium-glucose co-transporter-2 (SGLT2) inhibitors could help around half of heart failure patients -- those with a condition known as reduction ejection fraction.

But new findings published today show that the medication could be beneficial for all heart failure patients -- including those with a second type of heart failure called preserved ejection fraction.

It is the first drug to provide a real benefit in terms of improving outcomes for these patients. And the research team say it will revolutionise treatment options.

Lead researcher Prof Vass Vassiliou, from UEA's Norwich Medical School and an Honorary Consultant Cardiologist at the Norfolk and Norwich University Hospital, said: "Heart failure is a condition where the heart is not pumping as well as it should, and it affects about one million people in the UK.

"There are two types of heart failure. Heart Failure with a reduction in ejection fraction happens when the heart is unable to pump blood round the body due to a mechanical issue. And heart failure with preserved ejection fraction happens when, despite the heart pumping out blood well, it is not sufficient to provide oxygen to all the parts of the body.

"Patients are equally split between the two types of heart failure.

"For many years there was not a single medicine that could improve the outcome in patients with the second type of heart failure -- those patients with preserved ejection fraction.

"This type of heart failure had puzzled doctors, as every medicine tested showed no benefit.

"One class of heart medication, called SGLT2 inhibitors, was initially used for patients with diabetes. However, it was noticed that it also helped patients who had heart failure.

"Previous studies had shown that this medication would be beneficial in heart failure with reduced ejection fraction.

"But we found that it can also help heart failure patients with preserved ejection fraction."

SGLT2 inhibitors are more commonly known under their trade-names Forxiga (Dapagliflozin), Invokana (Canagliflozin), and Jardiance (Empagliflozin).

The research team undertook a meta-analysis of all studies published in the field and brought together data from almost 10,000 patients. They used statistical modelling to show the specific effect of these medicines.

Prof Vassiliou said: "We found that patients taking SGLT2 inhibitors were 22 per cent less likely to die from heart-related causes or be hospitalised for heart failure exacerbation than those taking placebo.

"This is very important because this is the first medication that can provide a benefit to this previously untreatable group of patients -- in terms of heart-related deaths or hospitalisation.

"This is the first medication that can really improve the outcomes for this patient group and it will revolutionise the treatment offered to heart failure patients," he added.

Read more at Science Daily

Nov 30, 2021

Closest pair of supermassive black holes yet

Using the European Southern Observatory's Very Large Telescope (ESO's VLT), astronomers have revealed the closest pair of supermassive black holes to Earth ever observed. The two objects also have a much smaller separation than any other previously spotted pair of supermassive black holes and will eventually merge into one giant black hole.

Voggel and her team were able to determine the masses of the two objects by looking at how the gravitational pull of the black holes influences the motion of the stars around them. The bigger black hole, located right at the core of NGC 7727, was found to have a mass almost 154 million times that of the Sun, while its companion is 6.3 million solar masses.

It is the first time the masses have been measured in this way for a supermassive black hole pair. This feat was made possible thanks to the close proximity of the system to Earth and the detailed observations the team obtained at the Paranal Observatory in Chile using the Multi-Unit Spectroscopic Explorer (MUSE) on ESO's VLT, an instrument Voggel learnt to work with during her time as a student at ESO. Measuring the masses with MUSE, and using additional data from the NASA/ESA Hubble Space Telescope, allowed the team to confirm that the objects in NGC 7727 were indeed supermassive black holes.

Astronomers suspected that the galaxy hosted the two black holes, but they had not been able to confirm their presence until now since we do not see large amounts of high-energy radiation coming from their immediate surroundings, which would otherwise give them away. "Our finding implies that there might be many more of these relics of galaxy mergers out there and they may contain many hidden massive black holes that still wait to be found," says Voggel. "It could increase the total number of supermassive black holes known in the local Universe by 30 percent."

Read more at Science Daily

Sun is likely an unaccounted source of the Earth’s water

Curtin University researchers have helped unravel the enduring mystery of the origins of the Earth's water, finding the Sun to be a surprising likely source.

A University of Glasgow-led international team of researchers including those from Curtin's Space Science and Technology Centre (SSTC) found the solar wind, comprised of charged particles from the Sun largely made of hydrogen ions, created water on the surface of dust grains carried on asteroids that smashed into the Earth during the early days of the Solar System.

SSTC Director, John Curtin Distinguished Professor Phil Bland said the Earth was very water-rich compared to other rocky planets in the Solar System, with oceans covering more than 70 percent of its surface, and scientists had long puzzled over the exact source of it all.

"An existing theory is that water was carried to Earth in the final stages of its formation on C-type asteroids, however previous testing of the isotopic 'fingerprint' of these asteroids found they, on average, didn't match with the water found on Earth meaning there was at least one other unaccounted for source," Professor Bland said.

"Our research suggests the solar wind created water on the surface of tiny dust grains and this isotopically lighter water likely provided the remainder of the Earth's water.

"This new solar wind theory is based on meticulous atom-by-atom analysis of miniscule fragments of an S-type near-Earth asteroid known as Itokawa, samples of which were collected by the Japanese space probe Hayabusa and returned to Earth in 2010.

"Our world-class atom probe tomography system here at Curtin University allowed us to take an incredibly detailed look inside the first 50 nanometres or so of the surface of Itokawa dust grains, which we found contained enough water that, if scaled up, would amount to about 20 litres for every cubic metre of rock."

Curtin graduate Dr Luke Daly, now of the University of Glasgow, said the research not only gives scientists a remarkable insight into the past source of Earth's water, but could also help future space missions.

"How astronauts would get sufficient water, without carrying supplies, is one of the barriers of future space exploration," Dr Daly said.

"Our research shows that the same space weathering process which created water on Itokawa likely occurred on other airless planets, meaning astronauts may be able to process fresh supplies of water straight from the dust on a planet's surface, such as the Moon."

Read more at Science Daily

Extinct swordfish-shaped marine reptile discovered

A team of international researchers from Canada, Colombia, and Germany has discovered a new marine reptile. The specimen, a stunningly preserved metre-long skull, is one of the last surviving ichthyosaurs -- ancient animals that look eerily like living swordfish.

"This animal evolved a unique dentition that allowed it to eat large prey," says Hans Larsson, Director of the Redpath Museum at McGill University. "Whereas other ichthyosaurs had small, equally sized teeth for feeding on small prey, this new species modified its tooth sizes and spacing to build an arsenal of teeth for dispatching large prey, like big fishes and other marine reptiles."

"We decided to name it Kyhytysuka which translates to 'the one that cuts with something sharp' in an indigenous language from the region in central Colombia where the fossil was found, to honour the ancient Muisca culture that existed there for millennia," says Dirley Cortes, a graduate student under the supervision of Hans Larsson and Carlos Jaramillo of the Smithsonian Tropical Research Institute.

The big picture of ichthyosaur evolution is clarified with this new species, the researchers say. "We compared this animal to other Jurassic and Cretaceous ichthyosaurs and were able to define a new type of ichthyosaurs," says Erin Maxwell of the State Natural History Museum of Stuttgart (a former graduate student of Hans Larsson's lab at McGill). "This shakes up the evolutionary tree of ichthyosaurs and lets us test new ideas of how they evolved."

According to the researchers, this species comes from an important transitional time during the Early Cretaceous period. At this time, the Earth was coming out of a relatively cool period, had rising sea levels, and the supercontinent Pangea was splitting into northern and southern landmasses. There was also a global extinction event at the end of the Jurassic that changed marine and terrestrial ecosystems. "Many classic Jurassic marine ecosystems of deep-water feeding ichthyosaurs, short-necked plesiosaurs, and marine-adapted crocodiles were succeeded by new lineages of long-necked plesiosaurs, sea turtles, large marine lizards called mosasaurs, and now this monster ichthyosaur" says Dirley Cortes.

"We are discovering many new species in the rocks this new ichthyosaur comes from. We are testing the idea that this region and time in Colombia was an ancient biodiversity hotspot and are using the fossils to better understand the evolution of marine ecosystems during this transitional time," she adds. As next steps the researchers are continuing to explore the wealth of new fossils housed in the Centro de Investigaciones Paleontológicas of Villa de Leyva in Colombia. "This is where I grew up," says Cortes "and it is so rewarding to get to do research here too."

Read more at Science Daily

Extraordinary Roman mosaic and villa discovered beneath farmer's field in Rutland, UK

Archaeologists have unearthed the first Roman mosaic of its kind in the UK. Today (Thursday 25th November 2021), a rare Roman mosaic and surrounding villa complex have been protected as a Scheduled Monument by DCMS on the advice of Historic England. The decision follows archaeological work undertaken by a team from University of Leicester Archaeological Services (ULAS), working in partnership with Historic England and in liaison with Rutland County Council.

The initial discovery of the mosaic was made during the 2020 lockdown by Jim Irvine, son of landowner Brian Naylor, who contacted the archaeological team at Leicestershire County Council, heritage advisors to the local authority. Given the exceptional nature of this discovery, Historic England was able to secure funding for urgent archaeological investigations of the site by ULAS in August 2020. Further excavation involving staff and students from the University of Leicester's School of Archaeology and Ancient History examined more of the site in September 2021. The remains of the mosaic measure 11m by almost 7m and depict part of the story of the Greek hero Achilles.

The artwork forms the floor of what's thought to be a large dining or entertaining area. Mosaics were used in a variety of private and public buildings across the Roman Empire, and often featured famous figures from history and mythology. However, the Rutland mosaic is unique in the UK in that it features Achilles and his battle with Hector at the conclusion of the Trojan War and is one of only a handful of examples from across Europe.

The room is part of a large villa building occupied in the late Roman period, between the 3rd and 4th century AD. The villa is also surrounded by a range of other buildings and features revealed by a geophysical survey and archaeological evaluation, including what appear to be aisled barns, circular structures and a possible bath house, all within a series of boundary ditches. The complex is likely to have been occupied by a wealthy individual, with a knowledge of classical literature.

Fire damage and breaks in the mosaic suggest that the site was later re-used and re-purposed. Other evidence uncovered includes the discovery of human remains within the rubble covering the mosaic. These burials are thought to have been interred after the building was no longer occupied, and while their precise age is currently unknown, they are later than the mosaic but placed in a relationship to the villa building, suggesting a very late Roman or Early-Medieval date for the repurposing of this structure. Their discovery gives an insight into how the site may have been used during this relatively poorly understood early post-Roman period of history.

Evidence recovered from the site will be analysed by ULAS at their University of Leicester base, and by specialists from Historic England and across the UK, including David Neal, the foremost expert on mosaic research in the country.

The protection as a scheduled monument recognises the exceptional national importance of this site. It ensures these remains are legally protected and helps combat unauthorised works or unlawful activities such as illegal metal detecting. The site has been thoroughly examined and recorded as part of the recent investigations and has now been backfilled to protect it for future generations.

The villa complex was found within an arable field where the shallow archaeological remains had been disturbed by ploughing and other activities. Historic England is working with the landowner to support the reversion of these fields to a sustainable grassland and pasture use. These types of agri-environment schemes are an essential part of how we can protect both the historic and natural environments and have contributed around £13 million per year towards the conservation and maintenance of our rural heritage. They help to preserve sites like the Rutland mosaic so that people can continue to enjoy and learn about our fascinating history.

In collaboration with the University of Leicester and other stakeholders, Historic England is planning further excavations on the site for 2022.

Discussions are on-going with Rutland County Council to explore the opportunity for an off-site display and interpretation of the villa complex and its finds. The form and scope of this work will be informed by the proposed future excavations and will be the subject of a future National Lottery Heritage Fund bid.

Read more at Science Daily

Scientists can control brain circuits, behavior, and emotion using light

Controlling signal transmission and reception within the brain circuits is necessary for neuroscientists to achieve a better understanding of the brain's functions. Communication among neuron and glial cells is mediated by various neurotransmitters being released from the vesicles through exocytosis. Thus, regulating vesicular exocytosis can be a possible strategy to control and understand brain circuits.

However, it has been difficult to freely control the activity of brain cells in a spatiotemporal manner using pre-existing techniques. One is an indirect approach that involves artificially controlling the membrane potential of cells, but it comes with problems of changing the acidity of the surrounding environment or causing unwanted misfiring of neurons. Moreover, it is not applicable for use in cells that do not respond to the membrane potential changes, such as glial cells.

To address this problem, South Korean researchers led by Director C. Justin LEE at the Center for Cognition and Sociality within the Institute for Basic Science (IBS) and professor HEO Won Do at Korea Advanced Institute of Science and Technology (KAIST) developed Opto-vTrap, a light-inducible and reversible inhibition system that can temporarily trap vesicles from being released from brain cells. Opto-vTrap directly targets transmitters containing vesicles, and it can be used in various types of brain cells, even the ones that do not respond to membrane potential changes.

In order to directly control the exocytotic vesicles, the research team applied a technology they previously developed in 2014, called light-activated reversible inhibition by assembled trap (LARIAT). This platform can inactivate various types of proteins when illuminated under blue light by instantly trapping the target proteins, like a lariat. Opto-vTrap was developed by applying this LARIAT platform to vesicle exocytosis. When the Opto-vTrap expressing cells or tissues are shined under blue light, the vesicles form clusters and become trapped within the cells, inhibiting the release of transmitters.

Most importantly, the inhibition triggered using this new technique is temporary, which is very important for neuroscience research. Other previous techniques that target vesicle fusion proteins damage them permanently and disable the target neuron for up to 24 hours, which is not appropriate for many behavioral experiments with short time constraints. By comparison, vesicles that were inactivated using Opto-vTrap decluster in about 15 minutes, and the neurons regain their full functions within an hour.

Opto-vTrap directly controls the signal transmitters' release, enabling the researchers to freely control brain activity. The research team verified the usability of Opto-vTrap in cultured cells and brain tissue slices. Furthermore, they tested the technique in live mice, which enabled them to temporarily remove fear memory from fear-conditioned animals.

In the future, Opto-vTrap will be used to uncover complex interactions between multiple parts of the brain. It will be a highly useful tool for studying how certain brain cell types affect brain function in different circumstances.

Professor Heo stated, "Since Opto-vTrap can be used in various cell types, it is expected to be helpful in various fields of brain science research," He explained, "We plan to conduct a study to figure out the spatiotemporal brain functions in various brain cell types in a specific environment using Opto-vTrap technology."

Read more at Science Daily

Nov 29, 2021

Orbital harmony limits late arrival of water on TRAPPIST-1 planets

Seven Earth-sized planets orbit the star TRAPPIST-1 in near-perfect harmony, and U.S. and European researchers have used that harmony to determine how much physical abuse the planets could have withstood in their infancy.

"After rocky planets form, things bash into them," said astrophysicist Sean Raymond of the University of Bordeaux in France. "It's called bombardment, or late accretion, and we care about it, in part, because these impacts can be an important source of water and volatile elements that foster life."

In a study available online today in Nature Astronomy, Raymond and colleagues from Rice University's NASA-funded CLEVER Planets project and seven other institutions used a computer model of the bombardment phase of planetary formation in TRAPPIST-1 to explore the impacts its planets could have withstood without getting knocked out of harmony.

Deciphering the impact history of planets is difficult in our solar system and might seem like a hopeless task in systems light-years away, Raymond said.

"On Earth, we can measure certain types of elements and compare them with meteorites," Raymond said. "That's what we do to try to figure out how much stuff bashed into the Earth after it was mostly formed."

But those tools don't exist for studying bombardment on exoplanets.

"We'll never get rocks from them," he said. "We're never going to see craters on them. So what can we do? This is where the special orbital configuration of TRAPPIST-1 comes in. It's a kind of a lever we can pull on to put limits on this."

TRAPPIST-1, about 40 light-years away, is far smaller and cooler than our sun. Its planets are named alphabetically from b to h in order of their distance from the star. The time needed to complete one orbit around the star -- equivalent to one year on Earth -- is 1.5 days on planet b and 19 days on planet h. Remarkably, their orbital periods form near-perfect ratios, a resonant arrangement reminiscent of harmonious musical notes. For example, for every eight "years" on planet b, five pass on planet c, three on planet d, two on planet e and so on.

"We can't say exactly how much stuff bashed into any of these planets, but because of this special resonant configuration, we can put an upper limit on it," Raymond said. "We can say, 'It can't have been more than this.' And it turns out that that upper limit is actually fairly small.

"We figured out that after these planets formed, they weren't bombarded by more than a very small amount of stuff," he said. "That's kind of cool. It's interesting information when we're thinking about other aspects of the planets in the system."

Planets grow within protoplanetary disks of gas and dust around newly formed stars. These disks only last a few million years, and Raymond said previous research has shown that resonant chains of planets like TRAPPIST-1's form when young planets migrate closer to their star before the disk disappears. Computer models have shown disks can shepherd planets into resonance. Raymond said it's believed that resonant chains like TRAPPIST-1's must be set before their disks disappear.

The upshot is TRAPPIST-1's planets formed fast, in about one-tenth the time it took Earth to form, said Rice study co-author Andre Izidoro, an astrophysicist and CLEVER Planets postdoctoral fellow.

CLEVER Planets, led by study co-author Rajdeep Dasgupta, the Maurice Ewing Professor of Earth Systems Science at Rice, is exploring the ways planets might acquire the necessary elements to support life. In previous studies, Dasgupta and colleagues at CLEVER Planets have shown a significant portion of Earth's volatile elements came from the impact that formed the moon.

"If a planet forms early and it is too small, like the mass of the moon or Mars, it cannot accrete a lot of gas from the disk," Dasgupta said. "Such a planet also has much less opportunity to gain life-essential volatile elements through late bombardments."

Izidoro said that would have been the case for Earth, which gained most of its mass relatively late, including about 1% from impacts after the moon-forming collision.

"We know Earth had at least one giant impact after the gas (in the protoplanetary disk) was gone," he said. "That was the moon-forming event.

"For the TRAPPIST-1 system, we have these Earth-mass planets that formed early," he said. "So one potential difference, compared to the Earth's formation, is that they could have, from the beginning, some hydrogen atmosphere and have never experienced a late giant impact. And this might change a lot of the evolution in terms of the interior of the planet, outgassing, volatile loss and other things that have implications for habitability."

Raymond said this week's study has implications not only for the study of other resonant planetary systems, but for far more common exoplanet systems that were believed to have begun as resonant systems.

"Super-Earths and sub-Neptunes are very abundant around other stars, and the predominant idea is that they migrated inward during that gas-disk phase and then possibly had a late phase of collisions," Raymond said. "But during that early phase, where they were migrating inward, we think that they pretty much -- universally maybe -- had a phase where they were resonant chain structures like TRAPPIST-1. They just didn't survive. They ended up going unstable later on."

Izidoro said one of the study's major contributions could come years from now, after NASA's James Webb Space Telescope, the European Southern Observatory's Extremely Large Telescope and other instruments allow astronomers to directly observe exoplanet atmospheres.

"We have some constraints today on the composition of these planets, like how much water they can have," Izidoro said of planets that form in a resonant, migration phase. "But we have very big error bars."

In the future, observations will better constrain the interior composition of exoplanets, and knowing the late bombardment history of resonant planets could be extremely useful.

"For instance, if one of these planets has a lot of water, let's say 20% mass fraction, the water must have been incorporated into the planets early, during the gaseous phase," he said. "So you will have to understand what kind of process could bring this water to this planet."

Read more at Science Daily

Researchers identify behavioral adaptations that may help Antarctic fishes adapt to warming Southern Ocean

At first glance, Antarctica seems inhospitable. Known for howling gales and extremely cold temperatures, the continent is blanketed with a mile-thick ice shelf. Occasional elephant seals and seabirds fleck the glacial shorelines.

Yet dipping below the waves, the Southern Ocean teems with biodiversity: vibrant swaths of sea ice algae and cyanobacteria, swarming krill and crustaceans, bristling kelp forests, gigantic polar sea spiders and sponges, whale pods, and abundant Antarctic fish fauna.

These fishes play a vital role in the Southern Ocean's food web of 9,000 known marine species, yet their subzero haven may be at risk. A 2021 climate analysis posited that by 2050 some areas of the Antarctic continental shelf will be at least 1 degree Celsius warmer.

Researchers from Virginia Tech's Fralin Biomedical Research Institute at VTC have published a new study in PLOS ONE describing how two species of Antarctic fish -- one with hemoglobin in its blood cells and one without -- respond to acute thermal stress.

The research team, directed by Virginia Tech Vice President for Health Sciences and Technology Michael Friedlander, observed that both species responded to progressive warming with an elaborate array of behavioral maneuvers, including fanning and splaying their fins, breathing at the surface, startle-like behavior, and transient bouts of alternating movement and rest.

"Remarkably, our team found that Antarctic fishes compensate for increasing metabolic demands by enhancing respiration through species-specific locomotor and respiratory responses, demonstrating resilience to environmental change and possibly to global warming," said Friedlander, who is also the Fralin Biomedical Research Institute's executive director, senior dean for research at the Virginia Tech Carilion School of Medicine, and a professor in the College of Science's Department of Biological Sciences. "Ambient warming presents a multi-faceted challenge to the fish, including increased temperature of the central nervous system and target tissues such as skeletal and cardiac muscles, but also reduced availability of dissolved oxygen in the water that passes through the gills during respiration. While these findings suggest that Antarctic fishes may be able to behaviorally adapt somewhat under extreme conditions, little is known about the effects of environmental warming on their predation habits, food availability, and fecundity,"

Iskander Ismailov, the study's first author and a research assistant professor in Friedlander's laboratory during the study, said, "Behavioral manifestations that we've described show that these fishes have powerful physiological capacities to survive environmental changes," said

Through millions of years of isolation from the rest of the world -- corralled by the Antarctic Circumpolar Current -- Southern Ocean fish species have become well adapted to their frosty ecosystem.

Blackfin icefish, Chaenocephalus aceratus, one of the two species studied by the team, have unique opalescent blood. These fish are among the few known vertebrates lacking hemoglobin, a molecule in red blood cells that efficiently carries oxygen from the lungs of land-dwelling vertebrates, or from the gills of aquatic vertebrates, throughout tissues in the body. Instead, blackfin icefish transport oxygen dissolved in blood plasma, harboring roughly 10% of the oxygen carrying capacity of hemoglobin.

Oxygen is more soluble in cold water, allowing white-blooded icefish to thrive in the Southern Ocean. As water temperature rises, however, these species experience increased metabolic demand, potentially making white-blooded fish more vulnerable to global warming. To test this hypothesis, the team examined five specimens of white-blooded blackfin icefish and five red-blooded black rockcod, Notothenia coriiceps, in a climate-controlled shoreline laboratory that circulated, and progressively warmed, saltwater straight from the Southern Ocean.

The fishes acclimated to the lab conditions, before being transferred to the experimental tank, where water temperature rose from -1.8 degrees Celsius to 13 degrees, at a rate of 3 degrees per hour. The researchers captured extensive video recordings, allowing them to examine and quantify the fishes' motility, breathing rate, maneuvers in the tank, and fin movements.

As the water temperature rose, the white-blooded icefish displayed intensive pectoral fin fanning -- a behavior previously observed in icefish during egg guarding -- that the researchers suggest may help facilitate respiration. By contrast, the red-blooded fish employed complex maneuvers, including pectoral fin fanning and splaying, followed by startle-like C-turns, which may augment gill ventilation, according to Ismailov.

"The findings provide a new perspective on the effects of rising temperature on these highly cold-adapted species," said George Somero, professor emeritus of marine biology at Stanford University and a leader in studying how marine life adapts to thermal stress, who was not involved in the research.

Preparation for the expedition began in early 2014. The research team designed, custom-built, and shipped laboratory equipment to Palmer Station in Antarctica before living there for three months in 2015. The journey included a flight to Punta Arenas, Chile, then crossing the Drake Passage by boat during the austral fall.

Ismailov was the first to arrive, setting up experimental rigs. Six weeks later, he was joined by Jordan Scharping, then a second-year Virginia Tech Carilion School of Medicine student conducting research in Friedlander's lab. The pair worked in overlapping 12-hour shifts running experiments in the laboratory at near-freezing temperatures.

"Dr. Friedlander drew me to this project. I remember him presenting the Antarctic project proposal to us medical students and everyone just lighting up about it. It was an incredible opportunity and I appreciate him giving it to me," said Scharping, who is now a physician at Northwestern Memorial Hospital.

Researchers were responsible for collecting their own fish specimens during a series of four, week-long fishing trips. At sea, with the help of the research vessel crew, the researchers worked around the clock -- sometimes during harsh conditions.

"One stormy night while we were fishing, a two-story wave overtook the stern, drenching me from head to toe in ice-cold seawater -- the captain of the boat stopped the fishing after that," Ismailov recalled. "As a graduate of medical school, I never could have imagined that my career would lead me to Antarctica to study fish, but this research project has become one of the most extraordinary and memorable in my life."

The field work was funded by a National Science Foundation Grant awarded to Elizabeth Crockett, professor emerita at Ohio University, and Kristin O'Brien, professor at the University of Alaska Fairbanks. Crockett and O'Brien -- both former graduate students of Bruce Sidell, who was trained by C. Ladd Prosser -- invited Friedlander to join the expedition along with collaborators from the University of British Columbia, the University of Leeds, and Valdosta State University.

But the underpinnings of this recent study started 45 years ago. Friedlander, then a graduate student under the mentorship of Prosser at the University of Illinois at Urbana-Champaign -- a pioneer in the field of comparative animal physiology and thermal biology -- conducted research to advance experimental approaches to evaluate how temperature change affects molecular, cellular, and behavioral processes in an entire organism. Their landmark study, published in the Journal of Comparative Physiology in 1977, examining the common goldfish, was lauded by Somero in a 2015 review in the Journal of Experimental Biology.

"I find it gratifying that the pathbreaking studies of temperature effects on goldfish behavior carried out by Dr. Friedlander several decades ago have evolved into this fascinating new work on fishes of the Southern Ocean," Somero said.

While the research team observed that stenothermal Antarctic fishes show remarkable capacity to withstand acute thermal stress, Ismailov warns that these vulnerable species still need protection.

"There's a history of severe overexploitation in the Southern Ocean in the '70s and '80s due to unregulated commercial fishing. These activities had depleted the populations of some fish species so badly that the prospects of their recovery are still unclear," Ismailov said.

Friedlander expounds on this, noting that all species play important roles in a fragile ecosystem.

"If left unregulated, anthropogenic activities could produce irreversible damage, impacting not just icefish, but many other species in the Antarctic food webs as well," Friedlander said. "By doing these types of proof of principle experiments now to begin to understand the physiological repertoire available to species at risk, we can begin to make more informed predictions about what sort of perturbations within complex ecosystems that climate change may trigger, and what type of reserve and adaptive capacity individual species may deploy,"

Read more at Science Daily

How can our brain still perceive familiar objects even when they become indistinct?

Researchers have explored the brain neuronal mechanism that allows the perception of familiar images even if they are indistinct. They found that the number of neurons responding to low-contrast rather than high-contrast visual stimuli increased in rats performing a visual orientation discrimination task after repeated experiences. These neurons showed stronger activities in correct-choice than incorrect-choice trials. These neurons efficiently represented low-contrast stimulations. Thus, the low-contrast preference in V1 activity may contribute to improved low-contrast visual.

The appearance of objects can often change. For example, in dim evenings or fog, the contrast of the objects decreases, making it difficult to distinguish them. However, after repeatedly encountering specific objects, the brain can identify them even if they become indistinct. The exact mechanism contributing to the perception of low-contrast familiar objects remains unknown.

In the primary visual cortex (V1), the area of the cerebral cortex dedicated to processing basic visual information, the visual responses have been considered to reflect directly the strength of external inputs. Thus, high-contrast visual stimuli elicit strong responses and vice versa.

In this study, Rie Kimura and Yumiko Yoshimura found that in rats, the number of V1 neurons preferentially responding to low-contrast stimuli increases after repeated experiences. In these neurons, low-contrast visual stimuli elicit stronger responses, and high-contrast stimuli elicit weaker responses. These low contrast-preferring neurons show a more evident activity when rats correctly perceive a low-contrast familiar object. It was first reported in Science Advances that low-contrast preference in V1 is strengthened in an experience-dependent manner to represent low-contrast visual information well. This mechanism may contribute to the perception of familiar objects, even when they are indistinct.

"This flexible information representation may enable a consistent perception of familiar objects with any contrast," Kimura says. "The flexibility of our brain makes our sensation effective, although you may not be aware of it. An artificial neural network model may reproduce the human sensation by incorporating not only high contrast-preferring neurons, generally considered until now, but also low contrast-preferring neurons, the main focus of this research."

From Science Daily