Jul 1, 2023

Earliest strands of the cosmic web

Galaxies are not scattered randomly across the universe. They gather together not only into clusters, but into vast interconnected filamentary structures with gigantic barren voids in between. This "cosmic web" started out tenuous and became more distinct over time as gravity drew matter together.

Astronomers using NASA's James Webb Space Telescope have discovered a thread-like arrangement of 10 galaxies that existed just 830 million years after the big bang. The 3 million light-year-long structure is anchored by a luminous quasar -- a galaxy with an active, supermassive black hole at its core. The team believes the filament will eventually evolve into a massive cluster of galaxies, much like the well-known Coma Cluster in the nearby universe.

"I was surprised by how long and how narrow this filament is," said team member Xiaohui Fan of the University of Arizona in Tucson. "I expected to find something, but I didn't expect such a long, distinctly thin structure."

"This is one of the earliest filamentary structures that people have ever found associated with a distant quasar," added Feige Wang of the University of Arizona in Tucson, the principal investigator of this program.

This discovery is from the ASPIRE project (A SPectroscopic survey of biased halos In the Reionization Era), whose main goal is to study the cosmic environments of the earliest black holes. In total, the program will observe 25 quasars that existed within the first billion years after the big bang, a time known as the Epoch of Reionization.

"The last two decades of cosmology research have given us a robust understanding of how the cosmic web forms and evolves. ASPIRE aims to understand how to incorporate the emergence of the earliest massive black holes into our current story of the formation of cosmic structure," explained team member Joseph Hennawi of the University of California, Santa Barbara.

Growing Monsters


Another part of the study investigates the properties of eight quasars in the young universe. The team confirmed that their central black holes, which existed less than a billion years after the big bang, range in mass from 600 million to 2 billion times the mass of our Sun. Astronomers continue seeking evidence to explain how these black holes could grow so large so fast.

"To form these supermassive black holes in such a short time, two criteria must be satisfied. First, you need to start growing from a massive 'seed' black hole. Second, even if this seed starts with a mass equivalent to a thousand Suns, it still needs to accrete a million times more matter at the maximum possible rate for its entire lifetime," explained Wang.

"These unprecedented observations are providing important clues about how black holes are assembled. We have learned that these black holes are situated in massive young galaxies that provide the reservoir of fuel for their growth," said Jinyi Yang of the University of Arizona, who is leading the study of black holes with ASPIRE.

Webb also provided the best evidence yet of how early supermassive black holes potentially regulate the formation of stars in their galaxies. While supermassive black holes accrete matter, they also can power tremendous outflows of material. These winds can extend far beyond the black hole itself, on a galactic scale, and can have a significant impact on the formation of stars.

"Strong winds from black holes can suppress the formation of stars in the host galaxy. Such winds have been observed in the nearby universe but have never been directly observed in the Epoch of Reionization," said Yang. "The scale of the wind is related to the structure of the quasar. In the Webb observations, we are seeing that such winds existed in the early universe."

Read more at Science Daily

Light or moderate alcohol consumption does not guard against diabetes, obesity

People who have just one or two drinks per day are not protected against endocrine conditions such as obesity and type 2 diabetes, according to a new study published in the Endocrine Society's Journal of Clinical Endocrinology & Metabolism.

Alcohol consumption is a significant public health concern because it is related to many medical conditions such as diabetes, obesity, liver conditions and heart disease. While it is widely accepted that excessive alcohol consumption causes a wide range of health issues, whether modest alcohol consumption has beneficial health effects remains controversial.

"Some research has indicated that moderate drinkers may be less likely to develop obesity or diabetes compared to non-drinkers and heavy drinkers. However, our study shows that even light-to-moderate alcohol consumption (no more than one standard drink per day) does not protect against obesity and type 2 diabetes in the general population," said Tianyuan Lu, Ph.D., from McGill University in Québec, Canada. "We confirmed that heavy drinking could lead to increased measures of obesity (body mass index, waist-to-hip ratio, fat mass, etc.) as well as increased risk of type 2 diabetes."

The researchers assessed self-reported alcohol intake data from 408,540 participants in the U.K. Biobank and found people who had more than 14 drinks per week had higher fat mass and a higher risk of obesity and type 2 diabetes.

These associations were stronger in women than in men. No data supported the association between moderate drinking and improved health outcomes in people drinking less than or equal to seven alcoholic beverages per week.

"We hope our research helps people understand the risks associated with drinking alcohol and that it informs future public health guidelines and recommendations related to alcohol use," Lu said. "We want our work to encourage the general population to choose alternative healthier behaviors over drinking."

Read more at Science Daily

Lessons learned from first genetically-modified pig heart into human patient

A new study published today in The Lancet has revealed the most extensive analysis to date on what led to the eventual heart failure in the world's first successful transplant of a genetically-modified pig heart into a human patient. This groundbreaking procedure was conducted by University of Maryland School of Medicine (UMSOM) physician-scientists back in January 2022 and marked an important milestone for medical science.

The patient, 57-year-old David Bennett, Sr., was treated at the University of Maryland Medical Center. He experienced strong cardiac function with no obvious signs of acute rejection for nearly seven weeks after the surgery. A sudden onset of heart failure led to his death two months after the transplant. Since then, the transplant team has been conducting extensive studies into the physiologic processes that led to the heart failure to identify factors that can be prevented in future transplants to improve the odds of longer-term success.

"Our paper provides crucial insight into how a multitude of factors likely played a role in the functional decline of the transplanted heart," said study lead author Muhammad M. Mohiuddin, MD, Professor of Surgery and Scientific/Program Director of the Cardiac Xenotransplantation Program at UMSOM. "Our goal is to continue moving this field forward as we prepare for clinical trials of xenotransplants involving pig organs."

Mr. Bennett, who was in end-stage heart failure and nearing the end of his life, did not qualify for a traditional heart transplant. The procedure was authorized by the U.S. Food and Drug Administration under its expanded access (compassionate use) provision.

"We were determined to shed light on what led to the heart transplant dysfunction in Mr. Bennett, who performed a heroic act by volunteering to be the first in the world," said study co-author Bartley Griffith, MD, Professor of Surgery and The Thomas E. and Alice Marie Hales Distinguished Professor in Transplantation at UMSOM. "We want our next patient to not only survive longer with a xenotransplant but to return to normal life and thrive for months or even years."

To better understand the processes that led to dysfunction of the pig heart transplant, the research team performed extensive testing on the limited available tissues in the patient. They carefully mapped out the sequence of events that led to the heart failure demonstrating that the heart functioned well on imaging tests like echocardiography until day 47 after surgery.

The new study confirms that no signs of acute rejection occurred during the first several weeks after the transplant. Likely, several overlapping factors led to heart failure in Mr. Bennett, including his poor state of health prior to the transplant that led him to become severely immunocompromised. This limited the use of an effective anti-rejection regimen used in preclinical studies for xenotransplantation. As a result, the researchers found, the patient was likely more vulnerable to rejection of the organ from antibodies made by the immune system. The researchers found indirect evidence of antibody-mediated rejection based on histology, immunohistochemical staining and single cell RNA analysis.

The use of an intravenous immunoglobulin, IVIG, a drug that contains antibodies, may also have contributed to damage to the heart muscle cells. It was given to the patient twice during the second month after the transplant to help prevent infection, likely also triggering an anti-pig immune response. The research team found evidence of immunoglobulin antibodies targeting the pig vascular endothelium layer of the heart.

Lastly, the new study investigated the presence of a latent virus, called porcine cytomegalovirus (PCMV), in the pig heart, which may have contributed to the dysfunction of the transplant. Activation of the virus may have occurred after the patient's anti-viral treatment regimen was reduced to address other health issues. This may have initiated an inflammatory response causing cell damage. However, there is no evidence that the virus infected the patient or spread to organs beyond the heart. Improved PCMV testing protocols have been developed for sensitive detection and exclusion of latent viruses for future xenotransplants.

Read more at Science Daily

Jun 30, 2023

Starlight and the first black holes: researchers detect the host galaxies of quasars in the early universe

New images from the James Webb Space Telescope have revealed, for the first time, starlight from two massive galaxies hosting actively growing black holes -- quasars -- seen less than a billion years after the Big Bang. A new study in Nature this week finds the black holes have masses close to a billion times that of the Sun, and the host galaxy masses are almost one hundred times larger, a ratio similar to what is found in the more recent universe. A powerful combination of the Subaru Telescope and the JWST has paved a new path to study the distant universe.

The existence of such massive black holes in the distant universe has created more questions than answers for astrophysicists. How could these black holes grow to be so large when the universe was so young? Even more puzzling, observations in the local universe show a clear relation between the mass of supermassive black holes and the much larger galaxies in which they reside. The galaxies and the black holes have completely different sizes, so which came first: the black holes or the galaxies? This is a "chicken-or-egg" problem on a cosmic scale.

An international team of researchers, led by Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) Project Researcher Xuheng Ding and Professor John Silverman, and Peking University Kavli Institute for Astronomy and Astrophysics (PKU-KIAA) Kavli Astrophysics Fellow Masafusa Onoue have started to answer this question with the James Webb Space Telescope (JWST), launched in December 2021. Studying the relation between host galaxies and black holes in the early universe allows scientists to watch their formation, and see how they are related to one another.

Quasars are luminous, while their host galaxies are faint, which has made it challenging for researchers to detect the dim light of the galaxy in the glare of the quasar, especially at great distances. Before the JWST, the Hubble Space Telescope was able to detect host galaxies of luminous quasars when the universe was just under 3 billion years old, but no younger.

The superb sensitivity and the ultra-sharp images of the JWST at infrared wavelengths finally allowed researchers to push these studies to the time when the quasars and galaxies first formed. Just a few months after JWST started regular operations, the team observed two quasars, HSC J2236+0032 and HSC J2255+0251, at redshifts 6.40 and 6.34 when the universe was approximately 860 million years old. These two quasars were discovered in a deep survey program of the 8.2m-Subaru Telescope on the summit of Maunakea in Hawai'i. The relatively low luminosities of these quasars made them prime targets for measurement of the host galaxy properties, and the successful detection of the hosts represents the earliest epoch to date at which starlight has been detected in a quasar.

The images of the two quasars were taken at infrared wavelengths of 3.56 and 1.50 micron with JWST's NIRCam instrument, and the host galaxies became apparent after carefully modeling and subtracting glare from the accreting black holes. The stellar signature of the host galaxy was also seen in a spectrum taken by JWST's NIRSPEC for J2236+0032, further supporting the detection of the host galaxy.

Analyses of the host galaxy photometry found that these two quasar host galaxies are massive, measuring 130 and 34 billion times the mass of the Sun, respectively. Measuring the speed of the turbulent gas in the vicinity of the quasars from the NIRSPEC spectra suggests that the black holes that power them are also massive, measuring 1.4 and 0.2 billion times the mass of the Sun. The ratio of the black hole mass to host galaxy mass is similar to those of galaxies in the more recent past, suggesting that the relationship between black holes and their hosts was already in place 860 million years after the Big Bang.

Read more at Science Daily

Significant decline of snow cover in the Northern hemisphere over the last half century

In the face of the ongoing climate crisis, scientists from many fields are directing their expertise at understanding how different climate systems have changed and will continue to do so as climate change progresses. Robert Lund, professor and department chair of statistics at the UC Santa Cruz Baskin School of Engineering, collaborated on a new study that uses rigorous mathematical models and statistical methods and finds declining snow cover in many parts of the northern hemisphere over the last half century.

Understanding snow cover trends is important because of the role that snow plays in the global energy balance. Snow's high albedo -- the ability to reflect light -- and insulating characteristics affects surface temperatures on a regional scale and thermal stability on a continent-wide scale.

In the new study published in the Journal of Hydrometeorology, researchers analyzed snow cover data gathered from weekly satellite flyovers between 1967 (when satellites became more common) and 2021, which was divided into grid sections for analysis. Of the grids that researchers determined had reliable data, they found that snow cover is declining in nearly twice as many grids as it is advancing.

"In the Arctic regions, snow is going away more often than not -- I think climatologists sort of suspected this," Lund said. "But it's also going away at the southern boundaries of the continents."

In a study that took about four years to complete, the researchers show that snow presence in the Arctic and southern latitudes of the Northern hemisphere is generally decreasing, while some areas such as Eastern Canada are seeing an increase in snow cover. This could be due to increasing temperatures in areas that are typically very cold but still below freezing, allowing the atmosphere to hold more water, which then falls as snow.

Lund believes this is the first truly dependable analysis of snow cover trends in the Northern hemisphere due to the rigor of the researchers' statistical methods. It is often challenging for non-statisticians to extract trends from this type of satellite data, which comes as a sequence of 0s or 1s to indicate if snow was present during a certain week. The researchers also had to take correlation into account when looking at trends, as the presence of snow cover one week greatly affects the likelihood of snow cover the following week. These two factors were taken into account with a Markov chain based model. Accurate uncertainty estimates of the trends could be computed from the model. The researchers found hundreds of grids where snow cover was declining with at least 97.5% certainty.

However, they also found that some of the satellite data gathered in mountainous regions was unreliable, showing no snow in the winter and several weeks of snow in the winter. This was likely due to a flaw in the algorithm that processed the satellite data to determine if snow was present or not.

"The reason this study took a lot of work is because the satellite data is so doggone poor," Lund said. "Whatever the meteorologists did to estimate snow from the pictures in some of the mountainous regions just didn't work, so we had to take all the grids in the Northern hemisphere, and figure out whether the data was even trustworthy or not."

By determining which satellite data is unreliable, this study can serve as a resource to the scientific community who also may want to evaluate this snow cover data for their research.

Read more at Science Daily

New study sheds light on the evolution of animals

A study led by the University of Oxford has brought us one step closer to solving a mystery that has puzzled naturalists since Charles Darwin: when did animals first appear in the history of Earth? The results have been published today in the journal Trends in Ecology & Evolution.

Animals* first occur in the fossil record around 574 million years ago. Their arrival appears as a sudden 'explosion' in rocks from the Cambrian period (539 million years ago to 485 million years ago) and seems to counter the typically gradual pace of evolutionary change. Many scientists (including Darwin himself) believe that the first animals actually evolved long before the Cambrian period, but they cannot explain why they are missing from the fossil record.

The 'molecular clock' method, for instance, suggests that animals first evolved 800 million years ago, during the early part of the Neoproterozoic era (1,000 million years ago to 539 million years ago). This approach uses the rates at which genes accumulate mutations to determine the point in time when two or more living species last shared a common ancestor. But although rocks from the early Neoproterozoic contain fossil microorganisms, such as bacteria and protists, no animal fossils have been found.

This posed a dilemma for palaeontologists: does the molecular clock method overestimate the point at which animals first evolved? Or were animals present during the early Neoproterozoic, but too soft and fragile to be preserved?

To investigate this, a team of researchers led by Dr Ross Anderson from the University of Oxford's Department of Earth Sciences have carried out the most thorough assessment to date of the preservation conditions that would be expected to capture the earliest animal fossils.

Lead author Dr Ross Anderson said: 'The first animals presumably lacked mineral-based shells or skeletons, and would have required exceptional conditions to be fossilised. But certain Cambrian mudstone deposits demonstrate exceptional preservation, even of soft and fragile animal tissues. We reasoned that if these conditions, known as Burgess Shale-Type (BST) preservation, also occurred in Neoproterozoic rocks, then a lack of fossils would suggest a real absence of animals at that time.'

To investigate this, the research team used a range of analytical techniques on samples of Cambrian mudstone deposits from almost 20 sites, to compare those hosting BST fossils with those preserving only mineral-based remains (such as trilobites). These methods included energy dispersive X-ray spectroscopy and X-ray diffraction carried out at the University of Oxford's Departments of Earth Sciences and Materials, besides infrared spectroscopy carried out at Diamond Light Source, the UK's national synchrotron.

The analysis found that fossils with exceptional BST-type preservation were particularly enriched in an antibacterial clay called berthierine. Samples with a composition of at least 20% berthierine yielded BST fossils in around 90% of cases.

Microscale mineral mapping of BST fossils revealed that another antibacterial clay, called kaolinite, appeared to directly bind to decaying tissues at an early stage, forming a protective halo during fossilisation.

'The presence of these clays was the main predictor of whether rocks would harbour BST fossils' added Dr Anderson. 'This suggests that the clay particles act as an antibacterial barrier that prevents bacteria and other microorganisms from breaking down organic materials.'

The researchers then applied these techniques to analyse samples from numerous fossil-rich Neoproterozoic mudstone deposits. The analysis revealed that most did not have the compositions necessary for BST preservation. However, three deposits in Nunavut (Canada), Siberia (Russia), and Svalbard (Norway) had almost identical compositions to BST-rocks from the Cambrian period. Nevertheless, none of the samples from these three deposits contained animal fossils, even though conditions were likely favourable for their preservation.

Dr Anderson added: 'Similarities in the distribution of clays with fossils in these rare early Neoproterozoic samples and with exceptional Cambrian deposits suggest that, in both cases, clays were attached to decaying tissues, and that conditions conducive to BST preservation were available in both time periods. This provides the first "evidence for absence" and supports the view that animals had not evolved by the early Neoproterozoic era, contrary to some molecular clock estimates.'

According to the researchers, the study suggests a possible maximum age to the origin of animals of around 789 million years: the youngest estimated age of the Svalbard formation. The group now intend to search for progressively younger Neoproterozoic deposits with conditions for BST preservation. This will confirm the age of rocks in which animals are missing from the fossil record because they really were absent, rather than because conditions did not enable them to be fossilised. They also intend to perform laboratory experiments to investigate the mechanisms that underpin clay-organic interactions in BST preservation.

Dr Anderson added: 'Mapping the compositions of these rocks at the microscale is allowing us to understand the nature of the exceptional fossil record in a way that we have never been able to do before. Ultimately, this could help determine how the fossil record may be biased towards preserving certain species and tissues, altering our perception of biodiversity across different geological eras.'

Read more at Science Daily

How the motion of DNA controls gene activity

Performing cutting-edge science requires thinking outside the box and bringing together different scientific disciplines. Sometimes this even means being in the right place at the right time. For David Brückner, postdoctoral researcher and NOMIS fellow at ISTA, all the above-mentioned things came into effect as he attended an on-campus lecture by Professor Thomas Gregor from Princeton University. Inspired by the talk, Brückner reached out with an idea: to physically interpret the specific data sets Gregor presented. Now, the results of their collaboration are published in Science. They highlight the stochastic (random) motion of two specific gene elements on a chromosome, which have to come into contact for the gene to become active in 3D space.

How DNA fits into a cell nucleus

Living organisms like humans are built on genes that are stored in the DNA -- our molecular blueprint. DNA is a polymer, a huge molecule of smaller individual parts (monomers). It is located in every cell's nucleus. "Depending on the organism, the DNA polymer can be up to meters long, yet the size of the nucleus is on the order of microns," Brückner explains. To fit into the tiny nucleus, DNA gets compacted by being coiled as if on a spool and further compressed into the well-known shape of chromosomes, which we all encountered in a biology textbook.

"Despite being heavily condensed, chromosomes are not static; they are jiggling around all the time," the physicist continues. These dynamics are very important. Whenever a specific gene has to be activated, two regions on the polymer called "enhancer" and "promoter" need to come into close contact and bind to each other. Only when this happens, a cellular machinery reads off the gene's information and forms the RNA molecule, which eventually gives rise to proteins that are essential for all the processes a living organism requires.

Depending on the organism, the enhancer and promoter can be quite far from each other on the chromosome. "With previously used methods, you could get a static view of the distance between these elements, but not how the system evolves over time," Brückner explains. Intrigued by this missing information, the scientists set out to get a dynamic look at how these elements are organized and how they move in 3D space in real time.

Visualizing gene regions

To achieve this goal, the experimental scientists from Princeton established a method to track those two DNA elements over a certain time period in a fly embryo. Through genetic manipulation, the DNA elements were fluorescently labeled, with the enhancer region illuminating in green and the promoter in blue. Using live imaging (time-lapse microscopy of living cells) the scientists were able to visualize the fluorescent spots in fly embryos to see how they were moving around to find each other.

Once the two spots came into proximity, the gene was activated and an additional red light turned on as the RNA was also tagged with red fluorophores. Brückner excitedly adds, "We got a visual readout of when the enhancer and promoter got in contact. That gave us a lot of information about their trajectories."

DNA is densely packed and exhibits fast motion

The challenge then was how to analyze this huge data set of stochastic motion. His background in theoretical physics allowed Brückner to extract statistics to understand the typical behavior of the system. He applied two simplified, different physical models to cut through the data.

One was the Rouse model. It assumes that every monomer of the polymer is an elastic spring. It predicts a loose structure and fast diffusion -- a random movement, where occasionally the gene regions encounter each other. The other model is called the "fractal globule." It predicts a very compact structure and therefore slow diffusion. "Surprisingly, we found in the data that the system is described by a combination of these two models -- a highly dense structure you would expect based on the fractal globule model, and diffusion which is described by the statistics from the Rouse model," Brückner explains.

Due to the combination of dense packing and fast motion, the binding of these two gene regions depends much less on their distance along the chromosome than previously anticipated. "If such a system is in a fluid and dynamic state all the time, long-distance communication is much better than we might have thought," Brückner adds.

Read more at Science Daily

Jun 29, 2023

ALMA digs deeper into the mystery of planet formation

An international research team used the Atacama Large Millimeter/submillimeter Array (ALMA) to observe disks around 19 protostars with a very high resolution to search for the earliest signs of planet formation. This survey was motivated by the recent findings that planet formation may be well-underway in the more-evolved proto-planetary disks, but until now there had been no systematic study to search for signs of planet formation in younger protostellar systems.

Planets form in a disk around a newborn star. These 'proto-planetary' disks only last a few million years, meaning that a forming planetary system only has this amount of time to finish its formation. However, it is still not clear just how rapidly planet formation begins within these disks. Recent ALMA observations have revealed that many proto-planetary disks have substructures such as gaps and rings, indicating that planets are already forming from the disk. "These previous results motivated us to examine even younger disks around protostars to answer the question, at what stage of star formation do planet forms," says Nagayoshi Ohashi at Academia Sinica Institute of Astronomy and Astrophysics (ASIAA, Taiwan), who led the team.

The team observed disks around 19 protostars located within about 650 light-years from the Earth. This is the first systematic study to investigate the detailed structure of disks around a large sample of protostars with high angular resolution. The observations clearly show that the disks around protostars are different from more-evolved proto-planetary disks. Among the 19 protostars, rings, and gaps, which are signs of planet formation, were observed only in a few disks. Moreover, the ring structures are less distinct than those seen in the proto-planetary disks.

Read more at Science Daily

Mountains vulnerable to extreme rain from climate change

As the world warms, extreme weather events grow -- and they also change. Researchers at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) found that climate change is shifting snowfall to rainfall on mountains across the Northern Hemisphere. Those surges of liquid water bring a distinct set of dangers, including floods, landslides, and soil erosion.

"One quarter of the global population lives in or downstream from mountainous regions," said Mohammed Ombadi, first author of the paper published today in Nature. "They are going to be directly affected by this risk."

Scientists already expect climate change to increase the volume of water falling during extreme events (which typically take place over a few hours to a day), but this study is the first time researchers have looked at whether that extreme precipitation comes as rain or snow. They found that the fraction of water falling as snow decreased in mountainous regions, falling instead as rain -- making mountains particularly susceptible to extreme rain hazards. They even put a number to it: For every 1 degree Celsius increase in the global temperature, researchers expect an average of 15% more rain at high elevations.

"This increase in rainfall extremes is not only something that is going to happen from now until the end of the 21st century -- we're already seeing it," Ombadi said. "That same rate was also evident in the data from 1950 to 2019. Rainfall extremes in mountains have already been increasing, and will continue to change with that 15% rate."

While all the mountain ranges in the Northern Hemisphere are seeing the shift from snow to rain, those at greatest risk of extreme rainfall events are the North American Pacific mountain ranges (the Cascades, Sierra Nevada, and coastal ranges from Canada to Southern California), the Himalayas, and high-latitude regions. Researchers are still working to understand why those areas are at higher risk than other mountain ranges such as the Rockies or the Alps.

"We think that North American Pacific mountain ranges are more susceptible to the risk of rainfall extremes than other mountain ranges because a significant portion of snowfall in this region typically occurs at temperatures just below zero degrees Celsius," Ombadi said. "The slightest change in air temperature will shift this snowfall to rainfall. This is unlike other mountain ranges where snowfall may occur at very low temperatures below zero degrees."

Ombadi hopes that fellow climate scientists will incorporate the distinction between snowfall and rainfall to improve global climate models, and that civil engineers and planners will use the data to better prepare for intense rain events.

"We need to factor these results into how we design and build the infrastructure in these mountainous regions, so that they can withstand the negative consequences of increases in rainfall extremes," Ombadi said.

Meanwhile, countries continue efforts to meet targets established by the Paris Agreement that would limit global warming to less than 2 degrees Celsius above pre-industrial levels.

"Our findings revealed a linear relationship between the level of warming and the increase in extreme rainfall: For instance, 1 degree of warming causes 15% more rain, while 3 degrees leads to a 45% increase in rainfall," Ombadi said. "There are many technologies in progress that could help us reduce greenhouse gas emissions and how much the planet warms. To me, this study shows the need to invest in those clean solutions, and also start preparing for the consequences of warming now."

Read more at Science Daily

Turning old maps into 3D digital models of lost neighborhoods

Imagine strapping on a virtual reality headset and "walking" through a long-gone neighborhood in your city -- seeing the streets and buildings as they appeared decades ago.

That's a very real possibility now that researchers have developed a method to create 3D digital models of historic neighborhoods using machine learning and historic Sanborn Fire Insurance maps.

But the digital models will be more than just a novelty -- they will give researchers a resource to conduct studies that would have been nearly impossible before, such as estimating the economic loss caused by the demolition of historic neighborhoods.

"The story here is we now have the ability to unlock the wealth of data that is embedded in these Sanborn fire atlases," said Harvey Miller, co-author of the study and professor of geography at The Ohio State University.

"It enables a whole new approach to urban historical research that we could never have imagined before machine learning. It is a game changer."

The study was published today (June 28, 2023) in the journal PLOS ONE.

This research begins with the Sanborn maps, which were created to allow fire insurance companies to assess their liability in about 12,000 cities and towns in the United States during the 19th and 20th centuries. In larger cities, they were often updated regularly, said Miller, who is director of Ohio State's Center for Urban and Regional Analysis (CURA).

The problem for researchers was that trying to manually collect usable data from these maps was tedious and time-consuming -- at least until the maps were digitized. Digital versions are now available from the Library of Congress.

Study co-author Yue Lin, a doctoral student in geography at Ohio State, developed machine learning tools that can extract details about individual buildings from the maps, including their locations and footprints, the number of floors, their construction materials and their primary use, such as dwelling or business.

"We are able to get a very good idea of what the buildings look like from data we get from the Sanborn maps," Lin said.

The researchers tested their machine learning technique on two adjacent neighborhoods on the near east side of Columbus, Ohio, that were largely destroyed in the 1960s to make way for the construction of I-70.

One of the neighborhoods, Hanford Village, was developed in 1946 to house returning Black veterans of World War II.

"The GI bill gave returning veterans funds to purchase homes, but they could only be used on new builds," said study co-author Gerika Logan, outreach coordinator of CURA. "So most of the homes were lost to the highway not long after they were built."

The other neighborhood in the study was Driving Park, which also housed a thriving Black community until I-70 split it in two.

The researchers used 13 Sanborn maps for the two neighborhoods produced in 1961, just before I-70 was built. Machine learning techniques were able to extract the data from the maps and create digital models.

Comparing data from the Sanford maps to today showed that a total of 380 buildings were demolished in the two neighborhoods for the highway, including 286 houses, 86 garages, five apartments and three stores.

Analysis of the results showed that the machine learning model was very accurate in recreating the information contained in the maps -- about 90% accurate for building footprints and construction materials.

"The accuracy was impressive. We can actually get a visual sense of what these neighborhoods looked like that wouldn't be possible in any other way," Miller said.

"We want to get to the point in this project where we can give people virtual reality headsets and let them walk down the street as it was in 1960 or 1940 or perhaps even 1881."

Using the machine learning techniques developed for this study, researchers could develop similar 3D models for nearly any of the 12,000 cities and towns that have Sanborn maps, Miller said.

This will allow researchers to re-create neighborhoods lost to natural disasters like floods, as well as urban renewal, depopulation and other types of change.

Because the Sanborn maps include information on businesses that occupied specific buildings, researchers could re-create digital neighborhoods to determine the economic impact of losing them to urban renewal or other factors. Another possibility would be to study how replacing homes with highways that absorbed the sun's heat affected the urban heat island effect.

"There's a lot of different types of research that can be done. This will be a tremendous resource for urban historians and a variety of other researchers," Miller said.

Read more at Science Daily

An unexpected doorway into the ear opens new possibilities for hearing restoration

An international team of researchers has developed a new method to deliver drugs into the inner ear. The discovery was possible by harnessing the natural flow of fluids in the brain and employing a little understood backdoor into the cochlea. When combined to deliver a gene therapy that repairs inner ear hair cells, the researchers were able to restore hearing in deaf mice.

"These findings demonstrate that cerebrospinal fluid transport comprises an accessible route for gene delivery to the adult inner ear and may represent an important step towards using gene therapy to restore hearing in humans," said Maiken Nedergaard, MD, DMSc, senior author of the new study, which appears in the journal Science Translational Medicine.

Nedergaard is co-director of the Center for Translational Neuromedicine at University of Rochester and the University of Copenhagen. The study was the product of a collaboration between researchers at the two universities and a group led by Barbara Canlon, Ph.D. in the Laboratory of Experimental Audiology at the Karolinska Institute in Stockholm, Sweden.

The number of people worldwide predicted to have mild to complete hearing loss is expected to grow to around 2.5 billion by mid-century. The primarily cause is the death or loss of function of hair cells found in the cochlea -- which are responsible for relaying sounds to the brain -- due to mutations of critical genes, aging, noise exposure, and other factors.

While hair cells do not naturally regenerated in humans and other mammals, gene therapies have shown promise and in separate studies have successfully repaired the function of hair cells in neo-natal and very young mice. However, as both mice and humans age, the cochlea, already a delicate structure, becomes enclosed in temporal bone. At this point, any effort to reach the cochlea and deliver a gene therapy via surgery risks damaging this sensitive area and altering hearing.

In the new study, the researchers describe a little understood passage into the cochlea called the cochlear aqueduct. While the name conjures images of monumental stone architecture, the cochlear aqueduct is thin boney channel no larger than a single strand of hair. Suspected to play a role in balancing pressure in the ear, new study shows that that the cochlear aqueduct also acts as a conduit between the cerebrospinal fluid found in the inner ear and the rest of the brain.

Scientists are developing clearer picture of the mechanics of glymphatic system, the brain's unique process of removing waste first described by the Nedergaard lab in 2012. Because the glymphatic system pumps cerebrospinal fluid deep into brain tissue to wash away toxic proteins, researchers have been eyeing it as a potentially new way to deliver drugs into the brain, a major challenge in developing drugs for neurological disorders.

Researchers have also discovered that the complex movement of fluids driven by the glymphatic system extend to the eyes and the peripheral nervous system, including ear. The new study represented an opportunity to put the drug delivery potential of the glymphatic system to the test, while at the same time targeting a previously unreachable part of the auditory system.

Employing a number of imagining and modeling technologies, the researchers were able to develop a detailed portrait of how fluid from other parts of the brain flows through cochlear aqueduct and into the inner ear. The team then injected an adeno-associated virus into the cisterna magna, a large reservoir of cerebrospinal fluid found at the base of the skull. The virus found its way into the inner ear via the cochlear aqueduct, delivered a gene therapy that expresses a protein called vesicular glutamate transporter-3, which enable the hair cells to transmit signal and rescued hearing in adult deaf mice.

"This new delivery route into the ear may not only serve the advancement of auditory research, but also prove useful when translated to humans with progressive genetic-mediated hearing loss," said Nedergaard.

Read more at Science Daily