13 young adults with tetraplegia are able to feed themselves, hold a drink, brush their teeth, and write as a result of a novel surgical technique which connects functioning nerves with injured nerves to restore power in paralysed muscles Nerve transfer surgery has enabled 13 young adults with complete paralysis to regain movement and function in their elbows and hands, according to the largest case series of this technique in people with tetraplegia (paralysis of both the upper and lower limbs), published in The Lancet.
During the surgery, Australian surgeons attached functioning nerves above the spinal injury to paralysed nerves below the injury. Two years after surgery, and following intensive physical therapy, participants were able to reach their arm out in front of them and open their hand to pick up and manipulate objects. Restoring elbow extension improved their ability to propel their wheelchair and to transfer into bed or a car.
They can now perform everyday tasks independently such as feeding themselves, brushing teeth and hair, putting on make-up, writing, handling money and credit cards, and using tools and electronic devices.
The findings suggest that nerve transfers can achieve similar functional improvements to traditional tendon transfers, with the benefit of smaller incisions and shorter immobilisation times after surgery.
In 10 participants, nerve transfers were uniquely combined with tendon transfers allowing different styles of reconstruction to be performed in each hand, and enabling participants to benefit from the innate strengths of both tendon and nerve transfers. Nerve transfers restored more natural movement and finer motor control in one hand, and tendon transfers restored more power and heavy lifting ability in the other hand.
While only a small study, researchers say that nerve transfers are a major advance in the restoration of hand and arm function, and offer another safe, reliable surgical option for people living with tetraplegia.
Nevertheless, four nerve transfers failed in three participants and the authors conclude that more research will be needed to determine which people are the best candidates to select for nerve transfer surgery to minimise the incidence of failure.
"For people with tetraplegia, improvement in hand function is the single most important goal. We believe that nerve transfer surgery offers an exciting new option, offering individuals with paralysis the possibility of regaining arm and hand functions to perform everyday tasks, and giving them greater independence and the ability to participate more easily in family and work life," says Dr Natasha van Zyl from Austin Health in Melbourne, Australia who led the research.
"What's more, we have shown that nerve transfers can be successfully combined with traditional tendon transfer techniques to maximise benefits. When grasp and pinch was restored using nerve transfers in one hand and tendon transfers in the other, participants consistently reporting that they liked both hands for different reasons and would not choose to have two hands reconstructed in the same way."
Traditionally, upper limb function has been reconstructed using tendon transfer surgery, during which muscles that still work, but are designed for another function, are surgically re-sited to do the work of muscles that are paralysed. In contrast, nerve transfers allow the direct reanimation of the paralysed muscle itself. Additionally, nerve transfers can re-animate more than one muscle at a time, have a shorter period of immobilisation after surgery (10 days in a sling vs 6-12 weeks in a brace for a nerve transfer for elbow extension), and avoid the technical problems associated with of tendon transfer surgery including tendon tensioning during surgery and mechanical failure (stretch or rupture) after surgery.
Previous single case reports and small retrospective studies have shown nerve transfer surgery to be feasible and safe in people with tetraplegia. But this is the first prospective study to use standardised functional outcome measures and combinations of multiple nerve and tendon transfer surgeries.
In total the study recruited 16 young adults (average age 27 years) with traumatic, early (less than 18 months post injury) spinal cord injury to the neck (C5-C7), who were referred to Austin Health in Melbourne for restoration of function in the upper limb. Most were the result of motor vehicle accidents or sports injuries.
Participants underwent single or multiple nerve transfers in one or both upper limbs to restore elbow extension, grasp, pinch, and hand opening. This involved taking working nerves to expendable muscles innervated above the spinal injury and attaching them to the nerves of paralysed muscles innervated below the injury to restore voluntary control and reanimate the paralysed muscle.
For example, the surgeons selected the nerve supplying the teres minor muscle in the shoulder as a donor nerve and attached it to the nerve supplying the triceps that activates the muscles that extend (straighten) the elbow. To restore grasp and pinch the nerve to a spare wrist extensor muscle was transferred to the anterior interosseous nerve.
In total, 59 nerve transfers were completed in 16 participants (13 men and three women; 27 limbs). In 10 participants (12 limbs), nerve transfers were combined with tendon transfers to improve hand function.
Participants completed assessments on their level of independence related to activities of daily living (e.g., self-care, toilet, upper limb function, muscle power, grasp and pinch strength, and hand opening ability) before surgery, one year after surgery, and again two years later. Two participants were lost to follow up, and there was one death (unrelated to the surgery).
At 24 months, significant improvements were noted in the hands ability to pick up and release several objects within a specified time frame and independence. Prior to surgery, none of the participants were able to score on the grasp or pinch strength tests, but 2 years later pinch and grasp strength were high enough to perform most activities of daily living.
Three participants had four failed nerve transfers -- two had a permanent decrease in sensation, and two had a temporary decrease in wrist strength that resolved by 1 year after surgery. Overall, surgery was well tolerated. Five serious adverse events were recorded (including a fall from a wheelchair with femur fracture), but none were related to the surgery.
Despite these achievements, nerve transfer surgery still has some limitations. For the best results nerve transfers should ideally be performed within 6-12 months of injury. Additionally, it can take months after nerve transfer for nerve regrowth into the paralysed muscle to occur and for new movement to be seen, and years until full strength is achieved. However, the authors note that one of the benefits of nerve transfers is that most movements not successfully restored by nerve transfers can still be restored using tendon transfers.
Read more at Science Daily
Jul 6, 2019
First observation of native ferroelectric metal
In a paper released today in Science Advances, UNSW researchers describe the first observation of a native ferroelectric metal.
The study represents the first example of a native metal with bistable and electrically switchable spontaneous polarization states -- the hallmark of ferroelectricity.
"We found coexistence of native metallicity and ferroelectricity in bulk crystalline tungsten ditelluride (WTe2) at room temperature," explains study author Dr Pankaj Sharma.
"We demonstrated that the ferroelectric state is switchable under an external electrical bias and explain the mechanism for 'metallic ferroelectricity' in WTe2 through a systematic study of the crystal structure, electronic transport measurements and theoretical considerations."
"A van der Waals material that is both metallic and ferroelectric in its bulk crystalline form at room temperature has potential for new nano-electronics applications," says author Dr Feixiang Xiang.
FERROELECTRIC BACKGROUNDER
Ferroelectricity can be considered an analogy to ferromagnetism. A ferromagnetic material displays permanent magnetism, and in layperson's terms, is simply, a 'magnet' with north and south pole. Ferroelectric material likewise displays an analogous electrical property called a permanent electric polarisation, which originates from electric dipoles consisting of equal, but oppositely charged ends or poles. In ferroelectric materials, these electric dipoles exist at the unit cell level and give rise to a non-vanishing permanent electric dipole moment.
This spontaneous electric dipole moment can be repeatedly transitioned between two or more equivalent states or directions upon application of an external electric field -- a property utilised in numerous ferroelectric technologies, for example nano-electronic computer memory, RFID cards, medical ultrasound transducers, infrared cameras, submarine sonar, vibration and pressure sensors, and precision actuators.
Conventionally, ferroelectricity has been observed in materials that are insulating or semiconducting rather than metallic, because conduction electrons in metals screen-out the static internal fields arising from the dipole moment.
THE STUDY
A room-temperature ferroelectric semimetal was published in Science Advances in July 2019.
Bulk single-crystalline tungsten ditelluride (WTe2), which belongs to a class of materials known as transition metal dichalcogenides (TMDCs), was probed by spectroscopic electrical transport measurements, conductive-atomic force microscopy (c-AFM) to confirm its metallic behaviour, and by piezo-response force microscopy (PFM) to map the polarisation, detecting lattice deformation due to an applied electric field.
Ferroelectric domains -- ie, the regions with oppositely oriented direction of polarization -- were directly visualised in freshly-cleaved WTe2 single crystals.
Spectroscopic-PFM measurements with top electrode in a capacitor geometry was used to demonstrate switching of the ferroelectric polarization.
The study was supported by funding from the Australian Research Council through the ARC Centre of Excellence in Future Low-Energy Electronics Technologies (FLEET), and the work was performed in part using facilities of the NSW Nodes of the Australian National Fabrication Facility, with the assistance of the Australian Government Research Training Program Scholarship scheme.
First-principles density functional theory (DFT) calculations (University of Nebraska) confirmed the experimental findings of the electronic and structural origins of the ferroelectric instability of WTe2, supported by the National Science Foundation.
FERROELECTRIC STUDIES AT FLEET
Ferroelectric materials are keenly studied at FLEET (the ARC Centre of Excellence in Future Low-Energy Electronics Technologies) for their potential use in low-energy electronics, 'beyond CMOS' technology.
The switchable electric dipole moment of ferroelectric materials could for example be used as a gate for the underlying 2D electron system in an artificial topological insulator.
In comparison with conventional semiconductors, the very close (sub-nanometre) proximity of a ferroelectric's electron dipole moment to the electron gas in the atomic crystal ensures more effective switching, overcoming limitations of conventional semiconductors where the conducting channel is buried tens of nanometres below the surface.
Read more at Science Daily
The study represents the first example of a native metal with bistable and electrically switchable spontaneous polarization states -- the hallmark of ferroelectricity.
"We found coexistence of native metallicity and ferroelectricity in bulk crystalline tungsten ditelluride (WTe2) at room temperature," explains study author Dr Pankaj Sharma.
"We demonstrated that the ferroelectric state is switchable under an external electrical bias and explain the mechanism for 'metallic ferroelectricity' in WTe2 through a systematic study of the crystal structure, electronic transport measurements and theoretical considerations."
"A van der Waals material that is both metallic and ferroelectric in its bulk crystalline form at room temperature has potential for new nano-electronics applications," says author Dr Feixiang Xiang.
FERROELECTRIC BACKGROUNDER
Ferroelectricity can be considered an analogy to ferromagnetism. A ferromagnetic material displays permanent magnetism, and in layperson's terms, is simply, a 'magnet' with north and south pole. Ferroelectric material likewise displays an analogous electrical property called a permanent electric polarisation, which originates from electric dipoles consisting of equal, but oppositely charged ends or poles. In ferroelectric materials, these electric dipoles exist at the unit cell level and give rise to a non-vanishing permanent electric dipole moment.
This spontaneous electric dipole moment can be repeatedly transitioned between two or more equivalent states or directions upon application of an external electric field -- a property utilised in numerous ferroelectric technologies, for example nano-electronic computer memory, RFID cards, medical ultrasound transducers, infrared cameras, submarine sonar, vibration and pressure sensors, and precision actuators.
Conventionally, ferroelectricity has been observed in materials that are insulating or semiconducting rather than metallic, because conduction electrons in metals screen-out the static internal fields arising from the dipole moment.
THE STUDY
A room-temperature ferroelectric semimetal was published in Science Advances in July 2019.
Bulk single-crystalline tungsten ditelluride (WTe2), which belongs to a class of materials known as transition metal dichalcogenides (TMDCs), was probed by spectroscopic electrical transport measurements, conductive-atomic force microscopy (c-AFM) to confirm its metallic behaviour, and by piezo-response force microscopy (PFM) to map the polarisation, detecting lattice deformation due to an applied electric field.
Ferroelectric domains -- ie, the regions with oppositely oriented direction of polarization -- were directly visualised in freshly-cleaved WTe2 single crystals.
Spectroscopic-PFM measurements with top electrode in a capacitor geometry was used to demonstrate switching of the ferroelectric polarization.
The study was supported by funding from the Australian Research Council through the ARC Centre of Excellence in Future Low-Energy Electronics Technologies (FLEET), and the work was performed in part using facilities of the NSW Nodes of the Australian National Fabrication Facility, with the assistance of the Australian Government Research Training Program Scholarship scheme.
First-principles density functional theory (DFT) calculations (University of Nebraska) confirmed the experimental findings of the electronic and structural origins of the ferroelectric instability of WTe2, supported by the National Science Foundation.
FERROELECTRIC STUDIES AT FLEET
Ferroelectric materials are keenly studied at FLEET (the ARC Centre of Excellence in Future Low-Energy Electronics Technologies) for their potential use in low-energy electronics, 'beyond CMOS' technology.
The switchable electric dipole moment of ferroelectric materials could for example be used as a gate for the underlying 2D electron system in an artificial topological insulator.
In comparison with conventional semiconductors, the very close (sub-nanometre) proximity of a ferroelectric's electron dipole moment to the electron gas in the atomic crystal ensures more effective switching, overcoming limitations of conventional semiconductors where the conducting channel is buried tens of nanometres below the surface.
Read more at Science Daily
Jul 5, 2019
More 'reactive' land surfaces cooled the Earth down
From time to time, there have been long periods of cooling in Earth's history. Temperatures had already fallen for more than ten million years before the last ice age began about 2.5 million years ago. At that time the northern hemisphere was covered with massive ice masses and glaciers. A geoscientific paradigm, widespread for over twenty years, explains this cooling with the formation of the large mountain ranges such as the Andes, the Himalayas and the Alps. As a result, more rock weathering has taken place, the paradigm suggests. This in turn removed more carbon dioxide (CO2) from the atmosphere, so that the 'greenhouse effect' decreased and the atmosphere cooled. This and other processes eventually led to the 'ice Age'.
In a new study, Jeremy Caves-Rugenstein from ETH Zurich, Dan Ibarra from Stanford University and Friedhelm von Blanckenburg from the GFZ German Research Centre for Geosciences in Potsdam were able to show that this paradigm cannot be upheld. According to the paper, weathering was constant over the period under consideration. Instead, increased 'reactivity' of the land surface has led to a decrease in CO2 in the atmosphere, thus cooling the Earth. The researchers published the results in the journal Nature.
A second look after isotope analysis
The process of rock weathering, and especially the chemical weathering of rocks with carbonic acid, has controlled the Earth's climate for billions of years. Carbonic acid is produced from CO2 when it dissolves in rainwater. Weathering thus removes CO2 from the Earth's atmosphere, precisely to the extent that volcanic gases supplied the atmosphere with it. The paradigm that has been widespread so far states that with the formation of the large mountains ranges in the last 15 million years, erosion processes have increased -- and with them also the CO2-binding rock weathering. Indeed, geochemical measurements in ocean sediments show that the proportion of CO2 in the atmosphere has strongly decreased during this phase.
"The hypothesis, however, has a big catch," explains Friedhelm von Blanckenburg of GFZ. "If the atmosphere had actually lost as much CO2 as the weathering created by erosion would have caused, it would hardly have had any CO2 left after less than a million years. All water would have had frozen to ice and life would have had a hard time to survive. But that was not the case."
That these doubts are justified, was already shown by von Blanckenburg and his colleague Jane Willenbring in a 2010 study, which appeared in Nature likewise. "We used measurements of the rare isotope beryllium-10 produced by cosmic radiation in the Earth's atmosphere and its ratio to the stable isotope beryllium-9 in ocean sediment to show that the weathering of the land surface had not increased at all," says Friedhelm von Blanckenburg.
The land's surface has become more 'reactive'
In the study published now, Caves-Rugenstein, Ibarra and von Blanckenburg additionally used the data of stable isotopes of the element lithium in ocean sediments as an indicator for the weathering processes. They wanted to find out how, despite constant rock weathering, the amount of CO2 in the atmosphere could have decreased. They entered their data into a computer model of the global carbon cycle.
Indeed, the results of the model showed that the potential of the land surface to weather has increased, but not the speed at which it weathered. The researchers call this potential of weathering the 'reactivity' of the land surface. "Reactivity describes how easily chemical compounds or elements take part in a reaction," explains Friedhelm von Blanckenburg. If there are more non-weathered and therefore more reactive rocks at the surface, these can in total react as extensively chemically with little CO2 in the atmosphere as already heavily weathered rocks would do with a lot of CO2. The decrease in CO2 in the atmosphere, which is responsible for the cooling, can thus be explained without an increased speed of weathering.
Read more at Science Daily
In a new study, Jeremy Caves-Rugenstein from ETH Zurich, Dan Ibarra from Stanford University and Friedhelm von Blanckenburg from the GFZ German Research Centre for Geosciences in Potsdam were able to show that this paradigm cannot be upheld. According to the paper, weathering was constant over the period under consideration. Instead, increased 'reactivity' of the land surface has led to a decrease in CO2 in the atmosphere, thus cooling the Earth. The researchers published the results in the journal Nature.
A second look after isotope analysis
The process of rock weathering, and especially the chemical weathering of rocks with carbonic acid, has controlled the Earth's climate for billions of years. Carbonic acid is produced from CO2 when it dissolves in rainwater. Weathering thus removes CO2 from the Earth's atmosphere, precisely to the extent that volcanic gases supplied the atmosphere with it. The paradigm that has been widespread so far states that with the formation of the large mountains ranges in the last 15 million years, erosion processes have increased -- and with them also the CO2-binding rock weathering. Indeed, geochemical measurements in ocean sediments show that the proportion of CO2 in the atmosphere has strongly decreased during this phase.
"The hypothesis, however, has a big catch," explains Friedhelm von Blanckenburg of GFZ. "If the atmosphere had actually lost as much CO2 as the weathering created by erosion would have caused, it would hardly have had any CO2 left after less than a million years. All water would have had frozen to ice and life would have had a hard time to survive. But that was not the case."
That these doubts are justified, was already shown by von Blanckenburg and his colleague Jane Willenbring in a 2010 study, which appeared in Nature likewise. "We used measurements of the rare isotope beryllium-10 produced by cosmic radiation in the Earth's atmosphere and its ratio to the stable isotope beryllium-9 in ocean sediment to show that the weathering of the land surface had not increased at all," says Friedhelm von Blanckenburg.
The land's surface has become more 'reactive'
In the study published now, Caves-Rugenstein, Ibarra and von Blanckenburg additionally used the data of stable isotopes of the element lithium in ocean sediments as an indicator for the weathering processes. They wanted to find out how, despite constant rock weathering, the amount of CO2 in the atmosphere could have decreased. They entered their data into a computer model of the global carbon cycle.
Indeed, the results of the model showed that the potential of the land surface to weather has increased, but not the speed at which it weathered. The researchers call this potential of weathering the 'reactivity' of the land surface. "Reactivity describes how easily chemical compounds or elements take part in a reaction," explains Friedhelm von Blanckenburg. If there are more non-weathered and therefore more reactive rocks at the surface, these can in total react as extensively chemically with little CO2 in the atmosphere as already heavily weathered rocks would do with a lot of CO2. The decrease in CO2 in the atmosphere, which is responsible for the cooling, can thus be explained without an increased speed of weathering.
Read more at Science Daily
How trees could save the climate
The Crowther Lab at ETH Zurich investigates nature-based solutions to climate change. In their latest study the researchers showed for the first time where in the world new trees could grow and how much carbon they would store. Study lead author and postdoc at the Crowther Lab Jean-François Bastin explains: "One aspect was of particular importance to us as we did the calculations: we ex-cluded cities or agricultural areas from the total restoration potential as these areas are needed for hu-man life."
Reforest an area the size of the USA
The researchers calculated that under the current climate conditions, Earth's land could support 4.4 billion hectares of continuous tree cover. That is 1.6 billion more than the currently existing 2.8 billion hectares. Of these 1.6 billion hectares, 0.9 billion hectares fulfill the criterion of not being used by hu-mans. This means that there is currently an area of the size of the US available for tree restoration. Once mature, these new forests could store 205 billion tonnes of carbon: about two thirds of the 300 billion tonnes of carbon that has been released into the atmosphere as a result of human activity since the Industrial Revolution.
According to Prof. Thomas Crowther, co-author of the study and founder of the Crowther Lab at ETH Zurich: "We all knew that restoring forests could play a part in tackling climate change, but we didn't really know how big the impact would be. Our study shows clearly that forest restoration is the best climate change solution available today. But we must act quickly, as new forests will take decades to mature and achieve their full potential as a source of natural carbon storage."
Russia best suited for reforestation
The study also shows which parts of the world are most suited to forest restoration. The greatest po-tential can be found in just six countries: Russia (151 million hectares); the US (103 million hectares); Canada (78.4 million hectares); Australia (58 million hectares); Brazil (49.7 million hectares); and China (40.2 million hectares).
Many current climate models are wrong in expecting climate change to increase global tree cover, the study warns. It finds that there is likely to be an increase in the area of northern boreal forests in re-gions such as Siberia, but tree cover there averages only 30 to 40 percent. These gains would be out-weighed by the losses suffered in dense tropical forests, which typically have 90 to 100 percent tree cover.
Look at Trees!
A tool on the Crowther Lab website enables users to look at any point on the globe, and find out how many trees could grow there and how much carbon they would store. It also offers lists of for-est restoration organisations. The Crowther Lab will also be present at this year's Scientifica (web-site available in German only) to show the new tool to visitors.
Read more at Science Daily
Reforest an area the size of the USA
The researchers calculated that under the current climate conditions, Earth's land could support 4.4 billion hectares of continuous tree cover. That is 1.6 billion more than the currently existing 2.8 billion hectares. Of these 1.6 billion hectares, 0.9 billion hectares fulfill the criterion of not being used by hu-mans. This means that there is currently an area of the size of the US available for tree restoration. Once mature, these new forests could store 205 billion tonnes of carbon: about two thirds of the 300 billion tonnes of carbon that has been released into the atmosphere as a result of human activity since the Industrial Revolution.
According to Prof. Thomas Crowther, co-author of the study and founder of the Crowther Lab at ETH Zurich: "We all knew that restoring forests could play a part in tackling climate change, but we didn't really know how big the impact would be. Our study shows clearly that forest restoration is the best climate change solution available today. But we must act quickly, as new forests will take decades to mature and achieve their full potential as a source of natural carbon storage."
Russia best suited for reforestation
The study also shows which parts of the world are most suited to forest restoration. The greatest po-tential can be found in just six countries: Russia (151 million hectares); the US (103 million hectares); Canada (78.4 million hectares); Australia (58 million hectares); Brazil (49.7 million hectares); and China (40.2 million hectares).
Many current climate models are wrong in expecting climate change to increase global tree cover, the study warns. It finds that there is likely to be an increase in the area of northern boreal forests in re-gions such as Siberia, but tree cover there averages only 30 to 40 percent. These gains would be out-weighed by the losses suffered in dense tropical forests, which typically have 90 to 100 percent tree cover.
Look at Trees!
A tool on the Crowther Lab website enables users to look at any point on the globe, and find out how many trees could grow there and how much carbon they would store. It also offers lists of for-est restoration organisations. The Crowther Lab will also be present at this year's Scientifica (web-site available in German only) to show the new tool to visitors.
Read more at Science Daily
Camera brings unseen world to light
When the first full-length movie made with the advanced, three-color process of Technicolor premiered in 1935, The New York Times declared "it produced in the spectator all the excitement of standing upon a peak ... and glimpsing a strange, beautiful and unexpected new world."
Technicolor forever changed how cameras -- and people -- saw and experienced the world around them. Today, there is a new precipice -- this one, offering views of a polarized world.
Polarization, the direction in which light vibrates, is invisible to the human eye (but visible to some species of shrimp and insects). But it provides a great deal of information about the objects with which it interacts. Cameras that see polarized light are currently used to detect material stress, enhance contrast for object detection, and analyze surface quality for dents or scratches.
However, like the early color cameras, current-generation polarization-sensitive cameras are bulky. Moreover, they often rely on moving parts and are costly, severely limiting the scope of their potential application.
Now, researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed a highly compact, portable camera that can image polarization in a single shot. The miniature camera -- about the size of a thumb -- could find a place in the vision systems of autonomous vehicles, onboard planes or satellites to study atmospheric chemistry, or be used to detect camouflaged objects.
The research is published in Science.
"This research is game-changing for imaging," said Federico Capasso, the Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering at SEAS and senior author of the paper. "Most cameras can typically only detect the intensity and color of light but can't see polarization. This camera is a new eye on reality, allowing us to reveal how light is reflected and transmitted by the world around us."
"Polarization is a feature of light that is changed upon reflection off a surface," said Paul Chevalier, a postdoctoral fellow at SEAS and co-author of the study. "Based on that change, polarization can help us in the 3D reconstruction of an object, to estimate its depth, texture and shape, and to distinguish man-made objects from natural ones, even if they're the same shape and color."
To unlock that powerful world of polarization, Capasso and his team harnessed the potential of metasurfaces, nanoscale structures that interact with light at wavelength size-scales.
"If we want to measure the light's full polarization state, we need to take several pictures along different polarization directions," said Noah Rubin, first author of the paper and graduate student in the Capasso Lab. "Previous devices either used moving parts or sent light along multiple paths to acquire the multiple images, resulting in bulky optics. A newer strategy uses specially patterned camera pixels, but this approach does not measure the full polarization state and requires a non-standard imaging sensor. In this work, we were able to take all of the optics needed and integrate them in a single, simple device with a metasurface."
Using a new understanding how polarized light interacts with objects, the researchers designed a metasurface that uses an array of subwavelength spaced nanopillars to direct light based on its polarization. The light then forms four images, each one showing a different aspect of the polarization. Taken together, these give a full snapshot of polarization at every pixel.
The device is about two centimeters in length and no more complicated than a camera on a smartphone. With an attached lens and protective case, the device is about the size of a small lunch box. The researchers tested the camera to show defects in injection-molded plastic objects, took it outside to film the polarization off car windshields and even took selfies to demonstrate how a polarization camera can visualize the 3D contours of a face.
"This technology could be integrated into existing imaging systems, such as the one in your cell phone or car, enabling the widespread adoption of polarization imaging and new applications previously unforeseen," said Rubin.
"This research opens an exciting new direction for camera technology with unprecedented compactness, allowing us to envision applications in atmospheric science, remote sensing, facial recognition, machine vision and more," said Capasso.
The Harvard Office of Technology Development has protected the intellectual property relating to this project and is exploring commercialization opportunities.
Read more at Science Daily
Technicolor forever changed how cameras -- and people -- saw and experienced the world around them. Today, there is a new precipice -- this one, offering views of a polarized world.
Polarization, the direction in which light vibrates, is invisible to the human eye (but visible to some species of shrimp and insects). But it provides a great deal of information about the objects with which it interacts. Cameras that see polarized light are currently used to detect material stress, enhance contrast for object detection, and analyze surface quality for dents or scratches.
However, like the early color cameras, current-generation polarization-sensitive cameras are bulky. Moreover, they often rely on moving parts and are costly, severely limiting the scope of their potential application.
Now, researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed a highly compact, portable camera that can image polarization in a single shot. The miniature camera -- about the size of a thumb -- could find a place in the vision systems of autonomous vehicles, onboard planes or satellites to study atmospheric chemistry, or be used to detect camouflaged objects.
The research is published in Science.
"This research is game-changing for imaging," said Federico Capasso, the Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering at SEAS and senior author of the paper. "Most cameras can typically only detect the intensity and color of light but can't see polarization. This camera is a new eye on reality, allowing us to reveal how light is reflected and transmitted by the world around us."
"Polarization is a feature of light that is changed upon reflection off a surface," said Paul Chevalier, a postdoctoral fellow at SEAS and co-author of the study. "Based on that change, polarization can help us in the 3D reconstruction of an object, to estimate its depth, texture and shape, and to distinguish man-made objects from natural ones, even if they're the same shape and color."
To unlock that powerful world of polarization, Capasso and his team harnessed the potential of metasurfaces, nanoscale structures that interact with light at wavelength size-scales.
"If we want to measure the light's full polarization state, we need to take several pictures along different polarization directions," said Noah Rubin, first author of the paper and graduate student in the Capasso Lab. "Previous devices either used moving parts or sent light along multiple paths to acquire the multiple images, resulting in bulky optics. A newer strategy uses specially patterned camera pixels, but this approach does not measure the full polarization state and requires a non-standard imaging sensor. In this work, we were able to take all of the optics needed and integrate them in a single, simple device with a metasurface."
Using a new understanding how polarized light interacts with objects, the researchers designed a metasurface that uses an array of subwavelength spaced nanopillars to direct light based on its polarization. The light then forms four images, each one showing a different aspect of the polarization. Taken together, these give a full snapshot of polarization at every pixel.
The device is about two centimeters in length and no more complicated than a camera on a smartphone. With an attached lens and protective case, the device is about the size of a small lunch box. The researchers tested the camera to show defects in injection-molded plastic objects, took it outside to film the polarization off car windshields and even took selfies to demonstrate how a polarization camera can visualize the 3D contours of a face.
"This technology could be integrated into existing imaging systems, such as the one in your cell phone or car, enabling the widespread adoption of polarization imaging and new applications previously unforeseen," said Rubin.
"This research opens an exciting new direction for camera technology with unprecedented compactness, allowing us to envision applications in atmospheric science, remote sensing, facial recognition, machine vision and more," said Capasso.
The Harvard Office of Technology Development has protected the intellectual property relating to this project and is exploring commercialization opportunities.
Read more at Science Daily
Pain signaling in humans more rapid than previously known
Pain signals can travel as fast as touch signals, according to a new study from researchers at Linköping University in Sweden, Liverpool John Moores University in the UK, and the National Institutes of Health (NIH) in the US. The discovery of a rapid pain-signalling system challenges our current understanding of pain. The study is published in the scientific journal Science Advances.
It has until now been believed that nerve signals for pain are always conducted more slowly than those for touch. The latter signals, which allow us to determine where we are being touched, are conducted by nerves that have a fatty sheath of myelin that insulates the nerve. Nerves with a thick layer of myelin conduct signals more rapidly than unmyelinated nerves. In contrast, the signalling of pain in humans has been thought to be considerably slower and carried out by nerves that have only a thin layer of myelin, or none at all.
In monkeys and many other mammals, on the other hand, part of the pain-signalling system can conduct nerve signals just as fast as the system that signals touch. The scientists speculated whether such a system is also present in humans.
"The ability to feel pain is vital to our survival, so why should our pain-signalling system be so much slower than the system used for touch, and so much slower than it could be?" asks Saad Nagi, principal research engineer of the Department of Clinical and Experimental Medicine and the Center for Social and Affective Neuroscience (CSAN) at Linköping University.
To answer this, the scientists used a technique that allowed them to detect the signals in the nerve fibres from a single nerve cell. They examined 100 healthy volunteers and looked for nerve cells that conducted signals as rapidly as the nerve cells that detect touch, but that had the properties of pain receptors, otherwise known as nociceptors. Pain receptors are characterised by the ability to detect noxious stimuli, such as pinching and abrasion of the skin, while not reacting to light touch. The researchers found that 12% of thickly myelinated nerve cells had the same properties as pain receptors, and in these nerve cells the conduction speed was as high as in touch-sensitive nerve cells.
The next step of the scientists' research was to determine the function of these ultrafast pain receptors. By applying short electrical pulses through the measurement electrodes, they could stimulate individual nerve cells. The volunteers described that they experienced sharp or pinprick pain.
"When we activated an individual nerve cell, it caused a perception of pain, so we conclude that these nerve cells are connected to pain centres in the brain," says Saad Nagi.
The research team also investigated patients with various rare neurological conditions. One group of people had, as adults, acquired nerve damage that led to the thickly myelinated nerve fibres being destroyed, while the small fibres were spared. These patients cannot detect light touch. The scientists predicted that the loss of myelinated nerve fibres should also affect the rapidly conducting pain system they had identified. It turned out that these people had an impaired ability to experience mechanical pain. Examination of patients with two other rare neurological conditions gave similar results. These results may be highly significant for pain research, and for the diagnosis and care of patients with pain.
Read more at Science Daily
It has until now been believed that nerve signals for pain are always conducted more slowly than those for touch. The latter signals, which allow us to determine where we are being touched, are conducted by nerves that have a fatty sheath of myelin that insulates the nerve. Nerves with a thick layer of myelin conduct signals more rapidly than unmyelinated nerves. In contrast, the signalling of pain in humans has been thought to be considerably slower and carried out by nerves that have only a thin layer of myelin, or none at all.
In monkeys and many other mammals, on the other hand, part of the pain-signalling system can conduct nerve signals just as fast as the system that signals touch. The scientists speculated whether such a system is also present in humans.
"The ability to feel pain is vital to our survival, so why should our pain-signalling system be so much slower than the system used for touch, and so much slower than it could be?" asks Saad Nagi, principal research engineer of the Department of Clinical and Experimental Medicine and the Center for Social and Affective Neuroscience (CSAN) at Linköping University.
To answer this, the scientists used a technique that allowed them to detect the signals in the nerve fibres from a single nerve cell. They examined 100 healthy volunteers and looked for nerve cells that conducted signals as rapidly as the nerve cells that detect touch, but that had the properties of pain receptors, otherwise known as nociceptors. Pain receptors are characterised by the ability to detect noxious stimuli, such as pinching and abrasion of the skin, while not reacting to light touch. The researchers found that 12% of thickly myelinated nerve cells had the same properties as pain receptors, and in these nerve cells the conduction speed was as high as in touch-sensitive nerve cells.
The next step of the scientists' research was to determine the function of these ultrafast pain receptors. By applying short electrical pulses through the measurement electrodes, they could stimulate individual nerve cells. The volunteers described that they experienced sharp or pinprick pain.
"When we activated an individual nerve cell, it caused a perception of pain, so we conclude that these nerve cells are connected to pain centres in the brain," says Saad Nagi.
The research team also investigated patients with various rare neurological conditions. One group of people had, as adults, acquired nerve damage that led to the thickly myelinated nerve fibres being destroyed, while the small fibres were spared. These patients cannot detect light touch. The scientists predicted that the loss of myelinated nerve fibres should also affect the rapidly conducting pain system they had identified. It turned out that these people had an impaired ability to experience mechanical pain. Examination of patients with two other rare neurological conditions gave similar results. These results may be highly significant for pain research, and for the diagnosis and care of patients with pain.
Read more at Science Daily
Jul 4, 2019
Murder in the Paleolithic? Evidence of violence behind human skull remains
New analysis of the fossilized skull of an Upper Paleolithic man suggests that he died a violent death, according to a study published July 3, 2019 in the open-access journal PLOS ONE by an international team from Greece, Romania and Germany led by the Eberhard Karls Universität Tübingen, Germany.
The fossilized skull of a Paleolithic adult man, known as the Cioclovina calvaria, was originally uncovered in a cave in South Transylvania and is thought to be around 33,000 years old. Since its discovery, this fossil has been extensively studied. Here, the authors reassessed trauma on the skull -- specifically a large fracture on the right aspect of the cranium which has been disputed in the past -- in order to evaluate whether this specific fracture occurred at the time of death or as a postmortem event.
The authors conducted experimental trauma simulations using twelve synthetic bone spheres, testing scenarios such as falls from various heights as well as single or double blows from rocks or bats. Along with these simulations, the authors inspected the fossil both visually and virtually using computed tomography technology.
The authors found there were actually two injuries at or near the time of death: a linear fracture at the base of the skull, followed by a depressed fracture on the right side of the cranial vault. The simulations showed that these fractures strongly resemble the pattern of injury resulting from consecutive blows with a bat-like object; the positioning suggests the blow resulting in the depressed fracture came from a face-to-face confrontation, possibly with the bat in the perpetrator's left hand. The researchers' analysis indicates that the two injuries were not the result of accidental injury, post-mortem damage, or a fall alone.
While the fractures would have been fatal, only the fossilized skull has been found so it's possible that bodily injuries leading to death might also have been sustained. Regardless, the authors state that the forensic evidence described in this study points to an intentionally-caused violent death, suggesting that homicide was practiced by early humans during the Upper Paleolithic.
The authors add: "The Upper Paleolithic was a time of increasing cultural complexity and technological sophistication. Our work shows that violent interpersonal behaviour and murder was also part of the behavioural repertoire of these early modern Europeans."
From Science Daily
The fossilized skull of a Paleolithic adult man, known as the Cioclovina calvaria, was originally uncovered in a cave in South Transylvania and is thought to be around 33,000 years old. Since its discovery, this fossil has been extensively studied. Here, the authors reassessed trauma on the skull -- specifically a large fracture on the right aspect of the cranium which has been disputed in the past -- in order to evaluate whether this specific fracture occurred at the time of death or as a postmortem event.
The authors conducted experimental trauma simulations using twelve synthetic bone spheres, testing scenarios such as falls from various heights as well as single or double blows from rocks or bats. Along with these simulations, the authors inspected the fossil both visually and virtually using computed tomography technology.
The authors found there were actually two injuries at or near the time of death: a linear fracture at the base of the skull, followed by a depressed fracture on the right side of the cranial vault. The simulations showed that these fractures strongly resemble the pattern of injury resulting from consecutive blows with a bat-like object; the positioning suggests the blow resulting in the depressed fracture came from a face-to-face confrontation, possibly with the bat in the perpetrator's left hand. The researchers' analysis indicates that the two injuries were not the result of accidental injury, post-mortem damage, or a fall alone.
While the fractures would have been fatal, only the fossilized skull has been found so it's possible that bodily injuries leading to death might also have been sustained. Regardless, the authors state that the forensic evidence described in this study points to an intentionally-caused violent death, suggesting that homicide was practiced by early humans during the Upper Paleolithic.
The authors add: "The Upper Paleolithic was a time of increasing cultural complexity and technological sophistication. Our work shows that violent interpersonal behaviour and murder was also part of the behavioural repertoire of these early modern Europeans."
From Science Daily
Winter monsoons became stronger during geomagnetic reversal
New evidence suggests that high-energy particles from space known as galactic cosmic rays affect the Earth's climate by increasing cloud cover, causing an "umbrella effect."
When galactic cosmic rays increased during the Earth's last geomagnetic reversal transition 780,000 years ago, the umbrella effect of low-cloud cover led to high atmospheric pressure in Siberia, causing the East Asian winter monsoon to become stronger. This is evidence that galactic cosmic rays influence changes in the Earth's climate. The findings were made by a research team led by Professor Masayuki Hyodo (Research Center for Inland Seas, Kobe University) and published on June 28 in the online edition of Scientific Reports.
The Svensmark Effect is a hypothesis that galactic cosmic rays induce low cloud formation and influence the Earth's climate. Tests based on recent meteorological observation data only show minute changes in the amounts of galactic cosmic rays and cloud cover, making it hard to prove this theory. However, during the last geomagnetic reversal transition, when the amount of galactic cosmic rays increased dramatically, there was also a large increase in cloud cover, so it should be possible to detect the impact of cosmic rays on climate at a higher sensitivity.
In the Chinese Loess Plateau, just south of the Gobi Desert near the border of Mongolia, dust has been transported for 2.6 million years to form loess layers -- sediment created by the accumulation of wind-blown silt -- that can reach up to 200 meters in thickness. If the wind gets stronger, the coarse particles are carried further, and larger amounts are transported. Focusing on this phenomenon, the research team proposed that winter monsoons became stronger under the umbrella effect of increased cloud cover during the geomagnetic reversal. They investigated changes in particle size and accumulation speed of loess layer dust in two Loess Plateau locations.
In both locations, for about 5000 years during the geomagnetic reversal 780,000 years ago, they discovered evidence of stronger winter monsoons: particles became coarser, and accumulation speeds were up to > 3 times faster. These strong winter monsoons coincide with the period during the geomagnetic reversal when the Earth's magnetic strength fell to less than ¼, and galactic cosmic rays increased by over 50%. This suggests that the increase in cosmic rays was accompanied by an increase in low-cloud cover, the umbrella effect of the clouds cooled the continent, and Siberian high atmospheric pressure became stronger. Added to other phenomena during the geomagnetic reversal -- evidence of an annual average temperature drop of 2-3 degrees Celsius, and an increase in annual temperature ranges from the sediment in Osaka Bay -- this new discovery about winter monsoons provides further proof that the climate changes are caused by the cloud umbrella effect.
Read more at Science Daily
When galactic cosmic rays increased during the Earth's last geomagnetic reversal transition 780,000 years ago, the umbrella effect of low-cloud cover led to high atmospheric pressure in Siberia, causing the East Asian winter monsoon to become stronger. This is evidence that galactic cosmic rays influence changes in the Earth's climate. The findings were made by a research team led by Professor Masayuki Hyodo (Research Center for Inland Seas, Kobe University) and published on June 28 in the online edition of Scientific Reports.
The Svensmark Effect is a hypothesis that galactic cosmic rays induce low cloud formation and influence the Earth's climate. Tests based on recent meteorological observation data only show minute changes in the amounts of galactic cosmic rays and cloud cover, making it hard to prove this theory. However, during the last geomagnetic reversal transition, when the amount of galactic cosmic rays increased dramatically, there was also a large increase in cloud cover, so it should be possible to detect the impact of cosmic rays on climate at a higher sensitivity.
In the Chinese Loess Plateau, just south of the Gobi Desert near the border of Mongolia, dust has been transported for 2.6 million years to form loess layers -- sediment created by the accumulation of wind-blown silt -- that can reach up to 200 meters in thickness. If the wind gets stronger, the coarse particles are carried further, and larger amounts are transported. Focusing on this phenomenon, the research team proposed that winter monsoons became stronger under the umbrella effect of increased cloud cover during the geomagnetic reversal. They investigated changes in particle size and accumulation speed of loess layer dust in two Loess Plateau locations.
In both locations, for about 5000 years during the geomagnetic reversal 780,000 years ago, they discovered evidence of stronger winter monsoons: particles became coarser, and accumulation speeds were up to > 3 times faster. These strong winter monsoons coincide with the period during the geomagnetic reversal when the Earth's magnetic strength fell to less than ¼, and galactic cosmic rays increased by over 50%. This suggests that the increase in cosmic rays was accompanied by an increase in low-cloud cover, the umbrella effect of the clouds cooled the continent, and Siberian high atmospheric pressure became stronger. Added to other phenomena during the geomagnetic reversal -- evidence of an annual average temperature drop of 2-3 degrees Celsius, and an increase in annual temperature ranges from the sediment in Osaka Bay -- this new discovery about winter monsoons provides further proof that the climate changes are caused by the cloud umbrella effect.
Read more at Science Daily
Measuring the laws of nature
There are some numerical values that define the basic properties of our universe. They are just as they are, and no one can tell why. These include, for example, the value of the speed of light, the mass of the electron, or the coupling constants that define the strength of the forces of nature.
One of these coupling constants, the "weak axial vector coupling constant" (abbreviated to gA), has now been measured with very high precision. This constant is needed to explain nuclear fusion in the sun, to understand the formation of elements shortly after the Big Bang, or to understand important experiments in particle physics. With the help of sophisticated neutron experiments, the value of the coupling constant gA has now been determined with an accuracy of 0.04 % The result has now been published in the journal "Physical Review Letters."
When particles change
There are four fundamental forces in our universe: electromagnetism, strong and weak nuclear force, and gravity. "To calculate these forces, we have to know certain parameters that determine their strength -- and especially in the case of weak interaction, this is a complicated matter," says Prof. Hartmut Abele from the Institute of Atomic and Subatomic Physics at TU Wien (Vienna). Weak interaction plays a crucial role when certain particles are transformed into others -- for example, when two protons merge into a nucleus in the sun and one of them becomes a neutron. To analyze such processes, the "weak axial vector coupling constant" gA has to be known.
There have been different attempts to measure gA. "For some of them, however, systematic corrections were required. Major disturbing factors can change the result by up to 30%," says Hartmut Abele.
A different measuring principle called "PERKEO" was developed in the 1980s in Heidelberg by Prof. Dirk Dubbers. Hartmut Abele has been involved in the work on the PERKEO detectors for many years, he himself has developed "PERKEO 2" as part of his dissertation. He works together with his former student Prof. Bastian Märkisch from TU Munich and Torsten Soldner from the Institut Laue-Langevin in Grenoble to significantly improve the measurement. With "PERKEO 3," new measurements have now been carried out in Grenoble, far exceeding all previous experiments in terms of accuracy.
The PEREKO detector analyzes neutrons, which decay into protons and emit a neutrino and an electron. "This electron emission is not perfectly symmetric," explains Hartmut Abele. "On one side, a few more electrons are emitted than on the other -- that depends on the spin direction of the neutron." The PERKEO detector uses strong magnetic fields to collect the electrons in both directions and then counts them. From the strength of the asymmetry, i.e. the difference in the number of electrons in the two directions, one can then directly deduce the value of the coupling constant gA.
Read more at Science Daily
One of these coupling constants, the "weak axial vector coupling constant" (abbreviated to gA), has now been measured with very high precision. This constant is needed to explain nuclear fusion in the sun, to understand the formation of elements shortly after the Big Bang, or to understand important experiments in particle physics. With the help of sophisticated neutron experiments, the value of the coupling constant gA has now been determined with an accuracy of 0.04 % The result has now been published in the journal "Physical Review Letters."
When particles change
There are four fundamental forces in our universe: electromagnetism, strong and weak nuclear force, and gravity. "To calculate these forces, we have to know certain parameters that determine their strength -- and especially in the case of weak interaction, this is a complicated matter," says Prof. Hartmut Abele from the Institute of Atomic and Subatomic Physics at TU Wien (Vienna). Weak interaction plays a crucial role when certain particles are transformed into others -- for example, when two protons merge into a nucleus in the sun and one of them becomes a neutron. To analyze such processes, the "weak axial vector coupling constant" gA has to be known.
There have been different attempts to measure gA. "For some of them, however, systematic corrections were required. Major disturbing factors can change the result by up to 30%," says Hartmut Abele.
A different measuring principle called "PERKEO" was developed in the 1980s in Heidelberg by Prof. Dirk Dubbers. Hartmut Abele has been involved in the work on the PERKEO detectors for many years, he himself has developed "PERKEO 2" as part of his dissertation. He works together with his former student Prof. Bastian Märkisch from TU Munich and Torsten Soldner from the Institut Laue-Langevin in Grenoble to significantly improve the measurement. With "PERKEO 3," new measurements have now been carried out in Grenoble, far exceeding all previous experiments in terms of accuracy.
The PEREKO detector analyzes neutrons, which decay into protons and emit a neutrino and an electron. "This electron emission is not perfectly symmetric," explains Hartmut Abele. "On one side, a few more electrons are emitted than on the other -- that depends on the spin direction of the neutron." The PERKEO detector uses strong magnetic fields to collect the electrons in both directions and then counts them. From the strength of the asymmetry, i.e. the difference in the number of electrons in the two directions, one can then directly deduce the value of the coupling constant gA.
Read more at Science Daily
Scientists weigh the balance of matter in galaxy clusters
A method of weighing the quantities of matter in galaxy clusters -- the largest objects in our universe -- has shown a balance between the amounts of hot gas, stars and other materials.
The results are the first to use observational data to measure this balance, which was theorized 20 years ago, and will yield fresh insight into the relationship between ordinary matter that emits light and dark matter, and about how our universe is expanding.
Galaxy clusters are the largest objects in the universe, each composed of around 1,000 massive galaxies. They contain vast amounts of dark matter, along with hot gas and cooler "ordinary matter," such as stars and cooler gas.
In a new study, published in Nature Communications, an international team led by astrophysicists from the University of Michigan in the US and the University of Birmingham in the UK used data from the Local Cluster Substructure Survey (LoCuSS) to measure the connections between the three main mass components that comprise galaxy clusters -- dark matter, hot gas, and stars.
Members of the research team had spent 12 years gathering data, which span a factor of 10 million in wavelength, using the Chandra and XMM-Newton satellites, the ROSAT All-sky survey, Subaru telescope, United Kingdom Infrared Telescope (UKIRT), Mayall telecope, the Sunyaev Zeldovich Array, and the Planck satellite. Using sophisticated statistical models and algorithms built by Dr Arya Farahi during his doctoral studies at the University of Michigan the team was able to conclude that the sum of gas and stars across the clusters that they studied is a nearly fixed fraction of the dark matter mass. This means that as stars form, the amount of hot gas available will decrease proportionally
"This validates the predictions of the prevailing cold dark matter theory. Everything is consistent with our current understanding of the universe," said Dr Farahi, currently a McWilliams Postdoctoral Fellow in the Department of Physics at Carnegie Mellon University.
Dr Graham Smith of the School of Physics and Astronomy at the University of Birmingham and Principal Investigator of LoCuSS, says: "A certain amount of material within the universe collapses to form galaxy clusters.
"But once they are formed, these clusters are 'closed boxes'. The hot gas has either formed stars, or still remains as gas, but the overall quantity remains constant."
"This research is powered by more than a decade of telescope investments," adds Professor August E. Evrard, of the University of Michigan. "Using this high quality data, we were able to characterise 41 nearby galaxy clusters and find a special relationship, specifically anti-correlated behaviour between the mass in stars and the mass in hot gas. This is significant because these two measurements together give us the best indication of the total system mass."
The findings will be crucial to astronomers' efforts to measure the properties of the universe as a whole. By gaining a better understanding of the internal physics of galaxy clusters, researchers will be able to better understand the behaviour of dark energy and the processes behind the expansion of the universe.
"Galaxy clusters are intrinsically fascinating, but in many ways still mysterious objects," adds Dr Smith. "Unpicking the complex astrophysics governing these objects will open many doors onto a broader understanding of the universe. Essentially, if we want to be able to claim that we understand how the universe works, we need to understand galaxy clusters."
Data of the kind studied by the team will grow by several orders of magnitude over the coming decades thanks to next-generation telescopes such as the Large Synoptic Survey Telescope (LSST) which is currently under construction in Chile, and e-ROSITA, a new x-ray satellite. Both will begin observations in the early 2020s.
Read more at Science Daily
The results are the first to use observational data to measure this balance, which was theorized 20 years ago, and will yield fresh insight into the relationship between ordinary matter that emits light and dark matter, and about how our universe is expanding.
Galaxy clusters are the largest objects in the universe, each composed of around 1,000 massive galaxies. They contain vast amounts of dark matter, along with hot gas and cooler "ordinary matter," such as stars and cooler gas.
In a new study, published in Nature Communications, an international team led by astrophysicists from the University of Michigan in the US and the University of Birmingham in the UK used data from the Local Cluster Substructure Survey (LoCuSS) to measure the connections between the three main mass components that comprise galaxy clusters -- dark matter, hot gas, and stars.
Members of the research team had spent 12 years gathering data, which span a factor of 10 million in wavelength, using the Chandra and XMM-Newton satellites, the ROSAT All-sky survey, Subaru telescope, United Kingdom Infrared Telescope (UKIRT), Mayall telecope, the Sunyaev Zeldovich Array, and the Planck satellite. Using sophisticated statistical models and algorithms built by Dr Arya Farahi during his doctoral studies at the University of Michigan the team was able to conclude that the sum of gas and stars across the clusters that they studied is a nearly fixed fraction of the dark matter mass. This means that as stars form, the amount of hot gas available will decrease proportionally
"This validates the predictions of the prevailing cold dark matter theory. Everything is consistent with our current understanding of the universe," said Dr Farahi, currently a McWilliams Postdoctoral Fellow in the Department of Physics at Carnegie Mellon University.
Dr Graham Smith of the School of Physics and Astronomy at the University of Birmingham and Principal Investigator of LoCuSS, says: "A certain amount of material within the universe collapses to form galaxy clusters.
"But once they are formed, these clusters are 'closed boxes'. The hot gas has either formed stars, or still remains as gas, but the overall quantity remains constant."
"This research is powered by more than a decade of telescope investments," adds Professor August E. Evrard, of the University of Michigan. "Using this high quality data, we were able to characterise 41 nearby galaxy clusters and find a special relationship, specifically anti-correlated behaviour between the mass in stars and the mass in hot gas. This is significant because these two measurements together give us the best indication of the total system mass."
The findings will be crucial to astronomers' efforts to measure the properties of the universe as a whole. By gaining a better understanding of the internal physics of galaxy clusters, researchers will be able to better understand the behaviour of dark energy and the processes behind the expansion of the universe.
"Galaxy clusters are intrinsically fascinating, but in many ways still mysterious objects," adds Dr Smith. "Unpicking the complex astrophysics governing these objects will open many doors onto a broader understanding of the universe. Essentially, if we want to be able to claim that we understand how the universe works, we need to understand galaxy clusters."
Data of the kind studied by the team will grow by several orders of magnitude over the coming decades thanks to next-generation telescopes such as the Large Synoptic Survey Telescope (LSST) which is currently under construction in Chile, and e-ROSITA, a new x-ray satellite. Both will begin observations in the early 2020s.
Read more at Science Daily
Jul 3, 2019
Why do mosquitoes choose humans?
Carolyn "Lindy" McBride is studying a question that haunts every summer gathering: How and why are mosquitoes attracted to humans?
Few animals specialize as thoroughly as the mosquitoes that carry diseases like Zika, malaria and dengue fever.
In fact, of the more than 3,000 mosquito species in the world, most are opportunistic, said McBride, an assistant professor of ecology and evolutionary biology and the Princeton Neuroscience Institute. They may be mammal biters, or bird biters, with a mild preference for various species within those categories, but most mosquitoes are neither totally indiscriminate nor species-specific. But McBride is most interested in the mosquitoes that scientists call "disease vectors" -- carriers of diseases that plague humans -- some of which have evolved to bite humans almost exclusively.
She studies several mosquitoes that carry diseases, including Aedes aegypti, which is the primary vector for dengue fever, Zika and yellow fever, and Culex pipiens, which carries West Nile virus. A. aegypti specializes in humans, while C. pipiens is less specialized, allowing it to transmit West Nile from birds to humans.
"It's the specialists that tend to be the best disease vectors, for obvious reasons: They bite a lot of humans," said McBride. She's trying to understand how the brain and genome of these mosquitoes have evolved to make them specialize in humans -- including how they can distinguish us from other mammals so effectively.
To help her understand what draws human-specialized mosquitoes to us, McBride compares the behavior, genetics and brains of the Zika mosquito to an African strain of the same species that does not specialize in humans.
In one line of research, she investigates how animal brains interpret complex aromas. That's a more complicated proposition than it first appears, since human odor is composed of more than 100 different compounds -- and those same compounds, in slightly different ratios, are present in most mammals.
"Not any one of those chemicals is attractive to mosquitoes by itself, so mosquitoes must recognize the ratio, the exact blend of components that defines human odor," said McBride. "So how does their brain figure it out?"
She is also studying what combination of compounds attracts mosquitoes. That could lead to baits that attract mosquitoes to lethal traps, or repellants that interrupt the signal.
Most mosquito studies in recent decades have been behavioral experiments, which are very labor intensive, said McBride. "You give them an odor and say, 'Do you like this?' and even with five compounds, the number of permutations you have to go through to figure out exactly what the right ratio is -- it's overwhelming." With 15 or 20 compounds, the number of permutations skyrockets, and with the full complement of 100, it's astronomical.
To test the odor preference of mosquitoes, McBride's lab has primarily used guinea pigs, small mammals with a different blend of many of the same 100 odor compounds of humans. Researchers gather their odor by blowing air over their bodies, and they then present mosquitoes with a choice between eau de guinea pig and a human arm. Human-specialized "domestic" A. aegypti mosquitoes will go toward the arm 90 to 95 percent of the time, said McBride, but the African "forest" A. aegypti mosquitoes are more likely to fly toward the guinea pig aroma.
In another recent experiment, then-senior Meredith Mihalopoulos of the Class of 2018 recruited seven volunteers and did "preference tests" with both forest and domestic A. aegypti mosquitoes. She let the mosquitoes choose between herself and each of the volunteers, finding that some people are more attractive to the insects than others. Then Alexis Kriete, a research specialist in the McBride lab, analyzed the odor of all the participants. They showed that while the same compounds were present, each human was more similar to each other than to the guinea pigs.
"There's nothing really unique about any animal odor," said McBride. "There's no one compound that characterizes a guinea pig species. To recognize a species, you have to recognize blends."
The McBride lab will be expanding to include other mammals and birds in their research. Graduate student Jessica Zung is working with farms and zoos to collect hair, fur, feather and wool samples from 50 animal species. She hopes to extract odor from them and analyze the odors at a Rutgers University facility that fractionates odors and identifies the ratio of the compounds. By inputting their odor profiles into a computational model, she and McBride hope to understand how exactly mosquitoes may have evolved to distinguish humans from non-human animals.
McBride's graduate student Zhilei Zhao is developing an entirely novel approach: imaging mosquito brains at very high resolutions to figure out how a mosquito identifies its next victim. "What combination of neural signals in the brain cause the mosquito to be attracted or repelled?" McBride asked. "If we can figure that out, then it's trivial to screen for blends that can be attractive or repellant. You put the mosquito up there, open up its head, image the brain, pop one aroma after another and watch: Does it hit the right combination of neurons?"
Key to that study will be the imaging equipment provided by Princeton's Bezos Center for Neural Circuit Dynamics, said McBride. "We can walk over there and say we want to image this, at this resolution, with this orientation, and a few months later, the microscope is built," she said. "We could have bought an off-the-shelf microscope, but it would have been so much slower and so much less powerful. Help from Stephan Thiberge, the director of the Bezos Center, has been critical for us."
McBride began her biology career studying evolution in butterflies, but she was lured to disease vector mosquitoes by how easy they are to rear in the lab. While the butterflies McBride studied need a year to develop, A. aegypti mosquitoes can go through an entire life cycle in three weeks, allowing for rapid-turnaround genetic experiments.
"That's what first drew me to mosquitoes," said McBride. "One of the surprises for me has been how satisfying it is that they have an impact on human health. That's certainly not why I got into biology -- I was studying birds and butterflies in the mountains, as far away from humans as I could get -- but I really appreciate that element of mosquito work now.
Read more at Science Daily
Few animals specialize as thoroughly as the mosquitoes that carry diseases like Zika, malaria and dengue fever.
In fact, of the more than 3,000 mosquito species in the world, most are opportunistic, said McBride, an assistant professor of ecology and evolutionary biology and the Princeton Neuroscience Institute. They may be mammal biters, or bird biters, with a mild preference for various species within those categories, but most mosquitoes are neither totally indiscriminate nor species-specific. But McBride is most interested in the mosquitoes that scientists call "disease vectors" -- carriers of diseases that plague humans -- some of which have evolved to bite humans almost exclusively.
She studies several mosquitoes that carry diseases, including Aedes aegypti, which is the primary vector for dengue fever, Zika and yellow fever, and Culex pipiens, which carries West Nile virus. A. aegypti specializes in humans, while C. pipiens is less specialized, allowing it to transmit West Nile from birds to humans.
"It's the specialists that tend to be the best disease vectors, for obvious reasons: They bite a lot of humans," said McBride. She's trying to understand how the brain and genome of these mosquitoes have evolved to make them specialize in humans -- including how they can distinguish us from other mammals so effectively.
To help her understand what draws human-specialized mosquitoes to us, McBride compares the behavior, genetics and brains of the Zika mosquito to an African strain of the same species that does not specialize in humans.
In one line of research, she investigates how animal brains interpret complex aromas. That's a more complicated proposition than it first appears, since human odor is composed of more than 100 different compounds -- and those same compounds, in slightly different ratios, are present in most mammals.
"Not any one of those chemicals is attractive to mosquitoes by itself, so mosquitoes must recognize the ratio, the exact blend of components that defines human odor," said McBride. "So how does their brain figure it out?"
She is also studying what combination of compounds attracts mosquitoes. That could lead to baits that attract mosquitoes to lethal traps, or repellants that interrupt the signal.
Most mosquito studies in recent decades have been behavioral experiments, which are very labor intensive, said McBride. "You give them an odor and say, 'Do you like this?' and even with five compounds, the number of permutations you have to go through to figure out exactly what the right ratio is -- it's overwhelming." With 15 or 20 compounds, the number of permutations skyrockets, and with the full complement of 100, it's astronomical.
To test the odor preference of mosquitoes, McBride's lab has primarily used guinea pigs, small mammals with a different blend of many of the same 100 odor compounds of humans. Researchers gather their odor by blowing air over their bodies, and they then present mosquitoes with a choice between eau de guinea pig and a human arm. Human-specialized "domestic" A. aegypti mosquitoes will go toward the arm 90 to 95 percent of the time, said McBride, but the African "forest" A. aegypti mosquitoes are more likely to fly toward the guinea pig aroma.
In another recent experiment, then-senior Meredith Mihalopoulos of the Class of 2018 recruited seven volunteers and did "preference tests" with both forest and domestic A. aegypti mosquitoes. She let the mosquitoes choose between herself and each of the volunteers, finding that some people are more attractive to the insects than others. Then Alexis Kriete, a research specialist in the McBride lab, analyzed the odor of all the participants. They showed that while the same compounds were present, each human was more similar to each other than to the guinea pigs.
"There's nothing really unique about any animal odor," said McBride. "There's no one compound that characterizes a guinea pig species. To recognize a species, you have to recognize blends."
The McBride lab will be expanding to include other mammals and birds in their research. Graduate student Jessica Zung is working with farms and zoos to collect hair, fur, feather and wool samples from 50 animal species. She hopes to extract odor from them and analyze the odors at a Rutgers University facility that fractionates odors and identifies the ratio of the compounds. By inputting their odor profiles into a computational model, she and McBride hope to understand how exactly mosquitoes may have evolved to distinguish humans from non-human animals.
McBride's graduate student Zhilei Zhao is developing an entirely novel approach: imaging mosquito brains at very high resolutions to figure out how a mosquito identifies its next victim. "What combination of neural signals in the brain cause the mosquito to be attracted or repelled?" McBride asked. "If we can figure that out, then it's trivial to screen for blends that can be attractive or repellant. You put the mosquito up there, open up its head, image the brain, pop one aroma after another and watch: Does it hit the right combination of neurons?"
Key to that study will be the imaging equipment provided by Princeton's Bezos Center for Neural Circuit Dynamics, said McBride. "We can walk over there and say we want to image this, at this resolution, with this orientation, and a few months later, the microscope is built," she said. "We could have bought an off-the-shelf microscope, but it would have been so much slower and so much less powerful. Help from Stephan Thiberge, the director of the Bezos Center, has been critical for us."
McBride began her biology career studying evolution in butterflies, but she was lured to disease vector mosquitoes by how easy they are to rear in the lab. While the butterflies McBride studied need a year to develop, A. aegypti mosquitoes can go through an entire life cycle in three weeks, allowing for rapid-turnaround genetic experiments.
"That's what first drew me to mosquitoes," said McBride. "One of the surprises for me has been how satisfying it is that they have an impact on human health. That's certainly not why I got into biology -- I was studying birds and butterflies in the mountains, as far away from humans as I could get -- but I really appreciate that element of mosquito work now.
Read more at Science Daily
Obese people outnumber smokers two to one
New figures from Cancer Research UK show that people who are obese now outnumber people who smoke two to one in the UK, and excess weight causes more cases of certain cancers than smoking, as the charity urges Government action to tackle obesity.
Almost a third of UK adults are obese and, while smoking is still the nation's biggest preventable cause of cancer and carries a much higher risk of the disease than obesity, Cancer Research UK's analysis revealed that being overweight or obese trumps smoking as the leading cause of four different types of cancer.
Excess weight causes around 1,900 more cases of bowel cancer than smoking in the UK each year. The same worrying pattern is true of cancer in the kidneys (1,400 more cases caused by excess weight than by smoking each year in the UK), ovaries (460) and liver (180).
Cancer Research UK launched a nationwide campaign this week to increase awareness of the link between obesity and cancer. Extra body fat sends out signals that can tell cells to divide more often and can cause damage that builds up over time and raises the risk of cancer.
The campaign compares smoking and obesity to show how policy change can help people form healthier habits, not to compare tobacco with food.
Michelle Mitchell, Cancer Research UK's chief executive, said: "As smoking rates fall and obesity rates rise, we can clearly see the impact on a national health crisis when the Government puts policies in place -- and when it puts its head in the sand.
"Our children could be a smoke-free generation, but we've hit a devastating record high for childhood obesity, and now we need urgent Government intervention to end the epidemic. They still have a chance to save lives.
"Scientists have so far identified that obesity causes 13 types of cancer but the mechanisms aren't fully understood. So further research is needed to find out more about the ways extra body fat can lead to cancer."
The charity wants the Government to act on its ambition to halve childhood obesity rates by 2030 and introduce a 9pm watershed for junk food adverts on TV and online, alongside other measures such as restricting promotional offers on unhealthy food and drinks.
Professor Linda Bauld, Cancer Research UK's prevention expert, commented: "There isn't a silver bullet to reduce obesity, but the huge fall in smoking over the years -- partly thanks to advertising and environmental bans -- shows that Government-led change works. It was needed to tackle sky-high smoking rates, and now the same is true for obesity.
Read more at Science Daily
Almost a third of UK adults are obese and, while smoking is still the nation's biggest preventable cause of cancer and carries a much higher risk of the disease than obesity, Cancer Research UK's analysis revealed that being overweight or obese trumps smoking as the leading cause of four different types of cancer.
Excess weight causes around 1,900 more cases of bowel cancer than smoking in the UK each year. The same worrying pattern is true of cancer in the kidneys (1,400 more cases caused by excess weight than by smoking each year in the UK), ovaries (460) and liver (180).
Cancer Research UK launched a nationwide campaign this week to increase awareness of the link between obesity and cancer. Extra body fat sends out signals that can tell cells to divide more often and can cause damage that builds up over time and raises the risk of cancer.
The campaign compares smoking and obesity to show how policy change can help people form healthier habits, not to compare tobacco with food.
Michelle Mitchell, Cancer Research UK's chief executive, said: "As smoking rates fall and obesity rates rise, we can clearly see the impact on a national health crisis when the Government puts policies in place -- and when it puts its head in the sand.
"Our children could be a smoke-free generation, but we've hit a devastating record high for childhood obesity, and now we need urgent Government intervention to end the epidemic. They still have a chance to save lives.
"Scientists have so far identified that obesity causes 13 types of cancer but the mechanisms aren't fully understood. So further research is needed to find out more about the ways extra body fat can lead to cancer."
The charity wants the Government to act on its ambition to halve childhood obesity rates by 2030 and introduce a 9pm watershed for junk food adverts on TV and online, alongside other measures such as restricting promotional offers on unhealthy food and drinks.
Professor Linda Bauld, Cancer Research UK's prevention expert, commented: "There isn't a silver bullet to reduce obesity, but the huge fall in smoking over the years -- partly thanks to advertising and environmental bans -- shows that Government-led change works. It was needed to tackle sky-high smoking rates, and now the same is true for obesity.
Read more at Science Daily
HIV eliminated from the genomes of living animals
HIV illustration |
"Our study shows that treatment to suppress HIV replication and gene editing therapy, when given sequentially, can eliminate HIV from cells and organs of infected animals," said Kamel Khalili, PhD, Laura H. Carnell Professor and Chair of the Department of Neuroscience, Director of the Center for Neurovirology, and Director of the Comprehensive NeuroAIDS Center at the Lewis Katz School of Medicine at Temple University (LKSOM). Dr. Khalili and Howard Gendelman, MD, Margaret R. Larson Professor of Infectious Diseases and Internal Medicine, Chair of the Department of Pharmacology and Experimental Neuroscience and Director of the Center for Neurodegenerative Diseases at UNMC, were senior investigators on the new study.
"This achievement could not have been possible without an extraordinary team effort that included virologists, immunologists, molecular biologists, pharmacologists, and pharmaceutical experts," Dr. Gendelman said. "Only by pooling our resources together were we able to make this groundbreaking discovery."
Current HIV treatment focuses on the use of antiretroviral therapy (ART). ART suppresses HIV replication but does not eliminate the virus from the body. Therefore, ART is not a cure for HIV, and it requires life-long use. If it is stopped, HIV rebounds, renewing replication and fueling the development of AIDS. HIV rebound is directly attributed to the ability of the virus to integrate its DNA sequence into the genomes of cells of the immune system, where it lies dormant and beyond the reach of antiretroviral drugs.
In previous work, Dr. Khalili's team used CRISPR-Cas9 technology to develop a novel gene editing and gene therapy delivery system aimed at removing HIV DNA from genomes harboring the virus. In rats and mice, they showed that the gene editing system could effectively excise large fragments of HIV DNA from infected cells, significantly impacting viral gene expression. Similar to ART, however, gene editing cannot completely eliminate HIV on its own.
For the new study, Dr. Khalili and colleagues combined their gene editing system with a recently developed therapeutic strategy known as long-acting slow-effective release (LASER) ART. LASER ART was co-developed by Dr. Gendelman and Benson Edagwa, PhD, Assistant Professor of Pharmacology at UNMC.
LASER ART targets viral sanctuaries and maintains HIV replication at low levels for extended periods of time, reducing the frequency of ART administration. The long-lasting medications were made possible by pharmacological changes in the chemical structure of the antiretroviral drugs. The modified drug was packaged into nanocrystals, which readily distribute to tissues where HIV is likely to be lying dormant. From there, the nanocrystals, stored within cells for weeks, slowly release the drug.
According to Dr. Khalili, "We wanted to see whether LASER ART could suppress HIV replication long enough for CRISPR-Cas9 to completely rid cells of viral DNA."
To test their idea, the researchers used mice engineered to produce human T cells susceptible to HIV infection, permitting long-term viral infection and ART-induced latency. Once infection was established, mice were treated with LASER ART and subsequently with CRISPR-Cas9. At the end of the treatment period, mice were examined for viral load. Analyses revealed complete elimination of HIV DNA in about one-third of HIV-infected mice.
Read more at Science Daily
Atmosphere of midsize planet revealed by Hubble, Spitzer
The planet, Gliese 3470 b (also known as GJ 3470 b), may be a cross between Earth and Neptune, with a large rocky core buried under a deep, crushing hydrogen-and-helium atmosphere. Weighing in at 12.6 Earth masses, the planet is more massive than Earth but less massive than Neptune (which is more than 17 Earth masses).
Many similar worlds have been discovered by NASA's Kepler space observatory, whose mission ended in 2018. In fact, 80% of the planets in our galaxy may fall into this mass range. However, astronomers have never been able to understand the chemical nature of such a planet until now, researchers say.
By inventorying the contents of GJ 3470 b's atmosphere, astronomers are able to uncover clues about the planet's nature and origin.
"This is a big discovery from the planet-formation perspective. The planet orbits very close to the star and is far less massive than Jupiter -- 318 times Earth's mass -- but has managed to accrete the primordial hydrogen/helium atmosphere that is largely 'unpolluted' by heavier elements," said Björn Benneke of the University of Montreal in Canada. "We don't have anything like this in the solar system, and that's what makes it striking."
Astronomers enlisted the combined multi-wavelength capabilities NASA's Hubble and Spitzer space telescopes to do a first-of-a-kind study of GJ 3470 b's atmosphere.
This was accomplished by measuring the absorption of starlight as the planet passed in front of its star (transit) and the loss of reflected light from the planet as it passed behind the star (eclipse). All told, the space telescopes observed 12 transits and 20 eclipses. The science of analyzing chemical fingerprints based on light is called "spectroscopy."
"For the first time we have a spectroscopic signature of such a world," said Benneke. But he is at a loss for classification: Should it be called a "super-Earth" or "sub-Neptune?" Or perhaps something else?
Fortuitously, the atmosphere of GJ 3470 b turned out to be mostly clear, with only thin hazes, enabling the scientists to probe deep into the atmosphere.
"We expected an atmosphere strongly enriched in heavier elements like oxygen and carbon which are forming abundant water vapor and methane gas, similar to what we see on Neptune," said Benneke. "Instead, we found an atmosphere that is so poor in heavy elements that its composition resembles the hydrogen/helium-rich composition of the Sun."
Other exoplanets, called "hot Jupiters," are thought to form far from their stars and over time migrate much closer. But this planet seems to have formed just where it is today, said Benneke.
The most plausible explanation, according to Benneke, is that GJ 3470 b was born precariously close to its red dwarf star, which is about half the mass of our Sun. He hypothesizes that essentially it started out as a dry rock and rapidly accreted hydrogen from a primordial disk of gas when its star was very young. The disk is called a "protoplanetary disk."
"We're seeing an object that was able to accrete hydrogen from the protoplanetary disk but didn't run away to become a hot Jupiter," said Benneke. "This is an intriguing regime."
One explanation is that the disk dissipated before the planet could bulk up further. "The planet got stuck being a sub-Neptune," said Benneke.
NASA's upcoming James Webb Space Telescope will be able to probe even deeper into GJ 3470 b's atmosphere, thanks to Webb's unprecedented sensitivity in the infrared. The new results have already spawned great interest from American and Canadian teams developing the instruments on Webb. They will observe the transits and eclipses of GJ 3470 b at light wavelengths where the atmospheric hazes become increasingly transparent.
The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Maryland, conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy in Washington, D.C.
Read more at Science Daily
Jul 2, 2019
Earliest example of merging galaxies
Researchers using the radio telescope ALMA (Atacama Large Millimeter/submillimeter Array) observed signals of oxygen, carbon, and dust from a galaxy in the early Universe 13 billion years ago. This is the earliest galaxy where this useful combination of three signals has been detected. By comparing the different signals, the team determined that the galaxy is actually two galaxies merging together, making it the earliest example of merging galaxies yet discovered.
Takuya Hashimoto, a postdoctoral researcher at the Japan Society for the Promotion of Science and Waseda University, Japan, and his team used ALMA to observe B14-65666, an object located 13 billion light-years away in the constellation Sextans. Because of the finite speed of light, the signals we receive from B14-65666 today had to travel for 13 billion years to reach us. In other words they show us the image of what the galaxy looked like 13 billion years ago, less than 1 billion years after the Big Bang.
ALMA detected radio emissions from oxygen, carbon, and dust in B14-65666. This is the earliest galaxy where all three of these signals have been detected. The detection of multiple signals is important because they carry complementary information.
Data analysis showed that the emissions are divided into two blobs. Previous observations with the Hubble Space Telescope (HST) had revealed two star clusters in B14-65666. Now with the three emission signals detected by ALMA, the team was able to show that the two blobs do in-fact form a single system, but they have different speeds. This indicates that the blobs are two galaxies in the process of merging. This is the earliest known example of merging galaxies. The research team estimated that the total stellar mass of B14-65666 is less than 10% that of the Milky Way. This means that B14-65666 is in the earliest phases of its evolution. Despite its youth, B14-65666 is producing stars 100 times more actively than the Milky Way. Such active star-formation is another important signature of galactic mergers, because the gas compression in colliding galaxies naturally leads to bursty star-formation.
"With rich data from ALMA and HST, combined with advanced data analysis, we could put the pieces together to show that B14-65666 is a pair of merging galaxies in the earliest era of the Universe," explains Hashimoto. "Detection of radio waves from three components in such a distant object clearly demonstrates ALMA's high capability to investigate the distant Universe."
Modern galaxies like our Milky Way have experienced countless, often violent, mergers. Sometimes a larger galaxy swallowed a smaller one. In rare cases, galaxies with similar sizes merged to form a new, larger galaxy. Mergers are essential for galaxy evolution, so many astronomers are eager to trace back the history of mergers.
Read more at Science Daily
Takuya Hashimoto, a postdoctoral researcher at the Japan Society for the Promotion of Science and Waseda University, Japan, and his team used ALMA to observe B14-65666, an object located 13 billion light-years away in the constellation Sextans. Because of the finite speed of light, the signals we receive from B14-65666 today had to travel for 13 billion years to reach us. In other words they show us the image of what the galaxy looked like 13 billion years ago, less than 1 billion years after the Big Bang.
ALMA detected radio emissions from oxygen, carbon, and dust in B14-65666. This is the earliest galaxy where all three of these signals have been detected. The detection of multiple signals is important because they carry complementary information.
Data analysis showed that the emissions are divided into two blobs. Previous observations with the Hubble Space Telescope (HST) had revealed two star clusters in B14-65666. Now with the three emission signals detected by ALMA, the team was able to show that the two blobs do in-fact form a single system, but they have different speeds. This indicates that the blobs are two galaxies in the process of merging. This is the earliest known example of merging galaxies. The research team estimated that the total stellar mass of B14-65666 is less than 10% that of the Milky Way. This means that B14-65666 is in the earliest phases of its evolution. Despite its youth, B14-65666 is producing stars 100 times more actively than the Milky Way. Such active star-formation is another important signature of galactic mergers, because the gas compression in colliding galaxies naturally leads to bursty star-formation.
"With rich data from ALMA and HST, combined with advanced data analysis, we could put the pieces together to show that B14-65666 is a pair of merging galaxies in the earliest era of the Universe," explains Hashimoto. "Detection of radio waves from three components in such a distant object clearly demonstrates ALMA's high capability to investigate the distant Universe."
Modern galaxies like our Milky Way have experienced countless, often violent, mergers. Sometimes a larger galaxy swallowed a smaller one. In rare cases, galaxies with similar sizes merged to form a new, larger galaxy. Mergers are essential for galaxy evolution, so many astronomers are eager to trace back the history of mergers.
Read more at Science Daily
Wood products mitigate less than one percent of global carbon emissions
Stacked lumber. |
An analysis across 180 countries found that global wood products offset 335 million tons of carbon dioxide in 2015, 71 million tons of which were unaccounted for under current United Nations standards. Wood product carbon sequestration could rise more than 100 million tons by 2030, depending on the level of global economic growth.
The results provide countries with the first consistent look at how their timber industries could offset their carbon emissions as nations search for ways to keep climate change manageable by severely curbing emissions.
Yet the new research also highlights how wood products account for just a small fraction of the needed offsets for all but a select few timber-heavy countries.
Craig Johnston, a professor of forest economics at the University of Wisconsin-Madison, and Volker Radeloff, a UW-Madison professor of forest and wildlife ecology, published their findings July 1 in the Proceedings of the National Academy of Sciences.
"Countries are looking for net-negative emissions strategies. So it's not just about lowering our emissions but pursuing strategies that might have storage potential, and harvested wood products are one of those options," says Johnston. "It's nice because you can pursue options that don't hinder growth. The question is, can we continue to consume wood products and have climate change benefits associated with that consumption?"
To address that question, Johnston worked with Radeloff to develop a consistent, international analysis of the carbon storage potential of these products, which countries must now account for under the global Paris Agreement to reduce carbon emissions.
They used data on lumber harvests and wood product production from 1961 to 2015, the most recent year available, from the U.N. Food and Agriculture Organization. The researchers modeled future carbon sequestration in wood products using five broad models of possible economic and population growth, the two factors that most affect demand for these products.
Although the production of wood products in 2015 offset less than 1 percent of global carbon emissions, the proportion was much higher for a handful of countries with large timber industries. Sweden's pool of wood products, for example, offset 9 percent of the country's carbon emissions in 2015, which accounted for 72 percent of emissions from industrial sources that year.
But for most countries, including the U.S., wood products mitigated a much smaller fraction of overall emissions in 2015, and this proportion is not expected to increase significantly through 2065, the researchers found.
Current U.N. guidelines only allow countries to count the carbon stored in wood products created from domestic timber harvests, not the timber grown locally and shipped internationally, nor products produced from imported lumber. These regulations create a gap between the actual amount of carbon stored in the world's wood products and what is officially counted.
In 2015, that gap amounted to 71 million tons of carbon dioxide, equivalent to the emissions from 15 million cars. If those guidelines remain unchanged, by 2065 another 50 million tons of carbon dioxide may go unaccounted for due to this gap. But this additional, uncounted carbon does not significantly increase the proportion of global emissions offset by wood products.
Johnston and Radeloff also found that the level of carbon stored in wood products is extremely sensitive to economic conditions. Slow or negative growth could significantly reduce the amount of carbon offset by these industries.
"As wood products are produced, you're adding to this carbon pool in the country, but these products do eventually decay. There's carbon emissions today from furniture or lumber that was produced 50 or 75 years ago," says Johnston. "So if we're not producing at a rate that at least offsets those emissions, then we'll actually see that carbon pool become a net source of emissions."
For example, the Great Recession in 2008 and 2009 turned America's wood products from a net sink of carbon into a net emitter. A similar effect released millions of tons of carbon dioxide from wood products for years after the Soviet Union collapsed, Johnston and Radeloff found.
All five of the study's projections for future economic growth predict that more carbon will be captured in wood products, but unforeseen economic shocks could temporarily reverse that trend for particular countries.
The current study offers a chance to assess current obligations and help countries predict future emissions. The results may also inform the next round of emissions targets and negotiations, the researchers say.
Read more at Science Daily
Physicists use light waves to accelerate supercurrents, enable ultrafast quantum computing
Quantum computing concept. |
Then he backed up and clarified all that. After all, the quantum world of matter and energy at terahertz and nanometer scales -- trillions of cycles per second and billionths of meters -- is still a mystery to most of us.
"I like to study quantum control of superconductivity exceeding the gigahertz, or billions of cycles per second, bottleneck in current state-of-the-art quantum computation applications," said Wang, a professor of physics and astronomy at Iowa State University whose research has been supported by the Army Research Office. "We're using terahertz light as a control knob to accelerate supercurrents."
Superconductivity is the movement of electricity through certain materials without resistance. It typically occurs at very, very cold temperatures. Think -400 Fahrenheit for "high-temperature" superconductors.
Terahertz light is light at very, very high frequencies. Think trillions of cycles per second. It's essentially extremely strong and powerful microwave bursts firing at very short time frames.
Wang and a team of researchers demonstrated such light can be used to control some of the essential quantum properties of superconducting states, including macroscopic supercurrent flowing, broken symmetry and accessing certain very high frequency quantum oscillations thought to be forbidden by symmetry.
It all sounds esoteric and strange. But it could have very practical applications.
"Light-induced supercurrents chart a path forward for electromagnetic design of emergent materials properties and collective coherent oscillations for quantum engineering applications," Wang and several co-authors wrote in a research paper just published online by the journal Nature Photonics.
In other words, the discovery could help physicists "create crazy-fast quantum computers by nudging supercurrents," Wang wrote in a summary of the research team's findings.
Finding ways to control, access and manipulate the special characteristics of the quantum world and connect them to real-world problems is a major scientific push these days. The National Science Foundation has included the "Quantum Leap" in its "10 big ideas" for future research and development.
"By exploiting interactions of these quantum systems, next-generation technologies for sensing, computing, modeling and communicating will be more accurate and efficient," says a summary of the science foundation's support of quantum studies. "To reach these capabilities, researchers need understanding of quantum mechanics to observe, manipulate and control the behavior of particles and energy at dimensions at least a million times smaller than the width of a human hair."
Wang and his collaborators -- Xu Yang, Chirag Vaswani and Liang Luo from Iowa State, responsible for terahertz instrumentation and experiments; Chris Sundahl, Jong-Hoon Kang and Chang-Beom Eom from the University of Wisconsin-Madison, responsible for high-quality superconducting materials and their characterizations; Martin Mootz and Ilias E. Perakis from the University of Alabama at Birmingham, responsible for model building and theoretical simulations -- are advancing the quantum frontier by finding new macroscopic supercurrent flowing states and developing quantum controls for switching and modulating them.
A summary of the research team's study says experimental data obtained from a terahertz spectroscopy instrument indicates terahertz light-wave tuning of supercurrents is a universal tool "and is key for pushing quantum functionalities to reach their ultimate limits in many cross-cutting disciplines" such as those mentioned by the science foundation.
Read more at Science Daily
Evolution of life in the ocean changed 170 million years ago
Modern-day plankton. |
Until that point, the success of organisms living within the marine environment had been strongly controlled by non-biological factors, including ocean chemistry and climate.
However, from the middle of the Jurassic period onwards (some 170 million years ago), biological factors such as predator-prey relationships became increasingly important.
Writing in Nature Geoscience, scientists say this change coincided with the proliferation of calcium carbonate-secreting plankton and their subsequent deposition on the ocean floor.
They believe the rise of this plankton stabilised the chemical composition of the ocean and provided the conditions for one of the most prominent diversifications of marine life in Earth's history.
The research was led by academics from the University of Plymouth's School of Geography, Earth and Environmental Sciences and School of Computing, Electronics and Mathematics, in cooperation with scientists from the University of Bergen in Norway, and the University of Erlangen-Nuremberg in Germany.
PhD candidate Kilian Eichenseer, the study's lead author, explained the impact of calcifying plankton: "Today, huge areas of the ocean floor are covered with the equivalent of chalk, made up of microscopic organisms that rose to dominance in the middle of the Jurassic period. The chalky mass helps to balance out the acidity of the ocean and, with that balance in place, organisms are less at the mercy of short-term perturbations of ocean chemistry than they might have been previously. It is easier to secrete a shell, regardless of its mineralogy, if the ocean chemistry is stable."
The aim of the research was to test the hypothesis that the evolutionary importance of the non-biological environment had declined through geological time.
Since its emergence more than 540 million years ago, multicellular life evolved under the influence of both the non-biological and the biological environment, but how the balance between these factors changed remained largely unknown.
Calcified seashells provide an ideal test to answer this question, as aragonite and calcite -- the minerals making up seashells -- also form non-biologically in the ocean.
In their study, the authors used the vast global fossil record of marine organisms that secreted calcium carbonate, which encompasses more than 400,000 samples dating from 10,000 years BC up to around 500 million years ago.
Using reconstructions of the temperature and the ocean water composition of the past, the authors estimated the proportion of aragonite and calcite that formed inorganically in the ocean in 85 geological stages across 500 million years.
Through a series of specially developed statistical analyses, this inorganic pattern of aragonite-calcite seas was then compared with seashell mineral composition over the same time.
The results show that up until the middle of the Jurassic period, around 170 million years ago, the ecological success of shell-secreting marine organisms was tightly coupled to their shell composition: organisms that secreted the mineral that was environmentally favoured had an evolutionary advantage.
However, the Earth-Life system was revolutionised forever by the rise of calcifying plankton, which expanded the production of calcium carbonate from continental shelves to the open ocean.
This ensured that the evolutionary impact of episodes of severe climate changes, and resulting ocean acidification, was less severe than comparable events earlier in Earth history.
Read more at Science Daily
Jul 1, 2019
Blue color tones in fossilized prehistoric feathers
Examining fossilised pigments, scientists from the University of Bristol have uncovered new insights into blue colour tones in prehistoric birds.
For some time, paleontologists have known that melanin pigment can preserve in fossils and have been able to reconstruct fossil colour patterns.
Melanin pigment gives black, reddish brown and grey colours to birds and is involved in creating bright iridescent sheens in bird feathers.
This can be observed by studying the melanin packages called melanosomes, which are shaped like little cylindrical objects less than one-thousandth of a millimetre and vary in shape from sausage shapes to little meatballs.
However, besides iridescent colours, which is structural, birds also make non-iridescent structural colours.
Those are, for example, blue colour tones in parrots and kingfishers. Until now, it was not known if such colours could be discovered in fossils.
This blue structural colour is created by the dense arrangement of cavities inside feathers, which scatters the blue light. Underneath is a layer of melanin that absorbs unscattered light.
Paleontologists have shown that the feather itself, which is made of keratin, does not fossilise while the melanin does. Therefore, if a blue feather fossilised, the dark pigment may be the only surviving feature and the feather may be interpreted as black or brown.
Now researchers from the University of Bristol, led by Frane Barbarovic who is currently at the University of Sheffield, have shown that blue feather melanosomes are highly distinct from melanosomes that are from feathers expressing black, reddish-brown, brown and iridescent, but overlap significantly with some grey feather melanosomes.
By looking at plumage colourations of modern representatives of fossil specimen and reconstructing which colour was the most likely present in the fossil specimen, they were able to discriminate between melanosomes significant for grey and blue colour, leading to the reconstruction of prehistoric Eocoracias brachyptera as a predominantly blue bird.
Frane Barbarovic said: "We have discovered that melanosomes in blue feathers have a distinct range in size from most of colour categories and we can, therefore, constrain which fossils may have been blue originally.
"The overlap with grey colour may suggest some common mechanism in how melanosomes are involved in making grey colouration and how these structural blue colours are formed.
"Based on these results in our publication we have also hypothesized potential evolutionary transition between blue and grey colour."
The research team now need to understand which birds are more likely to be blue based on their ecologies and modes of life. The blue colour is common in nature, but the ecology of this colour and its function in the life of birds is still elusive.
Read more at Science Daily
For some time, paleontologists have known that melanin pigment can preserve in fossils and have been able to reconstruct fossil colour patterns.
Melanin pigment gives black, reddish brown and grey colours to birds and is involved in creating bright iridescent sheens in bird feathers.
This can be observed by studying the melanin packages called melanosomes, which are shaped like little cylindrical objects less than one-thousandth of a millimetre and vary in shape from sausage shapes to little meatballs.
However, besides iridescent colours, which is structural, birds also make non-iridescent structural colours.
Those are, for example, blue colour tones in parrots and kingfishers. Until now, it was not known if such colours could be discovered in fossils.
This blue structural colour is created by the dense arrangement of cavities inside feathers, which scatters the blue light. Underneath is a layer of melanin that absorbs unscattered light.
Paleontologists have shown that the feather itself, which is made of keratin, does not fossilise while the melanin does. Therefore, if a blue feather fossilised, the dark pigment may be the only surviving feature and the feather may be interpreted as black or brown.
Now researchers from the University of Bristol, led by Frane Barbarovic who is currently at the University of Sheffield, have shown that blue feather melanosomes are highly distinct from melanosomes that are from feathers expressing black, reddish-brown, brown and iridescent, but overlap significantly with some grey feather melanosomes.
By looking at plumage colourations of modern representatives of fossil specimen and reconstructing which colour was the most likely present in the fossil specimen, they were able to discriminate between melanosomes significant for grey and blue colour, leading to the reconstruction of prehistoric Eocoracias brachyptera as a predominantly blue bird.
Frane Barbarovic said: "We have discovered that melanosomes in blue feathers have a distinct range in size from most of colour categories and we can, therefore, constrain which fossils may have been blue originally.
"The overlap with grey colour may suggest some common mechanism in how melanosomes are involved in making grey colouration and how these structural blue colours are formed.
"Based on these results in our publication we have also hypothesized potential evolutionary transition between blue and grey colour."
The research team now need to understand which birds are more likely to be blue based on their ecologies and modes of life. The blue colour is common in nature, but the ecology of this colour and its function in the life of birds is still elusive.
Read more at Science Daily
Neanderthals made repeated use of the ancient settlement of 'Ein Qashish, Israel
The archaeological site of 'Ein Qashish in northern Israel was a place of repeated Neanderthal occupation and use during the Middle Paleolithic, according to a study released June 26, 2019 in the open-access journal PLOS ONE by Ravid Ekshtain of the Hebrew University of Jerusalem and colleagues.
In the Levant region of the Middle East, the main source of information on Middle Paleolithic human occupation comes from cave sites. Compared to open air settlements, sheltered sites like caves were easily recognized and often visited, and therefore are more likely to record long periods of occupation. The open-air site of 'Ein Qashish in northern Israel, however, is unusual in having been inhabited over an extended prehistoric time period. This site provides a unique opportunity to explore an open-air locality across a large landscape and over a long period ranging between 71,000 and 54,000 years ago.
In a joint collaboration with the Israel Antiquities Authority Ekshtain and colleagues identified human skeletal remains in 'Ein Qashish as Neanderthal and observed more than 12,000 artifacts from four different depositional units in the same location on the landscape. These units represent different instances of occupation during changing environmental conditions.
From modification of artifacts and animal bones at the site, the authors infer that the occupants were knapping tools, provisioning resources, and consuming animals on-site.
Whereas many open-air settlements are thought to be short-lived and chosen for specialized tasks, 'Ein Qashish appears to be the site of repeated occupations each of which hosted a range of general activities, indicating a stable and consistent settlement system. The authors suggest that within a complex settlement system, open-air sites may have been more important for prehistoric humans than previously thought.
Ekshtain adds: "Ein Qashish is a 70-60 thousand years open-air site, with a series of stratified human occupations in a dynamic flood plain environment. The site stands out in the extensive excavated area and some unique finds for an open-air context, from which we deduce the diversity of human activities on the landscape. In contrast to other known open-air sites, the locality was not used for task-specific activities but rather served time and again as a habitation location. The stratigraphy, dates and finds from the site allow a reconstruction of a robust settlement system of the late Neanderthals in northern Israel slightly before their disappearance from the regional record, raising questions about the reasons for their disappearance and about their interactions with contemporaneous modern humans."
From Science Daily
In the Levant region of the Middle East, the main source of information on Middle Paleolithic human occupation comes from cave sites. Compared to open air settlements, sheltered sites like caves were easily recognized and often visited, and therefore are more likely to record long periods of occupation. The open-air site of 'Ein Qashish in northern Israel, however, is unusual in having been inhabited over an extended prehistoric time period. This site provides a unique opportunity to explore an open-air locality across a large landscape and over a long period ranging between 71,000 and 54,000 years ago.
In a joint collaboration with the Israel Antiquities Authority Ekshtain and colleagues identified human skeletal remains in 'Ein Qashish as Neanderthal and observed more than 12,000 artifacts from four different depositional units in the same location on the landscape. These units represent different instances of occupation during changing environmental conditions.
From modification of artifacts and animal bones at the site, the authors infer that the occupants were knapping tools, provisioning resources, and consuming animals on-site.
Whereas many open-air settlements are thought to be short-lived and chosen for specialized tasks, 'Ein Qashish appears to be the site of repeated occupations each of which hosted a range of general activities, indicating a stable and consistent settlement system. The authors suggest that within a complex settlement system, open-air sites may have been more important for prehistoric humans than previously thought.
Ekshtain adds: "Ein Qashish is a 70-60 thousand years open-air site, with a series of stratified human occupations in a dynamic flood plain environment. The site stands out in the extensive excavated area and some unique finds for an open-air context, from which we deduce the diversity of human activities on the landscape. In contrast to other known open-air sites, the locality was not used for task-specific activities but rather served time and again as a habitation location. The stratigraphy, dates and finds from the site allow a reconstruction of a robust settlement system of the late Neanderthals in northern Israel slightly before their disappearance from the regional record, raising questions about the reasons for their disappearance and about their interactions with contemporaneous modern humans."
From Science Daily
Bird three times larger than ostrich discovered in Crimean cave
A surprise discovery in a Crimean cave suggests that early Europeans lived alongside some of the largest ever known birds, according to new research published in the Journal of Vertebrate Paleontology.
It was previously thought that such gigantism in birds only ever existed on the islands of Madagascar and New Zealand as well as Australia. The newly-discovered specimen, discovered in the Taurida Cave on the northern coast of the Black Sea, suggests a bird as giant as the Madagascan elephant bird or New Zealand moa. It may have been a source of meat, bones, feathers and eggshell for early humans.
"When I first felt the weight of the bird whose thigh bone I was holding in my hand, I thought it must be a Malagasy elephant bird fossil because no birds of this size have ever been reported from Europe. However, the structure of the bone unexpectedly told a different story," says lead author Dr Nikita Zelenkov from the Russian Academy of Sciences.
"We don't have enough data yet to say whether it was most closely related to ostriches or to other birds, but we estimate it weighed about 450kg. This formidable weight is nearly double the largest moa, three times the largest living bird, the common ostrich, and nearly as much as an adult polar bear."
It is the first time a bird of such size has been reported from anywhere in the northern hemisphere. Although the species was previously known, no one ever tried to calculate the size of this animal. The flightless bird, attributed to the species Pachystruthio dmanisensis, was probably at least 3.5 metres tall and would have towered above early humans. It may have been flightless but it was also fast.
While elephant birds were hampered by their great size when it came to speed, the femur of the current bird was relatively long and slim, suggesting it was a better runner. The femur is comparable to modern ostriches as well as smaller species of moa and terror birds. Speed may have been essential to the bird's survival. Alongside its bones, palaeontologists found fossils of highly-specialised, massive carnivores from the Ice Age. They included giant cheetah, giant hyenas and sabre-toothed cats, which were able to prey on mammoths.
Other fossils discovered alongside the specimen, such as bison, help date it to 1.5 to 2 million years ago. A similar range of fossils was discovered at an archaeological site in the town of Dmanisi in Georgia, the oldest hominin site outside Africa. Although previously neglected by science, this suggests the giant bird may have been typical of the animals found at the time when the first hominins arrived in Europe. The authors suggest it reached the Black Sea region via the Southern Caucasus and Turkey.
The body mass of the bird was reconstructed using calculations from several formulae, based on measurements from the femur bone. Applying these formulae, the body mass of the bird was estimated to be around 450kg. Such gigantism may have originally evolved in response to the environment, which was increasingly arid as the Pleistocene epoch approached. Animals with a larger body mass have lower metabolic demands and can therefore make use of less nutritious food growing in open steppes.
Read more at Science Daily
It was previously thought that such gigantism in birds only ever existed on the islands of Madagascar and New Zealand as well as Australia. The newly-discovered specimen, discovered in the Taurida Cave on the northern coast of the Black Sea, suggests a bird as giant as the Madagascan elephant bird or New Zealand moa. It may have been a source of meat, bones, feathers and eggshell for early humans.
"When I first felt the weight of the bird whose thigh bone I was holding in my hand, I thought it must be a Malagasy elephant bird fossil because no birds of this size have ever been reported from Europe. However, the structure of the bone unexpectedly told a different story," says lead author Dr Nikita Zelenkov from the Russian Academy of Sciences.
"We don't have enough data yet to say whether it was most closely related to ostriches or to other birds, but we estimate it weighed about 450kg. This formidable weight is nearly double the largest moa, three times the largest living bird, the common ostrich, and nearly as much as an adult polar bear."
It is the first time a bird of such size has been reported from anywhere in the northern hemisphere. Although the species was previously known, no one ever tried to calculate the size of this animal. The flightless bird, attributed to the species Pachystruthio dmanisensis, was probably at least 3.5 metres tall and would have towered above early humans. It may have been flightless but it was also fast.
While elephant birds were hampered by their great size when it came to speed, the femur of the current bird was relatively long and slim, suggesting it was a better runner. The femur is comparable to modern ostriches as well as smaller species of moa and terror birds. Speed may have been essential to the bird's survival. Alongside its bones, palaeontologists found fossils of highly-specialised, massive carnivores from the Ice Age. They included giant cheetah, giant hyenas and sabre-toothed cats, which were able to prey on mammoths.
Other fossils discovered alongside the specimen, such as bison, help date it to 1.5 to 2 million years ago. A similar range of fossils was discovered at an archaeological site in the town of Dmanisi in Georgia, the oldest hominin site outside Africa. Although previously neglected by science, this suggests the giant bird may have been typical of the animals found at the time when the first hominins arrived in Europe. The authors suggest it reached the Black Sea region via the Southern Caucasus and Turkey.
The body mass of the bird was reconstructed using calculations from several formulae, based on measurements from the femur bone. Applying these formulae, the body mass of the bird was estimated to be around 450kg. Such gigantism may have originally evolved in response to the environment, which was increasingly arid as the Pleistocene epoch approached. Animals with a larger body mass have lower metabolic demands and can therefore make use of less nutritious food growing in open steppes.
Read more at Science Daily
Researchers decipher the history of supermassive black holes in the early universe
Astrophysicists at Western University have found evidence for the direct formation of black holes that do not need to emerge from a star remnant. The production of black holes in the early universe, formed in this manner, may provide scientists with an explanation for the presence of extremely massive black holes at a very early stage in the history of our universe.
Shantanu Basu and Arpan Das from Western's Department of Physics & Astronomy have developed an explanation for the observed distribution of supermassive black hole masses and luminosities, for which there was previously no scientific explanation. The findings were published today by Astrophysical Journal Letters.
The model is based on a very simple assumption: supermassive black holes form very, very quickly over very, very short periods of time and then suddenly, they stop. This explanation contrasts with the current understanding of how stellar-mass black holes are formed, which is they emerge when the centre of a very massive star collapses in upon itself.
"This is indirect observational evidence that black holes originate from direct-collapses and not from stellar remnants," says Basu, an astronomy professor at Western who is internationally recognized as an expert in the early stages of star formation and protoplanetary disk evolution.
Basu and Das developed the new mathematical model by calculating the mass function of supermassive black holes that form over a limited time period and undergo a rapid exponential growth of mass. The mass growth can be regulated by the Eddington limit that is set by a balance of radiation and gravitation forces or can even exceed it by a modest factor.
"Supermassive black holes only had a short time period where they were able to grow fast and then at some point, because of all the radiation in the universe created by other black holes and stars, their production came to a halt," explains Basu. "That's the direct-collapse scenario."
During the last decade, many supermassive black holes that are a billion times more massive than the Sun have been discovered at high 'redshifts,' meaning they were in place in our universe within 800 million years after the Big Bang. The presence of these young and very massive black holes question our understanding of black hole formation and growth. The direct-collapse scenario allows for initial masses that are much greater than implied by the standard stellar remnant scenario, and can go a long way to explaining the observations. This new result provides evidence that such direct-collapse black holes were indeed produced in the early universe.
Read more at Science Daily
Shantanu Basu and Arpan Das from Western's Department of Physics & Astronomy have developed an explanation for the observed distribution of supermassive black hole masses and luminosities, for which there was previously no scientific explanation. The findings were published today by Astrophysical Journal Letters.
The model is based on a very simple assumption: supermassive black holes form very, very quickly over very, very short periods of time and then suddenly, they stop. This explanation contrasts with the current understanding of how stellar-mass black holes are formed, which is they emerge when the centre of a very massive star collapses in upon itself.
"This is indirect observational evidence that black holes originate from direct-collapses and not from stellar remnants," says Basu, an astronomy professor at Western who is internationally recognized as an expert in the early stages of star formation and protoplanetary disk evolution.
Basu and Das developed the new mathematical model by calculating the mass function of supermassive black holes that form over a limited time period and undergo a rapid exponential growth of mass. The mass growth can be regulated by the Eddington limit that is set by a balance of radiation and gravitation forces or can even exceed it by a modest factor.
"Supermassive black holes only had a short time period where they were able to grow fast and then at some point, because of all the radiation in the universe created by other black holes and stars, their production came to a halt," explains Basu. "That's the direct-collapse scenario."
During the last decade, many supermassive black holes that are a billion times more massive than the Sun have been discovered at high 'redshifts,' meaning they were in place in our universe within 800 million years after the Big Bang. The presence of these young and very massive black holes question our understanding of black hole formation and growth. The direct-collapse scenario allows for initial masses that are much greater than implied by the standard stellar remnant scenario, and can go a long way to explaining the observations. This new result provides evidence that such direct-collapse black holes were indeed produced in the early universe.
Read more at Science Daily
Jun 30, 2019
When the dinosaurs died, lichens thrived
Lichens on rock. |
"We thought that lichens would be affected negatively, but in the three groups we looked at, they seized the chance and diversified rapidly," says Jen-Pang Huang, the paper's first author, a former postdoctoral researcher at the Field Museum now at Academia Sinica in Taipei. "Some lichens grow sophisticated 3D structures like plant leaves, and these ones filled the niches of plants that died out."
The researchers got interested in studying the effects of the mass extinction on lichens after reading a paper about how the asteroid strike also caused many species of early birds to go extinct. "I read it on the train, and I thought, 'My god, the poor lichens, they must have suffered too, how can we trace what happened to them?'" says Thorsten Lumbsch, senior author on the study and the Field Museum's curator of lichenized fungi.
You've seen lichens a million times, even if you didn't realize it. "Lichens are everywhere," says Huang. "If you go on a walk in the city, the rough spots or gray spots you see on rocks or walls or trees, those are common crust lichens. On the ground, they sometimes look like chewing gum. And if you go into a more pristine forest, you can find orange, yellow, and vivid violet colors -- lichens are really pretty." They're what scientists call "symbiotic organisms" -- they're made up of two different life forms sharing one body and working together. They're a partnership between a fungus and an organism that can perform photosynthesis, making energy from sunlight -- either a tiny algae plant, or a special kind of blue-green bacterium. Fungi, which include mushrooms and molds, are on their own branch on the tree of life, separate from plants and animals (and actually more closely related to us than to plants). The main role of fungi is to break down decomposing material.
During the mass extinction 66 million years ago, plants suffered since ash from the asteroid blocked out sunlight and lowered temperatures. But the mass extinction seemed to be a good thing for fungi -- they don't rely on sunlight for food and just need lots of dead stuff, and the fossil record shows an increase in fungal spores at this time. Since lichens contain a plant and a fungus, scientists wondered whether they were affected negatively like a plant or positively like a fungus.
"We originally expected lichens to be affected in a negative way, since they contain green things that need light," says Huang.
To see how lichens were affected by the mass extinction, the scientists had to get creative -- there aren't many fossil lichens from that time frame. But while the researchers didn't have lichen fossils, they did have lots of modern lichen DNA.
From observing fungi growing in lab settings, scientists know generally how often genetic mutations show up in fungal DNA -- how frequently a letter in the DNA sequence accidentally gets switched during the DNA copying process. That's called the mutation rate. And if you know the mutation rate, if you compare the DNA sequences of two different species, you can generally extrapolate how long ago they must have had a common ancestor with the same DNA.
The researchers fed DNA sequences of three families of lichens into a software program that compared their DNA and figured out what their family tree must look like, including estimates of how long ago it branched into the groups we see today. They bolstered this information with the few lichen fossils they did have, from 100 and 400 million years ago. And the results pointed to a lichen boom after 66 million years ago, at least for some of the leafier lichen families.
"Some groups don't show a change, so they didn't suffer or benefit from the changes to the environment," says Lumbsch, who in addition to his work on lichens is the Vice President of Science and Education at the Field. "Some lichens went extinct, and the leafy macrolichens filled those niches. I was really happy when I saw that not all the lichens suffered."
The results underline how profoundly the natural world we know today was shaped by this mass extinction. "If you could go back 40 million years, the most prominent groups in vegetation, birds, fungi -- they'd be more similar to what you see now than what you'd see 70 million years ago," says Lumbsch. "Most of what we see around us nowadays in nature originated after the dinosaurs."
And since this study shows how lichens responded to mass extinction 66 million years ago, it could shed light on how species will respond to the mass extinction the planet is currently undergoing. "Before we lose the world's biodiversity, we should document it, because we don't know when we'll need it," says Huang. "Lichens are environmental indicators -- by simply doing a biodiversity study, we can infer air quality and pollution levels."
Beyond the potential implications in understanding environmental impacts and mass extinctions, the researchers point to the ways the study deepens our understanding of the world around us.
"For me, it's fascinating because you would not be able to do this without large molecular datasets. This would have been impossible ten years ago," says Lumbsch. "It's another piece to the puzzle to understanding what's around us in nature."
"We expect a lot of patterns from studying other organisms, but fungi don't follow the pattern. Fungi are weird," says Huang. "They're really unpredictable, really diverse, really fun."
Read more at Science Daily
Scientists discover how plants breathe -- and how humans shaped their 'lungs'
Stomata in leaf. |
Botanists have known since the 19th century that leaves have pores -- called stomata -- and contain an intricate internal network of air channels. But until now it wasn't understood how those channels form in the right places in order to provide a steady flow of CO2 to every plant cell.
The new study, led by scientists at the University of Sheffield's Institute for Sustainable Food and published in Nature Communications, used genetic manipulation techniques to reveal that the more stomata a leaf has, the more airspace it forms. The channels act like bronchioles -- the tiny passages that carry air to the exchange surfaces of human and animal lungs.
In collaboration with colleagues at the University of Nottingham and Lancaster University, they showed that the movement of CO2 through the pores most likely determines the shape and scale of the air channel network.
The discovery marks a major step forward in our understanding of the internal structure of a leaf, and how the function of tissues can influence how they develop -- which could have ramifications beyond plant biology, in fields such as evolutionary biology.
The study also shows that wheat plants have been bred by generations of people to have fewer pores on their leaves and fewer air channels, which makes their leaves more dense and allows them to be grown with less water.
This new insight highlights the potential for scientists to make staple crops like wheat even more water-efficient by altering the internal structure of their leaves. This approach is being pioneered by other scientists at the Institute for Sustainable Food, who have developed climate-ready rice and wheat which can survive extreme drought conditions.
Professor Andrew Fleming from the Institute for Sustainable Food at the University of Sheffield said: "Until now, the way plants form their intricate patterns of air channels has remained surprisingly mysterious to plant scientists.
"This major discovery shows that the movement of air through leaves shapes their internal workings -- which has implications for the way we think about evolution in plants.
"The fact that humans have already inadvertently influenced the way plants breathe by breeding wheat that uses less water suggests we could target these air channel networks to develop crops that can survive the more extreme droughts we expect to see with climate breakdown."
Dr Marjorie Lundgren, Leverhulme Early Career Fellow at Lancaster University, said: "Scientists have suspected for a long time that the development of stomata and the development of air spaces within a leaf are coordinated. However, we weren't really sure which drove the other. So this started as a 'what came first, the chicken or the egg?' question.
"Using a clever set of experiments involving X-ray CT image analyses, our collaborative team answered these questions using species with very different leaf structures. While we show that the development of stomata initiates the expansion of air spaces, we took it one step further to show that the stomata actually need to be exchanging gases in order for the air spaces to expand. This paints a much more interesting story, linked to physiology."
The X-ray imaging work was undertaken at the Hounsfield Facility at the University of Nottingham. The Director of the Facility, Professor Sacha Mooney, said: "Until recently the application of X-ray CT, or CAT scanning, in plant sciences has mainly been focused on visualising the hidden half of the plant -- the roots -- as they grow in soil.
"Working with our partners in Sheffield we have now developed the technique to visualise the cellular structure of a plant leaf in 3D -- allowing us to see how the complex network of air spaces inside the leaf controls its behaviour. It's very exciting."
Read more at Science Daily
Subscribe to:
Posts (Atom)