The concept of implicit bias has made its way into the general consciousness, most often in the context of racial bias. More broadly, however, implicit biases can affect how people think of anything -- from their thoughts about cookies to those about white men.
"All the little ways in which our everyday thinking about social stuff is unconscious or uncontrollable," wrote Calvin Lai, assistant professor of psychology in Arts & Sciences at Washington University in St. Louis, in an article in DCist. "The stuff that we don't realize is influencing us when we make decisions."
Along with a broader cultural awareness of implicit bias is idea that the actions that they influence can be changed by eliminating the bias itself.
Change the bias, changes in the behavior will follow. It seems logical enough.
If true, reducing implicit bias could be put to practical use for anything from ending discrimination (removing a bias in favor of white males) to losing weight (dialing down a cookie bias).
In a meta-analysis of research papers published on the subject of implicit bias, however, Lai found that the evidence does not show this kind of causal relationship.
The research is published in the Journal of Personality and Social Psychology.
Lai worked with Patrick Forscher, of the University of Arkansas, to systematically review 492 studies that dealt with changing people's "automatic mental processes," the uncontrollable, unconscious mental processes that have come to be known in particular contexts as "implicit bias."
The studies contained more than 87,000 participants. After crunching the numbers, Lai and Forscher saw that studies suggest biases can, in fact, be changed -- although not dramatically.
When they honed in, looking at 63 studies that explicitly considered a link between changes in bias and changes in actions, however, they found no evidence of a causal relationship.
"We definitely didn't expect this," Lai said. "And it challenges assumptions about the relationship between implicit bias and behavior."
Lai suggested four possible reasons that a link was not established in the meta-analysis:
Measurement errors: The way outcomes were measured may have picked up on changes unrelated to the underlying bias. For example, Lai said, such a measurement would be analogous to "moving the mercury around within a thermometer rather than changing the heat in the room."
Confounds: After tests to measure an implicit bias, something happened, unrelated to the changed subjects' behavior.
Measured too narrow of a bias: Appeared to assess the same associations, but maybe the effects were too broad to capture a change associated with the change in bias. For example, the implicit bias measured was about broad attitudes toward White vs. Black people, but the behaviors measured were about behavior toward a specific person of a particular race. In that case, the attitude measured may have been too general.
No causal relationship: Implicit bias doesn't affect behavior at all.
This last option doesn't sit well with Lai. "It would open a theoretical can of worms because there are decades of experiments in other lines of research showing evaluation without conscious intention or control," he said.
However, Lai said there is a more effective way to change these behaviors; one that doesn't rely on changing people's implicit biases: ridding society of the features that cause people to act in a biased way.
For example, reducing subjectivity makes it more difficult for a person's biases to affect decision-making. Instead of relying on a "gut feeling" for a hiring decision, for example, lay out the requirements first, and stick to them.
Or, in the cookie realm, don't have any on hand -- not at home or at the office -- and don't drive past the bakery on the way home.
On an individual level, Lai said, "Equip people with strategies to resist the environment's biasing influence.
"The power of counterstereotypes is not to be underestimated," Lai wrote in a paper describing possible ways to counteract implicit biases. "And if counterstereotypical encounters become typical, shift in attitudes and beliefs will follow."
Lai points out that this study was heavily constrained by the available literature. The studies they included focused on brief interventions and assessments and was heavily skewed toward a certain demographic: university students.
Read more at Science Daily
Aug 3, 2019
3D printing the human heart
A team of researchers from Carnegie Mellon University has published a paper in Science that details a new technique allowing anyone to 3D bioprint tissue scaffolds out of collagen, the major structural protein in the human body. This first-of-its-kind method brings the field of tissue engineering one step closer to being able to 3D print a full-sized, adult human heart.
The technique, known as Freeform Reversible Embedding of Suspended Hydrogels (FRESH), has allowed the researchers to overcome many challenges associated with existing 3D bioprinting methods, and to achieve unprecedented resolution and fidelity using soft and living materials.
Each of the organs in the human body, such as the heart, is built from specialized cells that are held together by a biological scaffold called the extracellular matrix (ECM). This network of ECM proteins provides the structure and biochemical signals that cells need to carry out their normal function. However, until now it has not been possible to rebuild this complex ECM architecture using traditional biofabrication methods.
"What we've shown is that we can print pieces of the heart out of cells and collagen into parts that truly function, like a heart valve or a small beating ventricle," says Adam Feinberg, a professor of biomedical engineering (BME) and materials science & engineering at Carnegie Mellon, whose lab performed this work. "By using MRI data of a human heart, we were able to accurately reproduce patient-specific anatomical structure and 3D bioprint collagen and human heart cells."
Over 4000 patients in the United States are waiting for a heart transplant, while millions of others worldwide need hearts but are ineligible for the waitlist. The need for replacement organs is immense, and new approaches are needed to engineer artificial organs that are capable of repairing, supplementing, or replacing long-term organ function. Feinberg, who is a member of Carnegie Mellon's Bioengineered Organs Initiative, is working to solve these challenges with a new generation of bioengineered organs that more closely replicate natural organ structures.
"Collagen is an extremely desirable biomaterial to 3D print with because it makes up literally every single tissue in your body," explains Andrew Hudson, a BME Ph.D. student in Feinberg's lab and co-first author on the paper. "What makes it so hard to 3D print, however, is that it starts out as a fluid -- so if you try to print this in air it just forms a puddle on your build platform. So we've developed a technique that prevents it from deforming."
The FRESH 3D bioprinting method developed in Feinberg's lab allows collagen to be deposited layer-by-layer within a support bath of gel, giving the collagen a chance to solidify in place before it is removed from the support bath. With FRESH, the support gel can be easily melted away by heating the gel from room temperature to body temperature after the print is complete. This way, the researchers can remove the support gel without damaging the printed structure made of collagen or cells.
This method is truly exciting for the field of 3D bioprinting because it allows collagen scaffolds to be printed at the large scale of human organs. And it is not limited to collagen, as a wide range of other soft gels including fibrin, alginate, and hyaluronic acid can be 3D bioprinted using the FRESH technique, providing a robust and adaptable tissue engineering platform. Importantly, the researchers also developed open-source designs so that nearly anyone, from medical labs to high school science classes, can build and have access to low-cost, high-performance 3D bioprinters.
Looking forward, FRESH has applications in many aspects of regenerative medicine, from wound repair to organ bioengineering, but it is just one piece of a growing biofabrication field. "Really what we're talking about is the convergence of technologies," says Feinberg. "Not just what my lab does in bioprinting, but also from other labs and small companies in the areas of stem cell science, machine learning, and computer simulation, as well as new 3D bioprinting hardware and software."
Read more at Science Daily
The technique, known as Freeform Reversible Embedding of Suspended Hydrogels (FRESH), has allowed the researchers to overcome many challenges associated with existing 3D bioprinting methods, and to achieve unprecedented resolution and fidelity using soft and living materials.
Each of the organs in the human body, such as the heart, is built from specialized cells that are held together by a biological scaffold called the extracellular matrix (ECM). This network of ECM proteins provides the structure and biochemical signals that cells need to carry out their normal function. However, until now it has not been possible to rebuild this complex ECM architecture using traditional biofabrication methods.
"What we've shown is that we can print pieces of the heart out of cells and collagen into parts that truly function, like a heart valve or a small beating ventricle," says Adam Feinberg, a professor of biomedical engineering (BME) and materials science & engineering at Carnegie Mellon, whose lab performed this work. "By using MRI data of a human heart, we were able to accurately reproduce patient-specific anatomical structure and 3D bioprint collagen and human heart cells."
Over 4000 patients in the United States are waiting for a heart transplant, while millions of others worldwide need hearts but are ineligible for the waitlist. The need for replacement organs is immense, and new approaches are needed to engineer artificial organs that are capable of repairing, supplementing, or replacing long-term organ function. Feinberg, who is a member of Carnegie Mellon's Bioengineered Organs Initiative, is working to solve these challenges with a new generation of bioengineered organs that more closely replicate natural organ structures.
"Collagen is an extremely desirable biomaterial to 3D print with because it makes up literally every single tissue in your body," explains Andrew Hudson, a BME Ph.D. student in Feinberg's lab and co-first author on the paper. "What makes it so hard to 3D print, however, is that it starts out as a fluid -- so if you try to print this in air it just forms a puddle on your build platform. So we've developed a technique that prevents it from deforming."
The FRESH 3D bioprinting method developed in Feinberg's lab allows collagen to be deposited layer-by-layer within a support bath of gel, giving the collagen a chance to solidify in place before it is removed from the support bath. With FRESH, the support gel can be easily melted away by heating the gel from room temperature to body temperature after the print is complete. This way, the researchers can remove the support gel without damaging the printed structure made of collagen or cells.
This method is truly exciting for the field of 3D bioprinting because it allows collagen scaffolds to be printed at the large scale of human organs. And it is not limited to collagen, as a wide range of other soft gels including fibrin, alginate, and hyaluronic acid can be 3D bioprinted using the FRESH technique, providing a robust and adaptable tissue engineering platform. Importantly, the researchers also developed open-source designs so that nearly anyone, from medical labs to high school science classes, can build and have access to low-cost, high-performance 3D bioprinters.
Looking forward, FRESH has applications in many aspects of regenerative medicine, from wound repair to organ bioengineering, but it is just one piece of a growing biofabrication field. "Really what we're talking about is the convergence of technologies," says Feinberg. "Not just what my lab does in bioprinting, but also from other labs and small companies in the areas of stem cell science, machine learning, and computer simulation, as well as new 3D bioprinting hardware and software."
Read more at Science Daily
Aug 2, 2019
Unexpected nut eating by gorillas
Scientists have observed a population of western lowland gorillas in Loango National Park, Gabon using their teeth to crack open the woody shells of Coula edulis nuts. The researchers combined direct feeding observations and mechanical tests of seed casings to show that gorillas may be taxing their teeth to their upper limits, year after year, to access this energy rich food source.
Despite their large body size, gorillas are known to have a vegetarian diet consisting almost exclusively of leafy vegetation and fruit. Their teeth are large and high crested when compared to other great apes which is traditionally seen as an adaptation to them spending a large amount of time chewing tough fibrous plant material. In contrast, their teeth are not well adapted to eating hard objects, such as nuts encased in a woody shell, because the high crests on their molar teeth would be at risk of damage. "I was amazed when we first observed the nut eating by the gorillas," states Martha Robbins, senior author on the paper. "We can not only see it, but also hear it, as the shell gives way to the incredible strength of their bite. Gorillas obviously have large, powerful jaws, but we did not expect to see this because their teeth are not well-suited to such behavior."
The nuts of Coula edulis are encased in a hard, woody shell that takes around 271 kg of force to crack. Yet for the three months the nuts are available, the gorillas of Loango National Park concentrate their feeding on the energy rich kernels, spending up to three hours a day chomping through nuts. This is surprising as animals that eat very hard food items tend to have strong, rounded molars that act like a pestle and mortar and are very efficient at cracking brittle foods. Like other foliage eaters, gorilla teeth have higher crests providing extra cutting edges for slicing tough material. Under the monumental bite force required to crack nuts, teeth with sharp edges are prone to break meaning they may be worn away quickly. The researchers were surprised to learn that the gorillas at Loango are regularly gambling with their teeth and taxing them close to their predicted mechanical limits. While some primates, like chimps, protect their teeth by using tools to crack open nuts, it appears that the gorillas at Loango National Park reply on brute strength to break through the woody shells of Coula edulis nuts. The fact they do this year after year indicates that gorilla teeth may be stronger than previously thought.
The research also implies that western lowland gorillas have much greater dietary breadth than previously believed. The absence of nut cracking behavior in other populations of western gorillas where the nuts are also present suggests the behavior may be cultural, if gorillas need to observe and learn the behavior from other group members. "The fact that this nut eating is observed in Loango but not in other forests in central Africa where the nut occurs stresses the importance of studying and conserving gorillas throughout the habitat where they are found," says Robbins.
Read more at Science Daily
Despite their large body size, gorillas are known to have a vegetarian diet consisting almost exclusively of leafy vegetation and fruit. Their teeth are large and high crested when compared to other great apes which is traditionally seen as an adaptation to them spending a large amount of time chewing tough fibrous plant material. In contrast, their teeth are not well adapted to eating hard objects, such as nuts encased in a woody shell, because the high crests on their molar teeth would be at risk of damage. "I was amazed when we first observed the nut eating by the gorillas," states Martha Robbins, senior author on the paper. "We can not only see it, but also hear it, as the shell gives way to the incredible strength of their bite. Gorillas obviously have large, powerful jaws, but we did not expect to see this because their teeth are not well-suited to such behavior."
The nuts of Coula edulis are encased in a hard, woody shell that takes around 271 kg of force to crack. Yet for the three months the nuts are available, the gorillas of Loango National Park concentrate their feeding on the energy rich kernels, spending up to three hours a day chomping through nuts. This is surprising as animals that eat very hard food items tend to have strong, rounded molars that act like a pestle and mortar and are very efficient at cracking brittle foods. Like other foliage eaters, gorilla teeth have higher crests providing extra cutting edges for slicing tough material. Under the monumental bite force required to crack nuts, teeth with sharp edges are prone to break meaning they may be worn away quickly. The researchers were surprised to learn that the gorillas at Loango are regularly gambling with their teeth and taxing them close to their predicted mechanical limits. While some primates, like chimps, protect their teeth by using tools to crack open nuts, it appears that the gorillas at Loango National Park reply on brute strength to break through the woody shells of Coula edulis nuts. The fact they do this year after year indicates that gorilla teeth may be stronger than previously thought.
The research also implies that western lowland gorillas have much greater dietary breadth than previously believed. The absence of nut cracking behavior in other populations of western gorillas where the nuts are also present suggests the behavior may be cultural, if gorillas need to observe and learn the behavior from other group members. "The fact that this nut eating is observed in Loango but not in other forests in central Africa where the nut occurs stresses the importance of studying and conserving gorillas throughout the habitat where they are found," says Robbins.
Read more at Science Daily
Shining (star)light on the search for life
In the hunt for life on other worlds, astronomers scour over planets that are light-years away. They need ways to identify life from afar -- but what counts as good evidence?
Our own planet provides some inspiration. Microbes fill the air with methane; photosynthesizing plants expel oxygen. Perhaps these gases might be found wherever life has taken hold.
But on worlds very different from our own, putative signs of life can be stirred up by non-biological processes. To know a true sign when you see it, astronomer Kevin France at the University of Colorado, Boulder, says, you must look beyond the planet itself, all the way to the gleaming star it orbits.
To this end, France and his team designed the SISTINE mission. Flying on a sounding rocket for a 15-minute flight, it will observe far-off stars to help interpret signs of life on the planets that orbit them. The mission will launch from the White Sands Missile Range in New Mexico in the early morning hours of Aug. 5, 2019.
When Earth Is a Bad Example
Shortly after Earth formed 4.6 billion years ago, it was enveloped by a noxious atmosphere. Volcanoes spewed methane and sulfur. The air teemed with up to 200 times more carbon dioxide than today's levels.
It wasn't for another billion and a half years that molecular oxygen, which contains two oxygen atoms, entered the scene. It was a waste product, discarded by ancient bacteria through photosynthesis. But it kick-started what became known as the Great Oxidization Event, permanently changing Earth's atmosphere and paving the way for more complex lifeforms.
"We would not have large amounts of oxygen in our atmosphere if we didn't have that surface life," France said.
Oxygen is known as a biomarker: a chemical compound associated with life. Its presence in Earth's atmosphere hints at the lifeforms lurking below. But as sophisticated computer models have now shown, biomarkers on Earth aren't always so trustworthy for exoplanets, or planets orbiting stars elsewhere in the universe.
France points to M-dwarf stars to make this case. Smaller and colder than our Sun, M-dwarfs account for nearly three-quarters of the Milky Way's stellar population. To understand exoplanets that orbit them, scientists simulated Earth-sized planets circling M-dwarfs. Differences from Earth quickly emerged.
M-dwarfs generate intense ultraviolet light. When that light struck the simulated Earth-like planet, it ripped the carbon from carbon dioxide, leaving behind free molecular oxygen. UV light also broke up molecules of water vapor, releasing single oxygen atoms. The atmospheres created oxygen -- but without life.
"We call these false-positive biomarkers," France said. "You can produce oxygen on an Earth-like planet through photochemistry alone."
Earth's low oxygen levels without life were a kind of fluke -- thanks, in part, to our interaction with our Sun. Exoplanet systems with different stars might be different. "If we think we understand a planet's atmosphere but don't understand the star it orbits, we're probably going to get things wrong," France said.
To Know a Planet, Study its Star
France and his team designed SISTINE to better understand host stars and their effects on exoplanet atmospheres. Short for Suborbital Imaging Spectrograph for Transition region Irradiance from Nearby Exoplanet host stars, SISTINE measures the high-energy radiation from these stars. With knowledge about host stars' spectra, scientists can better distinguish true biomarkers from false-positives on their orbiting planets.
To make these measurements, SISTINE uses a spectrograph, an instrument that separates light into its component parts.
"Spectra are like fingerprints," said Jane Rigby, an astrophysicist at NASA's Goddard Space Flight Center in Greenbelt, Maryland, who uses the methodology. "It's how we find out what things are made of, both on our planet and as we look out into the universe."
SISTINE measures spectra in wavelengths from 100 to 160 nanometers, a range of far-UV light that, among other things, can create oxygen, possibly generating a false-positive. Light output in this range varies with the mass of the star -- meaning stars of different masses will almost surely differ from our Sun.
SISTINE can also measure flares, or bright stellar explosions, which release intense doses of far-UV light all at once. Frequent flares could turn a habitable environment into a lethal one.
The SISTINE mission will fly on a Black Brant IX sounding rocket. Sounding rockets make short, targeted flights into space before falling back to Earth; SISTINE's flight gives it about five minutes observing time. Though brief, SISTINE can see stars in wavelengths inaccessible to observatories like the Hubble Space Telescope.
Two launches are scheduled. The first, from White Sands in August, will calibrate the instrument. SISTINE will fly 174 miles above Earth's surface to observe NGC 6826, a cloud of gas surrounding a white dwarf star located about 2,000 light-years away in the constellation Cygnus. NGC 6826 is bright in UV light and shows sharp spectral lines -- a clear target for checking their equipment.
After calibration, the second launch will follow in 2020 from the Arnhem Space Centre in Nhulunbuy, Australia. There they will observe the UV spectra of Alpha Centauri A and B, the two largest stars in the three-star Alpha Centauri system. At 4.37 light-years away, these stars are our closest stellar neighbors and prime targets for exoplanet observations. (The system is home to Proxima Centauri B, the closest exoplanet to Earth.)
Testing New Tech
Both SISTINE's observations and the technology used to acquire them are designed with future missions in mind.
One is NASA's James Webb Space Telescope, currently set to launch in 2021. The deep space observatory will see visible to mid-infrared light -- useful for detecting exoplanets orbiting M-dwarfs. SISTINE observations can help scientists understand the light from these stars in wavelengths that Webb can't see.
SISTINE also carries novel UV detector plates and new optical coatings on its mirrors, designed to help them better reflect rather than absorb extreme UV light. Flying this technology on SISTINE helps test them for NASA's future large UV/optical space telescopes.
Read more at Science Daily
Our own planet provides some inspiration. Microbes fill the air with methane; photosynthesizing plants expel oxygen. Perhaps these gases might be found wherever life has taken hold.
But on worlds very different from our own, putative signs of life can be stirred up by non-biological processes. To know a true sign when you see it, astronomer Kevin France at the University of Colorado, Boulder, says, you must look beyond the planet itself, all the way to the gleaming star it orbits.
To this end, France and his team designed the SISTINE mission. Flying on a sounding rocket for a 15-minute flight, it will observe far-off stars to help interpret signs of life on the planets that orbit them. The mission will launch from the White Sands Missile Range in New Mexico in the early morning hours of Aug. 5, 2019.
When Earth Is a Bad Example
Shortly after Earth formed 4.6 billion years ago, it was enveloped by a noxious atmosphere. Volcanoes spewed methane and sulfur. The air teemed with up to 200 times more carbon dioxide than today's levels.
It wasn't for another billion and a half years that molecular oxygen, which contains two oxygen atoms, entered the scene. It was a waste product, discarded by ancient bacteria through photosynthesis. But it kick-started what became known as the Great Oxidization Event, permanently changing Earth's atmosphere and paving the way for more complex lifeforms.
"We would not have large amounts of oxygen in our atmosphere if we didn't have that surface life," France said.
Oxygen is known as a biomarker: a chemical compound associated with life. Its presence in Earth's atmosphere hints at the lifeforms lurking below. But as sophisticated computer models have now shown, biomarkers on Earth aren't always so trustworthy for exoplanets, or planets orbiting stars elsewhere in the universe.
France points to M-dwarf stars to make this case. Smaller and colder than our Sun, M-dwarfs account for nearly three-quarters of the Milky Way's stellar population. To understand exoplanets that orbit them, scientists simulated Earth-sized planets circling M-dwarfs. Differences from Earth quickly emerged.
M-dwarfs generate intense ultraviolet light. When that light struck the simulated Earth-like planet, it ripped the carbon from carbon dioxide, leaving behind free molecular oxygen. UV light also broke up molecules of water vapor, releasing single oxygen atoms. The atmospheres created oxygen -- but without life.
"We call these false-positive biomarkers," France said. "You can produce oxygen on an Earth-like planet through photochemistry alone."
Earth's low oxygen levels without life were a kind of fluke -- thanks, in part, to our interaction with our Sun. Exoplanet systems with different stars might be different. "If we think we understand a planet's atmosphere but don't understand the star it orbits, we're probably going to get things wrong," France said.
To Know a Planet, Study its Star
France and his team designed SISTINE to better understand host stars and their effects on exoplanet atmospheres. Short for Suborbital Imaging Spectrograph for Transition region Irradiance from Nearby Exoplanet host stars, SISTINE measures the high-energy radiation from these stars. With knowledge about host stars' spectra, scientists can better distinguish true biomarkers from false-positives on their orbiting planets.
To make these measurements, SISTINE uses a spectrograph, an instrument that separates light into its component parts.
"Spectra are like fingerprints," said Jane Rigby, an astrophysicist at NASA's Goddard Space Flight Center in Greenbelt, Maryland, who uses the methodology. "It's how we find out what things are made of, both on our planet and as we look out into the universe."
SISTINE measures spectra in wavelengths from 100 to 160 nanometers, a range of far-UV light that, among other things, can create oxygen, possibly generating a false-positive. Light output in this range varies with the mass of the star -- meaning stars of different masses will almost surely differ from our Sun.
SISTINE can also measure flares, or bright stellar explosions, which release intense doses of far-UV light all at once. Frequent flares could turn a habitable environment into a lethal one.
The SISTINE mission will fly on a Black Brant IX sounding rocket. Sounding rockets make short, targeted flights into space before falling back to Earth; SISTINE's flight gives it about five minutes observing time. Though brief, SISTINE can see stars in wavelengths inaccessible to observatories like the Hubble Space Telescope.
Two launches are scheduled. The first, from White Sands in August, will calibrate the instrument. SISTINE will fly 174 miles above Earth's surface to observe NGC 6826, a cloud of gas surrounding a white dwarf star located about 2,000 light-years away in the constellation Cygnus. NGC 6826 is bright in UV light and shows sharp spectral lines -- a clear target for checking their equipment.
After calibration, the second launch will follow in 2020 from the Arnhem Space Centre in Nhulunbuy, Australia. There they will observe the UV spectra of Alpha Centauri A and B, the two largest stars in the three-star Alpha Centauri system. At 4.37 light-years away, these stars are our closest stellar neighbors and prime targets for exoplanet observations. (The system is home to Proxima Centauri B, the closest exoplanet to Earth.)
Testing New Tech
Both SISTINE's observations and the technology used to acquire them are designed with future missions in mind.
One is NASA's James Webb Space Telescope, currently set to launch in 2021. The deep space observatory will see visible to mid-infrared light -- useful for detecting exoplanets orbiting M-dwarfs. SISTINE observations can help scientists understand the light from these stars in wavelengths that Webb can't see.
SISTINE also carries novel UV detector plates and new optical coatings on its mirrors, designed to help them better reflect rather than absorb extreme UV light. Flying this technology on SISTINE helps test them for NASA's future large UV/optical space telescopes.
Read more at Science Daily
Two fraudsters, one passport
Computers are more accurate than humans at detecting digitally manipulated ID photos, which merge the images of two people, new research has found.
Face morphing is a method used by fraudsters in which two separate identity photographs are digitally merged to create a single image that sufficiently resembles both people. This image is then submitted as part of the application for a genuine passport or driving licence, and if accepted, potentially allows both people to use the same genuine identification document without arousing suspicion.
A new study by psychologists at the University of Lincoln asked participants in one experiment to decide whether an image showed the person standing in front of them. In this task, participants accepted the digitally created morphs around half of the time, while a basic computer model could correctly identify morphs over two thirds of time.
The research used high quality 'face morphs' over a series of four experiments which included screen-based image comparison tasks alongside a live task, designed to mimic a real-life border-control situation in which an agent would have to accept or reject a passport image based on its resemblance to the person in front of them.
Results showed that participants not only failed to spot 51 percent of these fraudulent images, but once they were provided with more information on face-morphing attacks, detection rates only rose to 64 percent. In another experiment, the researchers showed that training did not help participants to detect morphs presented onscreen, and detection rates remained around chance level. The results suggest that the morphs were accepted as legitimate ID photos often enough that they may be feasible as tools for committing fraud, especially in border control situations where the final acceptance decision is often made by a human operator.
When similar images were put through a simple computer algorithm trained to differentiate between morphs and normal photos, 68 percent of the images were correctly identified as morph images, showing the programme to be significantly more accurate than human participants. The algorithm used was relatively basic as a demonstration, and recent software being developed by computer scientists is far more sophisticated and shows even greater levels of success.
Lead researcher Dr Robin Kramer from the University of Lincoln's School of Psychology said: "The advancements and availability of high quality image editing software has made these kinds of 'face morphing attacks' more sophisticated and the images harder to detect.
"Our results show that morph detection is highly error-prone and the level at which these images were accepted represents a significant concern for security agencies. Training did not provide a useful solution to this problem.
Read more at Science Daily
Face morphing is a method used by fraudsters in which two separate identity photographs are digitally merged to create a single image that sufficiently resembles both people. This image is then submitted as part of the application for a genuine passport or driving licence, and if accepted, potentially allows both people to use the same genuine identification document without arousing suspicion.
A new study by psychologists at the University of Lincoln asked participants in one experiment to decide whether an image showed the person standing in front of them. In this task, participants accepted the digitally created morphs around half of the time, while a basic computer model could correctly identify morphs over two thirds of time.
The research used high quality 'face morphs' over a series of four experiments which included screen-based image comparison tasks alongside a live task, designed to mimic a real-life border-control situation in which an agent would have to accept or reject a passport image based on its resemblance to the person in front of them.
Results showed that participants not only failed to spot 51 percent of these fraudulent images, but once they were provided with more information on face-morphing attacks, detection rates only rose to 64 percent. In another experiment, the researchers showed that training did not help participants to detect morphs presented onscreen, and detection rates remained around chance level. The results suggest that the morphs were accepted as legitimate ID photos often enough that they may be feasible as tools for committing fraud, especially in border control situations where the final acceptance decision is often made by a human operator.
When similar images were put through a simple computer algorithm trained to differentiate between morphs and normal photos, 68 percent of the images were correctly identified as morph images, showing the programme to be significantly more accurate than human participants. The algorithm used was relatively basic as a demonstration, and recent software being developed by computer scientists is far more sophisticated and shows even greater levels of success.
Lead researcher Dr Robin Kramer from the University of Lincoln's School of Psychology said: "The advancements and availability of high quality image editing software has made these kinds of 'face morphing attacks' more sophisticated and the images harder to detect.
"Our results show that morph detection is highly error-prone and the level at which these images were accepted represents a significant concern for security agencies. Training did not provide a useful solution to this problem.
Read more at Science Daily
Flu vaccine reduces risk of early death for elderly intensive care patients
It appears that an influenza vaccine does not just work when it comes to influenza. A new study shows that elderly people who have been admitted to an intensive care units have less risk of dying and of suffering a blood clot or bleeding in the brain if they have been vaccinated. And this is despite the fact that they are typically older, have more chronic diseases and take more medicine then those who have not been vaccinated.
The study covers almost 90,000 surviving intensive care patients above the age of 65 during an eleven year period in Denmark. Only a few of them were admitted directly due to influenza. However, regardless of the cause of the admission, for those who were vaccinated the risk of suffering a stroke -- which is the collective name for bleeding and blood clots in the brain -- was 16 per cent lower. This group also has an eight per cent lower risk of dying during the first year following their hospitalisation.
"Every year, 30,000 people are admitted to the intensive care units in Danish hospitals and we know that the first year is critical. Approximately three out of four survive the hospitalisation and are discharged from hospital. But even among the patients who are discharged, almost one in five die within the first year while many others suffer complications. Our study shows that there are fewer deaths and serious complications among the patients who have been vaccinated against influenza. So this supports the current recommendation that elderly people should be vaccinated," says Christian Fynbo Christiansen, clinical associate professor at Aarhus University Hospital and consultant at Aarhus University Hospital, Denmark.
Today, less than forty per cent of elderly Europeans say yes to the vaccination.
"We can't say with one hundred per cent certainty that the risk of a stroke and dying is lower solely because of the vaccine. But we can see that the elderly people who have been vaccinated do better in the event of critical illness. This suggests that it would be good if more elderly people received the vaccine. Not least because the vaccine is both safe and inexpensive," says Christian Fynbo Christiansen.
This is the first time that researchers have looked into the effect of the vaccine specifically on elderly critically ill patients. Other researchers have previously shown that the influenza vaccine lessens the risk of bacterial infections and heart attacks. However, the study shows that this is not the case for the elderly intensive care patients.
"Surprisingly, the vaccine didn't reduce the number of pneumonia cases in our study. We had otherwise expect that it would, as some previous studies have shown that the vaccine has this effect on younger and healthy individuals. Neither was there any clear difference in the number of blood clots in the heart. This raises new research questions about what effect of the vaccine on the immune system and whether there were other differences between the patients," says Christian Fynbo Christiansen.
Read more at Science Daily
The study covers almost 90,000 surviving intensive care patients above the age of 65 during an eleven year period in Denmark. Only a few of them were admitted directly due to influenza. However, regardless of the cause of the admission, for those who were vaccinated the risk of suffering a stroke -- which is the collective name for bleeding and blood clots in the brain -- was 16 per cent lower. This group also has an eight per cent lower risk of dying during the first year following their hospitalisation.
"Every year, 30,000 people are admitted to the intensive care units in Danish hospitals and we know that the first year is critical. Approximately three out of four survive the hospitalisation and are discharged from hospital. But even among the patients who are discharged, almost one in five die within the first year while many others suffer complications. Our study shows that there are fewer deaths and serious complications among the patients who have been vaccinated against influenza. So this supports the current recommendation that elderly people should be vaccinated," says Christian Fynbo Christiansen, clinical associate professor at Aarhus University Hospital and consultant at Aarhus University Hospital, Denmark.
Today, less than forty per cent of elderly Europeans say yes to the vaccination.
"We can't say with one hundred per cent certainty that the risk of a stroke and dying is lower solely because of the vaccine. But we can see that the elderly people who have been vaccinated do better in the event of critical illness. This suggests that it would be good if more elderly people received the vaccine. Not least because the vaccine is both safe and inexpensive," says Christian Fynbo Christiansen.
This is the first time that researchers have looked into the effect of the vaccine specifically on elderly critically ill patients. Other researchers have previously shown that the influenza vaccine lessens the risk of bacterial infections and heart attacks. However, the study shows that this is not the case for the elderly intensive care patients.
"Surprisingly, the vaccine didn't reduce the number of pneumonia cases in our study. We had otherwise expect that it would, as some previous studies have shown that the vaccine has this effect on younger and healthy individuals. Neither was there any clear difference in the number of blood clots in the heart. This raises new research questions about what effect of the vaccine on the immune system and whether there were other differences between the patients," says Christian Fynbo Christiansen.
Read more at Science Daily
Aug 1, 2019
Drop of ancient seawater rewrites Earth's history
Lava. |
Plate tectonics is Earth's vital -- and unique -- continuous recycling process that directly or indirectly controls almost every function of the planet, including atmospheric conditions, mountain building (forming of continents), natural hazards such as volcanoes and earthquakes, formation of mineral deposits and the maintenance of our oceans. It is the process where the large continental plates of the planet continuously move, and the top layers of the Earth (crust) are recycled into the mantle and replaced by new layers through processes such as volcanic activity.
Where it was previously thought that plate tectonics started about 2.7 billion years ago, a team of international scientists used the microscopic leftovers of a drop of water that was transported into the Earth's deep mantle -- through plate tectonics -- to show that this process started 600 million years before that. An article on their research that proves plate tectonics started on Earth 3.3 billion years ago was published in the high impact academic journal, Nature, on 16 July.
"Plate tectonics constantly recycles the planet's matter, and without it the planet would look like Mars," says Professor Allan Wilson from the Wits School of Geosciences, who was part of the research team.
"Our research showing that plate tectonics started 3.3 billion years ago now coincides with the period that life started on Earth. It tells us where the planet came from and how it evolved."
Earth is the only planet in our solar system that is shaped by plate tectonics and without it the planet would be uninhabitable.
For their research, the team analysed a piece of rock melt, called komatiite -- named after the type occurrence in the Komati river near Barberton in Mpumalanga -- that are the leftovers from the hottest magma ever produced in the first quarter of Earth's existence (the Archaean). While most of the komatiites were obscured by later alteration and exposure to the atmosphere, small droplets of the molten rock were preserved in a mineral called olivine. This allowed the team to study a perfectly preserved piece of ancient lava.
"We examined a piece of melt that was 10 microns (0.01mm) in diameter, and analysed its chemical indicators such as H2O content, chlorine and deuterium/hydrogen ratio, and found that Earth's recycling process started about 600 million years earlier than originally thought," says Wilson. "We found that seawater was transported deep into the mantle and then re-emerged through volcanic plumes from the core-mantle boundary."
The research allows insight into the first stages of plate tectonics and the start of stable continental crust.
"What is exciting is that this discovery comes at the 50th anniversary of the discovery of komatiites in the Barberton Mountain Land by Wits Professors, the brothers Morris and Richard Viljoen," says Wilson.
Read more at Science Daily
Hubble uncovers a 'heavy metal' exoplanet shaped like a football
Observations by NASA's Hubble Space Telescope reveal magnesium and iron gas streaming from the strange world outside our solar system known as WASP-121b. The observations represent the first time that so-called "heavy metals" -- elements heavier than hydrogen and helium -- have been spotted escaping from a hot Jupiter, a large, gaseous exoplanet very close to its star.
Normally, hot Jupiter-sized planets are still cool enough inside to condense heavier elements such as magnesium and iron into clouds.
But that's not the case with WASP-121b, which is orbiting so dangerously close to its star that its upper atmosphere reaches a blazing 4,600 degrees Fahrenheit. The temperature in WASP-121b's upper atmosphere is about 10 times greater than that of any known planetary atmosphere. The WASP-121 system resides about 900 light-years from Earth.
"Heavy metals have been seen in other hot Jupiters before, but only in the lower atmosphere," explained lead researcher David Sing of the Johns Hopkins University in Baltimore, Maryland. "So you don't know if they are escaping or not. With WASP-121b, we see magnesium and iron gas so far away from the planet that they're not gravitationally bound."
Ultraviolet light from the host star, which is brighter and hotter than the Sun, heats the upper atmosphere and helps lead to its escape. In addition, the escaping magnesium and iron gas may contribute to the temperature spike, Sing said. "These metals will make the atmosphere more opaque in the ultraviolet, which could be contributing to the heating of the upper atmosphere," he explained.
The sizzling planet is so close to its star that it is on the cusp of being ripped apart by the star's gravity. This hugging distance means that the planet is football shaped due to gravitational tidal forces.
"We picked this planet because it is so extreme," Sing said. "We thought we had a chance of seeing heavier elements escaping. It's so hot and so favorable to observe, it's the best shot at finding the presence of heavy metals. We were mainly looking for magnesium, but there have been hints of iron in the atmospheres of other exoplanets. It was a surprise, though, to see it so clearly in the data and at such great altitudes so far away from the planet. The heavy metals are escaping partly because the planet is so big and puffy that its gravity is relatively weak. This is a planet being actively stripped of its atmosphere."
The researchers used the observatory's Space Telescope Imaging Spectrograph to search in ultraviolet light for the spectral signatures of magnesium and iron imprinted on starlight filtering through WASP-121b's atmosphere as the planet passed in front of, or transited, the face of its home star.
This exoplanet is also a perfect target for NASA's upcoming James Webb Space Telescope to search in infrared light for water and carbon dioxide, which can be detected at longer, redder wavelengths. The combination of Hubble and Webb observations would give astronomers a more complete inventory of the chemical elements that make up the planet's atmosphere.
The WASP-121b study is part of the Panchromatic Comparative Exoplanet Treasury (PanCET) survey, a Hubble program to look at 20 exoplanets, ranging in size from super-Earths (several times Earth's mass) to Jupiters (which are over 100 times Earth's mass), in the first large-scale ultraviolet, visible, and infrared comparative study of distant worlds.
The observations of WASP-121b add to the developing story of how planets lose their primordial atmospheres. When planets form, they gather an atmosphere containing gas from the disk in which the planet and star formed. These atmospheres consist mostly of the primordial, lighter-weight gases hydrogen and helium, the most plentiful elements in the universe. This atmosphere dissipates as a planet moves closer to its star.
"The hot Jupiters are mostly made of hydrogen, and Hubble is very sensitive to hydrogen, so we know these planets can lose the gas relatively easily," Sing said. "But in the case of WASP-121b, the hydrogen and helium gas is outflowing, almost like a river, and is dragging these metals with them. It's a very efficient mechanism for mass loss."
Read more at Science Daily
TESS satellite uncovers 'first nearby super-Earth'
An international team of astronomers led by Cornell's Lisa Kaltenegger has characterized the first potentially habitable world outside of our own solar system.
Located about 31 light-years away, the super-Earth planet -- named GJ 357 d -- was discovered in early 2019 owing to NASA's Transiting Exoplanet Survey Satellite (TESS), a mission designed to comb the heavens for exoplanets, according to their new modeling research in the Astrophysical Journal Letters.
"This is exciting, as this is humanity's first nearby super-Earth that could harbor life -- uncovered with help from TESS, our small, mighty mission with a huge reach," said Kaltenegger, associate professor of astronomy, director of Cornell's Carl Sagan Institute and a member of the TESS science team.
The exoplanet is more massive than our own blue planet, and Kaltenegger said the discovery will provide insight into Earth's heavyweight planetary cousins. "With a thick atmosphere, the planet GJ 357 d could maintain liquid water on its surface like Earth, and we could pick out signs of life with telescopes that will soon be online," she said.
Astronomers from the Institute of Astrophysics of the Canary Islands and the University of La Laguna, both in Spain, announced the discovery of the GJ 357 system July 31 in the journal Astronomy & Astrophysics. They showed that the distant solar system -- with a diminutive M-type dwarf sun, about one-third the size of our own sun -- harbors three planets, with one of those in that system's habitable zone: GJ 357 d.
Last February, the TESS satellite observed that the dwarf sun GJ 357 dimmed very slightly every 3.9 days, evidence of a transiting planet moving across the star's face. That planet was GJ 357 b, a so-called "hot Earth" about 22% larger than Earth, according to the NASA Goddard Space Flight Center, which guides TESS.
Follow-up observations from the ground led to the discovery of two more exoplanetary siblings: GJ 357 c and GJ 357 d. The international team of scientists collected Earth-based telescopic data going back two decades -- to reveal the newly found exoplanets' tiny gravitational tugs on its host star, according to NASA.
Exoplanet GJ 357 c sizzles at 260 degrees Fahrenheit and has at least 3.4 times Earth's mass. However, the system's outermost known sibling planet -- GJ 357 d, a super-Earth -- could provide Earth-like conditions and orbits the dwarf star every 55.7 days at a distance about one-fifth of Earth's distance from the sun. It is not yet known if this planet transits its sun.
Kaltenegger, doctoral candidate Jack Madden and undergraduate student Zifan Lin '20 simulated light fingerprints, climates and remotely detectable spectra for a planet that could range from a rocky composition to a water world.
Madden explained that investigating new discoveries provides an opportunity to test theories and models. "We built the first models of what this new world could be like," he said. "Just knowing that liquid water can exist on the surface of this planet motivates scientists to find ways of detecting signs of life."
Lin described the work from an undergraduate perspective: "Working on a newly discovered planet is something of a dream come true. I was among the first group of people to model its spectra, and thinking about this still overwhelms me."
In a nod to her institute's namesake, the late Cornell professor Carl Sagan, Kaltenegger said: "If GJ 357 d were to show signs of life, it would be at the top of everyone's travel list -- and we could answer a 1,000-year-old question on whether we are alone in the cosmos."
Read more at Science Daily
Located about 31 light-years away, the super-Earth planet -- named GJ 357 d -- was discovered in early 2019 owing to NASA's Transiting Exoplanet Survey Satellite (TESS), a mission designed to comb the heavens for exoplanets, according to their new modeling research in the Astrophysical Journal Letters.
"This is exciting, as this is humanity's first nearby super-Earth that could harbor life -- uncovered with help from TESS, our small, mighty mission with a huge reach," said Kaltenegger, associate professor of astronomy, director of Cornell's Carl Sagan Institute and a member of the TESS science team.
The exoplanet is more massive than our own blue planet, and Kaltenegger said the discovery will provide insight into Earth's heavyweight planetary cousins. "With a thick atmosphere, the planet GJ 357 d could maintain liquid water on its surface like Earth, and we could pick out signs of life with telescopes that will soon be online," she said.
Astronomers from the Institute of Astrophysics of the Canary Islands and the University of La Laguna, both in Spain, announced the discovery of the GJ 357 system July 31 in the journal Astronomy & Astrophysics. They showed that the distant solar system -- with a diminutive M-type dwarf sun, about one-third the size of our own sun -- harbors three planets, with one of those in that system's habitable zone: GJ 357 d.
Last February, the TESS satellite observed that the dwarf sun GJ 357 dimmed very slightly every 3.9 days, evidence of a transiting planet moving across the star's face. That planet was GJ 357 b, a so-called "hot Earth" about 22% larger than Earth, according to the NASA Goddard Space Flight Center, which guides TESS.
Follow-up observations from the ground led to the discovery of two more exoplanetary siblings: GJ 357 c and GJ 357 d. The international team of scientists collected Earth-based telescopic data going back two decades -- to reveal the newly found exoplanets' tiny gravitational tugs on its host star, according to NASA.
Exoplanet GJ 357 c sizzles at 260 degrees Fahrenheit and has at least 3.4 times Earth's mass. However, the system's outermost known sibling planet -- GJ 357 d, a super-Earth -- could provide Earth-like conditions and orbits the dwarf star every 55.7 days at a distance about one-fifth of Earth's distance from the sun. It is not yet known if this planet transits its sun.
Kaltenegger, doctoral candidate Jack Madden and undergraduate student Zifan Lin '20 simulated light fingerprints, climates and remotely detectable spectra for a planet that could range from a rocky composition to a water world.
Madden explained that investigating new discoveries provides an opportunity to test theories and models. "We built the first models of what this new world could be like," he said. "Just knowing that liquid water can exist on the surface of this planet motivates scientists to find ways of detecting signs of life."
Lin described the work from an undergraduate perspective: "Working on a newly discovered planet is something of a dream come true. I was among the first group of people to model its spectra, and thinking about this still overwhelms me."
In a nod to her institute's namesake, the late Cornell professor Carl Sagan, Kaltenegger said: "If GJ 357 d were to show signs of life, it would be at the top of everyone's travel list -- and we could answer a 1,000-year-old question on whether we are alone in the cosmos."
Read more at Science Daily
Mysterious release of radioactive material uncovered
Radiation symbol. |
Among the 70 experts from all over Europe who contributed data and expertise to the current study are Dieter Hainz and Dr. Paul Saey from the Institute of Atomic and Subatomic Physics at TU Wien (Vienna). The data was evaluated by Prof. Georg Steinhauser from the University of Hanover (who is closely associated with the Atomic Institute) together with Dr. Olivier Masson from the Institut de Radioprotection et de Sûreté Nucléaire (IRSN) in France. The team has now published the results of the study in the journal Proceedings of the National Academy of Sciences (PNAS).
Unusual Ruthenium Release
"We measured radioactive ruthenium-106," says Georg Steinhauser. "The measurements indicate the largest singular release of radioactivity from a civilian reprocessing plant." In autumn of 2017, a cloud of ruthenium-106 was measured in many European countries, with maximum values of 176 millibecquerels per cubic meter of air. The values were up to 100 times higher than the total concentrations measured in Europe after the Fukushima incident. The half-life of the radioactive isotope is 374 days.
This type of release is very unusual. The fact that no radioactive substances other than ruthenium were measured is a clear indication that the source must have been a nuclear reprocessing plant.
The geographic extent of the ruthenium-106 cloud was also remarkable -- it was measured in large parts of Central and Eastern Europe, Asia and the Arabian Peninsula. Ruthenium-106 was even found in the Caribbean. The data was compiled by an informal, international network of almost all European measuring stations. In total, 176 measuring stations from 29 countries were involved. In Austria, in addition to TU Wien, the AGES (Austrian Agency for Health and Food Safety) also operates such stations, including the alpine observatory at Sonnblick at 3106m above sea level.
No Health Hazard
As unusual as the release may have been, the concentration of radioactive material has not reached levels that are harmful to human health anywhere in Europe. From the analysis of the data, a total release of about 250 to 400 terabecquerel of ruthenium-106 can be derived. To date, no state has assumed responsibility for this considerable release in the fall of 2017.
The evaluation of the concentration distribution pattern and atmospheric modelling suggests a release site in the southern Urals. This is where the Russian nuclear facility Majak is located. The Russian reprocessing plant had already been the scene of the second-largest nuclear release in history in September 1957 -- after Chernobyl and even larger than Fukushima. At that time, a tank containing liquid waste from plutonium production had exploded, causing massive contamination of the area.
Read more at Science Daily
Jul 31, 2019
Climate change alters tree demography in northern forests
The rise in temperature and precipitation levels in summer in northern Japan has negatively affected the growth of conifers and resulted in their gradual decline, according to a 38-year-long study in which mixed forests of conifers and broad-leaved trees were monitored by a team of researchers from Hokkaido University.
The findings demonstrate how climate change has changed the forests' demography and caused a directional change in the region, from being sub-boreal conifer-dominated to cool-temperate broad-leaved tree dominated.
Climate change as evidenced by, for example, an increase in the number of downpours and super typhoons, is impacting our daily lives. Forest ecosystems around the world are not exempt from this, but there are many issues to still clarify, such as species-specific responses to climate change and their mechanisms.
The present study published in Forest Economy and Management is only one among several studies conducted based on a long-term monitoring of data. The researchers investigated more than eight thousand individual trees in 17.5-hectare primeval reserve areas (Osashima and Panke) inside Hokkaido University's Nakagawa Experimental Forest situated in Hokkaido in northern Japan from 1979 to 2016. The team monitored their growth rates, mortality and recruitment (the process by which seeds establish themselves in an area and grow into mature individuals) rates and then analyzed the influence by climate change.
According to the study, the rise in temperature and precipitation levels has negatively affected the growth of three coniferous species, Abies sachalinensis, Picea glehnii, and Picea jezoensis, over the years while positively affecting broad-leaved trees such as Magnolia obovata. Furthermore, the mortality of large conifers was strongly affected by a powerful typhoon in 2004. Due to these factors, the ratio of conifers in the Osashima primeval reserve area has decreased by nearly 20 percent from 55 percent. Tsutomu Hiura of the research team warns, "If climate change accelerates, these primeval mixed forests are likely to become broad-leaved tree forests in the future."
"Shifts in species composition could bring about major changes in the ecological systems of mixed forests, such as their roles in carbon storage, quality water provision, biodiversity maintenance and wood supply. In order to devise measures that help us adapt to climate change, it is essential to analyze the state of forests from the perspective of their ecological functions and services (benefits to humans)," Tsutomu Hiura commented.
From Science Daily
The findings demonstrate how climate change has changed the forests' demography and caused a directional change in the region, from being sub-boreal conifer-dominated to cool-temperate broad-leaved tree dominated.
Climate change as evidenced by, for example, an increase in the number of downpours and super typhoons, is impacting our daily lives. Forest ecosystems around the world are not exempt from this, but there are many issues to still clarify, such as species-specific responses to climate change and their mechanisms.
The present study published in Forest Economy and Management is only one among several studies conducted based on a long-term monitoring of data. The researchers investigated more than eight thousand individual trees in 17.5-hectare primeval reserve areas (Osashima and Panke) inside Hokkaido University's Nakagawa Experimental Forest situated in Hokkaido in northern Japan from 1979 to 2016. The team monitored their growth rates, mortality and recruitment (the process by which seeds establish themselves in an area and grow into mature individuals) rates and then analyzed the influence by climate change.
According to the study, the rise in temperature and precipitation levels has negatively affected the growth of three coniferous species, Abies sachalinensis, Picea glehnii, and Picea jezoensis, over the years while positively affecting broad-leaved trees such as Magnolia obovata. Furthermore, the mortality of large conifers was strongly affected by a powerful typhoon in 2004. Due to these factors, the ratio of conifers in the Osashima primeval reserve area has decreased by nearly 20 percent from 55 percent. Tsutomu Hiura of the research team warns, "If climate change accelerates, these primeval mixed forests are likely to become broad-leaved tree forests in the future."
"Shifts in species composition could bring about major changes in the ecological systems of mixed forests, such as their roles in carbon storage, quality water provision, biodiversity maintenance and wood supply. In order to devise measures that help us adapt to climate change, it is essential to analyze the state of forests from the perspective of their ecological functions and services (benefits to humans)," Tsutomu Hiura commented.
From Science Daily
Virtual reality to solve minor personal problems
A new study shows that conversation with oneself embodied as Dr. Sigmund Freud works better to improve people's mood, compared to just talking about your problems in a virtual conversation with pre-scripted comments. Researchers claimed that the method could be used by clinicians to help people dealing with minor personal problems.
People are often much better at giving useful advice to a friend in trouble than they are in dealing with their own problems. Although we typically have continuous internal dialogue, we are trapped inside our own way of thinking with our own history and point of view, and find it difficult to take an external perspective regarding our own problems. However, with friends, especially someone we know well, it is much easier to understand the bigger picture, and help them find a way through their problems.
A research team of the University of Barcelona (UB), IDIBAPS and Virtual BodyWorks, a spin-off of both institutions and ICREA, has used immersive virtual reality to observe the effects of talking to themselves as if they were another person, using virtual reality. Study results, published in the Nature Group's journal Scientific Reports, show that conversation with oneself embodied as Dr Sigmund Freud works better to improve people's mood, compared to just talking about your problems in a virtual conversation with pre-scripted comments. Researchers claimed that the method could be used by clinicians to help people dealing with minor personal problems.
The study was led by Mel Slater and Solène Neyret, researchers at the Experimental Virtual Environments Lab for Neuroscience and Technology (Event Lab), a research group of the Faculty of Psychology of the UB. Clinical psychologist Guillem Feixas, of the UB Department of Clinical Psychology and Psychobiology and the Institute of Neurosciences of the University of Barcelona (UBNeuro) also guided the study.
Changing perception and attitude thanks to Virtual Reality
Previous studies developed by this research team have shown that when we adopt a different body using virtual reality, we change our behaviour, attitude and perception of things. "We showed earlier that it is possible for people to talk to themselves as if they were another person, body swapping to two different avatars, and that participants' mood and happiness improved. However, we didn't know whether this was due to simply the participant talking about their problem or whether the virtual body swapping really made a difference," said Mel Slater, also a member of the UBNeuro.
In order to test the body swapping idea, researchers compared one group who talked to themselves first embodied as the participant and then body swapping to a virtual Sigmund Freud; and another (control) group who spoke to the virtual Freud, but in that case Freud responded with pre-scripted questions and comments (there was no body swapping).
Embodied as Sigmund Freud
For this technique to work out,researchers scanned the person to obtain an 'avatar' which is a 3D-likeness of the person. In virtual reality, when they look at themselves, at their body parts, or in a mirror, they will see a representation of themselves. When they move their real body, their virtual body will move in the same way and at the same time. Seated across the table is another virtual human, in the case of this experiment, a representation of Dr Sigmund Freud.
The participant can explain their personal problem to Dr Freud, and then switch to being embodied as Freud. Now, embodied as Freud, when they look down towards themselves, or in a mirror, they will see Freud's body rather than their own, and also this body will move in synchrony with their own movements. "They will see and hear their own likeness explaining the problem, and they see their virtual self as if this were another person. Now they themselves have become the 'friend' who is listening and trying to help," said Mel Slater.
While embodied as Freud, and after perceiving a strong likeness of themselves describing a problem, they can respond, as Freud, back to themselves and ask a question or help the person in front (themselves) to find a solution. After this, they are embodied once again in their own body and they can see and hear Freud's answer. Although it was really themselves who had spoken through Freud, they will hear their voice as disguised. They can keep switching back and forth between the two bodies, so having a conversation: in reality it is with themselves, but it appears as if it is between two different people.
Better results in dealing with personal problems
One week after the completion of the experiment more than 80% of participants in the body swapping group reported a change with respect to their problem, compared to less than 50% in the control group. "We found that those in the body swapping group got better knowledge, understanding, control, and new ideas about their problem compared to the control group (no body swapping)," said Mel Slater.
Participants were guided by clinical psychologist Tania Johnston about how to formulate their problem, so researchers do not know whether this method could be used without this prior clinical advice, and the extent to which the clinician could be incorporated into the virtual reality as part of the procedure.
Read more at Science Daily
People are often much better at giving useful advice to a friend in trouble than they are in dealing with their own problems. Although we typically have continuous internal dialogue, we are trapped inside our own way of thinking with our own history and point of view, and find it difficult to take an external perspective regarding our own problems. However, with friends, especially someone we know well, it is much easier to understand the bigger picture, and help them find a way through their problems.
A research team of the University of Barcelona (UB), IDIBAPS and Virtual BodyWorks, a spin-off of both institutions and ICREA, has used immersive virtual reality to observe the effects of talking to themselves as if they were another person, using virtual reality. Study results, published in the Nature Group's journal Scientific Reports, show that conversation with oneself embodied as Dr Sigmund Freud works better to improve people's mood, compared to just talking about your problems in a virtual conversation with pre-scripted comments. Researchers claimed that the method could be used by clinicians to help people dealing with minor personal problems.
The study was led by Mel Slater and Solène Neyret, researchers at the Experimental Virtual Environments Lab for Neuroscience and Technology (Event Lab), a research group of the Faculty of Psychology of the UB. Clinical psychologist Guillem Feixas, of the UB Department of Clinical Psychology and Psychobiology and the Institute of Neurosciences of the University of Barcelona (UBNeuro) also guided the study.
Changing perception and attitude thanks to Virtual Reality
Previous studies developed by this research team have shown that when we adopt a different body using virtual reality, we change our behaviour, attitude and perception of things. "We showed earlier that it is possible for people to talk to themselves as if they were another person, body swapping to two different avatars, and that participants' mood and happiness improved. However, we didn't know whether this was due to simply the participant talking about their problem or whether the virtual body swapping really made a difference," said Mel Slater, also a member of the UBNeuro.
In order to test the body swapping idea, researchers compared one group who talked to themselves first embodied as the participant and then body swapping to a virtual Sigmund Freud; and another (control) group who spoke to the virtual Freud, but in that case Freud responded with pre-scripted questions and comments (there was no body swapping).
Embodied as Sigmund Freud
For this technique to work out,researchers scanned the person to obtain an 'avatar' which is a 3D-likeness of the person. In virtual reality, when they look at themselves, at their body parts, or in a mirror, they will see a representation of themselves. When they move their real body, their virtual body will move in the same way and at the same time. Seated across the table is another virtual human, in the case of this experiment, a representation of Dr Sigmund Freud.
The participant can explain their personal problem to Dr Freud, and then switch to being embodied as Freud. Now, embodied as Freud, when they look down towards themselves, or in a mirror, they will see Freud's body rather than their own, and also this body will move in synchrony with their own movements. "They will see and hear their own likeness explaining the problem, and they see their virtual self as if this were another person. Now they themselves have become the 'friend' who is listening and trying to help," said Mel Slater.
While embodied as Freud, and after perceiving a strong likeness of themselves describing a problem, they can respond, as Freud, back to themselves and ask a question or help the person in front (themselves) to find a solution. After this, they are embodied once again in their own body and they can see and hear Freud's answer. Although it was really themselves who had spoken through Freud, they will hear their voice as disguised. They can keep switching back and forth between the two bodies, so having a conversation: in reality it is with themselves, but it appears as if it is between two different people.
Better results in dealing with personal problems
One week after the completion of the experiment more than 80% of participants in the body swapping group reported a change with respect to their problem, compared to less than 50% in the control group. "We found that those in the body swapping group got better knowledge, understanding, control, and new ideas about their problem compared to the control group (no body swapping)," said Mel Slater.
Participants were guided by clinical psychologist Tania Johnston about how to formulate their problem, so researchers do not know whether this method could be used without this prior clinical advice, and the extent to which the clinician could be incorporated into the virtual reality as part of the procedure.
Read more at Science Daily
What the brains of people with excellent general knowledge look like
The brains of people with excellent general knowledge are particularly efficiently wired. This was shown by neuroscientists at Ruhr-Universität Bochum and Humboldt-Universität zu Berlin using magnetic resonance imaging. "Although we can precisely measure the general knowledge of people and this wealth of knowledge is very important for an individual's journey through life, we currently know little about the links between general knowledge and the characteristics of the brain," says Dr. Erhan Genç from the Department of Biopsychology in Bochum. The team describes the results in the European Journal of Personality on 28 July 2019.
Brain images and knowledge test
The researchers examined the brains of 324 men and women with a special form of magnetic resonance imaging called diffusion tensor imaging. This makes it possible to reconstruct the pathways of nerve fibres and thus gain an insight into the structural network properties of the brain. By means of mathematical algorithms, the researchers assigned an individual value to the brain of each participant, which reflected the efficiency of his or her structural fibre network.
The participants also completed a general knowledge test called the Bochum Knowledge Test, which was developed in Bochum by Dr. Rüdiger Hossiep. It is comprised of over 300 questions from various fields of knowledge such as art and architecture or biology and chemistry. The team led by Erhan Genç finally investigated whether the efficiency of structural networking is associated with the amount of general knowledge stored.
The result: People with a very efficient fibre network had more general knowledge than those with less efficient structural networking.
Linking pieces of information
"We assume that individual units of knowledge are dispersed throughout the entire brain in the form of pieces of information," explains Erhan Genç. "Efficient networking of the brain is essential in order to put together the information stored in various areas of the brain and successfully recall knowledge content."
An example: To answer the question of which constants occur in Einstein's theory of relativity, you have to connect the meaning of the term "constant" with knowledge of the theory of relativity. "We assume that more efficient networking of the brain contributes to better integration of pieces of information and thus leads to better results in a general knowledge test," says the Bochum-based researcher.
From Science Daily
Brain images and knowledge test
The researchers examined the brains of 324 men and women with a special form of magnetic resonance imaging called diffusion tensor imaging. This makes it possible to reconstruct the pathways of nerve fibres and thus gain an insight into the structural network properties of the brain. By means of mathematical algorithms, the researchers assigned an individual value to the brain of each participant, which reflected the efficiency of his or her structural fibre network.
The participants also completed a general knowledge test called the Bochum Knowledge Test, which was developed in Bochum by Dr. Rüdiger Hossiep. It is comprised of over 300 questions from various fields of knowledge such as art and architecture or biology and chemistry. The team led by Erhan Genç finally investigated whether the efficiency of structural networking is associated with the amount of general knowledge stored.
The result: People with a very efficient fibre network had more general knowledge than those with less efficient structural networking.
Linking pieces of information
"We assume that individual units of knowledge are dispersed throughout the entire brain in the form of pieces of information," explains Erhan Genç. "Efficient networking of the brain is essential in order to put together the information stored in various areas of the brain and successfully recall knowledge content."
An example: To answer the question of which constants occur in Einstein's theory of relativity, you have to connect the meaning of the term "constant" with knowledge of the theory of relativity. "We assume that more efficient networking of the brain contributes to better integration of pieces of information and thus leads to better results in a general knowledge test," says the Bochum-based researcher.
From Science Daily
Call it Mighty Mouse: Breakthrough leaps Alzheimer's research hurdle
University of California, Irvine researchers have made it possible to learn how key human brain cells respond to Alzheimer's, vaulting a major obstacle in the quest to understand and one day vanquish it. By developing a way for human brain immune cells known as microglia to grow and function in mice, scientists now have an unprecedented view of crucial mechanisms contributing to the disease.
The team, led by Mathew Blurton-Jones, associate professor of neurobiology & behavior, said the breakthrough also holds promise for investigating many other neurological conditions such as Parkinson's, traumatic brain injury, and stroke. The details of their study have just been published in the journal Neuron.
The scientists dedicated four years to devising the new rodent model, which is considered "chimeric." The word, stemming from the mythical Greek monster Chimera that was part goat, lion and serpent, describes an organism containing at least two different sets of DNA.
To create the specialized mouse, the team generated induced pluripotent stem cells, or iPSCs, using cells donated by adult patients. Once created, iPSCs can be turned into any other type of cell. In this case, the researchers coaxed the iPSCs into becoming young microglia and implanted them into genetically-modified mice. Examining the rodents several months later, the scientists found about 80-percent of the microglia in their brains was human, opening the door for an array of new research.
"Microglia are now seen as having a crucial role in the development and progression of Alzheimer's," said Blurton-Jones. "The functions of our cells are influenced by which genes are turned on or off. Recent research has identified over 40 different genes with links to Alzheimer's and the majority of these are switched on in microglia. However, so far we've only been able to study human microglia at the end stage of Alzheimer's in post-mortem tissues or in petri dishes."
In verifying the chimeric model's effectiveness for these investigations, the team checked how its human microglia reacted to amyloid plaques, protein fragments in the brain that accumulate in people with Alzheimer's. They indeed imitated the expected response by migrating toward the amyloid plaques and surrounding them.
"The human microglia also showed significant genetic differences from the rodent version in their response to the plaques, demonstrating how important it is to study the human form of these cells," Blurton-Jones said.
"This specialized mouse will allow researchers to better mimic the human condition during different phases of Alzheimer's while performing properly-controlled experiments," said Jonathan Hasselmann, one of the two neurobiology & behavior graduate students involved in the study. Understanding the stages of the disease, which according to the Alzheimer's Association can last from two to 20 years, has been among the challenges facing researchers.
Read more at Science Daily
The team, led by Mathew Blurton-Jones, associate professor of neurobiology & behavior, said the breakthrough also holds promise for investigating many other neurological conditions such as Parkinson's, traumatic brain injury, and stroke. The details of their study have just been published in the journal Neuron.
The scientists dedicated four years to devising the new rodent model, which is considered "chimeric." The word, stemming from the mythical Greek monster Chimera that was part goat, lion and serpent, describes an organism containing at least two different sets of DNA.
To create the specialized mouse, the team generated induced pluripotent stem cells, or iPSCs, using cells donated by adult patients. Once created, iPSCs can be turned into any other type of cell. In this case, the researchers coaxed the iPSCs into becoming young microglia and implanted them into genetically-modified mice. Examining the rodents several months later, the scientists found about 80-percent of the microglia in their brains was human, opening the door for an array of new research.
"Microglia are now seen as having a crucial role in the development and progression of Alzheimer's," said Blurton-Jones. "The functions of our cells are influenced by which genes are turned on or off. Recent research has identified over 40 different genes with links to Alzheimer's and the majority of these are switched on in microglia. However, so far we've only been able to study human microglia at the end stage of Alzheimer's in post-mortem tissues or in petri dishes."
In verifying the chimeric model's effectiveness for these investigations, the team checked how its human microglia reacted to amyloid plaques, protein fragments in the brain that accumulate in people with Alzheimer's. They indeed imitated the expected response by migrating toward the amyloid plaques and surrounding them.
"The human microglia also showed significant genetic differences from the rodent version in their response to the plaques, demonstrating how important it is to study the human form of these cells," Blurton-Jones said.
"This specialized mouse will allow researchers to better mimic the human condition during different phases of Alzheimer's while performing properly-controlled experiments," said Jonathan Hasselmann, one of the two neurobiology & behavior graduate students involved in the study. Understanding the stages of the disease, which according to the Alzheimer's Association can last from two to 20 years, has been among the challenges facing researchers.
Read more at Science Daily
Jul 30, 2019
The moon is older than previously believed
A new study spearheaded by Earth scientists at the University of Cologne's Institute of Geology and Mineralogy has constrained the age of the Moon to approximately 50 million years after the formation of the solar system. After the formation of the solar system, 4.56 billion years ago, the Moon formed approximately 4.51 billion years ago. The new study has thus determined that the Moon is significantly older than previously believed -- earlier research had estimated the Moon to have formed approximately 150 million years after solar system's formation. To achieve these results, the scientists analysed the chemical composition of a diverse range of samples collected during the Apollo missions. The study 'Early Moon formation inferred from hafnium-tungsten systematics' was published in Nature Geoscience.
On 21 July 1969, humankind took its first steps on another celestial body. In their few hours on the lunar surface, the crew of Apollo 11 collected and brought back to Earth 21.55 kg of samples. Almost exactly 50 years later, these samples are still teaching us about key events of the early solar system and the history of the Earth-Moon system. Determining the age of the Moon is also important to understand how and at which time the Earth formed, and how it evolved at the very beginning of the solar system.
This study focuses on the chemical signatures of different types of lunar samples collected by the different Apollo missions. 'By comparing the relative amounts of different elements in rocks that formed at different times, it is possible to learn how each sample is related to the lunar interior and the solidification of the magma ocean,' says Dr Raúl Fonseca from the University of Cologne, who studies processes that occurred in the Moon's interior in laboratory experiments together with his colleague Dr Felipe Leitzke.
The Moon likely formed in the aftermath of a giant collision between a Mars-sized planetary body and the early Earth. Over time, the Moon accreted from the cloud of material blasted into Earth's orbit. The newborn Moon was covered in a magma ocean, which formed different types of rocks as it cooled. 'These rocks recorded information about the formation of the Moon, and can still be found today on the lunar surface,' says Dr Maxwell Thiemens, former University of Cologne researcher and lead author of the study. Dr Peter Sprung, co-author of the study, adds: 'Such observations are not possible on Earth anymore, as our planet has been geologically active over time. The Moon thus provides a unique opportunity to study planetary evolution.'
The Cologne scientists used the relationship between the rare elements hafnium, uranium and tungsten as a probe to understand the amount of melting that occurred to generate the mare basalts, i.e., the black regions on the lunar surface. Owing to an unprecedented measurement precision, the study could identify distinct trends amongst the different suites of rocks, which now allows for a better understanding of the behaviour of these key rare elements.
Studying hafnium and tungsten on the Moon are particularly important because they constitute a natural radioactive clock of the isotope hafnium-182 decaying into tungsten-182. This radioactive decay only lasted for the first 70 million years of the solar system. By combining the hafnium and tungsten information measured in the Apollo samples with information from laboratory experiments, the study finds that the Moon already started solidifying as early as 50 million years after solar system formed. 'This age information means that any giant impact had to occur before that time, which answers a fiercely debated question amongst the scientific community regarding when the Moon formed,' adds Professor Dr Carsten Münker from the UoC's Institute of Geology and Mineralogy, senior author of the study.
Read more at Science Daily
On 21 July 1969, humankind took its first steps on another celestial body. In their few hours on the lunar surface, the crew of Apollo 11 collected and brought back to Earth 21.55 kg of samples. Almost exactly 50 years later, these samples are still teaching us about key events of the early solar system and the history of the Earth-Moon system. Determining the age of the Moon is also important to understand how and at which time the Earth formed, and how it evolved at the very beginning of the solar system.
This study focuses on the chemical signatures of different types of lunar samples collected by the different Apollo missions. 'By comparing the relative amounts of different elements in rocks that formed at different times, it is possible to learn how each sample is related to the lunar interior and the solidification of the magma ocean,' says Dr Raúl Fonseca from the University of Cologne, who studies processes that occurred in the Moon's interior in laboratory experiments together with his colleague Dr Felipe Leitzke.
The Moon likely formed in the aftermath of a giant collision between a Mars-sized planetary body and the early Earth. Over time, the Moon accreted from the cloud of material blasted into Earth's orbit. The newborn Moon was covered in a magma ocean, which formed different types of rocks as it cooled. 'These rocks recorded information about the formation of the Moon, and can still be found today on the lunar surface,' says Dr Maxwell Thiemens, former University of Cologne researcher and lead author of the study. Dr Peter Sprung, co-author of the study, adds: 'Such observations are not possible on Earth anymore, as our planet has been geologically active over time. The Moon thus provides a unique opportunity to study planetary evolution.'
The Cologne scientists used the relationship between the rare elements hafnium, uranium and tungsten as a probe to understand the amount of melting that occurred to generate the mare basalts, i.e., the black regions on the lunar surface. Owing to an unprecedented measurement precision, the study could identify distinct trends amongst the different suites of rocks, which now allows for a better understanding of the behaviour of these key rare elements.
Studying hafnium and tungsten on the Moon are particularly important because they constitute a natural radioactive clock of the isotope hafnium-182 decaying into tungsten-182. This radioactive decay only lasted for the first 70 million years of the solar system. By combining the hafnium and tungsten information measured in the Apollo samples with information from laboratory experiments, the study finds that the Moon already started solidifying as early as 50 million years after solar system formed. 'This age information means that any giant impact had to occur before that time, which answers a fiercely debated question amongst the scientific community regarding when the Moon formed,' adds Professor Dr Carsten Münker from the UoC's Institute of Geology and Mineralogy, senior author of the study.
Read more at Science Daily
At the edge of chaos: New method for exoplanet stability analysis
Exoplanets revolving around distant stars are coming quickly into focus with advanced technology like the Kepler space telescope. Gaining a full understanding of those systems is difficult, because the initial positions and velocities of the exoplanets are unknown. Determining whether the system dynamics are quasi-periodic or chaotic is cumbersome, expensive and computationally demanding.
In this week's Chaos, from AIP Publishing, Tamás Kovács delivers an alternative method for stability analysis of exoplanetary bodies using only the observed time series data to deduce dynamical measurements and quantify the unpredictability of exoplanet systems.
"If we don't know the governing equations of the motion of a system, and we only have the time series -- what we measure with the telescope -- then we want to transform that time series into a complex network. In this case, it is called a recurrence network," Kovács said. "This network holds all of the dynamical features of the underlying system we want to analyze."
The paper draws on the work of physicist Floris Takens, who proposed in 1981 that the dynamics of a system could be reconstructed using a series of observations about the state of the system. With Takens' embedding theorem as a starting point, Kovács uses time delay embedding to reconstruct a high-dimensional trajectory and then identify recurrence points, where bodies in the phase space are close to each other.
"Those special points will be the vertices and the edges of the complex network," Kovács said. "Once you have the network, you can reprogram this network to be able to apply measures like transitivity, average path length or others unique to that network."
Kovács tests the reliability of the method using a known system as a model, the three-body system of Saturn, Jupiter and the sun, and then applies it to the Kepler 36b and 36c system. His Kepler system results agree with what is known.
"Earlier studies pointed out that Kepler 36b and 36c is a very special system, because from the direct simulation and the numerical integrations, we see the system is at the edge of the chaos," Kovács said. "Sometimes, it shows regular dynamics, and at other times, it seems to be chaotic."
Read more at Science Daily
In this week's Chaos, from AIP Publishing, Tamás Kovács delivers an alternative method for stability analysis of exoplanetary bodies using only the observed time series data to deduce dynamical measurements and quantify the unpredictability of exoplanet systems.
"If we don't know the governing equations of the motion of a system, and we only have the time series -- what we measure with the telescope -- then we want to transform that time series into a complex network. In this case, it is called a recurrence network," Kovács said. "This network holds all of the dynamical features of the underlying system we want to analyze."
The paper draws on the work of physicist Floris Takens, who proposed in 1981 that the dynamics of a system could be reconstructed using a series of observations about the state of the system. With Takens' embedding theorem as a starting point, Kovács uses time delay embedding to reconstruct a high-dimensional trajectory and then identify recurrence points, where bodies in the phase space are close to each other.
"Those special points will be the vertices and the edges of the complex network," Kovács said. "Once you have the network, you can reprogram this network to be able to apply measures like transitivity, average path length or others unique to that network."
Kovács tests the reliability of the method using a known system as a model, the three-body system of Saturn, Jupiter and the sun, and then applies it to the Kepler 36b and 36c system. His Kepler system results agree with what is known.
"Earlier studies pointed out that Kepler 36b and 36c is a very special system, because from the direct simulation and the numerical integrations, we see the system is at the edge of the chaos," Kovács said. "Sometimes, it shows regular dynamics, and at other times, it seems to be chaotic."
Read more at Science Daily
Individuals with obesity get more satisfaction from their food
The propensity to overeat may, in part, be a function of the satisfaction derived from eating. A new study in the Journal of the Academy of Nutrition and Dietetics, published by Elsevier, found no significant difference in taste perceptions between participants of normal weight and those who were overweight. However, participants with obesity had initial taste perceptions that were greater than participants who were not obese, which declined at a more gradual rate than participants who were not obese. This quantification of satisfaction from food may help explain why some people eat more than others.
"Obesity is a major public-health problem. Thirty percent of the US population is obese, and obesity-related health problems (diabetes, hypertension, etc.) are increasing. Causes of obesity are varied, but food consumption decisions play an important role, especially decisions about what foods to eat and how much to consume. Taste perceptions may lead to overeating. If people with obesity have different taste perceptions than nonobese people, it could lead to better understanding of obesity and possibly designing new approaches to prevent obesity," explained lead investigator Linnea A. Polgreen, PhD, Department of Pharmacy Practice and Science, University of Iowa, Iowa City, IA, USA.
As individuals consume more of a food item, they experience diminishing marginal taste perception, which means their level of perceived taste from additional consumption may tend to decline (ie, additional consumption may become less pleasurable). The relationship between perceived taste and quantity consumed has traditionally been referred to as sensory-specific satiety.
In order to determine if marginal taste perceptions differ among participants of normal-weight, those who are overweight and those with obesity, and whether knowledge of nutritional information affects marginal taste perception, researchers at the University of Iowa conducted a non-clinical, randomized controlled trial of 290 adults (161 with normal BMI, 78 considered overweight, and 51 considered obese) to measure instantaneous taste perceptions. Eighty percent of the participants were female, and ages ranged from 18 to 75 years. Participants were offered and rated one piece of chocolate at a time in a controlled environment and could eat as much as they wanted without feeling uncomfortable. They consumed between two and 51 pieces. Half of the study participants received nutritional information about the chocolate before the chocolate tasting began.
The study identified a consistent association between taste from food, specifically chocolate, and BMI by directly observing instantaneous taste changes over a period of time, rather than just at the beginning and end of a period of consumption, as in prior studies.
Typically, the appeal of a specific food may decline as more of that food is eaten: the first bite of chocolate is better than the 10th, a phenomenon consistent with the concept of sensory-specific satiety. As anticipated, researchers found that ratings generally went down after each piece of chocolate consumed with no significant difference in taste perceptions between normal and overweight participants reported. However, participants with obesity had higher levels of initial taste perception, rated subsequent pieces higher than their counterparts without obesity, and their ratings declined at a more gradual rate compared to participants with normal weight and those with obesity. People hungrier prior to the study had greater taste perception; women's taste perceptions declined faster than men's; and providing nutritional information prior to chocolate consumption did not affect taste perception.
"In our study population, people with obesity reported a higher level of satisfaction for each additional piece of chocolate compared to nonobese people. Thus, their taste preferences appear markedly different," noted co-investigator Aaron C. Miller, PhD, Department of Epidemiology, University of Iowa, Iowa City, IA, USA. "Our findings further indicate that obese participants needed to consume a greater quantity of chocolate than nonobese participants to experience a similar decline in taste perceptions. Specifically, obese women needed to eat 12.5 pieces of chocolate to fall to the same level of taste perception as nonobese women who ate only 10 pieces, which corresponds to a difference of 67.5 calories. This may, in part, explain why obese people consume more than nonobese people."
Read more at Science Daily
"Obesity is a major public-health problem. Thirty percent of the US population is obese, and obesity-related health problems (diabetes, hypertension, etc.) are increasing. Causes of obesity are varied, but food consumption decisions play an important role, especially decisions about what foods to eat and how much to consume. Taste perceptions may lead to overeating. If people with obesity have different taste perceptions than nonobese people, it could lead to better understanding of obesity and possibly designing new approaches to prevent obesity," explained lead investigator Linnea A. Polgreen, PhD, Department of Pharmacy Practice and Science, University of Iowa, Iowa City, IA, USA.
As individuals consume more of a food item, they experience diminishing marginal taste perception, which means their level of perceived taste from additional consumption may tend to decline (ie, additional consumption may become less pleasurable). The relationship between perceived taste and quantity consumed has traditionally been referred to as sensory-specific satiety.
In order to determine if marginal taste perceptions differ among participants of normal-weight, those who are overweight and those with obesity, and whether knowledge of nutritional information affects marginal taste perception, researchers at the University of Iowa conducted a non-clinical, randomized controlled trial of 290 adults (161 with normal BMI, 78 considered overweight, and 51 considered obese) to measure instantaneous taste perceptions. Eighty percent of the participants were female, and ages ranged from 18 to 75 years. Participants were offered and rated one piece of chocolate at a time in a controlled environment and could eat as much as they wanted without feeling uncomfortable. They consumed between two and 51 pieces. Half of the study participants received nutritional information about the chocolate before the chocolate tasting began.
The study identified a consistent association between taste from food, specifically chocolate, and BMI by directly observing instantaneous taste changes over a period of time, rather than just at the beginning and end of a period of consumption, as in prior studies.
Typically, the appeal of a specific food may decline as more of that food is eaten: the first bite of chocolate is better than the 10th, a phenomenon consistent with the concept of sensory-specific satiety. As anticipated, researchers found that ratings generally went down after each piece of chocolate consumed with no significant difference in taste perceptions between normal and overweight participants reported. However, participants with obesity had higher levels of initial taste perception, rated subsequent pieces higher than their counterparts without obesity, and their ratings declined at a more gradual rate compared to participants with normal weight and those with obesity. People hungrier prior to the study had greater taste perception; women's taste perceptions declined faster than men's; and providing nutritional information prior to chocolate consumption did not affect taste perception.
"In our study population, people with obesity reported a higher level of satisfaction for each additional piece of chocolate compared to nonobese people. Thus, their taste preferences appear markedly different," noted co-investigator Aaron C. Miller, PhD, Department of Epidemiology, University of Iowa, Iowa City, IA, USA. "Our findings further indicate that obese participants needed to consume a greater quantity of chocolate than nonobese participants to experience a similar decline in taste perceptions. Specifically, obese women needed to eat 12.5 pieces of chocolate to fall to the same level of taste perception as nonobese women who ate only 10 pieces, which corresponds to a difference of 67.5 calories. This may, in part, explain why obese people consume more than nonobese people."
Read more at Science Daily
Tech companies not doing enough to protect users from phishing scams
Technology companies could be doing much more to protect individuals and organisations from the threats posed by phishing, according to research by the University of Plymouth.
However, users also need to make themselves more aware of the dangers to ensure potential scammers do not obtain access to personal or sensitive information.
Academics from Plymouth's Centre for Security, Communications and Network (CSCAN) Research assessed the effectiveness of phishing filters employed by various email service providers.
They sent two sets of messages to victim accounts, using email content obtained from archives of reported phishing attacks, with the first as plain text with links removed and the second having links retained and pointing to their original destination.
They then examined which mailbox it reached within email accounts as well as whether they were explicitly labelled in any way to denote them as suspicious or malicious.
In the significant majority of cases (75% without links and 64% with links) the potential phishing messages made it into inboxes and were not in any way labelled to highlight them as spam or suspicious. Moreover, only 6% of messages were explicitly labelled as malicious.
Professor Steven Furnell, leader of CSCAN, worked on the study with MSc student Kieran Millet and Associate Professor of Cyber Security Dr Maria Papadaki.
He said: "The poor performance of most providers implies they either do not employ filtering based on language content, or that it is inadequate to protect users. Given users' tendency to perform poorly at identifying malicious messages this is a worrying outcome. The results suggest an opportunity to improve phishing detection in general, but the technology as it stands cannot be relied upon to provide anything other than a small contribution in this context."
The number of phishing incidents has risen dramatically since they were first recorded in 2003. In fact, global software giant Kaspersky Lab reported that its anti-phishing system was triggered 482,465,211 times in 2018, almost double the number for 2017.
It is also a significant problem for businesses, with 80% telling the Cyber Security Breaches Survey 2019 that they have encountered 'Fraudulent emails or being directed to fraudulent websites' -- placing this category well ahead of malware and ransomware.
Phishing is designed to trick victims into divulging sensitive information, such as identity and financial-related data, and the threat can actually take several forms:
However, users also need to make themselves more aware of the dangers to ensure potential scammers do not obtain access to personal or sensitive information.
Academics from Plymouth's Centre for Security, Communications and Network (CSCAN) Research assessed the effectiveness of phishing filters employed by various email service providers.
They sent two sets of messages to victim accounts, using email content obtained from archives of reported phishing attacks, with the first as plain text with links removed and the second having links retained and pointing to their original destination.
They then examined which mailbox it reached within email accounts as well as whether they were explicitly labelled in any way to denote them as suspicious or malicious.
In the significant majority of cases (75% without links and 64% with links) the potential phishing messages made it into inboxes and were not in any way labelled to highlight them as spam or suspicious. Moreover, only 6% of messages were explicitly labelled as malicious.
Professor Steven Furnell, leader of CSCAN, worked on the study with MSc student Kieran Millet and Associate Professor of Cyber Security Dr Maria Papadaki.
He said: "The poor performance of most providers implies they either do not employ filtering based on language content, or that it is inadequate to protect users. Given users' tendency to perform poorly at identifying malicious messages this is a worrying outcome. The results suggest an opportunity to improve phishing detection in general, but the technology as it stands cannot be relied upon to provide anything other than a small contribution in this context."
The number of phishing incidents has risen dramatically since they were first recorded in 2003. In fact, global software giant Kaspersky Lab reported that its anti-phishing system was triggered 482,465,211 times in 2018, almost double the number for 2017.
It is also a significant problem for businesses, with 80% telling the Cyber Security Breaches Survey 2019 that they have encountered 'Fraudulent emails or being directed to fraudulent websites' -- placing this category well ahead of malware and ransomware.
Phishing is designed to trick victims into divulging sensitive information, such as identity and financial-related data, and the threat can actually take several forms:
- Bulk-phishing -- where the approach is not specially targeted or tailored toward the recipient;
- Spear-phishing -- where the message is targeted at specific individuals or companies and tailored accordingly;
- Clone-phishing -- where the scammers take a legitimate email containing an attachment or link, and replace it with a malicious version;
- Whaling -- in these cases the phishing is specifically targeted towards high value or senior individuals.
Jul 29, 2019
Origin of life: The importance of interfaces
Tiny gas-filled bubbles in the porous rock found around hot springs are thought to have played an important role in the origin of life. Temperature differences at the interface between liquid phases could therefore have initiated prebiotic chemical evolution.
A plethora of physicochemical processes must have created the conditions that enabled living systems to emerge on the early Earth. In other words, the era of biological evolution must have been preceded by a -- presumably protracted -- phase of 'prebiotic' chemical evolution, during which the first informational molecules capable of replicating themselves were assembled and selected. This scenario immediately raises another question: Under what environmental conditions could prebiotic evolution have taken place? One possible setting has long been discussed and explored -- tiny pores in volcanic rocks. An international team of researchers led by Dieter Braun (Professor of Systems Biophysics at Ludwig-Maximilians-Universitaet (LMU) in Munich) has now taken a closer look at the water-air interfaces in these pores. They form spontaneously at gas-filled bubbles and show an interesting combination of effects.
They found that they could have played an important part in facilitating the physicochemical interactions that contributed to the origin of life. Specifically, Braun and his colleagues asked whether such interfaces could have stimulated the kinds of chemical reactions that triggered the initial stages of prebiotic chemical evolution. Their findings appear in the leading journal Nature Chemistry.
The study strongly supports the notion that tiny gas-filled bubbles that were trapped in, and reacted with, the surfaces of pores in volcanic rocks could indeed have accelerated the formation of the chemical networks that ultimately gave rise to the first cells. Thus, the authors were able to experimentally verify and characterize the facilitating effects of air-water interfaces on the relevant chemical reactions. If there is a difference in temperature along the surface of such a bubble, water will tend to evaporate on the warmer side and condense on the cooler side, just as a raindrop that lands on a window runs down the flat surface of the glass and eventually evaporates. "In principle, this process can be repeated ad infinitum, since the water continuously cycles between the gaseous and the liquid phase," says Braun, who has characterized the mechanism and the underlying physical processes in detail, together with his doctoral student Matthias Morasch and other members of his research group. The upshot of this cyclical phenomenon is that molecules accumulate to very high concentrations on the warmer side of the bubble.
"We began by making a series of measurements of reaction rates under various conditions, in order to characterize the nature of the underlying mechanism," says Morasch. The phenomenon turned out to be surprisingly effect and robust. Even small molecules could be concentrated to high levels. "We then tested a whole range of physical and chemical processes, which must have played a central role in the origin of life -- and all of them were markedly accelerated or made possible at all under the conditions prevailing at the air-water interface." The study benefitted from interactions between Braun's group of biophysicists and the specialists in disciplines such as chemistry and geology who work together with him in the Collaborative Research Centre (SFB/TRR) on the Origin of Life (which is funded by the DFG), and from cooperations with members of international teams.
For example, the LMU researchers show that physicochemical processes which promote the formation of polymers are either stimulated -- or made possible in the first place -- by the availability of an interface between the aqueous environment and the gas phase, which markedly enhances rates of chemical reactions and catalytic mechanisms. In fact, in such experiments, molecules could be accumulated to high concentrations within lipid membranes when the researchers added the appropriate chemical constituents. "The vesicles produced in this way are not perfect. But the finding nevertheless suggests how the first rudimentary protocells and their outer membranes might have been formed," says Morasch.
Read more at Science Daily
A plethora of physicochemical processes must have created the conditions that enabled living systems to emerge on the early Earth. In other words, the era of biological evolution must have been preceded by a -- presumably protracted -- phase of 'prebiotic' chemical evolution, during which the first informational molecules capable of replicating themselves were assembled and selected. This scenario immediately raises another question: Under what environmental conditions could prebiotic evolution have taken place? One possible setting has long been discussed and explored -- tiny pores in volcanic rocks. An international team of researchers led by Dieter Braun (Professor of Systems Biophysics at Ludwig-Maximilians-Universitaet (LMU) in Munich) has now taken a closer look at the water-air interfaces in these pores. They form spontaneously at gas-filled bubbles and show an interesting combination of effects.
They found that they could have played an important part in facilitating the physicochemical interactions that contributed to the origin of life. Specifically, Braun and his colleagues asked whether such interfaces could have stimulated the kinds of chemical reactions that triggered the initial stages of prebiotic chemical evolution. Their findings appear in the leading journal Nature Chemistry.
The study strongly supports the notion that tiny gas-filled bubbles that were trapped in, and reacted with, the surfaces of pores in volcanic rocks could indeed have accelerated the formation of the chemical networks that ultimately gave rise to the first cells. Thus, the authors were able to experimentally verify and characterize the facilitating effects of air-water interfaces on the relevant chemical reactions. If there is a difference in temperature along the surface of such a bubble, water will tend to evaporate on the warmer side and condense on the cooler side, just as a raindrop that lands on a window runs down the flat surface of the glass and eventually evaporates. "In principle, this process can be repeated ad infinitum, since the water continuously cycles between the gaseous and the liquid phase," says Braun, who has characterized the mechanism and the underlying physical processes in detail, together with his doctoral student Matthias Morasch and other members of his research group. The upshot of this cyclical phenomenon is that molecules accumulate to very high concentrations on the warmer side of the bubble.
"We began by making a series of measurements of reaction rates under various conditions, in order to characterize the nature of the underlying mechanism," says Morasch. The phenomenon turned out to be surprisingly effect and robust. Even small molecules could be concentrated to high levels. "We then tested a whole range of physical and chemical processes, which must have played a central role in the origin of life -- and all of them were markedly accelerated or made possible at all under the conditions prevailing at the air-water interface." The study benefitted from interactions between Braun's group of biophysicists and the specialists in disciplines such as chemistry and geology who work together with him in the Collaborative Research Centre (SFB/TRR) on the Origin of Life (which is funded by the DFG), and from cooperations with members of international teams.
For example, the LMU researchers show that physicochemical processes which promote the formation of polymers are either stimulated -- or made possible in the first place -- by the availability of an interface between the aqueous environment and the gas phase, which markedly enhances rates of chemical reactions and catalytic mechanisms. In fact, in such experiments, molecules could be accumulated to high concentrations within lipid membranes when the researchers added the appropriate chemical constituents. "The vesicles produced in this way are not perfect. But the finding nevertheless suggests how the first rudimentary protocells and their outer membranes might have been formed," says Morasch.
Read more at Science Daily
Sun's solar wind and plasma 'burps' created on Earth
The sun's solar wind affects nearly everything in the solar system. It can disrupt the function of Earth's satellites and creates the lights of the auroras.
A new study by University of Wisconsin-Madison physicists mimicked solar winds in the lab, confirming how they develop and providing an Earth-bound model for the future study of solar physics.
Our sun is essentially a big ball of hot plasma -- an energetic state of matter made up of ionized gas. As the sun spins, the plasma spins along, too. This plasma movement in the core of the sun produces a magnetic field that fills the solar atmosphere. At some distance from the sun's surface, known as the Alfvén surface, this magnetic field weakens and plasma breaks away from the sun, creating the solar wind.
"The solar wind is highly variable, but there are essentially two types: fast and slow," explains Ethan Peterson, a graduate student in the department of physics at UW-Madison and lead author of the study published online July 29 in Nature Physics. "Satellite missions have documented pretty well where the fast wind comes from, so we were trying to study specifically how the slow solar wind is generated and how it evolves as it travels toward Earth."
Peterson and his colleagues, including physics professor Cary Forest, may not have direct access to the big plasma ball of the sun, but they do have access to the next best thing: the Big Red Ball.
The Big Red Ball is a three-meter-wide hollow sphere, with a strong magnet at its center and various probes inside. The researchers pump helium gas in, ionize it to create a plasma, and then apply an electric current that, along with the magnetic field, stirs the plasma, creating a near-perfect mimic of the spinning plasma and electromagnetic fields of the sun.
With their mini-sun in place, the researchers can take measurements at many points inside the ball, allowing them to study solar phenomena in three dimensions.
First, they were able to recreate the Parker Spiral, a magnetic field that fills the entire solar system named for the scientist who first described the solar wind. Below the Alfvén surface, the magnetic field radiates straight out from the Sun. But at that surface, solar wind dynamics take over, dragging the magnetic field into a spiral.
"Satellite measurements are pretty consistent with the Parker Spiral model, but only at one point at a time, so you'd never be able to make a simultaneous, large-scale map of it like we can in the lab." Peterson says. "Our experimental measurements confirm Parker's theory of how it is created by these plasma flows."
The researchers were also able to identify the source of the Sun's plasma "burps," small, periodic ejections of plasma that fuel the slow solar wind. With the plasma spinning, they probed the magnetic field and the speed of the plasma. Their data mapped a region where the plasma was moving fast enough and the magnetic field was weak enough that the plasma could break off and eject radially.
"These ejections are observed by satellites, but no one knows what drives them," Peterson says. "We ended up seeing very similar burps in our experiment, and identified how they develop."
The researchers stress that their Earth-bound experiments complement, but don't replace, satellite missions. For example, the Parker Solar Probe, launched in August 2018, is expected to reach and even dip below the Alfvén surface. It will provide direct measurements of solar wind never obtained before.
"Our work shows that laboratory experiments can also get at the fundamental physics of these processes," Peterson says. "And because the Big Red Ball is now funded as a National User Facility, it says to the science community: If you want to study the physics of solar wind, you can do that here."
Read more at Science Daily
A new study by University of Wisconsin-Madison physicists mimicked solar winds in the lab, confirming how they develop and providing an Earth-bound model for the future study of solar physics.
Our sun is essentially a big ball of hot plasma -- an energetic state of matter made up of ionized gas. As the sun spins, the plasma spins along, too. This plasma movement in the core of the sun produces a magnetic field that fills the solar atmosphere. At some distance from the sun's surface, known as the Alfvén surface, this magnetic field weakens and plasma breaks away from the sun, creating the solar wind.
"The solar wind is highly variable, but there are essentially two types: fast and slow," explains Ethan Peterson, a graduate student in the department of physics at UW-Madison and lead author of the study published online July 29 in Nature Physics. "Satellite missions have documented pretty well where the fast wind comes from, so we were trying to study specifically how the slow solar wind is generated and how it evolves as it travels toward Earth."
Peterson and his colleagues, including physics professor Cary Forest, may not have direct access to the big plasma ball of the sun, but they do have access to the next best thing: the Big Red Ball.
The Big Red Ball is a three-meter-wide hollow sphere, with a strong magnet at its center and various probes inside. The researchers pump helium gas in, ionize it to create a plasma, and then apply an electric current that, along with the magnetic field, stirs the plasma, creating a near-perfect mimic of the spinning plasma and electromagnetic fields of the sun.
With their mini-sun in place, the researchers can take measurements at many points inside the ball, allowing them to study solar phenomena in three dimensions.
First, they were able to recreate the Parker Spiral, a magnetic field that fills the entire solar system named for the scientist who first described the solar wind. Below the Alfvén surface, the magnetic field radiates straight out from the Sun. But at that surface, solar wind dynamics take over, dragging the magnetic field into a spiral.
"Satellite measurements are pretty consistent with the Parker Spiral model, but only at one point at a time, so you'd never be able to make a simultaneous, large-scale map of it like we can in the lab." Peterson says. "Our experimental measurements confirm Parker's theory of how it is created by these plasma flows."
The researchers were also able to identify the source of the Sun's plasma "burps," small, periodic ejections of plasma that fuel the slow solar wind. With the plasma spinning, they probed the magnetic field and the speed of the plasma. Their data mapped a region where the plasma was moving fast enough and the magnetic field was weak enough that the plasma could break off and eject radially.
"These ejections are observed by satellites, but no one knows what drives them," Peterson says. "We ended up seeing very similar burps in our experiment, and identified how they develop."
The researchers stress that their Earth-bound experiments complement, but don't replace, satellite missions. For example, the Parker Solar Probe, launched in August 2018, is expected to reach and even dip below the Alfvén surface. It will provide direct measurements of solar wind never obtained before.
"Our work shows that laboratory experiments can also get at the fundamental physics of these processes," Peterson says. "And because the Big Red Ball is now funded as a National User Facility, it says to the science community: If you want to study the physics of solar wind, you can do that here."
Read more at Science Daily
Camera can watch moving objects around corners
David Lindell, a graduate student in electrical engineering at Stanford University, donned a high visibility tracksuit and got to work, stretching, pacing and hopping across an empty room. Through a camera aimed away from Lindell -- at what appeared to be a blank wall -- his colleagues could watch his every move.
That's because, hidden to the naked eye, he was being scanned by a high powered laser and the single particles of light he reflected onto the walls around him were captured and reconstructed by the camera's advanced sensors and processing algorithm.
"People talk about building a camera that can see as well as humans for applications such as autonomous cars and robots, but we want to build systems that go well beyond that," said Gordon Wetzstein, an assistant professor of electrical engineering at Stanford. "We want to see things in 3D, around corners and beyond the visible light spectrum."
The camera system Lindell tested, which the researchers are presenting at the SIGGRAPH 2019 conference Aug. 1, builds upon previous around-the-corner cameras this team developed. It's able to capture more light from a greater variety of surfaces, see wider and farther away and is fast enough to monitor out-of-sight movement -- such as Lindell's calisthenics -- for the first time. Someday, the researchers hope superhuman vision systems could help autonomous cars and robots operate even more safely than they would with human guidance.
Practicality and seismology
Keeping their system practical is a high priority for these researchers. The hardware they chose, the scanning and image processing speeds, and the style of imaging are already common in autonomous car vision systems. Previous systems for viewing scenes outside a camera's line of sight relied on objects that either reflect light evenly or strongly. But real-world objects, including shiny cars, fall outside these categories, so this system can handle light bouncing off a range of surfaces, including disco balls, books and intricately textured statues.
Central to their advance was a laser 10,000 times more powerful than what they were using a year ago. The laser scans a wall opposite the scene of interest and that light bounces off the wall, hits the objects in the scene, bounces back to the wall and to the camera sensors. By the time the laser light reaches the camera only specks remain, but the sensor captures every one, sending it along to a highly efficient algorithm, also developed by this team, that untangles these echoes of light to decipher the hidden tableau.
"When you're watching the laser scanning it out, you don't see anything," described Lindell. "With this hardware, we can basically slow down time and reveal these tracks of light. It almost looks like magic."
The system can scan at four frames per second. It can reconstruct a scene at speeds of 60 frames per second on a computer with a graphics processing unit, which enhances graphics processing capabilities.
To advance their algorithm, the team looked to other fields for inspiration. The researchers were particularly drawn to seismic imaging systems -- which bounce sound waves off underground layers of Earth to learn what's beneath the surface -- and reconfigured their algorithm to likewise interpret bouncing light as waves emanating from the hidden objects. The result was the same high-speed and low memory usage with improvements in their abilities to see large scenes containing various materials.
"There are many ideas being used in other spaces -- seismology, imaging with satellites, synthetic aperture radar -- that are applicable to looking around corners," said Matthew O'Toole, an assistant professor at Carnegie Mellon University who was previously a postdoctoral fellow in Wetzstein's lab. "We're trying to take a little bit from these fields and we'll hopefully be able to give something back to them at some point."
Humble steps
Being able to see real-time movement from otherwise invisible light bounced around a corner was a thrilling moment for this team but a practical system for autonomous cars or robots will require further enhancements.
"It's very humble steps. The movement still looks low-resolution and it's not super-fast but compared to the state-of-the-art last year it is a significant improvement," said Wetzstein. "We were blown away the first time we saw these results because we've captured data that nobody's seen before."
Read more at Science Daily
That's because, hidden to the naked eye, he was being scanned by a high powered laser and the single particles of light he reflected onto the walls around him were captured and reconstructed by the camera's advanced sensors and processing algorithm.
"People talk about building a camera that can see as well as humans for applications such as autonomous cars and robots, but we want to build systems that go well beyond that," said Gordon Wetzstein, an assistant professor of electrical engineering at Stanford. "We want to see things in 3D, around corners and beyond the visible light spectrum."
The camera system Lindell tested, which the researchers are presenting at the SIGGRAPH 2019 conference Aug. 1, builds upon previous around-the-corner cameras this team developed. It's able to capture more light from a greater variety of surfaces, see wider and farther away and is fast enough to monitor out-of-sight movement -- such as Lindell's calisthenics -- for the first time. Someday, the researchers hope superhuman vision systems could help autonomous cars and robots operate even more safely than they would with human guidance.
Practicality and seismology
Keeping their system practical is a high priority for these researchers. The hardware they chose, the scanning and image processing speeds, and the style of imaging are already common in autonomous car vision systems. Previous systems for viewing scenes outside a camera's line of sight relied on objects that either reflect light evenly or strongly. But real-world objects, including shiny cars, fall outside these categories, so this system can handle light bouncing off a range of surfaces, including disco balls, books and intricately textured statues.
Central to their advance was a laser 10,000 times more powerful than what they were using a year ago. The laser scans a wall opposite the scene of interest and that light bounces off the wall, hits the objects in the scene, bounces back to the wall and to the camera sensors. By the time the laser light reaches the camera only specks remain, but the sensor captures every one, sending it along to a highly efficient algorithm, also developed by this team, that untangles these echoes of light to decipher the hidden tableau.
"When you're watching the laser scanning it out, you don't see anything," described Lindell. "With this hardware, we can basically slow down time and reveal these tracks of light. It almost looks like magic."
The system can scan at four frames per second. It can reconstruct a scene at speeds of 60 frames per second on a computer with a graphics processing unit, which enhances graphics processing capabilities.
To advance their algorithm, the team looked to other fields for inspiration. The researchers were particularly drawn to seismic imaging systems -- which bounce sound waves off underground layers of Earth to learn what's beneath the surface -- and reconfigured their algorithm to likewise interpret bouncing light as waves emanating from the hidden objects. The result was the same high-speed and low memory usage with improvements in their abilities to see large scenes containing various materials.
"There are many ideas being used in other spaces -- seismology, imaging with satellites, synthetic aperture radar -- that are applicable to looking around corners," said Matthew O'Toole, an assistant professor at Carnegie Mellon University who was previously a postdoctoral fellow in Wetzstein's lab. "We're trying to take a little bit from these fields and we'll hopefully be able to give something back to them at some point."
Humble steps
Being able to see real-time movement from otherwise invisible light bounced around a corner was a thrilling moment for this team but a practical system for autonomous cars or robots will require further enhancements.
"It's very humble steps. The movement still looks low-resolution and it's not super-fast but compared to the state-of-the-art last year it is a significant improvement," said Wetzstein. "We were blown away the first time we saw these results because we've captured data that nobody's seen before."
Read more at Science Daily
Elephant extinction will raise carbon dioxide levels in atmosphere
Forest elephants, Central African Republic |
In a paper recently published in Nature Geoscience, a Saint Louis University biologist and his colleagues found that elephant populations in central African forests encourage the growth of slow-growing trees with high wood density that sequester more carbon from the atmosphere than fast growing species which are the preferred foods of elephants.
As forest elephants preferentially browse on the fast growing species, they cause high levels of damage and mortality to these species compared to the slow growing, high wood density species. The collapse of forest elephant populations will likely therefore causes an increase in the abundance of fast growing tree species at the expense of slow growing species, and reduce the ability of the forest to capture carbon.
Stephen Blake, Ph.D., assistant professor of biology at Saint Louis University, spent 17 years in central Africa doing, among other things, applied research and conservation work with elephants. While there, he collected a data set on forest structure and species composition in the Nouabalé-Ndoki Forest of northern Congo.
In the current study, Blake's collaborators developed a mathematical computer model to answer the question 'What would happen to the composition of the forest over time with and without elephant browsing?'
To find out, they simulated elephant damage through browsing in the forest and assumed they browse certain plant species at different rates. Elephants prefer fast-growing species in more open spaces. As they feed and browse, they cause damage, knocking off a limb or breaking a shrub. The model calculated feeding and breakage rates along with elephant mortality rates to see their effect on certain woody plants.
"Lo and behold, as we look at numbers of elephants in a forest and we look at the composition of forest over time, we find that the proportion of trees with high density wood is higher in forests with elephants," Blake said.
"The simulation found that the slow-growing plant species survive better when elephants are present. These species aren't eaten by elephants and, over time, the forest becomes dominated by these slow-growing species. Wood (lignin) has a carbon backbone, meaning it has a large number of carbon molecules in it. Slow growing high wood density species contain more carbon molecules per unit volume than fast growing low wood density species. As the elephants "thin" the forest, they increase the number of slow-growing trees and the forest is capable of storing more carbon."
These findings suggest far-ranging ecological consequences of past and present extinctions. The loss of elephants will seriously reduce the ability of the remaining forest to sequester carbon. Trees and plants use carbon dioxide during photosynthesis, removing it from the atmosphere. For this reason, plants are helpful in combating global warming and serve to store carbon emissions.
Without the forest elephants, less carbon dioxide will be taken out of the atmosphere. In monetary terms, forest elephants represent a carbon storage service of $43 billion.
"The sad reality is that humanity is doing its best to rid the planet of elephants as quickly as it can," Blake said. "Forest elephants are rapidly declining and facing extinction. From a climate perspective, all of their positive effect on carbon and their myriad other ecological roles as forest gardeners and engineers will be lost."
The study authors note that forest elephant conservation could reverse this loss.
"Elephants are a flagship species. People love elephants -- we spend millions every year on cuddly toys, they are zoo favourites and who didn't cry during Dumbo? and yet we're pushing them closer to extinction every day. On one hand we admire them and feel empathy and are horrified when they are murdered and on the other hand we're not prepared to do anything serious about it. The consequences may be severe for us all. We need to change our ways.
Read more at Science Daily
Subscribe to:
Posts (Atom)