Some of you may have made a New Year's resolution to hit the gym to tackle that annoying belly fat. But have you ever wondered how physical activity produces this desired effect? A signaling molecule called interleukin-6 plays a critical role in this process, researchers report December 27 in the journal Cell Metabolism.
As expected, a 12-week intervention consisting of bicycle exercise decreased visceral abdominal fat in obese adults. But remarkably, this effect was abolished in participants who were also treated with tocilizumab, a drug that blocks interleukin-6 signaling and is currently approved for the treatment of rheumatoid arthritis. Moreover, tocilizumab treatment increased cholesterol levels regardless of physical activity.
"The take home for the general audience is 'do exercise,'" says first author Anne-Sophie Wedell-Neergaard of the University of Copenhagen. "We all know that exercise promotes better health, and now we also know that regular exercise training reduces abdominal fat mass and thereby potentially also the risk of developing cardio-metabolic diseases."
Abdominal fat is associated with an increased risk of not only cardio-metabolic disease, but also cancer, dementia, and all-cause mortality. Physical activity reduces visceral fat tissue, which surrounds internal organs in the abdominal cavity, but the underlying mechanisms have not been clear. Some researchers have proposed that a "fight-or-flight" hormone called epinephrine mediates this effect. But Wedell-Neergaard and co-senior study author Helga Ellingsgaard of the University of Copenhagen suspected that interleukin-6 could also play an important role because it regulates energy metabolism, stimulates the breakdown of fats in healthy people, and is released from skeletal muscle during exercise.
To test this idea, the researchers carried out a 12-week, single-center trial in which they randomly assigned abdominally obese adults to four groups. A total of 53 participants received intravenous infusions of either tocilizumab or saline as a placebo every four weeks, combined with no exercise or a bicycle routine consisting of several 45-minute sessions each week. The researchers used magnetic resonance imaging to assess visceral fat tissue mass at the beginning and end of the study.
In the placebo groups, exercise reduced visceral fat tissue mass by an average of 225 grams, or 8 percent, compared with no exercise. But tocilizumab treatment eliminated this effect. In the exercise groups, tocilizumab also increased visceral fat tissue mass by approximately 278 grams compared with placebo. In addition, tocilizumab increased total cholesterol and "bad" low-density-lipoprotein (LDL) cholesterol compared with placebo, in both the exercise and no-exercise groups. "To our knowledge, this is the first study to show that interleukin-6 has a physiological role in regulating visceral fat mass in humans," Wedell-Neergaard says.
The authors note that the study was exploratory and not intended to evaluate a given treatment in a clinical setting. To complicate matters, interleukin-6 can have seemingly opposite effects on inflammation, depending on the context. For example, chronic low-grade elevations of interleukin-6 are seen in patients with severe obesity, type 2 diabetes, and cardiovascular disease. "The signaling pathways in immune cells versus muscle cells differ substantially, resulting in pro-inflammatory and anti-inflammatory actions, so interleukin-6 may act differently in healthy and diseased people," Wedell-Neergaard explains.
In future studies, the researchers will test the possibility that interleukin-6 affects whether fats or carbohydrates are used to generate energy under various conditions. They will also investigate whether more interleukin-6, potentially given as an injection, reduces visceral fat mass on its own. "We need a more in-depth understanding of this role of interleukin-6 in order to discuss its implications," Wedell-Neergaard says.
Read more at Science Daily
Dec 29, 2018
Rerouting nerves during amputation reduces phantom limb pain before it starts
Losing a limb due to trauma, cancer, or poor circulation can result in phantom limb and stump pain in upwards of 75 percent of amputees in the United States. Primary TMR - the rerouting of nerves cut during amputation into surrounding muscle - greatly reduces phantom limb and residual limb pain, as reported in recent publications by Dr. Ian Valerio, division chief of Burn, Wound and Trauma in Ohio State's Department of Plastic and Reconstructive Surgery, and Dr. J. Byers Bowen, a former resident who is now in private practice. Their latest work featured in the January 2019 issue of Plastic and Reconstructive Surgery describes how to perform this technique in below-the-knee amputations.
TMR was first developed to allow amputees better control of upper limb prosthetics. Traditionally doctors perform the surgery months or years after the initial amputation. When surgeons discovered the procedure also improves certain causes of pain, they started using it to treat disorganized nerve endings called symptomatic neuromas and/or phantom limb pain.
In this paper, Valerio and Bowen provide a detailed description of TMR in below-the-knee amputees and document the benefits of primary TMR for preventing pain.
"This paper provides a blueprint for improving patient outcomes and quality of life following amputation," said Dr. K. Craig Kent, dean of The Ohio State University College of Medicine.
Over the course of three years, the surgeons performed 22 TMR surgeries on below-the-knee amputees, 18 primary and four secondary. None of the patients have developed symptomatic neuromas and only 13 percent of patients who received primary TMR reported having pain six months later.
"A significant amount of pain in amputees is caused by disorganized nerve endings, i.e. symptomatic neuromas, in the residual limb. They form when nerves are severed and not addressed, thus they have nowhere to go," Valerio said. "Attaching those cut nerve endings to motor nerves in a nearby muscle allows the body to re-establish its neural circuitry. This alleviates phantom and residual limb pain by giving those severed nerves somewhere to go and something to do."
Valerio said patients who've had TMR significantly reduce or sometimes stop using narcotics and other nerve pain related medications, which can greatly improve their quality of life.
"TMR has been shown to reduce pain scores and multiple types of pain via a variety of validated pain surveys. These findings are the first to show that surgery can greatly reduce phantom and other types of limb pain directly," Valerio said.
Bowen added that upper extremity amputees are better able to use and control their prosthetics in addition to their improved pain outcomes. He said, "TMR allows for more individual muscle unit firings through the patient's thoughts. It provides for better intuitive control resulting in more refined functional movements and more degrees of motion by an advanced prosthetic."
The researchers believe primary TMR is a reliable technique to prevent the development of disorganized nerve endings and to reduce phantom and other limb pain in all types of amputations. When done at the time of initial amputation, there is minimal health risk and recovery is similar to that of traditional amputation surgery.
Read more at Science Daily
Sugar-sweetened beverage pattern linked to higher kidney disease risk
Higher collective consumption of sweetened fruit drinks, soda, and water was associated with a higher likelihood of developing chronic kidney disease (CKD) in a community-based study of African-American adults in Mississippi. The findings, which appear in an upcoming issue of the Clinical Journal of the American Society of Nephrology (CJASN), contribute to the growing body of evidence pointing to the negative health consequences of consuming sugar-sweetened beverages.
Certain beverages may affect kidney health, but study results have been inconsistent. To provide more clarity, Casey Rebholz PhD, MS, MNSP, MPH (Johns Hopkins Bloomberg School of Public Health) and her colleagues prospectively studied 3003 African-American men and women with normal kidney function who were enrolled in the Jackson Heart Study.
"There is a lack of comprehensive information on the health implications of the wide range of beverage options that are available in the food supply," said Dr. Rebholz. "In particular, there is limited information on which types of beverages and patterns of beverages are associated with kidney disease risk in particular."
For their study, the investigators assessed beverage intake through a food frequency questionnaire administered at the start of the study in 2000-04, and they followed participants until 2009-13.
Among the 3003 participants, 185 (6%) developed CKD over a median follow-up of 8 years. After adjustment for confounding factors, consuming a beverage pattern consisting of soda, sweetened fruit drinks, and water was associated with a higher risk of developing CKD. Participants in the top tertile for consumption of this beverage pattern were 61% more likely to develop CKD than those in the bottom tertile.
The researchers were surprised to see that water was a component of this beverage pattern that was linked with a higher risk of CKD. They noted that study participants may have reported their consumption of a wide variety of types of water, including flavored and sweetened water. Unfortunately, the investigators did not collect information about specific brands or types of bottled water in the Jackson Heart Study.
In an accompanying editorial, Holly Kramer, MD, MPH and David Shoham, PhD (Loyola University Chicago) noted that the findings hold strong public health implications. "While a few select U.S. cities have successfully reduced SSB [sugar sweetened beverage] consumption via taxation, all other municipalities have resisted public health efforts to lower SSB consumption," they wrote. "This cultural resistance to reducing SSB consumption can be compared to the cultural resistance to smoking cessation during the 1960s after the Surgeon General report was released. During the 1960s, tobacco use was viewed as a social choice and not a medical or social public health problem."
Read more at Science Daily
Certain beverages may affect kidney health, but study results have been inconsistent. To provide more clarity, Casey Rebholz PhD, MS, MNSP, MPH (Johns Hopkins Bloomberg School of Public Health) and her colleagues prospectively studied 3003 African-American men and women with normal kidney function who were enrolled in the Jackson Heart Study.
"There is a lack of comprehensive information on the health implications of the wide range of beverage options that are available in the food supply," said Dr. Rebholz. "In particular, there is limited information on which types of beverages and patterns of beverages are associated with kidney disease risk in particular."
For their study, the investigators assessed beverage intake through a food frequency questionnaire administered at the start of the study in 2000-04, and they followed participants until 2009-13.
Among the 3003 participants, 185 (6%) developed CKD over a median follow-up of 8 years. After adjustment for confounding factors, consuming a beverage pattern consisting of soda, sweetened fruit drinks, and water was associated with a higher risk of developing CKD. Participants in the top tertile for consumption of this beverage pattern were 61% more likely to develop CKD than those in the bottom tertile.
The researchers were surprised to see that water was a component of this beverage pattern that was linked with a higher risk of CKD. They noted that study participants may have reported their consumption of a wide variety of types of water, including flavored and sweetened water. Unfortunately, the investigators did not collect information about specific brands or types of bottled water in the Jackson Heart Study.
In an accompanying editorial, Holly Kramer, MD, MPH and David Shoham, PhD (Loyola University Chicago) noted that the findings hold strong public health implications. "While a few select U.S. cities have successfully reduced SSB [sugar sweetened beverage] consumption via taxation, all other municipalities have resisted public health efforts to lower SSB consumption," they wrote. "This cultural resistance to reducing SSB consumption can be compared to the cultural resistance to smoking cessation during the 1960s after the Surgeon General report was released. During the 1960s, tobacco use was viewed as a social choice and not a medical or social public health problem."
Read more at Science Daily
Dec 28, 2018
Bacteria found in ancient Irish soil halts growth of superbugs: New hope for tackling antibiotic resistance
Antibiotic resistant superbugs could kill up to 1.3 million people in Europe by 2050, according to recent research.
The World Health Organisation (WHO) describes the problem as "one of the biggest threats to global health, food security, and development today."
The new strain of bacteria was discovered by a team based in Swansea University Medical School, made up of researchers from Wales, Brazil, Iraq and Northern Ireland.
They have named the new strain Streptomyces sp. myrophorea.
The soil they analysed originated from an area of Fermanagh, Northern Ireland, which is known as the Boho Highlands. It is an area of alkaline grassland and the soil is reputed to have healing properties.
The search for replacement antibiotics to combat multi-resistance has prompted researchers to explore new sources, including folk medicines: a field of study known as ethnopharmacology. They are also focusing on environments where well-known antibiotic producers like Streptomyces can be found.
One of the research team, Dr Gerry Quinn, a previous resident of Boho, County Fermanagh, had been aware of the healing traditions of the area for many years.
Traditionally a small amount of soil was wrapped up in cotton cloth and used to heal many ailments including toothache, throat and neck infections. Interestingly, this area was previously occupied by the Druids, around 1500 years ago, and Neolithic people 4000 years ago.
Read more at Science Daily
Your brain rewards you twice per meal: When you eat and when food reaches your stomach
We know a good meal can stimulate the release of the feel-good hormone dopamine, and now a study in humans from the Max Planck Institute for Metabolism Research in Germany suggests that dopamine release in the brain occurs at two different times: at the time the food is first ingested and another once the food reaches the stomach. The work appears December 27 in the journal Cell Metabolism.
"With the help of a new positron emission tomography (PET) technique we developed, we were not only able to find the two peaks of dopamine release, but we could also identify the specific brain regions that were associated with these releases," says senior author Marc Tittgemeyer, head of the Institute's Translational Neurocircuitry Group. "While the first release occurred in brain regions associated with reward and sensory perception, the post-ingestive release involved additional regions related to higher cognitive functions."
In the study, 12 healthy volunteers received either a palatable milkshake or a tasteless solution while PET data were recorded. Interestingly, the subjects' craving or desire for the milkshake was proportionally linked to the amount of dopamine released in particular brain areas at the first tasting. But the higher the craving, the less delayed post-ingestive dopamine was released.
"On one hand, dopamine release mirrors our subjective desire to consume a food item. On the other hand, our desire seems to suppress gut-induced dopamine release," says Heiko Backes, group leader for Multimodal Imaging of Brain Metabolism at the Institute, who is co-first author on the study with Sharmili Edwin Thanarajah.
Suppression of gut-induced release could potentially cause overeating of highly desired food items. "We continue to eat until sufficient dopamine was released," Backes says but adds that this hypothesis remains to be tested in further studies.
Earlier experiments have demonstrated gut-induced dopamine release in mice, but this is the first time it has been shown in humans.
Read more at Science Daily
"With the help of a new positron emission tomography (PET) technique we developed, we were not only able to find the two peaks of dopamine release, but we could also identify the specific brain regions that were associated with these releases," says senior author Marc Tittgemeyer, head of the Institute's Translational Neurocircuitry Group. "While the first release occurred in brain regions associated with reward and sensory perception, the post-ingestive release involved additional regions related to higher cognitive functions."
In the study, 12 healthy volunteers received either a palatable milkshake or a tasteless solution while PET data were recorded. Interestingly, the subjects' craving or desire for the milkshake was proportionally linked to the amount of dopamine released in particular brain areas at the first tasting. But the higher the craving, the less delayed post-ingestive dopamine was released.
"On one hand, dopamine release mirrors our subjective desire to consume a food item. On the other hand, our desire seems to suppress gut-induced dopamine release," says Heiko Backes, group leader for Multimodal Imaging of Brain Metabolism at the Institute, who is co-first author on the study with Sharmili Edwin Thanarajah.
Suppression of gut-induced release could potentially cause overeating of highly desired food items. "We continue to eat until sufficient dopamine was released," Backes says but adds that this hypothesis remains to be tested in further studies.
Earlier experiments have demonstrated gut-induced dopamine release in mice, but this is the first time it has been shown in humans.
Read more at Science Daily
European wheat lacks climate resilience
The climate is not only warming, it is also becoming more variable and extreme. Such unpredictable weather can weaken global food security if major crops such as wheat are not sufficiently resilient -- and if we are not properly prepared.
A group of European researchers, including Professor Jørgen E. Olesen from the Department of Agroecology at Aarhus University, has found that current breeding programmes and cultivar selection practices do not provide the needed resilience to climate change.
- The current breeding programmes and cultivar selection practices do not sufficiently prepare for climatic uncertainty and variability, the authors state in a paper recently published in PNAS (Proceedings of the National Academy of Sciences). Not only that -- the response diversity of wheat on farmers' fields in most European countries has worsened in the past five to fifteen years, depending on country.
Researchers predict that greater variability and extremeness of local weather conditions will lead to reduced yields in wheat and increased yield variability.
- Needless to say, decreased yields are not conducive to food security, but higher yield variability also poses problems. It can lead to a market with greater speculation and price volatility. This may threaten stable access to food by the poor, which in turn can enhance political instability and migration, Jørgen E. Olesen points out.
Decreasing variation in response diversity
The researchers base their assessments on thousands of yield observations of wheat cultivars in nine European countries for qualifying how different cultivars respond to weather. The researchers identified the variation of wheat response diversity on farmers' fields and demonstrated the relation to climate resilience.
The yield responses of all cultivars to different weather events were relatively similar within northern and central Europe, and within southern European countries -- the latter particularly with regard to durum wheat. There were serious gaps in wheat resilience across all Europe, especially with regard to yield performance under abundant rain.
- The lack of response diversity can pose serious problems with regard to food security. Therefore, farmers, breeders, and dealers in seeds and grain need to pay more attention to the diversity of cultivars grown, warns Professor Jørgen E. Olesen.
Climate resilience is imperative
Wheat is an important staple food crop in Europe and is the leading source of plant protein in our diet globally, so it is important to ensure that we have climate-resilient wheat cultivars on hand.
Rain, drought, heat or cold at vulnerable times during the growing season can seriously damage yields. Wheat yield is generally sensitive to even a few days of exposure to waterlogging and to wet weather that favours disease. In addition, heat stress rather than drought sensitivity appears to be a limiting factor for adaptation of wheat to climate change in Europe.
The dominant approach of adapting crops to climate change by tailoring genotypes to the most likely long-term change is likely insufficient. The capacity of a single crop variety to maintain good yield performance under climatic variability and extremes is limited, but diversity in responses to critical weather events can effectively enhance climate resilience. Therefore, a set of cultivars with diverse responses to critical weather conditions is prerequisite to promoting crop climate resilience.
Read more at Science Daily
A group of European researchers, including Professor Jørgen E. Olesen from the Department of Agroecology at Aarhus University, has found that current breeding programmes and cultivar selection practices do not provide the needed resilience to climate change.
- The current breeding programmes and cultivar selection practices do not sufficiently prepare for climatic uncertainty and variability, the authors state in a paper recently published in PNAS (Proceedings of the National Academy of Sciences). Not only that -- the response diversity of wheat on farmers' fields in most European countries has worsened in the past five to fifteen years, depending on country.
Researchers predict that greater variability and extremeness of local weather conditions will lead to reduced yields in wheat and increased yield variability.
- Needless to say, decreased yields are not conducive to food security, but higher yield variability also poses problems. It can lead to a market with greater speculation and price volatility. This may threaten stable access to food by the poor, which in turn can enhance political instability and migration, Jørgen E. Olesen points out.
Decreasing variation in response diversity
The researchers base their assessments on thousands of yield observations of wheat cultivars in nine European countries for qualifying how different cultivars respond to weather. The researchers identified the variation of wheat response diversity on farmers' fields and demonstrated the relation to climate resilience.
The yield responses of all cultivars to different weather events were relatively similar within northern and central Europe, and within southern European countries -- the latter particularly with regard to durum wheat. There were serious gaps in wheat resilience across all Europe, especially with regard to yield performance under abundant rain.
- The lack of response diversity can pose serious problems with regard to food security. Therefore, farmers, breeders, and dealers in seeds and grain need to pay more attention to the diversity of cultivars grown, warns Professor Jørgen E. Olesen.
Climate resilience is imperative
Wheat is an important staple food crop in Europe and is the leading source of plant protein in our diet globally, so it is important to ensure that we have climate-resilient wheat cultivars on hand.
Rain, drought, heat or cold at vulnerable times during the growing season can seriously damage yields. Wheat yield is generally sensitive to even a few days of exposure to waterlogging and to wet weather that favours disease. In addition, heat stress rather than drought sensitivity appears to be a limiting factor for adaptation of wheat to climate change in Europe.
The dominant approach of adapting crops to climate change by tailoring genotypes to the most likely long-term change is likely insufficient. The capacity of a single crop variety to maintain good yield performance under climatic variability and extremes is limited, but diversity in responses to critical weather events can effectively enhance climate resilience. Therefore, a set of cultivars with diverse responses to critical weather conditions is prerequisite to promoting crop climate resilience.
Read more at Science Daily
Unravelling mystery of how, when DNA replicates
A team of Florida State University researchers has unlocked a decades-old mystery about how a critical cellular process is regulated and what that could mean for the future study of genetics.
In cells, DNA and its associated material replicate at regular intervals, a process essential to all living organisms. This contributes to everything from how the body responds to disease to hair color. DNA replication was identified in the late 1950s, but since then researchers across the globe have come up short trying to understand exactly how this process was regulated. Now they know.
David Gilbert, the J. Herbert Taylor Distinguished Professor of Molecular Biology, and doctoral student Jiao Sima published a paper today in the journal Cell that showed there are specific points along the DNA molecule that control replication.
"It's been quite a mystery," Gilbert said. "Replication seemed resilient to everything we tried to do to perturb it. We've described it in detail, shown it changes in different cell types and that it is disrupted in disease. But until now, we couldn't find that final piece, the control elements or the DNA sequences that control it."
Notably, Gilbert's professorship is in honor of a former Florida State professor named J. Herbert Taylor. Taylor demonstrated how different segments of chromosomes duplicate in the late 1950s and published more than 100 papers on chromosome structure and replication. Roughly 60 years later, Gilbert determined how replication was regulated.
Sima had been working with Gilbert in the lab and ran close to a hundred genetic mutations on DNA molecules, hoping to see some sort of result that would better explain how the replication process worked. At a point of frustration, Gilbert said they came up with a "hail Mary" attempt.
Gilbert and Sima examined a single segment of the DNA in the highest possible 3D resolution and saw three sequences along the DNA molecule touching each other frequently. The researchers then used CRISPR, a sophisticated gene editing technology, to remove these three areas simultaneously.
And with that, they found that these three elements together were the key to DNA replication.
"Removing these elements shifted the segment's replication time from the very beginning to the very end of the process," Gilbert said. "This was one of those moments where just one result knocks your socks off."
In addition to the effect on replication timing, the removal of the three elements caused the 3D structure of the DNA molecule to change dramatically.
"We have for the first time pinpointed specific DNA sequences in the genome that regulate chromatin structure and replication timing," Sima said. "These results reflect one possible model of how DNA folds inside cells and how these folding patterns could impact the hereditary materials' function."
Greater understanding of how DNA replication is regulated opens new paths of research in genetics. When replication timing is altered -- as it was in Gilbert and Sima's experiment -- it can completely change how the genetic information of a cell is interpreted.
This could become crucial information as scientists tackle complicated diseases where the replication timing is disrupted.
Read more at Science Daily
In cells, DNA and its associated material replicate at regular intervals, a process essential to all living organisms. This contributes to everything from how the body responds to disease to hair color. DNA replication was identified in the late 1950s, but since then researchers across the globe have come up short trying to understand exactly how this process was regulated. Now they know.
David Gilbert, the J. Herbert Taylor Distinguished Professor of Molecular Biology, and doctoral student Jiao Sima published a paper today in the journal Cell that showed there are specific points along the DNA molecule that control replication.
"It's been quite a mystery," Gilbert said. "Replication seemed resilient to everything we tried to do to perturb it. We've described it in detail, shown it changes in different cell types and that it is disrupted in disease. But until now, we couldn't find that final piece, the control elements or the DNA sequences that control it."
Notably, Gilbert's professorship is in honor of a former Florida State professor named J. Herbert Taylor. Taylor demonstrated how different segments of chromosomes duplicate in the late 1950s and published more than 100 papers on chromosome structure and replication. Roughly 60 years later, Gilbert determined how replication was regulated.
Sima had been working with Gilbert in the lab and ran close to a hundred genetic mutations on DNA molecules, hoping to see some sort of result that would better explain how the replication process worked. At a point of frustration, Gilbert said they came up with a "hail Mary" attempt.
Gilbert and Sima examined a single segment of the DNA in the highest possible 3D resolution and saw three sequences along the DNA molecule touching each other frequently. The researchers then used CRISPR, a sophisticated gene editing technology, to remove these three areas simultaneously.
And with that, they found that these three elements together were the key to DNA replication.
"Removing these elements shifted the segment's replication time from the very beginning to the very end of the process," Gilbert said. "This was one of those moments where just one result knocks your socks off."
In addition to the effect on replication timing, the removal of the three elements caused the 3D structure of the DNA molecule to change dramatically.
"We have for the first time pinpointed specific DNA sequences in the genome that regulate chromatin structure and replication timing," Sima said. "These results reflect one possible model of how DNA folds inside cells and how these folding patterns could impact the hereditary materials' function."
Greater understanding of how DNA replication is regulated opens new paths of research in genetics. When replication timing is altered -- as it was in Gilbert and Sima's experiment -- it can completely change how the genetic information of a cell is interpreted.
This could become crucial information as scientists tackle complicated diseases where the replication timing is disrupted.
Read more at Science Daily
Newborn insects trapped in amber show first evidence of how to crack an egg
Four complete Tragychrysa ovoruptora newborns preserved together with egg shell remains and one visible egg burster (right inset). |
One of the earliest and toughest trials that all organisms face is birth. The new findings give scientists evidence on how tiny insects broke the barrier separating them from life and took their first steps into an ancient forest.
Trapped together inside 130 million-year-old Lebanese amber, or fossilised resin, researchers found several green lacewing newborn larvae, the split egg shells from where they hatched, and the minute structures the hatchlings used to crack the egg, known as egg bursters. The discovery is remarkable because no definitive evidence of these specialised structures had been reported from the fossil record of egg-laying animals, until now.
The fossil newborns have been described as the new species Tragichrysa ovoruptora, meaning 'egg breaking' and 'tragic green lacewing', after the fact that multiple specimens were ensnared and entombed in the resin simultaneously.
"Egg-laying animals such as many arthropods and vertebrates use egg bursters to break the egg surface during hatching; a famous example is the 'egg tooth' on the beak of newborn chicks," explains Dr Ricardo Pérez-de la Fuente, a researcher at Oxford University Museum of Natural History and lead author of the work. "Egg bursters are diverse in shape and location. Modern green lacewing hatchlings split the egg with a 'mask' bearing a jagged blade. Once used, this 'mask' is shed and left attached to the empty egg shell, which is exactly what we found in the amber together with the newborns."
Green lacewing larvae are small hunters which often carry debris as camouflage, and use sickle-shaped jaws to pierce and suck the fluids of their prey. Although the larvae trapped in amber differ significantly from modern-day relatives, in that they possess long tubes instead of clubs or bumps for holding debris, the studied egg shells and egg bursters are remarkably similar to those of today's green lacewings. Altogether, they provide the full picture of how these fossil insects hatched like their extant counterparts, about 130 million years ago during the Early Cretaceous.
"The process of hatching is ephemeral and the structures that make it possible tend to disappear quickly once egg-laying animals hatch, so obtaining fossil evidence of them is truly exceptional," remarks Dr Michael S. Engel, a co-author of the study from the University of Kansas.
The Tragichrysa ovoruptora larvae were almost certainly trapped by resin while clutching the eggs from which they had freshly emerged. Such behaviour is common among modern relatives while their body hardens and their predatory jaws become functional. The two mouthparts forming the jaws are not interlocked in most of the fossil larvae, which further suggests that they were recently born.
All the preparations studied were obtained from the same amber piece and are as thin as a pinhead, allowing a detailed account of the fossils and finding the tiny egg bursters, according to Dr Dany Azar, another co-author of the work, from the Lebanese University, who discovered and prepared the studied amber samples.
It would seem reasonable to assume that traits controlling a life event as crucial as hatching would have remained quite stable during evolution. However, as Dr Enrique Peñalver of the Spanish Geological Survey (IGME; Geomining Museum) and co-author of the work explains: "There are known instances in modern insects where closely related groups, even down to the species level, show different means of hatching that can entail the loss of egg bursters. So, the long-term stability of a hatching mechanism in a given animal lineage cannot be taken for granted."
Read more at Science Daily
Dec 27, 2018
Howler monkey study examines mechanisms of new species formation
A new University of Michigan study of interbreeding between two species of howler monkeys in Mexico is yielding insights into the forces that drive the evolution of new species.
How do new species emerge in nature? One common but overly simplified version of the story goes like this: A population of animals or plants becomes geographically isolated -- by a river that changes course or a mountain range that rises up, for example -- and the two separated groups accumulate genetic differences over time as they adapt to their environments in isolation.
Eventually, the DNA of the two groups is so different that the two populations are considered distinct species. Voilà, speciation has occurred.
In reality, the process is much more complex than that. While geographic isolation can start the speciation process, evolutionary biologists believe that other forces -- including various forms of natural selection -- can help to complete it.
The new U-M study provides rare empirical evidence that multiple forms of natural selection, including a contentious one called reinforcement, are helping to complete the speciation process in a natural howler monkey "hybrid zone," a place where the two species coexist and occasionally interbreed in a process called hybridization.
The study is scheduled for online publication Dec. 22 in the journal Molecular Ecology. In the paper, the researchers use the primate hybrid zone to identify parts of the genome that are likely to contain genes underlying speciation and to test for signals of the selection forces that shaped them.
"We observed patterns in the genetic data suggesting that hybridization is playing a direct role in completing the speciation process by enhancing genetic differences between species," said U-M doctoral candidate Marcella Baiz, the study's first author. The other authors are Liliana Cortés-Ortiz and Priscilla Tucker of the U-M Department of Ecology and Evolutionary Biology.
"We found a signal for multiple forms of natural selection driving species differences, including reinforcement, a process that has been highly debated," Baiz said. "This result is particularly notable because empirical evidence for reinforcement is extremely rare, especially genetic evidence."
The two species at the center of the study, mantled howler monkeys and black howler monkeys, diverged about 3 million years ago and lived apart until relatively recently when they came into contact again -- perhaps within the last 10,000 years -- in a roughly 12-mile-wide hybrid zone in the southeastern Mexican state of Tabasco.
A species was once defined as a group of actually or potentially interbreeding individuals that are reproductively isolated from other such groups. The concept of reproductive isolation is key to that definition and means that despite any hybridization, true species maintain their uniqueness.
However, the modern view of what a species is does not require full reproductive isolation, and hybridization has been discovered to be quite common in nature.
At the howler monkey hybrid zone in Mexico where U-M's Cortés-Ortiz and her colleagues have worked for about two decades, analysis of DNA samples has confirmed that black and mantled howler monkeys interbreed and produce hybrid offspring. The fact that hybridization is occurring between the two groups means that reproductive isolation is incomplete.
Evolutionary biologists believe that various natural selection pressures can help complete the process by strengthening barriers to gene flow between two groups, pushing them toward full reproductive isolation.
And because natural selection favors organisms that successfully reproduce over those that don't, it is biased against hybrids, which sometimes die before reproducing or are simply incapable of reproducing.
Natural selection tries to block the formation of these "unfit" hybrids. One way to do that is to gradually increase the genetic differences between two groups of organisms -- in this case black and mantled howler monkeys -- so that it's more difficult for them to mate and to produce hybrid offspring.
While working to thwart the formation of hybrids in this way, natural selection strengthens reproductive isolation by increasing genetic differences. This process is called reinforcement; while the idea has been around for more than a century, empirical evidence to support it is scarce.
To test for the presence of reinforcement, Baiz and her colleagues compared the DNA of black and mantled howler monkeys living the Tabasco hybrid zone to the DNA of black and mantled howler monkeys living far from the hybrid zone.
If reinforcement is working to thwart hybridization and to strengthen reproductive isolation, then the genetic differences between the two species in the hybrid zone should be greater than the genetic differences between monkeys of these two species living on either side of the hybrid zone.
And that's exactly what Baiz and her colleagues found when they compared genetic markers that are at or near genes likely associated with reproductive isolation.
Read more at Science Daily
How do new species emerge in nature? One common but overly simplified version of the story goes like this: A population of animals or plants becomes geographically isolated -- by a river that changes course or a mountain range that rises up, for example -- and the two separated groups accumulate genetic differences over time as they adapt to their environments in isolation.
Eventually, the DNA of the two groups is so different that the two populations are considered distinct species. Voilà, speciation has occurred.
In reality, the process is much more complex than that. While geographic isolation can start the speciation process, evolutionary biologists believe that other forces -- including various forms of natural selection -- can help to complete it.
The new U-M study provides rare empirical evidence that multiple forms of natural selection, including a contentious one called reinforcement, are helping to complete the speciation process in a natural howler monkey "hybrid zone," a place where the two species coexist and occasionally interbreed in a process called hybridization.
The study is scheduled for online publication Dec. 22 in the journal Molecular Ecology. In the paper, the researchers use the primate hybrid zone to identify parts of the genome that are likely to contain genes underlying speciation and to test for signals of the selection forces that shaped them.
"We observed patterns in the genetic data suggesting that hybridization is playing a direct role in completing the speciation process by enhancing genetic differences between species," said U-M doctoral candidate Marcella Baiz, the study's first author. The other authors are Liliana Cortés-Ortiz and Priscilla Tucker of the U-M Department of Ecology and Evolutionary Biology.
"We found a signal for multiple forms of natural selection driving species differences, including reinforcement, a process that has been highly debated," Baiz said. "This result is particularly notable because empirical evidence for reinforcement is extremely rare, especially genetic evidence."
The two species at the center of the study, mantled howler monkeys and black howler monkeys, diverged about 3 million years ago and lived apart until relatively recently when they came into contact again -- perhaps within the last 10,000 years -- in a roughly 12-mile-wide hybrid zone in the southeastern Mexican state of Tabasco.
A species was once defined as a group of actually or potentially interbreeding individuals that are reproductively isolated from other such groups. The concept of reproductive isolation is key to that definition and means that despite any hybridization, true species maintain their uniqueness.
However, the modern view of what a species is does not require full reproductive isolation, and hybridization has been discovered to be quite common in nature.
At the howler monkey hybrid zone in Mexico where U-M's Cortés-Ortiz and her colleagues have worked for about two decades, analysis of DNA samples has confirmed that black and mantled howler monkeys interbreed and produce hybrid offspring. The fact that hybridization is occurring between the two groups means that reproductive isolation is incomplete.
Evolutionary biologists believe that various natural selection pressures can help complete the process by strengthening barriers to gene flow between two groups, pushing them toward full reproductive isolation.
And because natural selection favors organisms that successfully reproduce over those that don't, it is biased against hybrids, which sometimes die before reproducing or are simply incapable of reproducing.
Natural selection tries to block the formation of these "unfit" hybrids. One way to do that is to gradually increase the genetic differences between two groups of organisms -- in this case black and mantled howler monkeys -- so that it's more difficult for them to mate and to produce hybrid offspring.
While working to thwart the formation of hybrids in this way, natural selection strengthens reproductive isolation by increasing genetic differences. This process is called reinforcement; while the idea has been around for more than a century, empirical evidence to support it is scarce.
To test for the presence of reinforcement, Baiz and her colleagues compared the DNA of black and mantled howler monkeys living the Tabasco hybrid zone to the DNA of black and mantled howler monkeys living far from the hybrid zone.
If reinforcement is working to thwart hybridization and to strengthen reproductive isolation, then the genetic differences between the two species in the hybrid zone should be greater than the genetic differences between monkeys of these two species living on either side of the hybrid zone.
And that's exactly what Baiz and her colleagues found when they compared genetic markers that are at or near genes likely associated with reproductive isolation.
Read more at Science Daily
NASA telescopes take a close look at the brightest comet of 2018
NASA's Hubble Space Telescope photographed comet 46P/Wirtanen on Dec. 13, when the comet was 7.4 million miles (12 million kilometers) from Earth. In this visible light image, the comet's nucleus is hidden in the center of a fuzzy glow from the comet's coma. The coma is a cloud of gas and dust that the comet has ejected during its pass through the inner solar system due to heating from the Sun. To make this composite image, the color blue was applied to high-resolution grayscale exposures acquired from the spacecraft's WFC3 instrument.
The inner part of a comet's coma is normally not accessible from Earth. The close fly-by of comet 46P/Wirtanen allowed astronomers to study it in detail. They combined the unique capabilities of Hubble, NASA's Chandra X-ray Observatory, and the Neil Gehrels Swift Observatory to study how gases are released from the nucleus, what the comet's ices are composed of, and how gas in the coma is chemically altered by sunlight and solar radiation.
NASA's Stratospheric Observatory for Infrared Astronomy, SOFIA, took this image of the comet on Dec. 16 and 17 when the aircraft was flying at 40,000 feet.
Comets and asteroids may be the source of Earth's water. SOFIA is studying the chemical fingerprints of different types of hydrogen in the comet's water, which will help us learn about the origins and history of water in the solar system -- including Earth's oceans.
The SOFIA image was taken with the telescope's visible light guide camera, using an orange filter to indicate the intensity of light relative to other objects. SOFIA's observations using infrared light to study the comet's water are now under analysis.
Comet 46P/Wirtanen made its closest approach to Earth on Dec. 16, when it passed just over 7 million miles (11 million kilometers) from our planet, about 30 times farther away than the Moon. Although its close approach is valuable for making science observations from Earth, and it is the brightest comet of 2018, 46P/Wirtanen is only barely visible to the unaided eye even where the sky is very dark. It is best viewed through binoculars or a telescope.
Backyard observers can currently find the comet near the constellation Taurus though with the challenge of added light from the Moon, but it will continue to be viewable in the weeks to come. Finder charts and other information are available at the Comet Wirtanen Observing Campaign website.
Comet 46P/Wirtanen orbits the Sun once every 5.4 years, much quicker than the 75-year orbit of the more famous Comet Halley. Most of its passes through the inner solar system are much farther from Earth, making this year's display particularly notable.
The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Maryland, conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, in Washington, D.C.
Read more at Science Daily
Holiday asteroid imaged with NASA radar
The asteroid will fly safely past Earth on Saturday, Dec. 22, at a distance of about 1.8 million miles (2.9 million kilometers). This will be the asteroid's closest approach in more than 400 years and the closest until 2070, when the asteroid will safely approach Earth slightly closer.
The radar images reveal an asteroid with a length of at least one mile (1.6 kilometers) and a shape similar to that of the exposed portion of a hippopotamus wading in a river. They were obtained Dec. 15-17 by coordinating the observations with NASA's 230-foot (70-meter) antenna at the Goldstone Deep Space Communications Complex in California, the National Science Foundation's 330-foot (100-meter) Green Bank Telescope in West Virginia and the Arecibo Observatory's 1,000-foot (305-meter) antenna in Puerto Rico.
The Green Bank Telescope was the receiver for the powerful microwave signals transmitted by either Goldstone or the NASA-funded Arecibo planetary radar in what is known as a "bistatic radar configuration." Using one telescope to transmit and another to receive can yield considerably more detail than would one telescope, and it is an invaluable technique to obtain radar images of closely approaching, slowly rotating asteroids like this one.
"The radar images achieve an unprecedented level of detail and are comparable to those obtained from a spacecraft flyby," said Lance Benner of the Jet Propulsion Laboratory in Pasadena, California, and the scientist leading the observations from Goldstone. "The most conspicuous surface feature is a prominent ridge that appears to wrap partway around the asteroid near one end. The ridge extends about 330 feet [100 meters] above the surrounding terrain. Numerous small bright spots are visible in the data and may be reflections from boulders. The images also show a cluster of dark, circular features near the right edge that may be craters."
The images confirm what was seen in earlier "light curve" measurements of sunlight reflected from the asteroid and from earlier radar images by Arecibo: 2003 SD220 has an extremely slow rotation period of roughly 12 days. It also has what seems to be a complex rotation somewhat analogous to a poorly thrown football. Known as "non-principal axis" rotation, it is uncommon among near-Earth asteroids, most of which spin about their shortest axis.
With resolutions as fine as 12 feet (3.7 meters) per pixel, the detail of these images is 20 times finer than that obtained during the asteroid's previous close approach to Earth three years ago, which was at a greater distance. The new radar data will provide important constraints on the density distribution of the asteroid's interior -- information that is available on very few near-Earth asteroids.
"This year, with our knowledge about 2003 SD220's slow rotation, we were able to plan out a great sequence of radar images using the largest single-dish radio telescopes in the nation," said Patrick Taylor, senior scientist with Universities Space Research Association (USRA) at the Lunar and Planetary Institute (LPI) in Houston.
"The new details we've uncovered, all the way down to 2003 SD220's geology, will let us reconstruct its shape and rotation state, as was done with Bennu, target of the OSIRIS-REx mission," said Edgard Rivera-Valentín, USRA scientist at LPI. "Detailed shape reconstruction lets us better understand how these small bodies formed and evolved over time."
Patrick Taylor led the bistatic radar observations with Green Bank Observatory, home of the Green Bank Telescope, the world's largest fully steerable radio telescope. Rivera-Valentín will be leading the shape reconstruction of 2003 SD220 and led the Arecibo Observatory observations.
Asteroid 2003 SD220 was discovered on Sept. 29, 2003, by astronomers at the Lowell Observatory Near-Earth-Object Search (LONEOS) in Flagstaff, Arizona -- an early Near-Earth Object (NEO) survey project supported by NASA that is no longer in operation. It is classified as being a "potentially hazardous asteroid" because of its size and close approaches to Earth's orbit. However, these radar measurements further refine the understanding of 2003 SD220's orbit, confirming that it does not pose a future impact threat to Earth.
Read more at Science Daily
The coolest experiment in the universe
The International Space Station, shown here in 2018, is home to many scientific experiments, including NASA's Cold Atom Laboratory. |
The Cold Atom Lab (CAL) is the first facility in orbit to produce clouds of "ultracold" atoms, which can reach a fraction of a degree above absolute zero: -459ºF (-273ºC), the absolute coldest temperature that matter can reach. Nothing in nature is known to hit the temperatures achieved in laboratories like CAL, which means the orbiting facility is regularly the coldest known spot in the universe.
NASA's Cold Atom Laboratory on the International Space Station is regularly the coldest known spot in the universe. But why are scientists producing clouds of atoms a fraction of a degree above absolute zero? And why do they need to do it in space? Quantum physics, of course.
Seven months after its May 21, 2018, launch to the space station from NASA's Wallops Flight Facility in Virginia, CAL is producing ultracold atoms daily. Five teams of scientists will carry out experiments on CAL during its first year, and three experiments are already underway.
Why cool atoms to such an extreme low? Room-temperature atoms typically zip around like hyperactive hummingbirds, but ultracold atoms move much slower than even a snail. Specifics vary, but ultracold atoms can be more than 200,000 times slower than room-temperature atoms. This opens up new ways to study atoms as well as new ways to use them for investigations of other physical phenomena. CAL's primary science objective is to conduct fundamental physics research -- to try to understand the workings of nature at the most fundamental levels.
"With CAL we're starting to get a really thorough understanding of how the atoms behave in microgravity, how to manipulate them, how the system is different than the ones we use on Earth," said Rob Thompson, a cold atom physicist at NASA's Jet Propulsion Laboratory in Pasadena, California, and the mission scientist for CAL. "This is all knowledge that is going to build a foundation for what I hope is a long future of cold atom science in space."
Laboratories on Earth can produce ultracold atoms, but on the ground, gravity pulls on the chilled atom clouds and they fall quickly, giving scientists only fractions of a second to observe them. Magnetic fields can be used to "trap" the atoms and hold them still, but that restricts their natural movement. In microgravity, the cold atom clouds float for much longer, giving scientists an extended view of their behavior.
The process to create the cold atom clouds starts with lasers that begin to lower the temperature by slowing the atoms down. Radio waves cut away the warmest members of the group, further lowering the average temperature. Finally, the atoms are released from a magnetic trap and allowed to expand. This causes a drop in pressure that, in turn, naturally causes another drop in the cloud's temperature (the same phenomenon that causes a can of compressed air to feel cold after use). In space, the cloud has longer to expand and thus reach even lower temperatures than what can be achieved on Earth -- down to about one ten billionth of a degree above absolute zero, perhaps even lower.
Ultracold atom facilities on Earth typically occupy an entire room, and in most, the hardware is left exposed so that scientists can adjust the apparatus if need be. Building a cold atom laboratory for space posed several design challenges, some of which change the fundamental nature of these facilities. First, there was the matter of size: CAL flew to the station in two pieces -- a metal box a little larger than a minifridge and a second one about the size of a carry-on suitcase. Second, CAL was designed to be operated remotely from Earth, so it was built as a fully enclosed facility.
CAL also features a number of technologies that have never been flown in space before, such as specialized vacuum cells that contain the atoms, which have to be sealed so tightly that almost no stray atoms can leak in. The lab needed to be able to withstand the shaking of launch and extreme forces experienced during the flight to the space station. It took the teams several years to develop unique hardware that could meet the precise needs for cooling atoms in space.
"Several parts of the system required redesigning, and some parts broke in ways we'd never seen before," said Robert Shotwell, chief engineer for JPL's Astronomy, Physics and Space Technology Directorate and CAL project manager. "The facility had to be completely torn apart and reassembled three times."
All the hard work and problem solving since the mission's inception in 2012 turned the CAL team's vision into reality this past May. CAL team members talked via live video with astronauts Ricky Arnold and Drew Feustel aboard the International Space Station for the installation of the Cold Atom Laboratory, the second ultracold atom facility ever operated in space, the first to reach Earth orbit and the first to remain in space for more than a few minutes. Along the way, CAL has also met the minimum requirements NASA set to deem the mission a success and is providing a unique tool for probing nature's mysteries.
Read more at Science Daily
Dec 24, 2018
Merry Christmas and happy hollidays
From A Magical Journey I wish you all merry christmas and happy hollidays.
Spend some time with you loved ones and think about other who are less fortunate. Help a homeless or donate money to charity or science, there's so many thing you can do to spread some human love.
Danny from A Magical Journey
Spend some time with you loved ones and think about other who are less fortunate. Help a homeless or donate money to charity or science, there's so many thing you can do to spread some human love.
Danny from A Magical Journey
Dec 23, 2018
Human mortality 'plateau' may be statistical error, not hint of immortality
Human error, not human biology, largely accounts for the apparent decline of mortality among the very old, according to a new report publishing on December 20 in the open-access journal PLOS Biology by Saul Newman of Australia National University in Canberra. The result casts doubt on the hypothesis that human longevity can be greatly extended beyond current limits.
As we age through adulthood, the probability of dying increases year after year. But studies in multiple species, including humans, have suggested that, at the far end of the lifespan, the rate of increase slows, or even plateaus. Biological explanations for such late-life mortality deceleration have been developed, but are controversial, and a role for statistical error has also been proposed.
In the new report, Newman shows that a variety of errors, individually and combined, have the effect of producing a slowing of apparent mortality at the end of the lifespan, and can largely explain away the observed trends. Categories of error include those in demographic sampling, birth and death records, age reporting, and others.
For instance, random errors in reporting of age within a population will result in some younger individuals being mistakenly recorded as older, and vice versa. As this population ages, older individuals mistakenly recorded as younger will die earlier than expected, but those mistakenly recorded as older will die later, enriching the pool of very old individuals and flattening the mortality curve.
Newman found that an error rate of as low as one in ten thousand would be sufficient to produce the observed declines in apparent age-related mortality. Furthermore, he was able to show that an improvement in data quality in large population studies corresponded with a reduction in late-life mortality deceleration.
"These findings suggest that human late-life mortality plateaus are largely, if not entirely, artefacts of error processes," Newman concludes. The finding has important consequences for understanding human longevity, since predictions that lifespan can be greatly increased have depended in part on the apparent decelerations and plateaus previously reported in the biological and demographic literature.
In a separate short paper, Newman asked whether such errors might even explain away the late-life mortality plateau reported in a recent high-profile paper published in Science Magazine earlier this year by Elisabetta Barbi, Kenneth Wachter and colleagues -- that paper used a high-quality dataset of nearly 4,000 death records from Italy to show that death rates decelerate after the age of 80 and plateau after 105. Newman calculates that this apparent effect could still be down to plausible error rates in record-keeping. In a response to this, Wachter defends the quality of their dataset, and describes Newman's proposed error rate as "wildly implausibly high."
Read more at Science Daily
As we age through adulthood, the probability of dying increases year after year. But studies in multiple species, including humans, have suggested that, at the far end of the lifespan, the rate of increase slows, or even plateaus. Biological explanations for such late-life mortality deceleration have been developed, but are controversial, and a role for statistical error has also been proposed.
In the new report, Newman shows that a variety of errors, individually and combined, have the effect of producing a slowing of apparent mortality at the end of the lifespan, and can largely explain away the observed trends. Categories of error include those in demographic sampling, birth and death records, age reporting, and others.
For instance, random errors in reporting of age within a population will result in some younger individuals being mistakenly recorded as older, and vice versa. As this population ages, older individuals mistakenly recorded as younger will die earlier than expected, but those mistakenly recorded as older will die later, enriching the pool of very old individuals and flattening the mortality curve.
Newman found that an error rate of as low as one in ten thousand would be sufficient to produce the observed declines in apparent age-related mortality. Furthermore, he was able to show that an improvement in data quality in large population studies corresponded with a reduction in late-life mortality deceleration.
"These findings suggest that human late-life mortality plateaus are largely, if not entirely, artefacts of error processes," Newman concludes. The finding has important consequences for understanding human longevity, since predictions that lifespan can be greatly increased have depended in part on the apparent decelerations and plateaus previously reported in the biological and demographic literature.
In a separate short paper, Newman asked whether such errors might even explain away the late-life mortality plateau reported in a recent high-profile paper published in Science Magazine earlier this year by Elisabetta Barbi, Kenneth Wachter and colleagues -- that paper used a high-quality dataset of nearly 4,000 death records from Italy to show that death rates decelerate after the age of 80 and plateau after 105. Newman calculates that this apparent effect could still be down to plausible error rates in record-keeping. In a response to this, Wachter defends the quality of their dataset, and describes Newman's proposed error rate as "wildly implausibly high."
Read more at Science Daily
Bees can count with small number of nerve cells in their brains, research suggests
A bumblebee choosing between two patterns containing different numbers of yellow circles. |
In order to understand how bees count, the researchers simulated a very simple miniature 'brain' on a computer with just four nerve cells -- far fewer than a real bee has.
The 'brain' could easily count small quantities of items when inspecting one item closely and then inspecting the next item closely and so on, which is the same way bees count. This differs from humans who glance at all the items and count them together.
In this study, published in the journal iScience, the researchers propose that this clever behaviour makes the complex task of counting much easier, allowing bees to display impressive cognitive abilities with minimal brainpower.
Previous studies have shown bees can count up to four or five items, can choose the smaller or the larger number from a group and even choose 'zero' against other numbers when trained to choose 'less'.
They might have achieved this not by understanding numerical concepts, but by using specific flight movements to closely inspect items which then shape their visual input and simplifies the task to the point where it requires minimal brainpower.
This finding demonstrates that the intelligence of bees, and potentially other animals, can be mediated by very small nerve cells numbers, as long as these are wired together in the right way.
The study could also have implications for artificial intelligence because efficient autonomous robots will need to rely on robust, computationally inexpensive algorithms, and could benefit from employing insect-inspired scanning behaviours.
Lead author Dr Vera Vasas, from Queen Mary University of London, said: "Our model shows that even though counting is generally thought to require high intelligence and large brains, it can be easily done with the smallest of nerve cell circuits connected in the right manner. We suggest that using specific flight movements to scan targets, rather than numerical concepts, explains the bees' ability to count. This scanning streamlines the visual input and means a task like counting requires little brainpower.
"Careful examination of the actual inspection strategies used by animals might reveal that they often employ active scanning behaviours as shortcuts to simplify complex visual pattern discrimination tasks. Hopefully, our work will inspire others to look more closely not just at what cognitive tasks animals can solve, but also at how they are solving them."
Brain size matters a lot when it comes to bees. They have only one million nerve cells in total, so they have precious little brainpower, and must implement very efficient computational algorithms to solve tasks. In comparison, humans have 86 billion nerve cells which are responsible for receiving information and sending commands.
To model the input to the brain, the authors analysed the point of view of a bee as it flies close to the countable objects and inspects them one-by-one.
The results showed the simulated brain was able to make reliable estimates on the number of items on display when provided with the actual visual input that the bee is receiving while carrying out the task.
Read more at Science Daily
Subscribe to:
Posts (Atom)