Jan 5, 2019

Stopping cancer from recruiting immune system double agents

Cancerous tumors trick myeloid cells, an important part of the immune system, into perceiving them as a damaged part of the body; the tumors actually put myeloid cells to work helping them grow and metastasize (spread). A research team co-led by scientists at Rush University Medical Center have discovered a potential therapy that can disrupt this recruitment and abnormal function of myeloid cells in laboratory mice. The findings of their latest study were published on Dec. 19 in Nature Communications.

Myeloid cells are a type of white blood cell that kills invaders like bacteria and cancer. "In the cancer context, myeloid cells promote tumor growth and suppress the activity of T cells," says Vineet Gupta, PhD, professor and vice chairperson for research and innovation in the Department of Internal Medicine at Rush Medical College.

Gupta and Judith Varner, PhD, of the Moores Cancer Center at the University of California at San Diego (UCSD) in California, were co-senior corresponding authors on the study. Samia Khan, PhD, a post-doctoral fellow in Gupta's lab; and Michael Schmid, PhD, and Megan Kaneda, PhD, with UCSD, were co-first authors.

Cancer recruit harmful myeloid cells, while suppressing helpful ones

Gupta's research focuses in general on integrins, a type of protein that are cellular receptors and regulate a number of biological processes. In this study, investigators looked at the integrin CD11b, which I present on myeloid cells and normally helps myeloid cell migration and its ability to fight disease.

Here, the researchers found that CD11b promotes development of myeloid cells into one sub-type, the M1 macrophage, that functions appropriately to suppress tumor growth. However, the tumors often suppress CD11b activity that results in development of the myeloid cells into a different type of cell, the M2 macrophage. These cells actually ward off T cells, which are vital to fighting disease, and M2s also secrete growth factors and promote the development of new blood vessels that allow cancer to grow and metastasize.

In previous research, agents developed to activate T cells have been "extremely effective in controlling tumor growth," Gupta says, but that approach, known as immunotherapy, has not resulted in a universal treatment for cancer. The hunt goes on.

Study found that CD11b is critical in regulating myeloid cells

In this study, the team explored how modifying the activity of CD11b affects myeloid cell behavior in the presence of cancer and if that could be used as a novel strategy to treat cancers. Using a small molecule discovered in the Gupta laboratory, Leukadherin-1 (LA-1), which in the body activates CD11b, the researchers developed a therapy that can boost the function of CD11b to promote the disease-fighting M1 type of myeloid cells, helping create a microenvironment at the tumor site where T cells can enter and attack the cancer.

The study used two types of genetically altered mice. One set of experiments was done with otherwise normal mice that lacked CD11b. Transplanted tumors grew much larger in those mice as compared to the tumors in wild-type (normal) mice, suggesting that CD11b restrains tumor growth.

Exploring further the reason for this difference, the team found that CD11b plays a critical role in regulating the polarization of myeloid cells into M1 or M2 macrophages. In the absence of CD11b, most of the myeloids cells in tumors were the M2 sub-type, that help the tumor grow and spread.

Boosting CD11b activity helped reduce tumor growth

In a different experiment, the team used LA-1 to boost CD11b activity beyond its normal levels in wild-type (normal) mice, and discovered that this increase caused a significant reduction in tumor growth in treated animals. Then, to be sure that their pharmacological intervention was directly due to the effects of LA-1 on CD11b, they created a mouse with a "point mutation" (a genetic mutation at a single residue in the CD11b protein sequence) and created a situation in which CD11b was active all the time (which is usually isn't) in the genetically modified animals.

"The boost in CD11b activity in the mouse with the point mutation mimics the one imparted on CD11b in normal mice with administration of LA-1," Gupta says. "The results were the same," Gupta says. In both cases, the tumors shrank dramatically, suggesting CD11b activation as an novel target for cancer immunotherapy.

In this study, LA-1 showed a great deal of promise, although, Gupta says it will be years before a treatment based on this molecule becomes available to patients. The results are very encouraging and will continue to motivate the team to move this novel approach towards developing treatments for patients.

Read more at Science Daily

One in 10 adults in US has food allergy, but nearly 1 in 5 think they do

Over 10 percent of adults in the U.S. -- over 26 million -- are estimated to have food allergy, according to a study published in JAMA Network Open that was led by Ruchi Gupta, MD, MPH, from Ann & Robert H. Lurie Children's Hospital of Chicago and Northwestern University. However, researchers found that 19 percent of adults think they are currently food allergic, although their reported symptoms are inconsistent with a true food allergy, which can trigger a life-threatening reaction. Results are based on a nationally representative survey of over 40,000 adults.

"While we found that one in 10 adults have food allergy, nearly twice as many adults think that they are allergic to foods, while their symptoms may suggest food intolerance or other food related conditions," says lead author Ruchi Gupta, MD, MPH, from Lurie Children's, who also is a Professor of Pediatrics at Northwestern University Feinberg School of Medicine. "It is important to see a physician for appropriate testing and diagnosis before completely eliminating foods from the diet. If food allergy is confirmed, understanding the management is also critical, including recognizing symptoms of anaphylaxis and how and when to use epinephrine."

Researchers discovered that only half of adults with convincing food allergy had a physician-confirmed diagnosis, and less than 25 percent reported a current epinephrine prescription.

Researchers also found that nearly half of food-allergic adults developed at least one of their food allergies as an adult.

"We were surprised to find that adult-onset food allergies were so common," says Dr. Gupta. "More research is needed to understand why this is occurring and how we might prevent it."

The study data indicate that the most prevalent food allergens among U.S. adults are shellfish (affecting 7.2 million adults), milk (4.7 million), peanut (4.5 million), tree nut (3 million), fin fish (2.2 million), egg (2 million), wheat (2 million), soy (1.5 million), and sesame (.5 million).

Read more at Science Daily

Jan 4, 2019

Dark matter on the move

Scientists have found evidence that dark matter can be heated up and moved around, as a result of star formation in galaxies. The findings provide the first observational evidence for the effect known as 'dark matter heating', and give new clues as to what makes up dark matter. The research is published today in the journal Monthly Notices of the Royal Astronomical Society.

In the new work, scientists from the University of Surrey, Carnegie Mellon University and ETH Zürich set out to hunt for evidence for dark matter at the centres of nearby dwarf galaxies. Dwarf galaxies are small, faint galaxies that are typically found orbiting larger galaxies like our own Milky Way. They may hold clues that could help us to better understand the nature of dark matter.

Dark matter is thought to make up most of the mass of the universe. However since it doesn't interact with light in the same way as normal matter, it can only be observed through its gravitational effects. The key to studying it may however lie in how stars are formed in these galaxies.

When stars form, strong winds can push gas and dust away from the heart of the galaxy. As a result, the galaxy's centre has less mass, which affects how much gravity is felt by the remaining dark matter. With less gravitational attraction, the dark matter gains energy and migrates away from the centre, an effect called 'dark matter heating'.

The team of astrophysicists measured the amount of dark matter at the centres of 16 dwarf galaxies with very different star formation histories. They found that galaxies that stopped forming stars long ago had higher dark matter densities at their centres than those that are still forming stars today. This supports the theory that the older galaxies had less dark matter heating.

Professor Justin Read, lead author of the study and Head of the Department of Physics at the University of Surrey, said: "We found a truly remarkable relationship between the amount of dark matter at the centres of these tiny dwarfs, and the amount of star formation they have experienced over their lives. The dark matter at the centres of the star-forming dwarfs appears to have been 'heated up' and pushed out."

The findings provide a new constraint on dark matter models: dark matter must be able to form dwarf galaxies that exhibit a range of central densities, and those densities must relate to the amount of star formation.

Professor Matthew Walker, a co-author from Carnegie Mellon University, added: "This study may be the "smoking gun" evidence that takes us a step closer to understanding what dark matter is. Our finding that it can be heated up and moved around helps to motivate searches for a dark matter particle."

Read more at Science Daily

The long memory of the Pacific Ocean

Cold waters that sank in polar regions hundreds of years ago during the Little Ice Age are still impacting deep Pacific Ocean temperature trends. While the deep Pacific temperature trends are small, they represent a large amount of energy in the Earth system.
The ocean has a long memory. When the water in today's deep Pacific Ocean last saw sunlight, Charlemagne was the Holy Roman Emperor, the Song Dynasty ruled China and Oxford University had just held its very first class. During that time, between the 9th and 12th centuries, the earth's climate was generally warmer before the cold of the Little Ice Age settled in around the 16th century. Now ocean surface temperatures are back on the rise but the question is, do the deepest parts of the ocean know that?

Researchers from the Woods Hole Oceanographic Institution (WHOI) and Harvard University have found that the deep Pacific Ocean lags a few centuries behind in terms of temperature and is still adjusting to the entry into the Little Ice Age. Whereas most of the ocean is responding to modern warming, the deep Pacific may be cooling.

"These waters are so old and haven't been near the surface in so long, they still 'remember' what was going on hundreds of years ago when Europe experienced some of its coldest winters in history," said Jake Gebbie, a physical oceanographer at WHOI and lead author of the study published Jan. 4, 2019, in the journal Science.

"Climate varies across all timescales," adds Peter Huybers, Professor of Earth and Planetary Sciences at Harvard University and co-author of the paper. "Some regional warming and cooling patterns, like the Little Ice Age and the Medieval Warm Period, are well known. Our goal was to develop a model of how the interior properties of the ocean respond to changes in surface climate."

What that model showed was surprising.

"If the surface ocean was generally cooling for the better part of the last millennium, those parts of the ocean most isolated from modern warming may still be cooling," said Gebbie.

The model is, of course, a simplification of the actual ocean. To test the prediction, Gebbie and Huybers compared the cooling trend found in the model to ocean temperature measurements taken by scientists aboard the HMS Challenger in the 1870s and modern observations from the World Ocean Circulation Experiment of the 1990s.

The HMS Challenger, a three-masted wooden sailing ship originally designed as a British warship, was used for the first modern scientific expedition to explore the world's ocean and seafloor. During the expedition from 1872 to 1876, thermometers were lowered into the ocean depths and more than 5,000 temperature measurements were logged.

"We screened this historical data for outliers and considered a variety of corrections associated with pressure effects on the thermometer and stretching of the hemp rope used for lowering thermometers," said Huybers.

The researchers then compared the HMS Challenger data to the modern observations and found warming in most parts of the global ocean, as would be expected due to the warming planet over the 20th Century, but cooling in the deep Pacific at a depth of around two kilometers.

"The close correspondence between the predictions and observed trends gave us confidence that this is a real phenomenon," said Gebbie.

These findings imply that variations in surface climate that predate the onset of modern warming still influence how much the climate is heating up today. Previous estimates of how much heat the Earth had absorbed during the last century assumed an ocean that started out in equilibrium at the beginning of the Industrial Revolution. But Gebbie and Huybers estimate that the deep Pacific cooling trend leads to a downward revision of heat absorbed over the 20th century by about 30 percent.

Read more at Science Daily

Next up: Ultracold simulators of super-dense stars

Rice University graduate student Tom Langin makes an adjustment to an experiment that uses 10 lasers of varying wavelengths to laser-cool ions in a neutral plasma.
Rice University physicists have created the world's first laser-cooled neutral plasma, completing a 20-year quest that sets the stage for simulators that re-create exotic states of matter found inside Jupiter and white dwarf stars.

The findings are detailed this week in the journal Science and involve new techniques for laser cooling clouds of rapidly expanding plasma to temperatures about 50 times colder than deep space.

"We don't know the practical payoff yet, but every time physicists have laser cooled a new kind of thing, it has opened a whole world of possibilities," said lead scientist Tom Killian, professor of physics and astronomy at Rice. "Nobody predicted that laser cooling atoms and ions would lead to the world's most accurate clocks or breakthroughs in quantum computing. We do this because it's a frontier."

Killian and graduate students Tom Langin and Grant Gorman used 10 lasers of varying wavelengths to create and cool the neutral plasma. They started by vaporizing strontium metal and using one set of intersecting laser beams to trap and cool a puff of strontium atoms about the size of a child's fingertip. Next, they ionized the ultracold gas with a 10-nanosecond blast from a pulsed laser. By stripping one electron from each atom, the pulse converted the gas to a plasma of ions and electrons.

Energy from the ionizing blast causes the newly formed plasma to expand rapidly and dissipate in less than one thousandth of a second. This week's key finding is that the expanding ions can be cooled with another set of lasers after the plasma is created. Killian, Langin and Gorman describe their techniques in the new paper, clearing the way for their lab and others to make even colder plasmas that behave in strange, unexplained ways.

Plasma is an electrically conductive mix of electrons and ions. It is one of four fundamental states of matter; but unlike solids, liquids and gases, which are familiar in daily life, plasmas tend to occur in very hot places like the surface of the sun or a lightning bolt. By studying ultracold plasmas, Killian's team hopes to answer fundamental questions about how matter behaves under extreme conditions of high density and low temperature.

To make its plasmas, the group starts with laser cooling, a method for trapping and slowing particles with intersecting laser beams. The less energy an atom or ion has, the colder it is, and the slower it moves about randomly. Laser cooling was developed in the 1990s to slow atoms until they are almost motionless, or just a few millionths of a degree above absolute zero.

"If an atom or ion is moving, and I have a laser beam opposing its motion, as it scatters photons from the beam it gets momentum kicks that slow it," Killian said. "The trick is to make sure that light is always scattered from a laser that opposes the particle's motion. If you do that, the particle slows and slows and slows."

During a postdoctoral fellowship at the National Institute of Standards and Technology in Bethesda, Md., in 1999, Killian pioneered the ionization method for creating neutral plasma from a laser-cooled gas. When he joined Rice's faculty the following year, he started a quest for a way to make the plasmas even colder. One motivation was to achieve "strong coupling," a phenomenon that happens naturally in plasmas only in exotic places like white dwarf stars and the center of Jupiter.

"We can't study strongly coupled plasmas in places where they naturally occur," Killian said. "Laser cooling neutral plasmas allows us to make strongly coupled plasmas in a lab, so that we can study their properties"

"In strongly coupled plasmas, there is more energy in the electrical interactions between particles than in the kinetic energy of their random motion," Killian said. "We mostly focus on the ions, which feel each other, and rearrange themselves in response to their neighbors' positions. That's what strong coupling means."

Because the ions have positive electric charges, they repel one another through the same force that makes your hair stand up straight if it gets charged with static electricity.

"Strongly coupled ions can't be near one another, so they try to find equilibrium, an arrangement where the repulsion from all of their neighbors is balanced," he said. "This can lead to strange phenomena like liquid or even solid plasmas, which are far outside our normal experience."

In normal, weakly coupled plasmas, these repulsive forces only have a small influence on ion motion because they're far outweighed by the effects of kinetic energy, or heat.

"Repulsive forces are normally like a whisper at a rock concert," Killian said. "They're drowned out by all the kinetic noise in the system."

In the center of Jupiter or a white dwarf star, however, intense gravity squeezes ions together so closely that repulsive forces, which grow much stronger at shorter distances, win out. Even though the temperature is quite high, ions become strongly coupled.

Killian's team creates plasmas that are orders of magnitude lower in density than those inside planets or dead stars, but by lowering the temperature they raise the ratio of electric-to-kinetic energies. At temperatures as low as one-tenth of a Kelvin above absolute zero, Killian's team has seen repulsive forces take over.

"Laser cooling is well developed in gases of neutral atoms, for example, but the challenges are very different in plasmas," he said.

Read more at Science Daily

Engineers create an inhalable form of messenger RNA

MIT researchers have designed inhalable particles that can deliver messenger RNA. These lung epithelial cells have taken up particles (yellow) that carry mRNA encoding green fluorescent protein.
Messenger RNA, which can induce cells to produce therapeutic proteins, holds great promise for treating a variety of diseases. The biggest obstacle to this approach so far has been finding safe and efficient ways to deliver mRNA molecules to the target cells.

In an advance that could lead to new treatments for lung disease, MIT researchers have now designed an inhalable form of mRNA. This aerosol could be administered directly to the lungs to help treat diseases such as cystic fibrosis, the researchers say.

"We think the ability to deliver mRNA via inhalation could allow us to treat a range of different disease of the lung," says Daniel Anderson, an associate professor in MIT's Department of Chemical Engineering, a member of MIT's Koch Institute for Integrative Cancer Research and Institute for Medical Engineering and Science (IMES), and the senior author of the study.

The researchers showed that they could induce lung cells in mice to produce a target protein -- in this case, a bioluminescent protein. If the same success rate can be achieved with therapeutic proteins, that could be high enough to treat many lung diseases, the researchers say.

Asha Patel, a former MIT postdoc who is now an assistant professor at Imperial College London, is the lead author of the paper, which appears in the Jan. 4 issue of the journal Advanced Materials. Other authors of the paper include James Kaczmarek and Kevin Kauffman, both recent MIT PhD recipients; Suman Bose, a research scientist at the Koch Institute; Faryal Mir, a former MIT technical assistant; Michael Heartlein, the chief technical officer at Translate Bio; Frank DeRosa, senior vice president of research and development at Translate Bio; and Robert Langer, the David H. Koch Institute Professor at MIT and a member of the Koch Institute.

Treatment by inhalation

Messenger RNA encodes genetic instructions that stimulate cells to produce specific proteins. Many researchers have been working on developing mRNA to treat genetic disorders or cancer, by essentially turning the patients' own cells into drug factories.

Because mRNA can be easily broken down in the body, it needs to transported within some kind of protective carrier. Anderson's lab has previously designed materials that can deliver mRNA and another type of RNA therapy called RNA interference (RNAi) to the liver and other organs, and some of these are being further developed for possible testing in patients.

In this study, the researchers wanted to create an inhalable form of mRNA, which would allow the molecules to be delivered directly to the lungs. Many existing drugs for asthma and other lung diseases are specially formulated so they can be inhaled via either an inhaler, which sprays powdered particles of medication, or a nebulizer, which releases an aerosol containing the medication.

The MIT team set out to develop a material that could stabilize RNA during the process of aerosol delivery. Some previous studies have explored a material called polyethylenimine (PEI) for delivering inhalable DNA to the lungs. However, PEI doesn't break down easily, so with the repeated dosing that would likely be required for mRNA therapies, the polymer could accumulate and cause side effects.

To avoid those potential side effects, the researchers turned to a type of positively charged polymers called hyperbranched poly (beta amino esters), which, unlike PEI, are biodegradable.

The particles the team created consist of spheres, approximately 150 nanometers in diameter, with a tangled mixture of the polymer and mRNA molecules that encode luciferase, a bioluminescent protein. The researchers suspended these particles in droplets and delivered them to mice as an inhalable mist, using a nebulizer.

"Breathing is used as a simple but effective delivery route to the lungs. Once the aerosol droplets are inhaled, the nanoparticles contained within each droplet enter the cells and instruct it to make a particular protein from mRNA," Patel says.

The researchers found that 24 hours after the mice inhaled the mRNA, lung cells were producing the bioluminescent protein. The amount of protein gradually fell over time as the mRNA was cleared. The researchers were able to maintain steady levels of the protein by giving the mice repeated doses, which may be necessary if adapted to treat chronic lung disease.

Broad distribution

Further analysis of the lungs revealed that mRNA was evenly distributed throughout the five lobes of the lungs and was taken up mainly by epithelial lung cells, which line the lung surfaces. These cells are implicated in cystic fibrosis, as well as other lung diseases such as respiratory distress syndrome, which is caused by a deficiency in surfactant protein. In her new lab at Imperial College London, Patel plans to further investigate mRNA-based therapeutics.

In this study, the researchers also demonstrated that the nanoparticles could be freeze-dried into a powder, suggesting that it may be possible to deliver them via an inhaler instead of nebulizer, which could make the medication more convenient for patients.

Read more at Science Daily

Jan 3, 2019

Botulinum toxin reduces chronic migraine attacks, compared to placebo

A growing body of evidence supports the effectiveness of botulinum toxin injections in reducing the frequency of chronic migraine headaches, concludes an updated review and analysis in the January issue of Plastic and Reconstructive Surgery®.

Based on meta-analysis of pooled clinical trial data, botulinum toxin is superior to inactive placebo for preventive treatment of migraine, report Prof. Benoit Chaput, MD, PhD, of University Hospital Rangueil, Toulouse, France, and colleagues. "Botulinum toxin is a safe and well-tolerated treatment that should be proposed to patients with migraine," the researchers write.

Assembled Evidence Supports Effectiveness of Botox for Chronic Migraine

Prof. Chaput and colleagues identified and analyzed data from 17 previous randomized trials comparing botulinum toxin with placebo for preventive treatment of migraine headaches. Botulinum toxin -- best known by the brand name Botox -- was approved by the US Food and Drug Administration (FDA) for treatment of chronic migraine in 2010. Since then, a growing number of patients have reported successful results with botulinum toxin injections to alleviate chronic migraine headaches.

The 17 studies included nearly 3,650 patients, about 1,550 of whom had chronic migraine: defined as at least 15 headache attacks per month for more than three months, with migraine symptoms on at least eight days per month. The remaining patients had less-frequent episodic migraine headaches.

On pooled data analysis, botulinum toxin injections significantly reduced the frequency of chronic migraine attacks with. Three months after injection, patients treated with botulinum toxin had an average of 1.6 fewer migraine attacks per month, compared to those treated with inactive placebo.

The improvement was apparent within two months of botulinum toxin treatment. To sustain the effects of treatment, botulinum toxin injections are typically repeated every three months.

There was also a "statistical tendency" toward less-frequent attacks with botulinum toxin in patients with episodic migraine. Again, improvement occurred within two months. Although botulinum toxin had a higher rate of adverse effects compared to placebo, none of these were serious.

The pooled data also showed significant improvement in quality of life in patients treated with botulinum toxin. This improvement was directly linked to a reduction in depressive symptoms. "It can be explained by the reduced impact of headaches and migraine-related disability, thus reducing symptoms of depression and anxiety," Prof. Chaput and coauthors write.

Migraine headaches are an increasingly common condition, leading to significant disability and increased use of healthcare resources. Although botulinum toxin injection for chronic migraine is FDA-approved, there are still conflicting data regarding its effectiveness. The new report provides a comprehensive analysis of the highest-quality evidence to date, including three randomized trials not included in previous reports.

Read more at Science Daily

Scientists engineer shortcut for photosynthetic glitch, boost crop growth 40%

Four unmodified plants (left) grow beside four plants (right) engineered with alternate routes to bypass photorespiration -- an energy-expensive process that costs yield potential. The modified plants are able to reinvest their energy and resources to boost productivity by 40 percent.
Plants convert sunlight into energy through photosynthesis; however, most crops on the planet are plagued by a photosynthetic glitch, and to deal with it, evolved an energy-expensive process called photorespiration that drastically suppresses their yield potential. Researchers from the University of Illinois and U.S. Department of Agriculture Agricultural Research Service report in the journal Science that crops engineered with a photorespiratory shortcut are 40 percent more productive in real-world agronomic conditions.

"We could feed up to 200 million additional people with the calories lost to photorespiration in the Midwestern U.S. each year," said principal investigator Donald Ort, the Robert Emerson Professor of Plant Science and Crop Sciences at Illinois' Carl R. Woese Institute for Genomic Biology. "Reclaiming even a portion of these calories across the world would go a long way to meeting the 21st Century's rapidly expanding food demands -- driven by population growth and more affluent high-calorie diets."

This landmark study is part of Realizing Increased Photosynthetic Efficiency (RIPE), an international research project that is engineering crops to photosynthesize more efficiently to sustainably increase worldwide food productivity with support from the Bill & Melinda Gates Foundation, the Foundation for Food and Agriculture Research (FFAR), and the U.K. Government's Department for International Development (DFID).

Photosynthesis uses the enzyme Rubisco -- the planet's most abundant protein -- and sunlight energy to turn carbon dioxide and water into sugars that fuel plant growth and yield. Over millennia, Rubisco has become a victim of its own success, creating an oxygen-rich atmosphere. Unable to reliably distinguish between the two molecules, Rubisco grabs oxygen instead of carbon dioxide about 20 percent of the time, resulting in a plant-toxic compound that must be recycled through the process of photorespiration.

"Photorespiration is anti-photosynthesis," said lead author Paul South, a research molecular biologist with the Agricultural Research Service, who works on the RIPE project at Illinois. "It costs the plant precious energy and resources that it could have invested in photosynthesis to produce more growth and yield."

Photorespiration normally takes a complicated route through three compartments in the plant cell. Scientists engineered alternate pathways to reroute the process, drastically shortening the trip and saving enough resources to boost plant growth by 40 percent. This is the first time that an engineered photorespiration fix has been tested in real-world agronomic conditions.

"Much like the Panama Canal was a feat of engineering that increased the efficiency of trade, these photorespiratory shortcuts are a feat of plant engineering that prove a unique means to greatly increase the efficiency of photosynthesis," said RIPE Director Stephen Long, the Ikenberry Endowed University Chair of Crop Sciences and Plant Biology at Illinois.

The team engineered three alternate routes to replace the circuitous native pathway. To optimize the new routes, they designed genetic constructs using different sets of promoters and genes, essentially creating a suite of unique roadmaps. They stress tested these roadmaps in 1,700 plants to winnow down the top performers.

Over two years of replicated field studies, they found that these engineered plants developed faster, grew taller, and produced about 40 percent more biomass, most of which was found in 50-percent-larger stems.

The team tested their hypotheses in tobacco: an ideal model plant for crop research because it is easier to modify and test than food crops, yet unlike alternative plant models, it develops a leaf canopy and can be tested in the field. Now, the team is translating these findings to boost the yield of soybean, cowpea, rice, potato, tomato, and eggplant.

"Rubisco has even more trouble picking out carbon dioxide from oxygen as it gets hotter, causing more photorespiration," said co-author Amanda Cavanagh, an Illinois postdoctoral researcher working on the RIPE project. "Our goal is to build better plants that can take the heat today and in the future, to help equip farmers with the technology they need to feed the world."

Read more at Science Daily

Melting ice sheets release tons of methane into the atmosphere

This photo shows a rhodamine dye injection into the proglacial river, just before a waterfall. The pink dye (the rhodamine) is used to calculate the water discharge of the proglacial river (i.e. how much water/melt if flowing in the river at that time).
The Greenland Ice Sheet emits tons of methane according to a new study, showing that subglacial biological activity impacts the atmosphere far more than previously thought.

An international team of researchers led by the University of Bristol camped for three months next to the Greenland Ice Sheet, sampling the meltwater that runs off a large catchment (> 600 km2) of the Ice Sheet during the summer months.

As reported in Nature, using novel sensors to measure methane in meltwater runoff in real time, they observed that methane was continuously exported from beneath the ice.

They calculated that at least six tons of methane was transported to their measuring site from this portion of the Ice Sheet alone, roughly the equivalent of the methane released by up to 100 cows.

Professor Jemma Wadham, Director of Bristol's Cabot Institute for the Environment, who led the investigation, said: "A key finding is that much of the methane produced beneath the ice likely escapes the Greenland Ice Sheet in large, fast flowing rivers before it can be oxidized to CO2, a typical fate for methane gas which normally reduces its greenhouse warming potency."

Methane gas (CH4) is the third most important greenhouse gas in the atmosphere after water vapour and carbon dioxide (CO2). Although, present in lower concentrations that CO2, methane is approximately 20-28 times more potent. Therefore smaller quantities have the potential to cause disproportionate impacts on atmospheric temperatures. Most of the Earth's methane is produced by microorganisms that convert organic matter to CH4 in the absence of oxygen, mostly in wetlands and on agricultural land, for instance in the stomachs of cows and rice paddies. The remainder comes from fossil fuels like natural gas.

While some methane had been detected previously in Greenland ice cores and in an Antarctic Subglacial Lake, this is the first time that meltwaters produced in spring and summer in large ice sheet catchments have been reported to continuously flush out methane from the ice sheet bed to the atmosphere.

Lead author, Guillaume Lamarche-Gagnon, from Bristol's School of Geographical Sciences, said: "What is also striking is the fact that we've found unequivocal evidence of a widespread subglacial microbial system. Whilst we knew that methane-producing microbes likely were important in subglacial environments, how important and widespread they truly were was debatable. Now we clearly see that active microorganisms, living under kilometres of ice, are not only surviving, but likely impacting other parts of the Earth system. This subglacial methane is essentially a biomarker for life in these isolated habitats."

Most studies on Arctic methane sources focus on permafrost, because these frozen soils tend to hold large reserves of organic carbon that could be converted to methane when they thaw due to climate warming. This latest study shows that ice sheet beds, which hold large reserves of carbon, liquid water, microorganisms and very little oxygen -- the ideal conditions for creating methane gas -- are also atmospheric methane sources.

Co-researcher Dr Elizabeth Bagshaw from Cardiff University added: "The new sensor technologies that we used give us a window into this previously unseen part of the glacial environment. Continuous measurement of meltwater enables us to improve our understanding of how these fascinating systems work and how they impact the rest of the planet."

Read more at Science Daily

Archeological discovery yields clues to how our ancestors may have adapted to their environment

During the Stone Age ancestral humans lived with a variety of animal species along what was an area of wetlands in the middle of the Jordanian desert. The site, in the town of Azraq Basin, has been excavated and has revealed an abundance of tools and animal bones from up to 250,000 years ago, leading to better understanding of how ancestral humans have adapted to this changing environment.

James Pokines, PhD, associate professor of forensic anthropology at Boston University School of Medicine, was a leader of the excavation with a team from the Azraq Marshes Archaeological and Paleoecological Project.

The team discovered bone and tooth specimens belonging to wild ancestors of modern-day camels and elephants, as well as horse, rhinoceros, antelope and wild cattle species, among others. Poor preservation of small and less dense bones has resulted in limited conclusions about smaller species of animals that may have inhabited the area during this time.

Prior research in the site revealed evidence of butchery, with blood proteins from multiple species appearing on Stone Age tools. "The periphery of the wetlands where large animals drank and grazed would have presented excellent hunting opportunities for ancestral humans. Humans may have also faced their own challenges from other predatory competitors such as lions and hyenas roaming the area," said Pokines, corresponding author of the study.

The team's discovery adds important background to a growing picture of land use over time in Azraq Basin. "There are many portions of the globe that we still know so little about in terms of how ancestral humans lived and evolved there and how they adapted to that environment ... we hope to understand how different populations of ancestral humans adapted to this changing, arid environment throughout the Stone Age."

The excavation efforts were the outcome of a successful collaboration with Jordanian authorities and according to the researchers has paved the way for future excavations in the region.

From Science Daily

What makes two species different?

The researchers use genetic markers to track segments of the X chromosome that they move from one species of Drosophila (fruit fly) into a different species in order to find X-linked genes that cause male sterility. Genetic markers that affect eye color are located on the X chromosome, so the researchers start with Drosophila mauritiana that have two genetic markers -- giving them dark red eyes, left -- and cross them to white-eyed Drosophila simulans.
Most evolutionary biologists distinguish one species from another based on reproductivity: members of different species either won't or can't mate with one another, or, if they do, the resulting offspring are often sterile, unviable, or suffer some other sort of reduced fitness.

For most of the 20th century, scientists believed that this reproductive incompatibility evolved gradually between species as a by-product of adapting to different ecological circumstances: if two species were geographically isolated, they would adapt differences based on their environment. New research conducted at the University of Rochester, in collaboration with the University of Nebraska, shows, however, that there are more factors at play -- specifically the presence of selfish genes called meiotic drive elements, whose flow among species may dictate whether two species converge or diverge. In a new paper published in the journal eLife, the researchers show that sex chromosomes evolve to be genetically incompatible between species faster than the rest of the genetic chromosomes and reveal the factors at play in this incompatibility.

When two members of a species mate and exchange genetic material, this is known as gene flow. When two members of different species mate, however, gene flow is reduced. "Genes from one species simply can't talk to genes from the other species," says Daven Presgraves, a dean's professor of biology at Rochester. Though the genes may work fine on their own genetic background, when they are moved into the genetic background of another species, they have negative effects. "All of the gene copies in you and me work in the human genome. But if we were to take a gene out of you and stick it in a macaw parrot, they haven't seen this sequence before and it might not work together with the other genes. That would compromise some sort of function like fertility."

This is what happened when Presgraves and members of his lab crossed two different species of fruit flies, one from Madagascar and the other from the island of Mauritius. When the two species were crossed, their female hybrid offspring were fertile, but the hybrid male offspring were completely sterile. "One of the steps on the way to complete reproductive isolation is that the XY sex becomes sterile first in that gradual build-up of incompatibility," Presgraves says. In the case of fruit flies, as in human beings, the XY sex is male.

Chromosomes are divided into two types: allosomes, or sex chromosomes, and autosomes, or body chromosomes. Genetic traits linked to an organism's sex are passed on through the sex chromosomes. The rest of the genetic hereditary information is passed on through the autosomes. When the researchers mapped the factors that cause hybrid males to become sterile, they found that there were many more incompatibility factors on the X allosome compared to the autosomes. This means that sex chromosomes become functionally different between species much faster than non-sex chromosomes, Presgraves says. "There's a lot more exchange going on between the autosomes than on the X."

But what is it that makes sex chromosomes accumulate genetic incompatibility faster than the rest of the genome?

The researchers found that a class of "selfish genes" called meiotic drive elements are responsible for making sex chromosomes genetically incompatible at a faster rate. In general, selfish genes are parasites of the genome -- they propagate themselves at the expense of other genes. Meiotic drive elements in particular sabotage the rules of typical inheritance: in normal Mendelian inheritance, a gene is transmitted to half the offspring. Meiotic drive elements, however, manipulate reproduction so they can transmit themselves to more than their fair share of the genome. In hybrid male fruit flies, meiotic drive elements usually kill any sperm that don't carry them, leaving only (or mostly) sperm that do carry the meiotic drive elements.

"This could be because multiple meiotic drive elements from both parental species are unsuppressed in hybrids, and their combined action causes sterility," says Colin Meiklejohn, a former postdoctoral student in Presgraves's lab.

In a twist, however, the researchers also found that if meiotic drive elements are able to experience gene flow, they can also help bring species together. During early speciation, when two different species are just beginning to break away from one another, reproductive incompatibility can be incomplete and "leaky" -- some part of the genome may still be compatible and exchangeable.

"If two populations are leaky and there is opportunity for gene flow, a selfish gene can leak over into the other population and spread there," Presgraves says. If the species interbreeds and this selfish gene is able to be passed down, instead of becoming incompatible, "that part of the genome will become perfectly exchangeable. In some cases a selfish gene will basically erase the build-up of incompatibilities for a part of the genome."

Read more at Science Daily

Jan 2, 2019

Long term agriculture change impacts stream water quality

Graduate students Heather Luken and Tanner Williamson collect water samples from an automated sampler on Four Mile Creek. The autosampler collects a water sample from the stream every seven hours, and stores 24 samples in the carousel of bottles shown here.
In the early 1990s, Acton Lake in southwestern Ohio had a muddy problem. Large amounts of sediment from nearby farms were entering the lake's watershed. These sediments traveled through streams draining the landscape and were filling up the lake.

So, the USDA gave local farmers incentives to change some of their farming practices. One of these practices was conservation tillage, in which the soil is plowed less often. That can reduce sediment runoff.

A new study examines how the switch to conservation tillage has impacted Acton Lake over the past decades. From 1994 to 2014, the researchers measured concentrations of suspended sediment, nitrogen, and phosphorus in streams draining into Acton Lake.

"We find that short-term trends in water quality may not reflect long-term changes," says study co-author Michael Vanni.

Tracking changes in water quality over the long term is vital, says Vanni, a biologist at Miami University, Ohio. "We don't have a lot of long-term information on how water quality in a stream or lake responds to agricultural change," he says.

That might be surprising since many ecologists study agricultural watersheds. But according to Vanni, studies on a given ecosystem are usually short term. "Long-term studies, like ours, can reveal important shifts in water quality," says Vanni. "Many of the changes we observed can only be seen after studying the streams for 20 plus years."

Vanni and his colleagues found that water quality responses were different during the first decade of the study (1994-2003) compared to the next (2004-2014). They also discovered that concentrations of suspended sediment, nitrogen, and phosphorus each reacted differently.

Levels of suspended sediment declined throughout the entire study period. However, the decline was much sharper in the first ten years.

Phosphorus and nitrogen levels had contrasting outcomes. "The concentration of dissolved phosphorus in the streams declined sharply during the first ten years," says Vanni. "But then, phosphorus levels increased over the next ten years."

In contrast, nitrogen levels didn't change much in the first ten years. After that, they fell sharply.

The study focused on the watershed of the Upper Four Mile Creek, which drains into Acton Lake. Most of the surrounding area is made up of corn and soybean farms. The researchers have monitored farming practices in the area since 1989 and water quality since 1994.

The long-term changes seen in this study indicate that there might be tradeoffs in managing different aspects of water quality. "The main reason to encourage conservation tillage was to reduce soil erosion and sedimentation in Acton Lake," says Vanni. "That has clearly been successful. Sediment inputs to the lake have declined."

Nitrogen levels are also declining. "That's great for local freshwater ecosystems," says Vanni. "It's also beneficial to the Gulf of Mexico, where some of our runoff eventually travels."

On the other hand, rising phosphorus levels are a cause for concern. "They could promote algal blooms downstream," says Vanni. "We might need to consider the tradeoffs involved in managing for sediments, nitrogen, or phosphorus."

It's not completely clear how the study findings would apply to other areas. However, the changes in water quality observed in this study are similar to those seen in some of the rivers that drain into Lake Erie.

High phosphorus levels are a problem in those watersheds as well. In fact, "high levels of phosphorus are implicated in causing increased blooms of harmful algae in Lake Erie," says Vanni.

Vanni and colleagues hope to continue measuring changes in suspended sediments, nitrogen, and phosphorus in the Acton Lake watershed.

Read more at Science Daily

Early protostar already has a warped disk

Using observations from the ALMA radio observatory in Chile, researchers have observed, for the first time, a warped disk around an infant protostar that formed just several tens of thousands of years ago. This implies that the misalignment of planetary orbits in many planetary systems -- including our own -- may be caused by distortions in the planet-forming disk early in their existence.

The planets in our solar system orbit the sun in planes that are at most about seven degrees offset from the equator of the sun itself. It has been known for some time that many extrasolar systems have planets that are not lined up in a single plane or with the equator of the star. One explanation for this is that some of the planets might have been affected by collisions with other objects in the system or by stars passing by the system, ejecting them from their initial orbital plane.

However, the possibility remained that the formation of planets out of the normal plane was actually caused by a warping of the star-forming cloud out of which the planets were born. Recently, images of protoplanetary disks -- rotating disks where planets form around a star -- have in fact showed such warping. But it was still unclear how early this happened.

In the latest findings, published in Nature, the group from the RIKEN Cluster for Pioneering Research (CPR) and Chiba University in Japan have discovered that L1527; an infant protostar still embedded within a cloud, has a disk that has two parts -- an inner one rotating in one plane, and an outer one in a different plane. The disk is very young and still growing. L1527, which is about 450 light years away in the Taurus Molecular Cloud, is a good object for study as it has a disk that is nearly edge-on to our view.

According to Nami Sakai, who led the research group, "This observation shows that it is conceivable that the misalignment of planetary orbits can be caused by a warp structure formed in the earliest stages of planetary formation. We will have to investigate more systems to find out if this is a common phenomenon or not."

The remaining question is what caused the warping of the disk. Sakai suggests two reasonable explanations. "One possibility," she says, "is that irregularities in the flow of gas and dust in the protostellar cloud are still preserved and manifest themselves as the warped disk. A second possibility is that the magnetic field of the protostar is in a different plane from the rotational plane of the disk, and that the inner disk is being pulled into a different plane from the rest of the disk by the magnetic field." She says they plan further work to determine which is responsible for the warping of the disk.

The ALMA observatory in Chile is managed by an international consortium including the National Astronomical Observatory of Japan (NAOJ).

From Science Daily

Thriving on teamwork: New research shows how brain cells filter information in groups

When we perceive the world around us, certain objects appear to be more noticeable than others, depending on what we do. For example, when we view a forest-covered mountain from a distance, the forest looks like a large green carpet. But as we get closer, we start noticing the individual trees, and the forest fades to the background. What happens in the brain as our experience changes so drastically?

For decades, scientists studying the visual system thought that individual brain cells, called neurons, operate as filters. Some neurons would prefer coarse details of the visual scene and ignore fine details, while others would do the opposite. Every neuron was thought to do its own filtering.

A new study led by Salk Institute researchers challenges this view. The study revealed that the same neurons that prefer coarse details could change to prefer finer details under different conditions. The work, which appeared in the journal Neuron on December 31, 2018, could help to better understand neural mechanisms that shape our perceptions of the world.

"We were trying to look beneath the hood and figure out how these filters work," says Professor Thomas Albright, director of Salk's Center for Neurobiology of Vision and a senior author of the study.

"The selectivity of neurons was thought to be stable, but our work has shown that the filtering properties of neurons are much more flexible than was previously thought," adds study first author Ambarish Pawar, a postdoctoral researcher at Salk.

The team focused on neurons in the visual cortex in an animal model. Animals were shown optical patterns in which the researchers varied the contrast between dark and light areas and measured neurons' preferences to coarse and fine details. The goal was to see how neurons process these patterns, specifically in the brain's middle temporal area within the visual cortex. Scientists expected to find that the neurons were strictly "tuned" to perceive either coarse or fine details, but not both. What they found instead that an individual neuron could filter both fine as well as coarse detail, depending on the contrast of the pattern.

By measuring the firing rates of multiple neurons activated by the optical stimuli, the researchers showed that such flexibility was more likely if entire networks of neurons acted as filters rather than individual neurons.

"Our results suggest that the previously common description of individual neurons as filters was incorrect," says Sergei Gepshtein, a scientist with the Center for Neurobiology of Vision at Salk and co-author of the new study.

"The preference of neurons may shift due to a change in the balance of positive (excitatory) signals and negative (inhibitory) signals by which neurons communicate in the network," adds Pawar.

The researchers showed that teaming up endows networks of neurons with a high amount of flexibility in their preferences could easily adapt and tune the brain to the changing conditions, just as you might tune a radio to get good reception as you drive.

"We've uncovered a new dimension of adaptability of cortical networks," says Gepshtein. "Our results made it clear that to understand that adaptability we have to rethink what the computing units of the brain are. It is the team of connected neurons -- the malleable neural network -- that is more suited as such a unit rather than an individual neuron."

"This unexpected finding could help us shed light on the neural mechanisms that underlie the brains' enormous adaptability to a continuously changing environment," says Pawar.

Albright adds that, "even though the study centered on the visual system, this same flexible quality of neural networks is likely to hold true for other parts of the brain."

Read more at Science Daily

Juno mission captures images of volcanic plumes on Jupiter's moon Io

Juno's Radiation Monitoring Investigation collected this image of Jupiter's moon Io with Juno's Stellar Reference Unit (SRU) star camera shortly after Io was eclipsed by Jupiter at 12:40:29 (UTC) Dec. 21, 2018. Io is softly illuminated by moonlight from another of Jupiter's moons, Europa. The brightest feature on Io is suspected to be a penetrating radiation signature. The glow of activity from several of Io's volcanoes is seen, including a plume circled in the image.
A team of space scientists has captured new images of a volcanic plume on Jupiter's moon Io during the Juno mission's 17th flyby of the gas giant. On Dec. 21, during winter solstice, four of Juno's cameras captured images of the Jovian moon Io, the most volcanic body in our solar system. JunoCam, the Stellar Reference Unit (SRU), the Jovian Infrared Auroral Mapper (JIRAM) and the Ultraviolet Imaging Spectrograph (UVS) observed Io for over an hour, providing a glimpse of the moon's polar regions as well as evidence of an active eruption.

"We knew we were breaking new ground with a multi-spectral campaign to view Io's polar region, but no one expected we would get so lucky as to see an active volcanic plume shooting material off the moon's surface," said Scott Bolton, principal investigator of the Juno mission and an associate vice president of Southwest Research Institute's Space Science and Engineering Division. "This is quite a New Year's present showing us that Juno has the ability to clearly see plumes."

JunoCam acquired the first images on Dec. 21 at 12:00, 12:15 and 12:20 coordinated universal time (UTC) before Io entered Jupiter's shadow. The Images show the moon half-illuminated with a bright spot seen just beyond the terminator, the day-night boundary.

"The ground is already in shadow, but the height of the plume allows it to reflect sunlight, much like the way mountaintops or clouds on the Earth continue to be lit after the sun has set," explained Candice Hansen-Koharcheck, the JunoCam lead from the Planetary Science Institute.

At 12:40 UTC, after Io had passed into the darkness of total eclipse behind Jupiter, sunlight reflecting off nearby moon Europa helped to illuminate Io and its plume. SRU images released by SwRI depict Io softly illuminated by moonlight from Europa. The brightest feature on Io in the image is thought to be a penetrating radiation signature, a reminder of this satellite's role in feeding Jupiter's radiation belts, while other features show the glow of activity from several volcanoes. "As a low-light camera designed to track the stars, the SRU can only observe Io under very dimly lit conditions. Dec. 21 gave us a unique opportunity to observe Io's volcanic activity with the SRU using only Europa's moonlight as our lightbulb," said Heidi Becker, lead of Juno's Radiation Monitoring Investigation, at NASA's Jet Propulsion Laboratory.

Sensing heat at long wavelengths, the JIRAM instrument detects hotspots in the daylight and at night.

"Though Jupiter's moons are not JIRAM's primary objectives, every time we pass close enough to one of them, we take advantage of the opportunity for an observation," said Alberto Adriani, a researcher at Italy's National Institute of Astrophysics. "The instrument is sensitive to infrared wavelengths, which are perfect to study the volcanism of Io. This is one of the best images of Io that JIRAM has been able to collect so far."

The latest images can lead to new insights into the gas giant's interactions with its five moons, causing phenomena such as Io's volcanic activity or freezing of the moon's atmosphere during eclipse, added Bolton. JIRAM recently documented Io's volcanic activity before and after eclipse. Io's volcanoes were discovered by NASA's Voyager spacecraft in 1979. Io's gravitational interaction with Jupiter drives the moon's volcanoes, which emit umbrella-like plumes of SO2 gas and produce extensive basaltic lava fields.

The recent Io images were captured at the halfway point of the mission, which is scheduled to complete a map of Jupiter in July 2021. Launched in 2011, Juno arrived at Jupiter in 2016. The spacecraft orbits Jupiter every 53 days, studying its auroras, atmosphere and magnetosphere.

Read more at Science Daily

New Horizons successfully explores Ultima Thule

At left is a composite of two images taken by New Horizons' high-resolution Long-Range Reconnaissance Imager (LORRI), which provides the best indication of Ultima Thule's size and shape so far. Preliminary measurements of this Kuiper Belt object suggest it is approximately 20 miles long by 10 miles wide (32 kilometers by 16 kilometers). An artist's impression at right illustrates one possible appearance of Ultima Thule, based on the actual image at left. The direction of Ultima's spin axis is indicated by the arrows.
NASA's New Horizons spacecraft flew past Ultima Thule in the early hours of New Year's Day, ushering in the era of exploration from the enigmatic Kuiper Belt, a region of primordial objects that holds keys to understanding the origins of the solar system.

"Congratulations to NASA's New Horizons team, Johns Hopkins Applied Physics Laboratory and the Southwest Research Institute for making history yet again. In addition to being the first to explore Pluto, today New Horizons flew by the most distant object ever visited by a spacecraft and became the first to directly explore an object that holds remnants from the birth of our solar system," said NASA Administrator Jim Bridenstine. "This is what leadership in space exploration is all about."

Signals confirming the spacecraft is healthy and had filled its digital recorders with science data on Ultima Thule reached the mission operations center at the Johns Hopkins Applied Physics Laboratory (APL) today at 10:29 a.m. EST, almost exactly 10 hours after New Horizons' closest approach to the object.

"New Horizons performed as planned today, conducting the farthest exploration of any world in history -- 4 billion miles from the Sun," said Principal Investigator Alan Stern, of the Southwest Research Institute in Boulder, Colorado. "The data we have look fantastic and we're already learning about Ultima from up close. From here out the data will just get better and better!"

Images taken during the spacecraft's approach -- which brought New Horizons to within just 2,200 miles (3,500 kilometers) of Ultima at 12:33 a.m. EST -- revealed that the Kuiper Belt object may have a shape similar to a bowling pin, spinning end over end, with dimensions of approximately 20 by 10 miles (32 by 16 kilometers). Another possibility is Ultima could be two objects orbiting each other. Flyby data have already solved one of Ultima's mysteries, showing that the Kuiper Belt object is spinning like a propeller with the axis pointing approximately toward New Horizons. This explains why, in earlier images taken before Ultima was resolved, its brightness didn't appear to vary as it rotated. The team has still not determined the rotation period.

As the science data began its initial return to Earth, mission team members and leadership reveled in the excitement of the first exploration of this distant region of space.

"New Horizons holds a dear place in our hearts as an intrepid and persistent little explorer, as well as a great photographer," said Johns Hopkins Applied Physics Laboratory Director Ralph Semmel. "This flyby marks a first for all of us -- APL, NASA, the nation and the world -- and it is a great credit to the bold team of scientists and engineers who brought us to this point."

"Reaching Ultima Thule from 4 billion miles away is an incredible achievement. This is exploration at its finest," said Adam L. Hamilton, president and CEO of the Southwest Research Institute in San Antonio. "Kudos to the science team and mission partners for starting the textbooks on Pluto and the Kuiper Belt. We're looking forward to seeing the next chapter."

The New Horizons spacecraft will continue downloading images and other data in the days and months ahead, completing the return of all science data over the next 20 months. When New Horizons launched in January 2006, George W. Bush was in the White House, Twitter had just been launched and Time Magazine's Person of the Year was "you -- all the worldwide web users." Nine years into its journey, the spacecraft began its exploration of the Kuiper Belt with a flyby of Pluto and its moons. Almost 13 years after the launch, the spacecraft will continue its exploration of the Kuiper Belt until at least 2021. Team members plan to propose more Kuiper Belt exploration.

Read more at Science Daily

Dec 31, 2018

Happy New Year

As the tradition for A Magical Journey on new years eve, it's time to say farewell to the old year and hello to the new.
As allways there will be new discoveries in many fields of science and you can read about some of them here in the new year.

As allways I'll end the year with ABBA and their Happy New year:


Happy New Year to all of you

The magicman Danny Boston from A Magical Journey

Dec 30, 2018

How 'Dry January' is the secret to better sleep, saving money and losing weight

New research from the University of Sussex shows that taking part in Dry January -- abstaining from booze for a month -- sees people regaining control of their drinking, having more energy, better skin and losing weight. They also report drinking less months later.

The research, led by Sussex psychologist Dr Richard de Visser, was conducted with over 800 people who took part in Dry January in 2018. The results show that Dry January participants are still drinking less in August. They reported that:

  • drinking days fell on average from 4.3 to 3.3 per week;
  • units consumed per drinking day dropped on average from 8.6 to 7.1;
  • frequency of being drunk dropped from 3.4 per month to 2.1 per month on average.

Dr Richard de Visser, Reader in Psychology at the University of Sussex, said:

"The simple act of taking a month off alcohol helps people drink less in the long term: by August people are reporting one extra dry day per week. There are also considerable immediate benefits: nine in ten people save money, seven in ten sleep better and three in five lose weight.

"Interestingly, these changes in alcohol consumption have also been seen in the participants who didn't manage to stay alcohol-free for the whole month -- although they are a bit smaller. This shows that there are real benefits to just trying to complete Dry January."

The University of Sussex research showed that:

  • 93% of participants had a sense of achievement;
  • 88% saved money;
  • 82% think more deeply about their relationship with drink;
  • 80% feel more in control of their drinking;
  • 76% learned more about when and why they drink;
  • 71% realised they don't need a drink to enjoy themselves;
  • 70% had generally improved health;
  • 71% slept better;
  • 67% had more energy;
  • 58% lost weight;
  • 57% had better concentration;
  • 54% had better skin.

Dr Richard de Visser's findings come from three self-completed online surveys: 2,821 on registering for Dry January; 1,715 in the first week of February; and 816 participants in August.

A new YouGov poll undertaken for Alcohol Change UK showed that one in ten people who drink -- an estimated 4.2 million people in the UK -- are already planning to do Dry January in 2019.

Dr Richard Piper, CEO of Alcohol Change UK, said:

"Put simply, Dry January can change lives. We hear every day from people who took charge of their drinking using Dry January, and who feel healthier and happier as a result.

"The brilliant thing about Dry January is that it's not really about January. Being alcohol-free for 31 days shows us that we don't need alcohol to have fun, to relax, to socialise. That means that for the rest of the year we are better able to make decisions about our drinking, and to avoid slipping into drinking more than we really want to.

"Many of us know about the health risks of alcohol -- seven forms of cancer, liver disease, mental health problems -- but we are often unaware that drinking less has more immediate benefits too. Sleeping better, feeling more energetic, saving money, better skin, losing weight... The list goes on. Dry January helps millions to experience those benefits and to make a longer-lasting change to drink more healthily."

Read more at Science Daily

All about Ultima: New Horizons flyby target is unlike anything explored in space

The Kuiper Belt lies in the so-called "third zone" of our solar system, beyond the terrestrial planets (inner zone) and gas giants (middle zone). This vast region contains billions of objects, including comets, dwarf planets like Pluto and "planetesimals" like Ultima Thule. The objects in this region are believed to be frozen in time -- relics left over from the formation of the solar system.
NASA's New Horizons spacecraft is set to fly by a distant "worldlet" 4 billion miles from the Sun in just six days, on New Year's Day 2019. The target, officially designated 2014 MU69, was nicknamed "Ultima Thule," a Latin phrase meaning "a place beyond the known world," after a public call for name recommendations. No spacecraft has ever explored such a distant world.

Ultima, as the flyby target is affectionately called by the New Horizons team, is orbiting in the heart of our solar system's Kuiper Belt, far beyond Neptune. The Kuiper Belt -- a collection of icy bodies ranging in size from dwarf planets like Pluto to smaller planetesimals like Ultima Thule (pronounced "ultima toolee") and even smaller bodies like comets -- are believed to be the building blocks of planets.

Ultima's nearly circular orbit indicates it originated at its current distance from the Sun. Scientists find its birthplace important for two reasons. First, because that means Ultima is an ancient sample of this distant portion of the solar system. Second, because temperatures this far from the Sun are barely above absolute zero -- mummifying temperatures that preserves Kuiper Belt objects -- they are essentially time capsules of the ancient past.

Marc Buie, New Horizons co-investigator from the Southwest Research Institute in Boulder, Colorado, and members of the New Horizons science team discovered Ultima using the Hubble Space Telescope in 2014. The object is so far and faint in all telescopes, little is known about the world beyond its location and orbit. In 2016, researchers determined it had a red color. In 2017, a NASA campaign using ground-based telescopes traced out its size -- just about 20 miles (30 kilometers) across -- and irregular shape when it passed in front of a star, an event called a "stellar occultation."

From its brightness and size, New Horizons team members have calculated Ultima's reflectivity, which is only about 10 percent, or about as dark as garden dirt. Beyond that, nothing else is known about it -- basic facts like its rotational period and whether or not it has moons are unknown.

"All that is about to dramatically change on New Year's Eve and New Year's Day," said New Horizons Principal Investigator Alan Stern, also of SwRI. "New Horizons will map Ultima, map its surface composition, determine how many moons it has and find out if it has rings or even an atmosphere. It will make other studies, too, such as measuring Ultima's temperature and perhaps even its mass. In the space of one 72-hour period, Ultima will be transformed from a pinpoint of light -- a dot in the distance -- to a fully explored world. It should be breathtaking!"

"New Horizons is performing observations at the frontier of planetary science," said Project Scientist Hal Weaver, of the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, "and the entire team looks forward to unveiling the most distant and pristine object ever explored during a spacecraft flyby."

"From Ultima's orbit, we know that it is the most primordial object ever explored. I'm excited to see the surface features of this small world, particularly the craters on the surface," said Deputy Project Scientist Cathy Olkin, of SwRI. "Young craters could provide a window to see the composition of the subsurface of Ultima. Also by counting the number and impactors that have hit Ultima, we can learn about the number of small objects in the outer solar system."

Read more at Science Daily

Our universe: An expanding bubble in an extra dimension

In their article, the scientists propose a new model with dark energy and our Universe riding on an expanding bubble in an extra dimension. The whole Universe is accommodated on the edge of this expanding bubble.
Uppsala University researchers have devised a new model for the Universe -- one that may solve the enigma of dark energy. Their new article, published in Physical Review Letters, proposes a new structural concept, including dark energy, for a universe that rides on an expanding bubble in an additional dimension.

We have known for the past 20 years that the Universe is expanding at an ever accelerating rate. The explanation is the "dark energy" that permeates it throughout, pushing it to expand. Understanding the nature of this dark energy is one of the paramount enigmas of fundamental physics.

It has long been hoped that string theory will provide the answer. According to string theory, all matter consists of tiny, vibrating "stringlike" entities. The theory also requires there to be more spatial dimensions than the three that are already part of everyday knowledge. For 15 years, there have been models in string theory that have been thought to give rise to dark energy. However, these have come in for increasingly harsh criticism, and several researchers are now asserting that none of the models proposed to date are workable.

In their article, the scientists propose a new model with dark energy and our Universe riding on an expanding bubble in an extra dimension. The whole Universe is accommodated on the edge of this expanding bubble. All existing matter in the Universe corresponds to the ends of strings that extend out into the extra dimension. The researchers also show that expanding bubbles of this kind can come into existence within the framework of string theory. It is conceivable that there are more bubbles than ours, corresponding to other universes.

The Uppsala scientists' model provides a new, different picture of the creation and future fate of the Universe, while it may also pave the way for methods of testing string theory.

From Science Daily