Oct 6, 2018

Why huskies have blue eyes

Embark dog, Lakota, shows off bright blue eyes.
DNA testing of more than 6,000 dogs has revealed that a duplication on canine chromosome 18 is strongly associated with blue eyes in Siberian Huskies, according to a study published October 4, 2018, in the open-access journal PLOS Genetics by Adam Boyko and Aaron Sams of Embark Veterinary, Inc., and colleagues. Embark is a dog DNA startup company headquartered in Boston, MA, and Ithaca, NY, and research partner of the Cornell University College of Veterinary Medicine. According to the authors, this represents the first consumer genomics study ever conducted in a non-human model and the largest canine genome-wide association study to date.

Consumer genomics enables genetic discovery on an unprecedented scale by linking very large databases of genomic data with phenotype information voluntarily submitted via web-based surveys. But the promise of consumer genomic data is not limited to human research. Genomic tools for dogs are readily available but the genetic underpinnings of many important traits remain undiscovered. Although two genetic variants are known to underlie blue eye color in some dogs, these do not explain the trait in some other dogs, like Siberian Huskies.

To address this gap in knowledge, Boyko, Sams and colleagues used a diverse panel of 6,070 genetically tested dogs with owners that contributed phenotype data via web-based surveys and photo uploads. They found that a 98.6-kilobase duplication on chromosome 18 near the ALX4 gene, which plays an important role in mammalian eye development, was strongly associated with variation in blue eye color, primarily in Siberian Huskies but also in non-merle Australian Shepherds. One copy of the variant was enough to cause blue eyes or heterochromia (blue and brown eyes), although some dogs with the variant did not have blue eyes, so other genetic or environmental factors are still involved. Future studies of the functional mechanism underlying this association may lead to the discovery of a novel pathway by which blue eyes develop in mammals. From a broader perspective, the results underscore the power of consumer data-driven discovery in non-human species, especially dogs, where there is intense owner interest in the personal genomic information of their pets, a high level of engagement with web-based surveys, and an underlying genetic architecture ideal for mapping studies.

Aaron J. Sams adds: "Using genetic data from the pets of our customers, combined with eye colors reported by customers for those same animals, we have discovered a genetic duplication that is strongly associated with blue eye color. This study demonstrates the power of the approach that Embark is taking towards improving canine health. In a single year, we collected enough data to conduct the largest canine study of its kind. Embark is currently pursuing similar research projects in a range of morphological and health-related traits and we hope to continue to use our platform to move canine genetics and health forward in a very real way."

From Science Daily

New DNA tool predicts height, shows promise for serious illness assessment

A new tool accurately predicts height.
A new DNA tool created by Michigan State University can accurately predict people's height, and more importantly, could potentially assess their risk for serious illnesses, such as heart disease and cancer.

For the first time, the tool, or algorithm, builds predictors for human traits such as height, bone density and even the level of education a person might achieve, purely based on one's genome. But the applications may not stop there.

"While we have validated this tool for these three outcomes, we can now apply this method to predict other complex traits related to health risks such as heart disease, diabetes and breast cancer," said Stephen Hsu, lead investigator of the study and vice president for research and graduate studies at MSU. "This is only the beginning."

Further applications have the potential to dramatically advance the practice of precision health, which allows physicians to intervene as early as possible in patient care and prevent or delay illness.

The research, featured in the October issue of Genetics, analyzed the complete genetic makeup of nearly 500,000 adults in the United Kingdom using machine learning, where a computer learns from data.

In validation tests, the computer accurately predicted everyone's height within roughly an inch. While bone density and educational attainment predictors were not as precise, they were accurate enough to identify outlying individuals who were at risk of having very low bone density associated with osteoporosis or were at risk of struggling in school.

Traditional genetic testing typically looks for a specific change in a person's genes or chromosomes that can indicate a higher risk for diseases such as breast cancer. Hsu's model considers numerous genomic differences and builds a predictor based on the tens of thousands of variations.

Using data from the UK Biobank, an international resource for health information, Hsu and his team put the algorithm to work, evaluating each participant's DNA and teaching the computer to pull out these distinct differences.

"The algorithm looks at the genetic makeup and height of each person," Hsu said. "The computer learns from each person and ultimately produces a predictor that can determine how tall they are from their genome alone."

Hsu's team will continue to improve the algorithms, while tapping into larger, more diverse data sets. Doing this would further validate the techniques and continue to help map out the genetic architecture of these important traits and disease risks.

With greater computing power and decreasing costs around DNA sequencing, what was once thought to be five to 10 years out, is now a lot closer when it comes to this type of work, Hsu added.

Read more at Science Daily

Oct 5, 2018

Gene signature predicts outcome after spinal cord injury

Scientists have determined a gene signature that is linked to the severity of spinal cord injury in animals and humans, according to a study in the open-access journal eLife.

The discovery of key genes that are switched on or off in response to spinal cord injury could inform the development of biomarkers that predict recovery and possibly pinpoint new targets for treatment.

At the moment, there are no widely available treatments capable of immediately restoring motor and sensory function after injury. A major barrier is the lack of understanding of the complex cascade of biological processes that occur when a spinal cord injury happens.

"Our understanding of the pathophysiological processes triggered by spinal cord injury is fragmentary," explains senior author Michael Skinnider, a medical and PhD student at the University of British Columbia, Canada. "We set out to integrate the data from decades of small-scale studies using a systems biology approach."

The team first reviewed past experiments to find genes associated with the response to spinal cord injury, searching through more than 500 studies. They found 695 unique human genes that had been linked with the response to spinal cord injury and, of these, 151 were linked in more than one study. Further analysis showed that the genes are biologically and functionally related, coding for groups of protein molecules that physically interact with one another.

To find if these genes truly reflect functional changes after spinal cord injury, the team constructed a network of genes from healthy human spinal cords and integrated this data with those determined from the experimental studies. They found that two groups of genes (M3 and M7) included a high number of the genes that had been previously pinpointed in experiments as important in the response to spinal cord injury.

They next looked at five experimental studies of gene expression in mice and rats after spinal cord injury to see whether these gene groups (and others) were significantly altered. They found that four gene groups, including M3 and M7, were switched on, and a further two gene groups were switched off. Some gene groups were not as connected in mice and rats as in humans, suggesting that they might be human-specific markers of spinal cord injury. Other gene groups were only important at a certain time-point after injury, suggesting that they are involved in the transition from acute to chronic injury.

Of all the gene groups studied, M3 genes were most strongly linked to injury severity in both mice and rats, which suggests that these genes could make an ideal biomarker for predicting injury severity. Indeed, one of the genes in this group, annexin A1, previously associated with spinal cord injury, was able to perfectly differentiate between moderately and severely injured rats when used as a biomarker.

Read more at Science Daily

Teaching wild birds to sing a new tune

This image shows a Savannah sparrow.
Like toddlers learning to speak, young birds learn to sing by listening to the voices of adults. Now, researchers reporting in Current Biology on October 4 have shown for the first time that they could teach young sparrows in the wild how to sing a new tune. The wild birds then passed the new songs on to the next generation.

"I was quite shocked that our loudspeakers succeeded in teaching wild birds to sing," says Dan Mennill from the University of Windsor in Ontario, Canada. "The sparrows in our island-living population had abundant opportunities to learn songs from live tutors, and yet thirty birds learned songs from the loudspeakers, providing experimental evidence of vocal learning."

Conventional experiments of vocal learning in birds have been conducted in the laboratory. But such studies are much more difficult to do in the wild. The researchers overcame the challenges in the new study by focusing their attention on Savannah Sparrows living at Bowdoin Scientific Station on Kent Island. The sparrows on this island often return to the place of their birth to breed as adults. That made it possible for researchers to expose young birds to novel songs and then record those same animals when they returned from migration to breed the next year.

Mennill's team, including researchers from the University of Windsor, University of Guelph, and Williams College, developed a new type of loudspeaker that is programmable, solar powered, light activated, and weatherproof. The speakers allowed them to broadcast adult songs with distinctive acoustic signatures for the wild sparrows over tutoring sessions that lasted for months. Over a six-year period between 2013 and 2018, they experimentally tutored five cohorts of Savannah Sparrows, from the time they hatched to adulthood.

Across the five cohorts, thirty birds produced songs that matched the broadcasted songs. Those songs differed from anything the birds would have heard otherwise. In all thirty cases, the researchers report, the birds produced songs containing phrases that had never been recorded on the island in three decades of field study.

The findings confirm that wild Savannah Sparrows learn songs by listening to adult Savannah Sparrows. When those young sparrows become adults, they then pass these new songs on to subsequent generations. The new findings also provide the first experimental evidence that the timing of exposure to song influences vocal learning in wild birds. The Savannah Sparrows preferentially learned songs heard in the summer they hatched and then again at the outset of their first breeding season the following year.

Mennill says this study population of Savannah Sparrows, in which some males have learned typical songs and others have learned unusual songs, now presents unique opportunities for further study.

Read more at Science Daily

Living organisms find a critical balance

Scientists Sara Walker, Bradley Karas, Siyu Zhou, Bryan Daniels, Harrison Smith, Hyunju Kim with 67 sheets of paper, one for each of the biological networks studied in this research.
Biologists know a lot about how life works, but they are still figuring out the big questions of why life exists, why it takes various shapes and sizes, and how life is able to amazingly adapt to fill every nook and cranny on Earth.

An interdisciplinary team of researchers at Arizona State University has discovered that the answers to these questions may lie in the ability of life to find a middle ground, balancing between robustness and adaptability. The results of their study have been recently published in Physical Review Letters.

The importance of stability

The research team, led by Bryan Daniels of the Center for Biosocial Complex Systems with direction from faculty member Sara Walker of the School of Earth and Space Exploration, sifted through data to better understand the root connections among 67 biological networks that describe how components of these systems interact with one another. The biological networks are sets of individual components (like proteins and genes) that interact with one another to perform important tasks like transmitting signals or deciding a cell's fate. They measured a number of mathematical features, simulating the networks' behavior and looking for patterns to provide clues on what made them so special.

To perform their study, they examined data from the Cell Collective database. This rich resource represents biological processes across life -- encapsulating a wide range of biological processes from humans to animals, plants, bacteria and viruses. The number of components in these networks ranged from five nodes to 321 nodes, encompassing 6500 different biological interactions.

And these nodes include many of life's key building blocks -- genes and proteins that act as master switches controlling cell division, growth and death, and communication.

Using a wealth of molecular data, scientists can now study the interactions among the building blocks, with an ultimate goal of understanding the key to how life emerges.

"We wanted to know whether the biological networks were special compared to random networks, and if so, how," says Daniels.

They focused on trying to find a threshold point at which an entire system may change in response to just a small change. Such a change could profoundly upset the balance of life, creating a teeter-totter of fate deciding whether an organism would die or thrive.

"In a stable system, organisms will always come back to their original state," explains Daniels. "In an unstable system, the effect of a small change will grow and cause the whole system to behave differently."

Through rigorous testing of the 67 networks, the team found that all of the networks shared a special property: They existed in between two extremes, neither too stable nor unstable.

As such, the team found that sensitivity, which is a measure of stability, was near a special point that biologists call "criticality," suggesting that the networks may be evolutionarily adapted to an optimal tradeoff between stability and instability.

Life in the balance


Previous studies have shown that a handful of biological systems, from neurons to ant colonies, lie in this middle ground of criticality and this new research expands the list of living systems in this state.

This can be of particular interest to astrobiologists, like co-author Walker who is searching for life on other planets. Understanding how life can take various forms, and why it does so, may help identify life on other planets and determine how it might look different from life on Earth. It can also help inform our search for the origins of life in the lab.

"We still don't really understand what life is," says Walker, "and determining what quantitative properties, such as criticality, best distinguish life from non-life is an important step toward building that understanding at a fundamental level so that we may recognize life on other worlds or in our experiments on Earth, even if it looks very different than us."

The findings also advance the field of quantitative biology by showing that, from the basic building blocks of life, scientists can identify a critical sensitivity that is common across a large swath of biology. And it promises to advance synthetic biology by allowing scientists to use life's building blocks to more accurately construct biochemical networks that are similar to living systems.

"Each biological system has distinctive features, from its components and its size to its function and its interactions with the surrounding environment," explains co-author Hyunju Kim of the School of Earth and Space Exploration and the Beyond Center. "In this research, for the first time, we are able to make connections between the theoretical hypothesis on biological systems' universal tendency to retain the balance at the medium degree of stability and 67 biological models with various characteristics built on actual experiment data."

In addition to Daniels, Walker, and Kim, the interdisciplinary research team on this study includes co-authors Douglas Moore of the Beyond Center, Siyu Zhou of the Department of Physics, Bradley Karas and Harrison Smith of the School of Earth and Space Exploration, and Stuart Kauffman of the Institute for Systems Biology in Seattle, Washington.

This research emerged from a course led by Walker and Kim on complex systems approaches to understanding life, offered at the School of Earth and Space Exploration. Co-authors Karas, Zhou, and Smith were originally students in the class when the project began.

Read more at Science Daily

Surprising chemical complexity of Saturn's rings changing planet's upper atmosphere

During Cassini's 'Grand Finale' plunge into Saturn's innermost ring and upper atmosphere in 2017, the mass spectrometer aboard the probe sampled chemicals at altitudes between Saturn's rings and atmosphere.
Political humorist Mark Russell once joked, "The scientific theory I like best is that the rings of Saturn are composed entirely of lost airline luggage."

Well, there's no luggage, it turns out. But a new study appearing in Science based on data from the final orbits last year of NASA's Cassini spacecraft shows the rings of Saturn -- some of the most visually stupendous objects in the universe -- are far more chemically complicated than previously was understood.

Furthermore, the paper shows the innermost D ring of the gas giant is hurling dust grains coated in its chemical cocktail into the planet's upper atmosphere at an extraordinary rate as it spins. Over long timescales, the researchers say this infalling material may change the carbon and oxygen content of the atmosphere.

"This is a new element of how our solar system works," said Thomas Cravens, professor of physics & astronomy at the University of Kansas and a co-author of the new paper. "Two things surprised me. One is the chemical complexity of what was coming off the rings -- we thought it would be almost entirely water based on what we saw in the past. The second thing is the sheer quantity of it -- a lot more than we originally expected. The quality and quantity of the materials the rings are putting into the atmosphere surprised me."

Cravens is a member of Cassini's Ion and Neutral Mass Spectrometer (INMS) team. During Cassini's "Grand Finale" plunge into Saturn's innermost ring and upper atmosphere in 2017, the mass spectrometer aboard the probe sampled chemicals at altitudes between Saturn's rings and atmosphere.

More than simply water, the INMS found the rings to be composed of water, methane, ammonia, carbon monoxide, molecular nitrogen and carbon dioxide.

"What the paper is describing is the environment in the gap between the inner ring and upper atmosphere, and some of the things found were expected, such as water," Cravens said. "What was a surprise was the mass spectrometer saw methane -- no one expected that. Also, it saw some carbon dioxide, which was unexpected. The rings were thought to be entirely water. But the innermost rings are fairly contaminated, as it turns out, with organic material caught up in ice."

A further new finding from Cassini's mass spectrometer showed large amounts of the chemical brew from Saturn's D ring is flung into the planet's upper atmosphere by the ring spinning faster than the planet's atmosphere itself.

"We saw it was happening even though it's not fully understood," the KU researcher said. "What we saw is this material, including some benzine, was altering the uppermost atmosphere of Saturn in the equatorial region. There were both grains and dust that were contaminated."

Cravens said the findings could cast new light on mechanisms underpinning our solar system as well as other solar systems and exoplanets -- and also prompt a host of new scientific questions.

"This could help us understand, how does a planet get rings? Some do, some don't," he said. "What's the lifetime of a ring? And what's replenishing the rings? Was there a time when Saturn didn't have rings? How did that composition get into there in the first place? Is it left over from the formation of our solar system? Does it date back to proto pre-solar nebula, the nebula that collapsed out of interstellar media that formed the sun and planets?"

According to Cravens, the higher-than-expected rate of material being expelled from Saturn's D Ring into the planet's upper atmosphere, or ionosphere, is sufficient that astronomers now think the lifespan of the ring may be briefer than previously estimated.

"Because of this data, we now have shortened the lifetime of inner rings because of the quantity of material being moved out -- it's much more than we thought before," Cravens said. We know that it's bumping material out of the rings at least 10 times faster than we thought. If it's not being replenished, the rings aren't going to last -- you've got a hole in your bucket. Jupiter probably had a ring that evolved into the current wispy ring, and it could be for similar reasons. Rings do come and go. At some point they gradually drain away unless somehow they're getting new material."

Assisted by KU graduate and undergraduate students, a first stage of Cravens' work involved sorting and cleaning raw data from Cassini's INMS instrument.

"The raw data came through from our instrument on Cassini to deep-space antennas to NASA's Jet Propulsion Laboratory and then to computers at the Southwest Research Institute in San Antonio where Hunter Waite, the first author, is based," he said.

But Cravens' main contribution involved interpreting that data with a focus on how materials from the rings are altering Saturn's ionosphere. Cravens and his colleagues report the influx of chemicals from the rings change Saturn's equatorial ionospheric chemistry by converting the hydrogen ions and triatomic hydrogen ions into heavier molecular ions, depleting the planet's ionospheric density.

Read more at Science Daily

Oct 4, 2018

Mountaintop observatory sees gamma rays from exotic Milky Way object

The High-Altitude Water Cherenkov Gamma-Ray Observatory (HAWC) is a detector designed to look at gamma-ray emission coming from astronomical objects such as supernova remnants, quasars and rotating dense stars called pulsars. Located roughly 13,500 feet above sea level near the Sierra Negra volcano in Mexico, the detector is composed of more than 300 tanks of water, each about 24 feet in diameter. When particles strike the water, they produce a shock wave of blue light called Cherenkov radiation. Special cameras in the tanks detect this light, allowing scientists to determine the origin of incoming gamma rays.
The night sky seems serene, but telescopes tell us that the universe is filled with collisions and explosions. Distant, violent events signal their presence by spewing light and particles in all directions. When these messengers reach Earth, scientists can use them to map out the action-packed sky, helping to better understand the volatile processes happening deep within space.

For the first time, an international collaboration of scientists has detected highly energetic light coming from the outermost regions of an unusual star system within our own galaxy. The source is a microquasar -- a black hole that gobbles up stuff from a nearby companion star and blasts out two powerful jets of material. The team's observations, described in the October 4, 2018 issue of the journal Nature, strongly suggest that electron acceleration and collisions at the ends of the microquasar's jets produced the powerful gamma rays. Scientists think that studying messengers from this microquasar may offer a glimpse into more extreme events happening at the centers of distant galaxies.

The team gathered data from the High-Altitude Water Cherenkov Gamma-Ray Observatory (HAWC), which is a detector designed to look at gamma-ray emission coming from astronomical objects such as supernova remnants, quasars and rotating dense stars called pulsars. Now, the team has studied one of the most well-known microquasars, named SS 433, which is about 15,000 light years away from Earth. Scientists have seen about a dozen microquasars in our galaxy and only a couple of them appear to emit high-energy gamma rays. With SS 433's close proximity and orientation, scientists have a rare opportunity to observe extraordinary astrophysics.

"SS 433 is right in our neighborhood and so, using HAWC's unique wide field of view, we were able to resolve both microquasar particle acceleration sites," said Jordan Goodman, a Distinguished University Professor at the University of Maryland and U.S. lead investigator and spokesperson for the HAWC collaboration. "By combining our observations with multi-wavelength and multi-messenger data from other telescopes, we can improve our understanding of particle acceleration in SS 433 and its giant, extragalactic cousins, called quasars."

Quasars are massive black holes that suck in material from the centers of galaxies, rather than feeding on a single star. They actively expel radiation, which can been seen from across the universe. But they are so far away that most known quasars have been detected because their jets are aimed at Earth -- like having a flashlight aimed directly at one's eyes. In contrast, SS 433's jets are oriented away from Earth and HAWC has detected similarly energetic light coming from the microquasar's side.

Regardless of where they originate, gamma rays travel in a straight line to their destination. The ones that arrive at Earth collide with molecules in the atmosphere, creating new particles and lower-energy gamma rays. Each new particle then smashes into more stuff, creating a particle shower as the signal cascades toward the ground.

HAWC, located roughly 13,500 feet above sea level near the Sierra Negra volcano in Mexico, is perfectly situated to catch the fast-moving rain of particles. The detector is composed of more than 300 tanks of water, each of which is about 24 feet in diameter. When the particles strike the water they are moving fast enough to produce a shock wave of blue light called Cherenkov radiation. Special cameras in the tanks detect this light, allowing scientists to determine the origin story of the gamma rays.

The HAWC collaboration examined 1,017 days' worth of data and saw evidence that gamma rays were coming from the ends of the microquasar's jets, rather than the central part of the star system. Based on their analysis, the researchers concluded that electrons in the jets attain energies that are about a thousand times higher than can be achieved using earthbound particle accelerators, such as the city-sized Large Hadron Collider, located along the border between France and Switzerland. The jets' electrons collide with the low-energy microwave background radiation that permeates space, resulting in gamma ray emission. This is a new mechanism for generating high-energy gamma rays in this type of system and is different than what scientists have observed when an object's jets are aimed at Earth.

Ke Fang, a co-author of the study and former postdoctoral researcher at the Joint Space-Science Institute, a partnership between UMD and NASA's Goddard Space Flight Center, said that this new measurement is critical to understanding what is going on in SS 433.

"Looking at only one kind of light coming from SS 433 is like seeing only the tail of an animal," said Fang, who is currently an Einstein Fellow at Stanford University. "Thus, we combine all of its signals, from low energy radio to X-ray, with new high-energy gamma ray observations, to find out what kind of beast SS 433 really is."

Until now, instruments had not observed SS 433 emitting such highly energetic gamma rays. But HAWC is designed to be very sensitive to this extreme part of the light spectrum. The detector also has a wide field of view that looks at the entire overhead sky all of the time. The collaboration used these capabilities to resolve the microquasar's structural features.

"SS 433 is an unusual star system and each year something new has come out about it," said Segev BenZvi, another co-author of the study and an assistant professor of physics at the University of Rochester. "This new observation of high-energy gamma rays builds on almost 40 years of measurements of one of the weirdest objects in the Milky Way. Every measurement gives us a different piece of the puzzle, and we hope to use our knowledge to learn about the quasar family as a whole."

Read more at Science Daily

Neanderthal-like features in 450,000-year-old fossil teeth from the Italian Peninsula

This is a virtual rendering of the Visogliano and Fontana Ranuccio teeth.
Fossil teeth from Italy, among the oldest human remains on the Italian Peninsula, show that Neanderthal dental features had evolved by around 450,000 years ago, according to a study published October 3, 2018 in the open-access journal PLOS ONE by Clément Zanolli of the Université Toulouse III Paul Sabatier in France and colleagues. These teeth also add to a growing picture of a period of complex human evolution that we are only beginning to understand.

Zanolli and colleagues examined dental remains from the sites of Fontana Fanuccio, located 50km southeast of Rome, and Visogliano, located 18km northwest of Trieste. At around 450,000 years old, these teeth join a very short list of fossil human remains from Middle Pleistocene Europe. Using micro-CT scanning and detailed morphological analyses, the authors examined the shape and arrangement of tooth tissues and compared them with teeth of other human species. They found that the teeth of both sites share similarities with Neanderthals and are distinct from modern humans.

There has been much debate over the identities and relationships of Middle Pleistocene ancient humans in Eurasia. The discovery of Neanderthal-like teeth so early in the record adds support to the suggestion of an early divergence of the Neanderthal lineage from our own, around the Early-Middle Pleistocene transition. The teeth are also notably different from other teeth known from this time in Eurasia, suggesting that there may have been multiple human lineages populating the region at this time, adding to a growing list of evidence that the Middle Pleistocene was a time of more complex human evolution than previously recognized.

Zanolli adds: "The remains from Fontana Ranuccio and Visogliano represent among the oldest human fossil remains testifying to a peopling phase of the Italian Peninsula. Our analyses of the tooth internal structural organization reveal a Neanderthal-like signature, also resembling the condition shown by the contemporary assemblage from Atapuerca Sima de los Huesos, indicating that an overall Neanderthal morphological dental template was preconfigured in Western Europe at least 430 to 450 ka ago."

From Science Daily

Traces of opiates found in ancient Cypriot vessel

The base-ring juglet resembles the seed head of an opium poppy.
Researchers at the University of York and the British Museum have discovered traces of opiates preserved inside a distinctive vessel dating back to the Late Bronze Age.

Vessels of this type, known as 'base-ring juglets', have long been thought to have links with opium use because when inverted they resemble the seed head of the opium poppy; they are known to have been widely traded in the eastern Mediterranean ca. 1650 -- 1350BC.

Researchers used a range of analytical techniques to study a particular juglet housed in the British Museum, which is a sealed vessel, allowing the contents inside to be preserved. This meant that there was a rare opportunity for scientists to investigate what components might have survived.

Initial analysis by scientists at the British Museum showed that the juglet residue was mostly composed of a plant oil but hinted at the presence of opium alkaloids, a group of organic compounds derived from the opium poppy, and that are known to have significant psychological effects on the human body.

To conclusively detect the alkaloids and demonstrate the presence of opiates in the oil-based residue of the vessel, however, a new analytical technique was needed.

Using instruments in the Centre of Excellence in Mass Spectrometry at the University of York, Dr Rachel Smith developed the new analytical method as part of her PhD at the University's Department of Chemistry.

Dr Smith said: "The particular opiate alkaloids we detected are ones we have shown to be the most resistant to degradation, which makes them better targets in ancient residues than more well-known opiates such as morphine.

"We found the alkaloids in degraded plant oil, so the question as to how opium would have been used in this juglet still remains. Could it have been one ingredient amongst others in an oil-based mixture, or could the juglet have been re-used for oil after the opium or something else entirely?"

In the past, it has been argued that these juglets could have been used to hold poppy seed oil, containing traces of opium, used for anointing or in a perfume. In this theory, the opium effects may have held symbolic significance.

Professor Jane Thomas-Oates, Chair of Analytical Science in the Department of Chemistry, and supervisor of the study at the University of York, said: "The juglet is significant in revealing important details about trade and the culture of the period, so it was important to us to try and progress the debate about what it might have been used for.

"We were able to establish a rigorous method for detecting opiates in this kind of residue, but the next analytical challenge is to see if we can succeed with less well-preserved residues."

This is the first time that reliable chemical evidence has been produced to link the opium poppy with a base-ring juglet, despite many previous attempts by researchers over the years.

Dr Rebecca Stacey, Senior Scientist in the Department of Scientific Research at the British Museum, said: "It is important to remember that this is just one vessel, so the result raises lots of questions about the contents of the juglet and its purpose. The presence of the alkaloids here is unequivocal and lends a new perspective to the debate about their significance."

Read more at Science Daily

Viruses influenced gene sharing between Neanderthals and humans

This graphical abstract shows how human genome evolution after Neanderthal interbreeding was shaped by viral infections and the resulting selection for ancient alleles of viralinteracting protein genes.
Human evolution used to be depicted as a straight line, gradually progressing from an ape-like ancestor to modern Homo sapiens. But thanks to next-generation sequencing -- as well as the discovery of genetic material from extinct subspecies of early humans -- findings in recent years have shown that it wasn't quite so orderly. The human family tree is full of twists and branches that helped shape what we are today. Now, a study published in the journal Cell is reporting new details about the role of viruses in shaping evolution, in particular viral interactions between modern humans and Neanderthals.

"It's not a stretch to imagine that when modern humans met up with Neanderthals, they infected each other with pathogens that came from their respective environments," says first author David Enard (@DavidEnard), an assistant professor in ecology and evolutionary biology at the University of Arizona. "By interbreeding with each other, they also passed along genetic adaptations to cope with some of those pathogens."

Current thinking is that modern humans began moving out of Africa and into Eurasia about 70,000 years ago. When they arrived, they met up with Neanderthals who, along with their own ancestors, had been adapting to that geographic area for hundreds of thousands of years. The Eurasian environment shaped Neanderthals' evolution, including the development of adaptations to viruses and other pathogens that were present there but not in Africa.

The Cell study provides new details about the role of adaptive introgression, or hybridization between species, in human evolution. "Some of the Neanderthals had adaptive mutations that gave them advantages against these pathogens, and they were able to pass some of these mutations on to modern humans," explains Enard, who completed the work while he was a postdoctoral researcher at Stanford University. "That's called positive natural selection -- it favors certain individuals that carry these advantageous mutations."

Enard and Dmitri Petrov (@PetrovADmitri), the Michelle and Kevin Douglas Professor of Biology at Stanford University and senior author of the new study, use bioinformatics tools to study global patterns of evolution across tens of thousands of years. Their earlier research focused on how viruses impacted the evolution of humans. In 2016, they reported that about one-third of protein adaptations since humans split from other great apes was driven by a response to infectious viruses. The new work built on those findings looked at which of those adaptations may have come from Neanderthals.

In the current study, the investigators annotated thousands of genes in the human genome that are known to interact with pathogens -- more than 4,000 of the 25,000 total genes. "We focused on these genes because the ones that interact with viruses are much more likely to have been involved in adaptation against infectious disease compared with genes that don't have anything to do with viruses," Enard says.

They then looked at whether there was an enrichment of stretches of Neanderthal DNA in those 4,000 genes. Earlier studies from other groups have shown that Neanderthal DNA is present in humans. Those sequences are publicly available to investigators in the field. Based on the analysis, Enard and Petrov found strong evidence that adaptive genes that provided resistance against viruses were shared between Neanderthals and modern humans.

"Many Neanderthal sequences have been lost in modern humans, but some stayed and appear to have quickly increased to high frequencies at the time of contact, suggestive of their selective benefits at that time," Petrov says. "Our research aims to understand why that was the case. We believe that resistance to specific RNA viruses provided by these Neanderthal sequences was likely a big part of the reason for their selective benefits."

"One of the things that population geneticists have wondered about is why we have maintained these stretches of Neanderthal DNA in our own genomes," Enard adds. "This study suggests that one of the roles of those genes was to provide us with some protection against pathogens as we moved into new environments."

In addition to revealing new details about human evolution, Enard notes, another benefit of this type of research is that it will help investigators uncover new clues about ancient disease outbreaks. This could potentially inform better ways to monitor for and treat future epidemics. "RNA is very fragile and gets degraded fast, so it's hard to learn much about ancient diseases that were caused by RNA-based viruses," he says. "You can think of these genetic adaptations like footprints from long-extinct dinosaurs preserved in fossilized mud. Even without having access to the viruses themselves, scientists who study prehistoric epidemics will be able to learn about the pathogens that drove them."

Read more at Science Daily

Astronomers find first compelling evidence for a moon outside our solar system

This is an artist's impression of the exoplanet Kepler-1625b, transiting the star, with the candidate exomoon in tow.
A pair of Columbia University astronomers using NASA's Hubble Space Telescope and Kepler Space Telescope have assembled compelling evidence for the existence of a moon orbiting a gas-giant planet 8,000 light-years away.

In a paper published Oct. 3 in the journal Science Advances, Alex Teachey and David Kipping report that the detection of a candidate exomoon -- that is, moons orbiting planets in other star systems -- is unusual because of its large size, comparable to the diameter of Neptune. Such gargantuan moons do not exist in our own solar system, where nearly 200 natural satellites have been cataloged.

"This would be the first case of detecting a moon outside our solar system," said Kipping, an assistant professor of astronomy at Columbia. "If confirmed by follow-up Hubble observations, the finding could provide vital clues about the development of planetary systems and may cause experts to revisit theories of how moons form around planets."

In looking for exomoons, the researchers analyzed data from 284 Kepler-discovered planets that were in comparatively wide orbits, with periods greater than 30 days, around their host star. The observations measured the momentary dimming of starlight as a planet passed in front of its star, called a transit. The researchers found one instance, in Kepler 1625b, that had intriguing anomalies.

"We saw little deviations and wobbles in the light curve that caught our attention," Kipping said.

The Kepler results were enough for the team to get 40 hours of time with Hubble to intensively study the planet, obtaining data four times more precise than that of Kepler. The researchers monitored the planet before and during its 19-hour-long transit across the face of the star. After it ended, Hubble detected a second and much smaller decrease in the star's brightness 3.5 hours later, consistent with "a moon trailing the planet like a dog following its owner on a leash," Kipping said. "Unfortunately, the scheduled Hubble observations ended before the complete transit of the moon could be measured."

In addition to this dip in light, Hubble provided supporting evidence for the moon hypothesis by measuring that the planet began its transit 1.25 hours earlier than predicted. This is consistent with the planet and moon orbiting a common center of gravity (barycenter) that would cause the planet to wobble from its predicted location.

"An extraterrestrial civilization watching the Earth and Moon transit the Sun would note similar anomalies in the timing of Earth's transit," Kipping said.

The researchers note that in principle this anomaly could be caused by the gravitational pull of a hypothetical second planet in the system, although Kepler found no evidence for additional planets around the star during its four-year mission.

"A companion moon is the simplest and most natural explanation for the second dip in the light curve and the orbit-timing deviation," said lead author Teachey, NSF Graduate Fellow in astronomy at Columbia. "It was a shocking moment to see that light curve, my heart started beating a little faster and I just kept looking at that signature. But we knew our job was to keep a level head testing every conceivable way in which the data could be tricking us until we were left with no other explanation."

The moon is estimated to be only 1.5 percent the mass of its companion planet, which itself estimated to be several times the mass of Jupiter. This value is close to the mass-ratio between the Earth and its moon. But in the case of the Earth-Moon system and the Pluto-Charon system -- the largest of the five known natural satellites of the dwarf planet Pluto -- an early collision with a larger body is hypothesized to have blasted off material that later coalesced into a moon. Kepler 1625b and its satellite, however, are gaseous, not rocky, and, therefore, such a collision may not lead to the condensation of a satellite.

Exomoons are difficult to find because they are smaller than their companion planet and so their transit signal is weak; they also shift position with each transit because the moon is orbiting the planet. In addition, the ideal candidate planets hosting moons are in large orbits, with long and infrequent transit times. In this search, the Neptune-sized moon would have been among the easiest to first detect because of its large size.

The host planet and its moon lie within the solar mass star's (Kepler 1625) habitable zone, where moderate temperatures allow for the existence of liquid water on any solid planetary surface. "Both bodies, however, are considered to be gaseous and therefore unsuitable for life as we know it," Kipping said.

Read more at Science Daily

Oct 3, 2018

New simulation sheds light on spiraling supermassive black holes

Gas glows brightly in this computer simulation of supermassive black holes only 40 orbits from merging. Models like this may eventually help scientists pinpoint real examples of these powerful binary systems. (Full video available at: https://svs.gsfc.nasa.gov/13043
A new model is bringing scientists a step closer to understanding the kinds of light signals produced when two supermassive black holes, which are millions to billions of times the mass of the Sun, spiral toward a collision. For the first time, a new computer simulation that fully incorporates the physical effects of Einstein's general theory of relativity shows that gas in such systems will glow predominantly in ultraviolet and X-ray light.

Just about every galaxy the size of our own Milky Way or larger contains a monster black hole at its center. Observations show galaxy mergers occur frequently in the universe, but so far no one has seen a merger of these giant black holes.

"We know galaxies with central supermassive black holes combine all the time in the universe, yet we only see a small fraction of galaxies with two of them near their centers," said Scott Noble, an astrophysicist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "The pairs we do see aren't emitting strong gravitational-wave signals because they're too far away from each other. Our goal is to identify -- with light alone -- even closer pairs from which gravitational-wave signals may be detected in the future."

A paper describing the team's analysis of the new simulation was published Tuesday, Oct. 2, in the Astrophysical Journal and is now available online.

Scientists have detected merging stellar-mass black holes -- which range from around three to several dozen solar masses -- using the National Science Foundation's Laser Interferometer Gravitational-Wave Observatory (LIGO). Gravitational waves are space-time ripples traveling at the speed of light. They are created when massive orbiting objects like black holes and neutron stars spiral together and merge.

Supermassive mergers will be much more difficult to find than their stellar-mass cousins. One reason ground-based observatories can't detect gravitational waves from these events is because Earth itself is too noisy, shaking from seismic vibrations and gravitational changes from atmospheric disturbances. The detectors must be in space, like the Laser Interferometer Space Antenna (LISA) led by ESA (the European Space Agency) and planned for launch in the 2030s. Observatories monitoring sets of rapidly spinning, superdense stars called pulsars may detect gravitational waves from monster mergers. Like lighthouses, pulsars emit regularly timed beams of light that flash in and out of view as they rotate. Gravitational waves could cause slight changes in the timing of those flashes, but so far studies haven't yielded any detections.

But supermassive binaries nearing collision may have one thing stellar-mass binaries lack -- a gas-rich environment. Scientists suspect the supernova explosion that creates a stellar black hole also blows away most of the surrounding gas. The black hole consumes what little remains so quickly there isn't much left to glow when the merger happens.

Supermassive binaries, on the other hand, result from galaxy mergers. Each supersized black hole brings along an entourage of gas and dust clouds, stars and planets. Scientists think a galaxy collision propels much of this material toward the central black holes, which consume it on a time scale similar to that needed for the binary to merge. As the black holes near, magnetic and gravitational forces heat the remaining gas, producing light astronomers should be able to see.

"It's very important to proceed on two tracks," said co-author Manuela Campanelli, director of the Center for Computational Relativity and Gravitation at the Rochester Institute of Technology in New York, who initiated this project nine years ago. "Modeling these events requires sophisticated computational tools that include all the physical effects produced by two supermassive black holes orbiting each other at a fraction of the speed of light. Knowing what light signals to expect from these events will help modern observations identify them. Modeling and observations will then feed into each other, helping us better understand what is happening at the hearts of most galaxies."

The new simulation shows three orbits of a pair of supermassive black holes only 40 orbits from merging. The models reveal the light emitted at this stage of the process may be dominated by UV light with some high-energy X-rays, similar to what's seen in any galaxy with a well-fed supermassive black hole.

Three regions of light-emitting gas glow as the black holes merge, all connected by streams of hot gas: a large ring encircling the entire system, called the circumbinary disk, and two smaller ones around each black hole, called mini disks. All these objects emit predominantly UV light. When gas flows into a mini disk at a high rate, the disk's UV light interacts with each black hole's corona, a region of high-energy subatomic particles above and below the disk. This interaction produces X-rays. When the accretion rate is lower, UV light dims relative to the X-rays.

Based on the simulation, the researchers expect X-rays emitted by a near-merger will be brighter and more variable than X-rays seen from single supermassive black holes. The pace of the changes links to both the orbital speed of gas located at the inner edge of the circumbinary disk as well as that of the merging black holes.

"The way both black holes deflect light gives rise to complex lensing effects, as seen in the movie when one black hole passes in front of the other," said Stéphane d'Ascoli, a doctoral student at École Normale Supérieure in Paris and lead author of the paper. "Some exotic features came as a surprise, such as the eyebrow-shaped shadows one black hole occasionally creates near the horizon of the other."

The simulation ran on the National Center for Supercomputing Applications' Blue Waters supercomputer at the University of Illinois at Urbana-Champaign. Modeling three orbits of the system took 46 days on 9,600 computing cores. Campanelli said the collaboration was recently awarded additional time on Blue Waters to continue developing their models.

The original simulation estimated gas temperatures. The team plans to refine their code to model how changing parameters of the system, like temperature, distance, total mass and accretion rate, will affect the emitted light. They're interested in seeing what happens to gas traveling between the two black holes as well as modeling longer time spans.

Read more at Science Daily

Black holes ruled out as universe's missing dark matter

A supernova (bright spot at lower left) and its host galaxy (upper center), as they would appear if gravitationally lensed by an intervening black hole (center). The gravitational field of the black hole distorts and magnifies the image and makes both the galaxy and the supernova shine brighter. Gravitationally magnified supernovas would occur rather frequently if black holes were the dominant form of matter in the universe. The lack of such findings can be used to set limits on the mass and abundance of black holes.
For one brief shining moment after the 2015 detection of gravitational waves from colliding black holes, astronomers held out hope that the universe's mysterious dark matter might consist of a plenitude of black holes sprinkled throughout the universe.

University of California, Berkeley, physicists have dashed those hopes.

Based on a statistical analysis of 740 of the brightest supernovas discovered as of 2014, and the fact that none of them appear to be magnified or brightened by hidden black hole "gravitational lenses," the researchers concluded that primordial black holes can make up no more than about 40 percent of the dark matter in the universe. Primordial black holes could only have been created within the first milliseconds of the Big Bang as regions of the universe with a concentrated mass tens or hundreds of times that of the sun collapsed into objects a hundred kilometers across.

The results suggest that none of the universe's dark matter consists of heavy black holes, or any similar object, including massive compact halo objects, so-called MACHOs.

Dark matter is one of astronomy's most embarrassing conundrums: despite comprising 84.5 percent of the matter in the universe, no one can find it. Proposed dark matter candidates span nearly 90 orders of magnitude in mass, from ultralight particles like axions to MACHOs.

Several theorists have proposed scenarios in which there are multiple types of dark matter. But if dark matter consists of several unrelated components, each would require a different explanation for its origin, which makes the models very complex.

"I can imagine it being two types of black holes, very heavy and very light ones, or black holes and new particles. But in that case one of the components is orders of magnitude heavier than the other, and they need to be produced in comparable abundance. We would be going from something astrophysical to something that is truly microscopic, perhaps even the lightest thing in the universe, and that would be very difficult to explain," said lead author Miguel Zumalacárregui, a Marie Curie Global Fellow at the Berkeley Center for Cosmological Physics.

An as-yet unpublished reanalysis by the same team using an updated list of 1,048 supernovas cuts the limit in half, to a maximum of about 23 percent, further slamming the door on the dark matter-black hole proposal.

"We are back to the standard discussions. What is dark matter? Indeed, we are running out of good options," said Uroš Seljak, a UC Berkeley professor of physics and astronomy and BCCP co-director. "This is a challenge for future generations."

The analysis is detailed in a paper published this week in the journal Physical Review Letters.

Dark matter lensing

Their conclusions are based on the fact that an unseen population of primordial black holes, or any massive compact object, would gravitationally bend and magnify light from distant objects on its way to Earth. Therefore, gravitational lensing should affect the light from distant Type Ia supernovas. These are the exploding stars that scientists have used as standard brightness sources to measure cosmic distances and document the expansion of the universe.

Zumalacárregui conducted a complex statistical analysis of data on the brightness and distance supernovas catalogued in two compilations -- 580 in the Union and 740 in the joint light-curve analysis (JLA) catalogs -- and concluded that eight should be brighter by a few tenths of a percent than predicted based on observations of how these supernovas brighten and fade over time. No such brightening has been detected.

Other researchers have performed similar but simpler analyses that yielded inconclusive results. But Zumalacárregui incorporated the precise probability of seeing all magnifications, from small to huge, as well as uncertainties in brightness and distance of each supernova. Even for low-mass black holes -- those 1 percent the mass of the sun -- there should be some highly magnified distant supernovas, he said, but there are none.

"You cannot see this effect on one supernova, but when you put them all together and do a full Bayesian analysis you start putting very strong constraints on the dark matter, because each supernova counts and you have so many of them," Zumalacárregui said. The more supernovas included in the analysis, and the farther away they are, the tighter the constraints. Data on 1,048 bright supernovas from the Pantheon catalog provided an even lower upper limit -- 23 percent -- than the newly published analysis.

Seljak published a paper proposing this type of analysis in the late 1990s, but when interest shifted from looking for big objects, MACHOs, to looking for fundamental particles, in particular weakly interacting massive particles, or WIMPs, follow-up plans fell by the wayside. By then, many experiments had excluded most masses and types of MACHOs, leaving little hope of discovering such objects.

At the time, too, only a small number of distant Type Ia supernovas had been discovered and their distances measured.

Only after the LIGO observations brought up the issue again did Seljak and Zumalacárregui embark on the complicated analysis to determine the limits on dark matter.

Read more at Science Daily

Nobel Prize in Chemistry 2018

Abstract molecular structure
The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Chemistry 2018 with one half to Frances H. Arnold, California Institute of Technology, Pasadena, USA "for the directed evolution of enzymes" and the other half jointly to George P. Smith, University of Missouri, Columbia, USA and Sir Gregory P. Winter, MRC Laboratory of Molecular Biology, Cambridge, UK "for the phage display of peptides and antibodies."

They harnessed the power of evolution

The power of evolution is revealed through the diversity of life. The 2018 Nobel Laureates in Chemistry have taken control of evolution and used it for purposes that bring the greatest benefit to humankind. Enzymes produced through directed evolution are used to manufacture everything from biofuels to pharmaceuticals. Antibodies evolved using a method called phage display can combat autoimmune diseases and in some cases cure metastatic cancer.

Since the first seeds of life arose around 3.7 billion years ago, almost every crevice on Earth has filled with different organisms. Life has spread to hot springs, deep oceans and dry deserts, all because evolution has solved a number of chemical problems. Life's chemical tools -- proteins -- have been optimised, changed and renewed, creating incredible diversity.

This year's Nobel Laureates in Chemistry have been inspired by the power of evolution and used the same principles -- genetic change and selection -- to develop proteins that solve humankind's chemical problems.

One half of this year's Nobel Prize in Chemistry is awarded to Frances H. Arnold. In 1993, she conducted the first directed evolution of enzymes, which are proteins that catalyse chemical reactions. Since then, she has refined the methods that are now routinely used to develop new catalysts. The uses of Frances Arnold's enzymes include more environmentally friendly manufacturing of chemical substances, such as pharmaceuticals, and the production of renewable fuels for a greener transport sector.

The other half of this year's Nobel Prize in Chemistry is shared by George P. Smith and Sir Gregory P. Winter. In 1985, George Smith developed an elegant method known as phage display, where a bacteriophage -- a virus that infects bacteria -- can be used to evolve new proteins. Gregory Winter used phage display for the directed evolution of antibodies, with the aim of producing new pharmaceuticals. The first one based on this method, adalimumab, was approved in 2002 and is used for rheumatoid arthritis, psoriasis and inflammatory bowel diseases. Since then, phage display has produced anti-bodies that can neutralise toxins, counteract autoimmune diseases and cure metastatic cancer.

Read more at Science Daily

Stem cells organize themselves into pseudo-embryos

Seven-day old gastruloid. The cell nuclei are marked in blue. Neural progenitor cells (green) are distributed along the antero-posterior axis. Progenitor cells of the tail bud (pink) are confined to the posterior extremity of the gastruloid and indicate the direction of its elongation.
The definitive architecture of the mammalian body is established shortly after implantation of the embryo in the uterus. The antero-posterior, dorso-ventral and medio-lateral axes of the body become organized under the control of gene networks that coordinate the transcription of DNA in various regions of the embryo. Researchers from the University of Geneva (UNIGE), the University of Cambridge, UK, and the Federal Institute of Technology Lausanne (EPFL) now report the ability of mouse stem cells to produce pseudo-embryos that display similar capacities. Established from about 300 embryonic stem cells only, these structures, called gastruloids, show developmental features comparable to that of the posterior part of embryos aged from 6 to 10 days. The study, published in the journal Nature, shows that the three main embryonic axes are formed according to a gene expression program similar to that of embryos. Gastruloids thus possess a remarkable potential for the study of the early stages of normal or pathological embryonic development in mammals.

Studying the processes orchestrating the formation of early mammalian embryos is hampered by the difficulty in obtaining them. The team of Alfonso Martinez Arias, Professor at the Department of Genetics of the University of Cambridge, UK, recently discovered that, under certain conditions, murine embryonic stem cells can assemble into three-dimensional aggregates that keep elongating in culture. These entities called "gastruloids" display different characteristics of early stages of embryonic development.

Interdependent developmental processes

"To determine whether gastruloids organize themselves into bona fide embryonic structures, we characterized their level of genetic activation at different stages of development," explains Denis Duboule, Professor at the Department of Genetics and Evolution of the UNIGE Faculty of Sciences and at the Swiss Institute for Experimental Cancer Research of EPFL. The researchers identified and quantified the RNA transcribed from gastruloids and compared the expressed genes with those of mouse embryos at comparable stages of development.

"Gastruloids form structures similar to the posterior part of the embryo, whose development program is somewhat different from that of the head," says Leonardo Beccari, scientist in the Geneva group and co-first author of the study. These pseudo-embryos express genes characteristic of the various types of progenitor cells necessary for the constitution of future tissues. "The complexity of gene expression profiles increases over time, with the appearance of markers from different embryonic cell lineages, much like the profiles observed in control embryos," notes Naomi Moris, scientist from the Cambridge team and co-first author of the article.

Architect genes are activated as in the embryo

During the formation of the gastruloids' antero-posterior, dorso-ventral and medio-lateral axes, members of these teams could visualize an amazing property; the 'architect' Hox genes, which are normally activated in a precise sequential order in the embryo, were equally well activated in gastruloids grown in vitro. "The implementation of the Hox gene network over time, which mimicks that of the tail of the embryo, confirms the remarkably high level of self-organization of gastruloids," explains Mehmet Girgin, geneticist at EPFL and co-first author of the study.

Read more at Science Daily

Oct 2, 2018

Nobel Prize in Physics 2018

Laser beam light effect
The inventions being honored this year have revolutionised laser physics. Extremely small objects and incredibly rapid processes are now being seen in a new light. Advanced precision instruments are opening up unexplored areas of research and a multitude of industrial and medical applications.

The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Physics 2018 "for groundbreaking inventions in the field of laser physics" with one half to Arthur Ashkin, Bell Laboratories, Holmdel, USA "for the optical tweezers and their application to biological systems" and the other half jointly to Gérard Mourou, École Polytechnique, Palaiseau, France and University of Michigan, Ann Arbor, USA and Donna Strickland, University of Waterloo, Canada "for their method of generating high-intensity, ultra-short optical pulses."

Arthur Ashkin invented optical tweezers that grab particles, atoms, viruses and other living cells with their laser beam fingers. This new tool allowed Ashkin to realise an old dream of science fiction -- using the radiation pressure of light to move physical objects. He succeeded in getting laser light to push small particles towards the centre of the beam and to hold them there. Optical tweezers had been invented.

A major breakthrough came in 1987, when Ashkin used the tweezers to capture living bacteria without harming them. He immediately began studying biological systems and optical tweezers are now widely used to investigate the machinery of life.

Gérard Mourou and Donna Strickland paved the way towards the shortest and most intense laser pulses ever created by humankind. Their revolutionary article was published in 1985 and was the foundation of Strickland's doctoral thesis.

Using an ingenious approach, they succeeded in creating ultrashort high-intensity laser pulses without destroying the amplifying material. First they stretched the laser pulses in time to reduce their peak power, then amplified them, and finally compressed them. If a pulse is compressed in time and becomes shorter, then more light is packed together in the same tiny space -- the intensity of the pulse increases dramatically.

Strickland and Mourou's newly invented technique, called chirped pulse amplification, CPA, soon became standard for subsequent high-intensity lasers. Its uses include the millions of corrective eye surgeries that are conducted every year using the sharpest of laser beams.

Read more at Science Daily

New tool helps scientists better target the search for alien life

Schematic view of the Milky Way showing six isotropic extraterrestrial emission processes forming spherical shells filled by radio signals. The outer radii of the spherical shells are proportional to the time at which the signals were first emitted, while the thicknesses are proportional to the duration of the emissions. In this example, the Earth is illuminated by one of these signals.
Could there be another planet out there with a society at the same stage of technological advancement as ours? To help find out, EPFL scientist Claudio Grimaldi, working in association with the University of California, Berkeley, has developed a statistical model that gives researchers a new tool in the search for the kind of signals that an extraterrestrial society might emit. His method -- described in an article appearing today in Proceedings of the National Academy of Sciences -- could also make the search cheaper and more efficient.

Atrophysics initially wasn't Grimaldi's thing; he was interested more in the physics of condensed matter. Working at EPFL's Laboratory of Physics of Complex Matter, his research involved calculating the probabilities of carbon nanotubes exchanging electrons. But then he wondered: if the nanotubes were stars and the electrons were signals generated by extraterrestrial societies, could we calculate the probability of detecting those signals more accurately?

This is not pie-in-the-sky research -- scientists have been studying this possibility for nearly 60 years. Several research projects concerning the search for extraterrestrial intelligence (SETI) have been launched since the late 1950s, mainly in the United States. The idea is that an advanced civilization on another planet could be generating electromagnetic signals, and scientists on Earth might be able to pick up those signals using the latest high-performance radio telescopes.

Renewed interest

Despite considerable advances in radio astronomy and the increase in computing power since then, none of those projects has led to anything concrete. Some signals without identifiable origin have well been recorded, like the Wow! signal in 1977, but none of them has been repeated or seems credible enough to be attributable to alien life.

But that doesn't mean scientists have given up. On the contrary, SETI has seen renewed interest following the discovery of the many exoplanets orbiting the billions of suns in our galaxy. Researchers have designed sophisticated new instruments -- like the Square Kilometre Array, a giant radio telescope being built in South Africa and Australia with a total collecting area of one square kilometer -- that could pave the way to promising breakthroughs. And Russian entrepreneur Yuri Milner recently announced an ambitious program called Breakthrough Listen, which aims to cover 10 times more sky than previous searches and scan a much wider band of frequencies. Milner intends to fund his initiative with 100 million dollars over 10 years.

"In reality, expanding the search to these magnitudes only increases our chances of finding something by very little. And if we still don't detect any signals, we can't necessarily conclude with much more certainty that there is no life out there," says Grimaldi.

Still a ways to go

The advantage of Grimaldi's statistical model is that it lets scientists interpret both the success and failure to detect signals at varying distances from Earth. His model employs Bayes' theorem to calculate the remaining probability of detecting a signal within a given radius around our planet.

For example, even if no signal is detected within a radius of 1,000 light years, there is still an over 10% chance that Earth is within range of hundreds of similar signals from elsewhere in the galaxy, but that our radio telescopes are currently not powerful enough to detect them. However, that probability rises to nearly 100% if even just one signal is detected within the 1,000-light-year radius. In that case, we could be almost certain that our galaxy is full of alien life.

After factoring in other parameters like the size of the galaxy and how closely packed its stars are, Grimaldi estimates that the probability of detecting a signal becomes very slight only at a radius of 40,000 light years. In other words, if no signals are detected at this distance from Earth, we could reasonably conclude that no other civilization at the same level of technological development as ours is detectable in the galaxy. But so far, scientists have been able to search for signals within a radius of "just" 40 light years.

Read more at Science Daily

New extremely distant solar system object found during hunt for Planet X

An artist's conception of a distant Solar System Planet X, which could be shaping the orbits of smaller extremely distant outer Solar System objects like 2015 TG387 discovered by a team of Carnegie's Scott Sheppard, Northern Arizona University's Chad Trujillo, and the University of Hawaii's David Tholen.
Carnegie's Scott Sheppard and his colleagues -- Northern Arizona University's Chad Trujillo, and the University of Hawaii's David Tholen -- are once again redefining our Solar System's edge. They discovered a new extremely distant object far beyond Pluto with an orbit that supports the presence of an even-farther-out, Super-Earth or larger Planet X.

The newly found object, called 2015 TG387, will be announced Tuesday by the International Astronomical Union's Minor Planet Center. A paper with the full details of the discovery has also been submitted to the Astronomical Journal.

2015 TG387 was discovered about 80 astronomical units (AU) from the Sun, a measurement defined as the distance between Earth and the Sun. For context, Pluto is around 34 AU, so 2015 TG387 is about two and a half times further away from the Sun than Pluto is right now.

The new object is on a very elongated orbit and never comes closer to the Sun, a point called perihelion, than about 65 AU. Only 2012 VP113 and Sedna at 80 and 76 AU respectively have more-distant perihelia than 2015 TG387. Though 2015 TG387 has the third-most-distant perihelion, its orbital semi-major axis is larger than 2012 VP113 and Sedna's, meaning it travels much farther from the Sun than they do. At its furthest point, it reaches all the way out to about 2,300 AU. 2015 TG387 is one of the few known objects that never comes close enough to the Solar System's giant planets, like Neptune and Jupiter, to have significant gravitational interactions with them.

"These so-called Inner Oort Cloud objects like 2015 TG387, 2012 VP113, and Sedna are isolated from most of the Solar System's known mass, which makes them immensely interesting," Sheppard explained. "They can be used as probes to understand what is happening at the edge of our Solar System."

The object with the most-distant orbit at perihelion, 2012 VP113, was also discovered by Sheppard and Trujillo, who announced that find in 2014. The discovery of 2012 VP113 led Sheppard and Trujillo to notice similarities of the orbits of several extremely distant Solar System objects, and they proposed the presence of an unknown planet several times larger than Earth -- sometimes called Planet X or Planet 9 -- orbiting the Sun well beyond Pluto at hundreds of AUs.

"We think there could be thousands of small bodies like 2015 TG387 out on the Solar System's fringes, but their distance makes finding them very difficult," Tholen said. "Currently we would only detect 2015 TG387 when it is near its closest approach to the Sun. For some 99 percent of its 40,000-year orbit, it would be too faint to see."

The object was discovered as part of the team's ongoing hunt for unknown dwarf planets and Planet X. It is the largest and deepest survey ever conducted for distant Solar System objects.

"These distant objects are like breadcrumbs leading us to Planet X. The more of them we can find, the better we can understand the outer Solar System and the possible planet that we think is shaping their orbits -- a discovery that would redefine our knowledge of the Solar System's evolution," Sheppard added.

It took the team a few years of observations to obtain a good orbit for 2015 TG387 because it moves so slowly and has such a long orbital period. They first observed 2015 TG387 in October of 2015 at the Japanese Subaru 8-meter telescope located atop Mauna Kea in Hawaii. Follow-up observations at the Magellan telescope at Carnegie's Las Campanas Observatory in Chile and the Discovery Channel Telescope in Arizona were obtained in 2015, 2016, 2017 and 2018 to determine 2015 TG387's orbit.

2015 TG387 is likely on the small end of being a dwarf planet since it has a diameter near 300 kilometers. The location in the sky where 2015 TG387 reaches perihelion is similar to 2012 VP113, Sedna, and most other known extremely distant trans-Neptunian objects, suggesting that something is pushing them into similar types of orbits.

Trujillo and University of Oklahoma's Nathan Kaib ran computer simulations for how different hypothetical Planet X orbits would affect the orbit of 2015 TG387. The simulations included a Super-Earth-mass planet at several hundred AU on an elongated orbit as proposed by Caltech's Konstantin Batygin and Michael Brown in 2016. Most of the simulations showed that not only was 2015 TG387's orbit stable for the age of the Solar System, but it was actually shepherded by Planet X's gravity, which keeps the smaller 2015 TG387 away from the massive planet. This gravitational shepherding could explain why the most-distant objects in our Solar System have similar orbits. These orbits keep them from ever approaching the proposed planet too closely, which is similar to how Pluto never gets too close to Neptune even though their orbits cross.

"What makes this result really interesting is that Planet X seems to affect 2015 TG387 the same way as all the other extremely distant Solar System objects. These simulations do not prove that there's another massive planet in our Solar System, but they are further evidence that something big could be out there" Trujillo concludes.

Read more at Science Daily

Gaia spots stars flying between galaxies

The positions and reconstructed orbits of 20 high-velocity stars, represented on top of an artistic view of our Galaxy, the Milky Way. These stars were identified using data from the second release of ESA's Gaia mission. The seven stars shown in red are sprinting away from the Galaxy and could be travelling fast enough to eventually escape its gravity. Surprisingly, the study revealed also thirteen stars, shown in orange, that are racing towards the Milky Way: these could be stars from another galaxy, zooming right through our own.
A team of astronomers using the latest set of data from ESA's Gaia mission to look for high-velocity stars being kicked out of the Milky Way were surprised to find stars instead sprinting inwards -- perhaps from another galaxy. The study is published in the journal Monthly Notices of the Royal Astronomical Society.

Stars circle around our galaxy at hundreds of kilometres per second, and their motions contain a wealth of information about the past history of the Galaxy. The fastest class of these stars are called hypervelocity stars, which are thought to start their life near the Galactic centre, later to be flung towards the edge of the Milky Way via interactions with the black hole at its centre.

Only a small number of hypervelocity stars have ever been discovered, and Gaia's recently published second data release provides a unique opportunity to look for more of them.

"Of the seven million Gaia stars with full 3D velocity measurements, we found twenty that could be travelling fast enough to eventually escape from the Milky Way," explains Elena Maria Rossi, one of the authors of the new study, based at Leiden University, in the Netherlands.

However, the team were in for a surprise: "Rather than flying away from the Galactic centre, most of the high-velocity stars we spotted seem to be racing towards it," adds co-author Tommaso Marchetti. "These could be stars from another galaxy, zooming right through the Milky Way."

It is possible that these intergalactic interlopers come from the Large Magellanic Cloud, a relatively small galaxy orbiting the Milky Way, or they may originate from a galaxy even further afield. If that is the case, they carry the imprint of their site of origin, and studying them at much closer distances than their parent galaxy could provide unprecedented information on the nature of stars in another galaxy -- similar in a way to studying Martian material brought to our planet by meteorites.

"Stars can be accelerated to high velocities when they interact with a supermassive black hole," Elena explains. "So the presence of these stars might be a sign of such black holes in nearby galaxies. But the stars may also have once been part of a binary system, flung towards the Milky Way when their companion star exploded as a supernova. Either way, studying them could tell us more about these kinds of processes in nearby galaxies."

An alternative explanation is that the newly identified sprinting stars could be native to our Galaxy's halo, accelerated and pushed inwards through interactions with one of the dwarf galaxies that fell towards the Milky Way during its build-up history. Additional information about the age and composition of the stars could help the astronomers clarify their origin.

New data will help to nail down the nature and origin of these stars with more certainty, and the team will use ground-based telescopes to find out more about them. At least two more Gaia data releases are planned in the 2020s, and each will provide both more precise and new information on a larger set of stars.

"We eventually expect full 3D velocity measurements for up to 150 million stars," explains co-author Anthony Brown, chair of the Gaia Data Processing and Analysis Consortium Executive. "This will help find hundreds or thousands of hypervelocity stars, understand their origin in much more detail, and use them to investigate the Galactic centre environment as well as the history of our Galaxy," he adds.

Read more at Science Daily

Oct 1, 2018

No 'reservoir': Detectable HIV-1 in treated human liver cells found to be inert

In a proof-of-principle study, researchers at Johns Hopkins report that a certain liver immune cell called a macrophage contains only defective or inert HIV-1 copies, and aren't likely to restart infection on their own in HIV-1-infected people on long-term antiretroviral therapy (ART).

The study, the investigators say, strongly suggest that defective or inert HIV-1 can remain in the liver macrophages for up to ten years without functioning as an HIV-1 "reservoir" that can replicate the virus at high levels.

But the finding, they say, also suggests that HIV-1 treatment strategies that only or mostly focus on these macrophages in a search for a cure might need to shift more heavily to other cell types more likely to serve as active reservoirs of the virus.

A report of the findings was published online on Sept. 10 and in the Oct. issue of the Journal of Clinical Investigation.

"Our study was the first, to our knowledge, in which purified liver tissue macrophages from HIV-1 infected people taking ART directly tested the idea that tissue macrophages are a long-lived active HIV-1 reservoir," says Ashwin Balagopal, M.D., an associate professor in the Division of Infectious Diseases at the Johns Hopkins University School of Medicine.

According to the U.S. Centers for Disease Control and Prevention (CDC), 36.7 million people worldwide and 1.1 million people in the U.S. are infected with HIV-1. Commonly, ART is successfully used to suppress the replication of HIV-1 and stop or control the progression of acquired immunodeficiency syndrome (AIDS) in humans. The virus infects the body's immune system cells, commonly forming a persistent, and reservoir in humans in so-called resting memory CD4+ T white blood cells. Macrophages work with T cells normally to envelop and clear tissues of microbes and debris.

The inability to completely wipe out pools of infectious HIV-1 has for decades frustrated efforts to completely cure the infection. And it means that the interruption or discontinuation of ART at any time re-activates HIV-1 replication, spreading the virus to new cells.

As a result, Balagopal says, investigators are increasingly focused more specifically on the location and biology of these HIV-1 reservoirs.

To determine if liver macrophages were a true source of infection-capable HIV-1 reservoirs after ART, liver tissue samples were taken from nine HIV-1 infected persons, seven of whom underwent liver transplantation at the Johns Hopkins Hospital and would have otherwise had their livers discarded. Eight of the nine persons were on ART for periods ranging from eight to 140 months.

The sample group included only adults whose demographic information is considered exempt from human subject research because all samples were obtained strictly for scientific reasons or postmortem, and would otherwise have been discarded. The Johns Hopkins School of Medicine institutional review approved this study protocol.

Using lab techniques that both measure HIV-1 containing T cells and separate out the liver macrophages, the researchers found HIV-1 to be present in the macrophages even after exposure to longstanding virus-suppressing ART.

However, Balagopal said that when his team tried to simulate virus "rebound" from the liver macrophages in the laboratory, they found only "fragments of HIV-1 in small quantities, without robust growth of full-length, infectious virus."

The researchers found that HIV-1 was in the liver macrophages of one subject who took ART for 11.7 years. They concluded that while liver macrophages might harbor HIV-1 for a long time, it's unlikely these viruses could continue an infection on their own, unlikely to function as a reservoir, because the viruses were not able to replicate.

Balagopal cautioned that their study results still affirm the need to address liver macrophage infection, because even if inert, these cells may be able to produce portions of viral proteins that can misdirect the immune system.

"These results contribute an important piece in our efforts to understand the role of non-T cells, such as macrophages, as HIV-1 cellular reservoirs in individuals on long-term ART, but also may help the research community focus on additional cure strategies," Abraham Kandathil, Ph.D., a research associate in the Division of Infectious Diseases at Johns Hopkins University School of Medicine who performed all of the key experiments.

In the future, Kandathil says, more research is needed to determine if the inert HIV-1 infected liver macrophages have any functional significance in people taking ART because expression of defective HIV-1 proteins can confuse the immune system and cause tissue inflammation.

Read more at Science Daily