Feb 3, 2018

Meet the 'odderon': Large Hadron Collider experiment shows potential evidence of quasiparticle sought for decades

View of the tunnel where the proton detectors are located.
In a 17-mile circular tunnel underneath the border between France and Switzerland, an international collaboration of scientists runs experiments using the world's most advanced scientific instrument, the Large Hadron Collider (LHC). By smashing together protons that travel close to light speed, particle physicists analyze these collisions and learn more about the fundamental makeup of all matter in the universe. In recent years, for instance, these experiments showed data leading to the Nobel Prize for the discovery of the Higgs Boson.

Now, a team of high-energy experimental particle physicists, including several from the University of Kansas, has uncovered possible evidence of a subatomic quasiparticle dubbed an "odderon" that -- until now -- had only been theorized to exist. Their results currently are published on the arXiv and CERN preprint servers in two papers that have been submitted to peer-reviewed journals.

"We've been looking for this since the 1970s," said Christophe Royon, Foundation Distinguished Professor in the KU Department of Physics & Astronomy.

The new findings concern hadrons (the family of particle that includes protons and neutrons), which are composed of quarks "glued" together with gluons. These particular experiments involve "collisions" where the protons remain intact after the collision. In all previous experiments, scientists detected collisions involving only even numbers of gluons exchanged between different protons.

"The protons interact like two big semi-trucks that are transporting cars, the kind you see on the highway," said Timothy Raben, a particle theorist at KU who has worked on the odderon. "If those trucks crashed together, after the crash you'd still have the trucks, but the cars would now be outside, no longer aboard the trucks -- and also new cars are produced (energy is transformed into matter)."

In the new paper, researchers using more energy and observing collisions with more precision report potential evidence of an odd number of gluons, without any quarks, exchanged in the collisions.

"Until now, most models were thinking there was a pair of gluons -- always an even number," said Royon. "Now we measure for the first time the higher number of events and properties and at a new energy. We found measurements that are incompatible with this traditional model of assuming an even number of gluons. It's a kind of discovery that we might have seen for the first time, this odd exchange of the number of gluons. There may be three, five, seven or more gluons."

The KU researchers explained the odderon can be seen as the total contribution coming from all types of odd gluon exchange. It represents the involvement of all of three, five, seven or other odd number numbers of gluons. By contrast, the older model assumes a contribution from all even numbers of gluons, so it includes contributions from two, four, six or more even-numbered gluons together.

At the LHC, the work was carried out by a team of more than 100 physicists from eight countries using the TOTEM experiment, near one of the four points in the supercollider where proton beams are directed into each other, causing billions of proton pairs to collide every second.

KU researchers said the findings give fresh detail to the Standard Model of particle physics, a widely accepted physics theory that explains how the basic building blocks of matter interact.

"This doesn't break the Standard Model, but there are very opaque regions of the Standard Model, and this work shines a light on one of those opaque regions," said Raben.

Physicists have imagined the existence of the odderon for many decades, but until the LHC began operating at its highest energies in 2015, the odderon remained mere conjecture. The data collected and presented in the new paper was collected at 13 teraelectronvolts (TeV), the fastest scientists have ever been able to collide protons.

"These ideas date back to the '70s, but even at that time it quickly became evident we weren't close technologically to being able to see the odderon, so while there are several decades of predictions, the odderon has not been seen," Raben said.

According to the KU researchers, the TOTEM experiment was designed to detect the protons that are not destroyed by the collision but are only slightly deviated. So, the TOTEM particle detectors are placed at a few millimeters from the outgoing beams of protons that did not interact. By comparing current results with measurements made at lower energies using less powerful particle accelerators, TOTEM has been able to make the most precise measurement ever.

The co-authors compared the ratio of signatures from collisions at various energies to establish the "rho parameter," one measure that helped build evidence for the possible presence of odderons.

"If you go to really high energies, there are signatures of the behavior of beams collided at a high energy that can be measured," said Raben. "But there are different types of high-energy growth signatures. Up until now, we've only had to think about one type of high-energy growth behavior. Essentially these quantities might change as a function of the amount of energy. The rho parameter is essentially measuring the ratio of one signature to another of this high energy growth."

Such measurement of the rho parameter is owed to the shared work, collaboration and key contributions, on the detectors' hardware and in particular on the physics analysis, by several postdocs and senior physicists.

Aside from Royon, KU personnel involved in the new TOTEM findings include postdoctoral researcher Nicola Minafra, who earned a CMS achievement award this year, and graduate students Cristian Baldenegro Barrera, Justin Williams, Tommaso Isidori and Cole Lindsey. Other KU researchers participating in the work are Laurent Forthomme, a postdoctoral researcher also based at CERN and working on the CMS/TOTEM experiments, and graduate student Federico Deganutti, who works with Raben on theory.

"Our students come from many different nations," said Royon. "KU is a working at the frontier of new things, and we expect big results in the coming months or years. Other research efforts include looking for an extra dimension in the universe, but for now we're just looking at the data."

Read more at Science Daily

Natural telescope sets new magnification record

The yellow dotted line traces the boundaries of the galaxy's gravitationally lensed image. The inset on the upper left shows what eMACSJ1341-QG-1 would look like if we observed it directly, without the cluster lens. The dramatic amplification and distortion caused by the intervening, massive galaxy cluster (of which only a few galaxies are seen in this zoomed-in view) is apparent.
Extremely distant galaxies are usually too faint to be seen, even by the largest telescopes. But nature has a solution: gravitational lensing, predicted by Albert Einstein and observed many times by astronomers. Now, an international team of astronomers, led by Harald Ebeling of the Institute for Astronomy at the University of Hawaii at Manoa, has discovered one of the most extreme instances of magnification by gravitational lensing.

Using the Hubble Space Telescope to survey a sample of huge clusters of galaxies, the team found a distant galaxy, eMACSJ1341-QG-1, that is magnified 30 times thanks to the distortion of space-time created by the massive galaxy cluster dubbed eMACSJ1341.9-2441.

The underlying physical effect of gravitational lensing was first confirmed during the solar eclipse of 1919, and can dramatically magnify images of distant celestial sources if a sufficiently massive object lies between the background source and observers.

Galaxy clusters, enormous concentrations of dark matter and hot gas surrounding hundreds or thousands of individual galaxies, all bound by the force of gravity, are valued by astronomers as powerful "gravitational lenses." By magnifying the galaxies situated behind them, massive clusters act as natural telescopes that allow scientists to study faint and distant sources that would otherwise be beyond the reach of even the most powerful human-made telescopes.

"The very high magnification of this image provides us with a rare opportunity to investigate the stellar populations of this distant object and, ultimately, to reconstruct its undistorted shape and properties," said team member Johan Richard of the University of Lyon, who performed the lensing calculations.

Although similarly extreme magnifications have been observed before, the discovery sets a record for the magnification of a rare "quiescent" background galaxy -- one that, unlike our Milky Way, does not form new stars in giant clouds of cool gas.

Explained UH team leader Ebeling, "We specialize in finding extremely massive clusters that act as natural telescopes and have already discovered many exciting cases of gravitational lensing. This discovery stands out, though, as the huge magnification provided by eMACSJ1341 allows us to study in detail a very rare type of galaxy."

From Science Daily

Feb 2, 2018

Toward end of Ice Age, human beings witnessed fires larger than dinosaur killers

New research shows that some 12,800 years ago, an astonishing 10 percent of the Earth's land surface, or about 10 million square kilometers, was consumed by fires.
On a ho-hum day some 12,800 years ago, the Earth had emerged from another ice age. Things were warming up, and the glaciers had retreated.

Out of nowhere, the sky was lit with fireballs. This was followed by shock waves.

Fires rushed across the landscape, and dust clogged the sky, cutting off the sunlight. As the climate rapidly cooled, plants died, food sources were snuffed out, and the glaciers advanced again. Ocean currents shifted, setting the climate into a colder, almost "ice age" state that lasted an additional thousand years.

Finally, the climate began to warm again, and people again emerged into a world with fewer large animals and a human culture in North America that left behind completely different kinds of spear points.

This is the story supported by a massive study of geochemical and isotopic markers just published in the Journal of Geology.

The results are so massive that the study had to be split into two papers.

"Extraordinary Biomass-Burning Episode and Impact Winter Triggered by the Younger Dryas Cosmic Impact ~12,800 Years Ago" is divided into "Part I: Ice Cores and Glaciers" and "Part 2: Lake, Marine, and Terrestrial Sediments."

The paper's 24 authors include KU Emeritus Professor of Physics & Astronomy Adrian Melott and Professor Brian Thomas, a 2005 doctoral graduate from KU, now at Washburn University.

"The work includes measurements made at more than 170 different sites across the world," Melott said.

The KU researcher and his colleagues believe the data suggests the disaster was touched off when Earth collided with fragments of a disintegrating comet that was roughly 62 miles in diameter -- the remnants of which persist within our solar system to this day.

"The hypothesis is that a large comet fragmented and the chunks impacted the Earth, causing this disaster," said Melott. "A number of different chemical signatures -- carbon dioxide, nitrate, ammonia and others -- all seem to indicate that an astonishing 10 percent of the Earth's land surface, or about 10 million square kilometers, was consumed by fires."

According to Melott, analysis of pollen suggests pine forests were probably burned off to be replaced by poplar, which is a species that colonizes cleared areas.

Indeed, the authors posit the cosmic impact could have touched off the Younger Dryas cool episode, biomass burning, late Pleistocene extinctions of larger species and "human cultural shifts and population declines."

Read more at Science Daily

Cheetahs' inner ear is one-of-a-kind, vital to high-speed hunting

This illustration shows the location of the inner ear in a modern cheetah skull.
The world's fastest land animal, the cheetah, is a successful hunter not only because it is quick, but also because it can hold an incredibly still gaze while pursuing prey. For the first time, researchers have investigated the cheetah's extraordinary sensory abilities by analyzing the speedy animal's inner ear, an organ that is essential for maintaining body balance and adapting head posture during movement in most vertebrates. The study, published today in the journal Scientific Reports and led by researchers at the American Museum of Natural History, finds that the inner ear of modern cheetahs is unique and likely evolved relatively recently.

"If you watch a cheetah run in slow motion, you'll see incredible feats of movement: its legs, its back, its muscles all move with such coordinated power. But its head hardly moves at all," said lead author Camille Grohé, who conducted this work during a National Science Foundation and Frick Postdoctoral Fellowship in the Museum's Division of Paleontology. "The inner ear facilitates the cheetah's remarkable ability to maintain visual and postural stability while running and capturing prey at speeds of up to 65 miles per hour. Until now, no one has investigated the inner ear's role in this incredible hunting specialization."

In the inner ear of vertebrates, the balance system consists of three semicircular canals that contain fluid and sensory hair cells that detect movement of the head. Each of the semicircular canals is positioned at a different angle and is especially sensitive to different movements: up and down, side-to-side, and tilting from one side to the other.

The researchers used high-resolution X-ray computed tomography (CT) at the Museum's Microscopy and Imaging Facility, the National Museum of Natural History in Paris, and the Biomaterials Science Center of the University of Basel in Switzerland to scan the skulls of 21 felid specimens, including seven modern cheetahs (Acinonyx jubatus) from distinct populations, a closely related extinct cheetah (Acinonyx pardinensis) that lived in the Pleistocene between about 2.6 million and 126,000 years ago, and more than a dozen other living felid species. With those data, they created detailed 3-D virtual images of each species' inner ear shape and dimensions.

They found that the inner ears of living cheetahs differ markedly from those of all other felids alive today, with a greater overall volume of the vestibular system and longer anterior and posterior semicircular canals.

"This distinctive inner ear anatomy reflects enhanced sensitivity and more rapid responses to head motions, explaining the cheetah's extraordinary ability to maintain visual stability and to keep their gaze locked in on prey even during incredibly high-speed hunting," said coauthor John Flynn, the Frick Curator of Fossil Mammals in the Museum's Division of Paleontology.

These traits were not present in Acinonyx pardinensis, the extinct species examined by the researchers, emphasizing the recent evolution of the highly specialized inner ear of modern cheetah.

Read more at Science Daily

Astrophysicists discover planets in extragalactic galaxies using microlensing

Image of the gravitational lens RX J1131-1231 galaxy with the lens galaxy at the center and four lensed background quasars. It is estimated that there are trillions of planets in the center elliptical galaxy in this image.
A University of Oklahoma astrophysics team has discovered for the first time a population of planets beyond the Milky Way galaxy. Using microlensing -- an astronomical phenomenon and the only known method capable of discovering planets at truly great distances from the Earth among other detection techniques -- OU researchers were able to detect objects in extragalactic galaxies that range from the mass of the Moon to the mass of Jupiter.

Xinyu Dai, professor in the Homer L. Dodge Department of Physics and Astronomy, OU College of Arts and Sciences, with OU postdoctoral researcher Eduardo Guerras, made the discovery with data from the National Aeronautics and Space Administration's Chandra X-ray Observatory, a telescope in space that is controlled by the Smithsonian Astrophysical Observatory.

"We are very excited about this discovery. This is the first time anyone has discovered planets outside our galaxy," said Dai. "These small planets are the best candidate for the signature we observed in this study using the microlensing technique. We analyzed the high frequency of the signature by modeling the data to determine the mass."

While planets are often discovered within the Milky Way using microlensing, the gravitational effect of even small objects can create high magnification leading to a signature that can be modeled and explained in extragalactic galaxies. Until this study, there has been no evidence of planets in other galaxies.

"This is an example of how powerful the techniques of analysis of extragalactic microlensing can be. This galaxy is located 3.8 billion light years away, and there is not the slightest chance of observing these planets directly, not even with the best telescope one can imagine in a science fiction scenario," said Guerras. "However, we are able to study them, unveil their presence and even have an idea of their masses. This is very cool science."

For this study, OU researchers used the NASA Chandra X-ray Observatory at the Smithsonian Astrophysical Observatory. The microlensing models were calculated at the OU Supercomputing Center for Education and Research.

From Science Daily

Computer reads brain activity to find out the music each person is listening to

A computer could identify the song someone was listening to through analyzing their brain.
It may sound like sci-fy, but mind reading equipment are much closer to become a reality than most people can imagine. A new study carried out at D'Or Institute for Research and Education used a Magnetic Resonance (MR) machine to read participants' minds and find out what song they were listening to. The study, published today in Scientific Reports, contributes for the improvement of the technique and pave the way to new research on reconstruction of auditory imagination and inner speech. In the clinical domain, it can enhance brain-computer interfaces in order to establish communication with locked-in syndrome patients.

In the experiment, six volunteers heard 40 pieces of classical music, rock, pop, jazz, and others. The neural fingerprint of each song on participants' brain was captured by the MR machine while a computer was learning to identify the brain patterns elicited by each musical piece. Musical features such as tonality, dynamics, rhythm and timbre were taken in account by the computer.

After that, researchers expected that the computer would be able to do the opposite way: identify which song participants were listening to, based on their brain activity -- a technique known as brain decoding. When confronted with two options, the computer showed up to 85% accuracy in identifying the correct song, which is a great performance, comparing to previous studies.

Researchers then pushed the test even harder by providing not two but 10 options (e.g. one correct and nine wrong) to the computer. In this scenario, the computer correctly identified the song in 74% of the decisions.

In the future, studies on brain decoding and machine learning will create possibilities of communication regardless any kind of written or spoken language. "Machines will be able to translate our musical thoughts into songs," says Sebastian Hoefle, researcher from D'Or Institute and PhD student from Federal University of Rio de Janeiro, Brazil. The study is a result of a collaboration between Brazilian researchers and colleagues from Germany, Finland and India.

Read more at Science Daily

Radiocarbon dating reveals mass grave did date to the Viking age

This is one of the female skulls from the Repton charnel.
A team of archaeologists, led by Cat Jarman from the University of Bristol's Department of Anthropology and Archaeology, has discovered that a mass grave uncovered in the 1980s dates to the Viking Age and may have been a burial site of the Viking Great Army war dead.

Although the remains were initially thought to be associated with the Vikings, radiocarbon dates seemed to suggest the grave consisted of bones collected over several centuries. New scientific research now shows that this was not the case and that the bones are all consistent with a date in the late 9th century. Historical records state that the Viking Great Army wintered in Repton, Derbyshire, in 873 A.D. and drove the Mercian king into exile.

Excavations led by archaeologists Martin Biddle and Birthe Kjølbye-Biddle at St Wystan's Church in Repton in the 1970s and 1980s discovered several Viking graves and a charnel deposit of nearly 300 people underneath a shallow mound in the vicarage garden.

The mound appears to have been a burial monument linked to the Great Army.

An Anglo-Saxon building, possibly a royal mausoleum, was cut down and partially ruined, before being turned into a burial chamber.

One room was packed with the commingled remains of at least 264 people, around 20 percent of whom were women. Among the bones were Viking weapons and artefacts, including an axe, several knives, and five silver pennies dating to the period 872-875 A.D. 80 percent of the remains were men, mostly aged 18 to 45, with several showing signs of violent injury.

During the excavations, everything pointed to the burial's association with the Viking Great Army, but confusingly, initial radiocarbon dates suggested otherwise. It seemed to contain a mix of bones of different ages, meaning that they could not all have been from the Viking Age.

Now, new dating proves that they are all consistent with a single date in the 9th century and therefore with the Viking Great Army.

Cat Jarman said: "The previous radiocarbon dates from this site were all affected by something called marine reservoir effects, which is what made them seem too old.

"When we eat fish or other marine foods, we incorporate carbon into our bones that is much older than in terrestrial foods. This confuses radiocarbon dates from archaeological bone material and we need to correct for it by estimating how much seafood each individual ate."

A double grave from the site -- one of the only Viking weapon graves found in the country -- was also dated, yielding a date range of 873-886 A.D.

The grave contained two men, the older of whom was buried with a Thor's hammer pendant, a Viking sword, and several other artefacts.

He had received numerous fatal injuries around the time of death, including a large cut to his left femur. Intriguingly, a boar's tusk had been placed between his legs, and it has been suggested that the injury may have severed his penis or testicles, and that the tusk was there to replace what he had lost in preparation for the after-world.

The new dates now show that these burials could be consistent with members of the Viking Great Army.

Outside the charnel mound another extraordinary grave can now be shown to be likely to relate to the Vikings in Repton as well.

Four juveniles, aged between eight and 18, were buried together in a single grave with a sheep jaw at their feet.

Next to them large stones may have held a marker, and the grave was placed near the entrance to the mass grave. At least two of the juveniles have signs of traumatic injury. The excavators suggested this may have been a ritual grave, paralleling accounts of sacrificial killings to accompany Viking dead from historical accounts elsewhere in the Viking world. The new radiocarbon dates can now place this burial into the time period of 872-885 A.D.

Cat Jarman added: "The date of the Repton charnel bones is important because we know very little about the first Viking raiders that went on to become part of considerable Scandinavian settlement of England.

Read more at Science Daily

Feb 1, 2018

How black holes shape the cosmos

Visualization of the intensity of shock waves in the cosmic gas (blue) around collapsed dark matter structures (orange/white). Similar to a sonic boom, the gas in these shock waves is accelerated with a jolt when impacting on the cosmic filaments and galaxies.
Every galaxy harbours a supermassive black hole at its center. A new computer model now shows how these gravity monsters influence the large-scale structure of our universe. The research team includes scientists from the Heidelberg Institute for Theoretical Studies (HITS), Heidelberg University, the Max-Planck-Institutes for Astronomy (MPIA, Heidelberg) and for Astrophysics (MPA, Garching), US universities Harvard and the Massachusetts Institute of Technology (MIT), as well as the Center for Computational Astrophysics in New York. The project, "Illustris -- The Next Generation" (IllustrisTNG), is the most complete simulation of its kind to date. Based on the basic laws of physics, the simulation shows how our cosmos evolved since the Big Bang. Adding to the predecessor Illustris project, IllustrisTNG includes some of the physical processes which play a crucial role in this evolution for the very first time in such an extensive simulation. First results of the IllustrisTNG project have now been published in three articles in the journal Monthly Notices of the Royal Astronomical Society. These findings should help to answer fundamental questions in cosmology.

A realistic universe out of the computer

At its intersection points, the cosmic web of gas and dark matter predicted by IllustrisTNG hosts galaxies quite similar to the shape and size of real galaxies. For the first time, hydrodynamical simulations could directly compute the detailed clustering pattern of galaxies in space. Comparison with observational data -- including newest large surveys -- demonstrate the high degree of realism of IllustrisTNG. In addition, the simulations predict how the cosmic web changes over time, in particular in relation to the underlying "back bone" of the dark matter cosmos. "It is particularly fascinating that we can accurately predict the influence of supermassive black holes on the distribution of matter out to large scales," says principal investigator Prof. Volker Springel (HITS, MPA, Heidelberg University). "This is crucial for reliably interpreting forthcoming cosmological measurements."

The most important transformation in the life cycle of galaxies

In another study, Dr. Dylan Nelson (MPA) was able to demonstrate the important impact of black holes on galaxies. Star-forming galaxies shine brightly in the blue light of their young stars until a sudden evolutionary shift ends the star formation, such that the galaxy becomes dominated by old, red stars, and joins a graveyard full of "red and dead" galaxies. "The only physical entity capable of extinguishing the star formation in our large elliptical galaxies are the supermassive black holes at their centers," explains Nelson. "The ultrafast outflows of these gravity traps reach velocities up to 10 percent of the speed of light and affect giant stellar systems that are billions of times larger than the comparably small black hole itself."

Where the stars sparkle: New findings for the structures of galaxies

IllustrisTNG also improves researchers´ understanding of the hierarchical structure formation of galaxies. Theorists argue that small galaxies should form first, and then merge into ever larger objects, driven by the relentless pull of gravity. The numerous galaxy collisions literally tear some galaxies apart and scatter their stars onto wide orbits around the newly created large galaxies, which should give them a faint background glow of stellar light. These predicted pale stellar halos are very difficult to observe due to their low surface brightness, but IllustrisTNG was able to simulate exactly what astronomers should be looking for in their data. "Our predictions can now be systematically checked by observers," Dr. Annalisa Pillepich (MPIA) points out, who led a further IllustrisTNG study. "This yields a critical test for the theoretical model of hierarchical galaxy formation."

Read more at Science Daily

Cancer 'vaccine' eliminates tumors in mice

Ronald Levy (left) and Idit Sagiv-Barfi led the work on a possible cancer treatment that involves injecting two immune-stimulating agents directly into solid tumors.
Injecting minute amounts of two immune-stimulating agents directly into solid tumors in mice can eliminate all traces of cancer in the animals, including distant, untreated metastases, according to a study by researchers at the Stanford University School of Medicine.

The approach works for many different types of cancers, including those that arise spontaneously, the study found.

The researchers believe the local application of very small amounts of the agents could serve as a rapid and relatively inexpensive cancer therapy that is unlikely to cause the adverse side effects often seen with bodywide immune stimulation.

"When we use these two agents together, we see the elimination of tumors all over the body," said Ronald Levy, MD, professor of oncology. "This approach bypasses the need to identify tumor-specific immune targets and doesn't require wholesale activation of the immune system or customization of a patient's immune cells."

One agent is currently already approved for use in humans; the other has been tested for human use in several unrelated clinical trials. A clinical trial was launched in January to test the effect of the treatment in patients with lymphoma.

Levy, who holds the Robert K. and Helen K. Summy Professorship in the School of Medicine, is the senior author of the study, which will be published Jan. 31 in Science Translational Medicine. Instructor of medicine Idit Sagiv-Barfi, PhD, is the lead author.

'Amazing, bodywide effects'


Levy is a pioneer in the field of cancer immunotherapy, in which researchers try to harness the immune system to combat cancer. Research in his laboratory led to the development of rituximab, one of the first monoclonal antibodies approved for use as an anticancer treatment in humans.

Some immunotherapy approaches rely on stimulating the immune system throughout the body. Others target naturally occurring checkpoints that limit the anti-cancer activity of immune cells. Still others, like the CAR T-cell therapy recently approved to treat some types of leukemia and lymphomas, require a patient's immune cells to be removed from the body and genetically engineered to attack the tumor cells. Many of these approaches have been successful, but they each have downsides -- from difficult-to-handle side effects to high-cost and lengthy preparation or treatment times.

"All of these immunotherapy advances are changing medical practice," Levy said. "Our approach uses a one-time application of very small amounts of two agents to stimulate the immune cells only within the tumor itself. In the mice, we saw amazing, bodywide effects, including the elimination of tumors all over the animal."

Cancers often exist in a strange kind of limbo with regard to the immune system. Immune cells like T cells recognize the abnormal proteins often present on cancer cells and infiltrate to attack the tumor. However, as the tumor grows, it often devises ways to suppress the activity of the T cells.

Levy's method works to reactivate the cancer-specific T cells by injecting microgram amounts of two agents directly into the tumor site. (A microgram is one-millionth of a gram). One, a short stretch of DNA called a CpG oligonucleotide, works with other nearby immune cells to amplify the expression of an activating receptor called OX40 on the surface of the T cells. The other, an antibody that binds to OX40, activates the T cells to lead the charge against the cancer cells. Because the two agents are injected directly into the tumor, only T cells that have infiltrated it are activated. In effect, these T cells are "prescreened" by the body to recognize only cancer-specific proteins.

Cancer-destroying rangers

Some of these tumor-specific, activated T cells then leave the original tumor to find and destroy other identical tumors throughout the body.

The approach worked startlingly well in laboratory mice with transplanted mouse lymphoma tumors in two sites on their bodies. Injecting one tumor site with the two agents caused the regression not just of the treated tumor, but also of the second, untreated tumor. In this way, 87 of 90 mice were cured of the cancer. Although the cancer recurred in three of the mice, the tumors again regressed after a second treatment. The researchers saw similar results in mice bearing breast, colon and melanoma tumors.

Mice genetically engineered to spontaneously develop breast cancers in all 10 of their mammary pads also responded to the treatment. Treating the first tumor that arose often prevented the occurrence of future tumors and significantly increased the animals' life span, the researchers found.

Finally, Sagiv-Barfi explored the specificity of the T cells by transplanting two types of tumors into the mice. She transplanted the same lymphoma cancer cells in two locations, and she transplanted a colon cancer cell line in a third location. Treatment of one of the lymphoma sites caused the regression of both lymphoma tumors but did not affect the growth of the colon cancer cells.

"This is a very targeted approach," Levy said. "Only the tumor that shares the protein targets displayed by the treated site is affected. We're attacking specific targets without having to identify exactly what proteins the T cells are recognizing."

The current clinical trial is expected to recruit about 15 patients with low-grade lymphoma. If successful, Levy believes the treatment could be useful for many tumor types. He envisions a future in which clinicians inject the two agents into solid tumors in humans prior to surgical removal of the cancer as a way to prevent recurrence due to unidentified metastases or lingering cancer cells, or even to head off the development of future tumors that arise due to genetic mutations like BRCA1 and 2.

"I don't think there's a limit to the type of tumor we could potentially treat, as long as it has been infiltrated by the immune system," Levy said.

Read more at Science Daily

3D printing of living cells

Printing a liquid-filled foam and including living cells.
Using a new technique they call 'in-air microfluidics', University of Twente scientists succeed in printing 3D structures with living cells. This special technique enable the fast and 'in-flight' production of micro building blocks that are viable and can be used for repairing damaged tissue, for example.

Microfluidics is all about manipulating tiny drops of fluid with sizes between a micrometer and a millimeter. Most often, chips with tiny fluidic channels, reactors and other components are used for this: lab-on-a-chip systems. Although these chips offer a broad range of possibilities, in producing emulsions for example -- droplets carrying another substance -- the speed at which droplets leave the chip is typically in the microliter per minute range. For clinical and industrial applications, this is not fast enough: filling a volume of a cubic centimeter would take about 1000 minutes or 17 hours. The technique that is presented now, does this in a couple of minutes.

Impact of jets

Can we reach these higher speeds by not manipulating the fluids in microchannels, but in the air instead? This was one of the questions the researchers wanted to answer. And indeed it was possible, by using two 'jets' of fluid. From one jet, droplets are shot at the other jet. Creating the jets is relatively simple, and they move 100 to 1000 times faster than droplets from a microchip. Speed is not the only advantage. By choosing jets containing different types of fluids that react, the collision results in new materials. Smart combinations of fluids will result in solid and printable building blocks in one single step.

Printing tissue

In this way, it is possible to capture a living cell inside printable material. The resulting bio building blocks are printed in a 3D structure that looks like a sponge, filled with cells and fluid. These 3D modular biomaterials have an internal structure that is quite similar to that of natural tissue. Many 3D printing techniques are based on using heat or UV light: both would damage living cells. The new microfluidic approach is therefore a promising technique in tissue engineering, in which damaged tissue is repaired by using cultured cell material of the patient.

Read more at Science Daily

Zeroing in on dopamine

Super-resolution microscopy reveals the presence of active release sites (green) on dopamine neurons (purple).
Among the brain's many chemical messengers, few stand out as much as the neurotransmitter dopamine. Linked to love, pleasure, motivation and more, dopamine signaling plays a central role in the brain's reward system. It is also critical for processes such as motor control, learning and memory.

Numerous disorders have been linked to malfunctioning dopamine neurons, including Parkinson's, schizophrenia and addiction. Because of dopamine's importance in the brain, researchers have studied the neurotransmitter for decades, making great progress in understanding its activity and when it goes awry.

Less is known, however, about the mechanisms that healthy dopamine cells use to release the neurotransmitter, a gap that has limited scientists' ability to develop treatments for a range of dopamine-related conditions.

Now, researchers from Harvard Medical School have for the first time identified the molecular machinery responsible for the precise secretion of dopamine in the brain.

Their work, published online in Cell on Feb. 1, identifies specialized sites in dopamine-producing neurons that release the dopamine in a fast, spatially precise manner -- a finding that runs counter to current models of how the neurotransmitter transmits signals in the brain.

"The dopamine system plays an essential role in many diseases, but fewer studies have asked the fundamental question of how healthy dopamine neurons release the neurotransmitter," said senior study author Pascal Kaeser, assistant professor of neurobiology at HMS.

"If your car breaks down and you want it fixed, you want your mechanic to know how a car works," he added. "Similarly, a better understanding of dopamine in the laboratory could have a tremendous impact on the ability to treat disorders in which dopamine signaling goes awry in the long term."

Dopamine research has largely centered on its dysfunction and on the protein receptors that neurons use to receive dopamine, said Kaeser. Despite the neurotransmitter's importance, studies on how it is released in the brain under normal circumstances have been limited, he added.

Promiscuous No More

To identify the molecular machinery responsible for dopamine secretion, Kaeser and his colleagues focused on dopamine-producing neurons in the midbrain, which are involved in the neural circuitry underlying movement and reward seeking.

They first searched for active zones -- specialized neurotransmitter release sites located at synapses, the junctions that connect one neuron to the another. Using super-resolution microscopy to image sections of the brain into which dopamine neurons project, the team found that dopamine neurons contained proteins that mark the presence of active zones.

These zones indicate that a neuron may engage in fast synaptic transmission, in which a neurotransmitter signal is precisely transferred from one neuron to another within milliseconds.

This was the first evidence of fast active zones in dopamine neurons, which were previously thought to engage in only so-called volume transmission -- a process in which the neurotransmitter signals slowly and nonspecifically across relatively large areas of the brain.

Active zones were found at lower densities in dopamine neurons than in other neurons, and additional experiments revealed in detail how the neurotransmitter is rapidly secreted and reabsorbed at these sites.

"I think that our findings will change how we think about dopamine," Kaeser said. "Our data suggest that dopamine is released in very specific locations, with incredible spatial precision and speed, whereas before it was thought that dopamine was slowly and promiscuously secreted."

In another set of experiments, the researchers used genetic tools to delete several active zone proteins. Deleting one specific protein, RIM, was sufficient to almost entirely abolish dopamine secretion in mice. RIM has been implicated in a range of diseases including neuropsychiatric and developmental disorders.

Deleting another active zone protein, however, had little or no effect on dopamine release, suggesting that dopamine secretion relies on unique specialized machinery, the authors said.

"Our study indicates that dopamine signaling is much more organized than previously thought," said study first author Changliang Liu, an Alice and Joseph Brooks Postdoctoral Fellow and a Gordon Fellow in the Kaeser lab.

"We showed that active zones and RIM, which is associated with diseases such as schizophrenia and autism spectrum disorders in human genetic studies, are key for dopamine signaling," Liu said. "These newly identified mechanisms may be related to these disorders and may lead to new therapeutic strategies in the future."

The team is now working to investigate these active zones in greater detail to build a deeper understanding of their role in dopamine signaling and how to manipulate them.

"We are deeply invested in learning the entire dopamine signaling machine. Right now, most treatments supply the brain with dopamine in excess, which comes with many side effects because it activates processes that shouldn't be active," Kaeser said.

"Our long-term hope is to identify proteins that only mediate dopamine secretion," he said. "One can imagine that by manipulating the release of dopamine, we may be better able to reconstruct normal signaling in the brain."

Read more at Science Daily

Jan 31, 2018

Language is learned in brain circuits that predate humans

Nerve cell
It has often been claimed that humans learn language using brain components that are specifically dedicated to this purpose. Now, new evidence strongly suggests that language is in fact learned in brain systems that are also used for many other purposes and even pre-existed humans, say researchers in PNAS (Early Edition online Jan. 29).

The research combines results from multiple studies involving a total of 665 participants. It shows that children learn their native language and adults learn foreign languages in evolutionarily ancient brain circuits that also are used for tasks as diverse as remembering a shopping list and learning to drive.

"Our conclusion that language is learned in such ancient general-purpose systems contrasts with the long-standing theory that language depends on innately-specified language modules found only in humans," says the study's senior investigator, Michael T. Ullman, PhD, professor of neuroscience at Georgetown University School of Medicine.

"These brain systems are also found in animals -- for example, rats use them when they learn to navigate a maze," says co-author Phillip Hamrick, PhD, of Kent State University. "Whatever changes these systems might have undergone to support language, the fact that they play an important role in this critical human ability is quite remarkable."

The study has important implications not only for understanding the biology and evolution of language and how it is learned, but also for how language learning can be improved, both for people learning a foreign language and for those with language disorders such as autism, dyslexia, or aphasia (language problems caused by brain damage such as stroke).

The research statistically synthesized findings from 16 studies that examined language learning in two well-studied brain systems: declarative and procedural memory.

The results showed that how good we are at remembering the words of a language correlates with how good we are at learning in declarative memory, which we use to memorize shopping lists or to remember the bus driver's face or what we ate for dinner last night.

Grammar abilities, which allow us to combine words into sentences according to the rules of a language, showed a different pattern. The grammar abilities of children acquiring their native language correlated most strongly with learning in procedural memory, which we use to learn tasks such as driving, riding a bicycle, or playing a musical instrument. In adults learning a foreign language, however, grammar correlated with declarative memory at earlier stages of language learning, but with procedural memory at later stages.

The correlations were large, and were found consistently across languages (e.g., English, French, Finnish, and Japanese) and tasks (e.g., reading, listening, and speaking tasks), suggesting that the links between language and the brain systems are robust and reliable.

The findings have broad research, educational, and clinical implications, says co-author Jarrad Lum, PhD, of Deakin University in Australia.

"Researchers still know very little about the genetic and biological bases of language learning, and the new findings may lead to advances in these areas," says Ullman. "We know much more about the genetics and biology of the brain systems than about these same aspects of language learning. Since our results suggest that language learning depends on the brain systems, the genetics, biology, and learning mechanisms of these systems may very well also hold for language."

For example, though researchers know little about which genes underlie language, numerous genes playing particular roles in the two brain systems have been identified. The findings from this new study suggest that these genes may also play similar roles in language. Along the same lines, the evolution of these brain systems, and how they came to underlie language, should shed light on the evolution of language.

Additionally, the findings may lead to approaches that could improve foreign language learning and language problems in disorders, Ullman says.

For example, various pharmacological agents (e.g., the drug memantine) and behavioral strategies (e.g., spacing out the presentation of information) have been shown to enhance learning or retention of information in the brain systems, he says. These approaches may thus also be used to facilitate language learning, including in disorders such as aphasia, dyslexia, and autism.

Read more at Science Daily

Reconstructing an ancient lethal weapon

University of Washington researchers re-created ancient projectile points to test their effectiveness. From left to right: stone, microblade and bone tips.
Archaeologists are a little like forensic investigators: They scour the remains of past societies, looking for clues in pottery, tools and bones about how people lived, and how they died.

And just as detectives might re-create the scene of a crime, University of Washington archaeologists have re-created the weapons used by hunter-gatherers in the post-Ice Age Arctic some 14,000 years ago. Looking for clues as to how those early people advanced their own technology, researchers also considered what that might tell us about human migration, ancient climates and the fate of some animal species.

In an article published Jan. 31 in the Journal of Archaeological Science, Janice Wood, recent UW anthropology graduate, and Ben Fitzhugh, a UW professor of anthropology, show how they reconstructed prehistoric projectiles and points from ancient sites in what is now Alaska and studied the qualities that would make for a lethal hunting weapon.

The UW team chose to study hunting weapons from the time of the earliest archaeological record in Alaska (around 10,000 to 14,000 years ago), a time that is less understood archaeologically, and when different kinds of projectile points were in use. Team members designed a pair of experiments to test the effectiveness of the different point types. By examining and testing different points in this way, the team has come to a new understanding about the technological choices people made in ancient times.

"The hunter-gatherers of 12,000 years ago were more sophisticated than we give them credit for," Fitzhugh said. "We haven't thought of hunter-gatherers in the Pleistocene as having that kind of sophistication, but they clearly did for the things that they had to manage in their daily lives, such as hunting game. They had a very comprehensive understanding of different tools, and the best tools for different prey and shot conditions."

Prior research has focused on the flight ballistics of the hunting weapons in general, and no prior study has looked specifically at the ballistics of tools used in Siberia and the Arctic regions of North America just after the Ice Age. In addition to foraging for plants and berries (when available), nomadic groups hunted caribou, reindeer and other animals for food, typically with spears or darts (thrown from atlatl boards). Without preservation of the wood shafts, these tools are mainly differentiated in the archaeological record by their stone and bone points. But it was not known how effective different kinds of points were in causing lethal injury to prey.

Nor is it known, definitively, whether different types of points were associated with only certain groups of people, or whether with the same groups used certain point types to specialize on particular kinds of game or hunting practices. It is generally accepted that different point types were developed in Africa and Eurasia and brought to Alaska before the end of the Ice Age. These included rudimentary points made of sharpened bone, antler or ivory; more intricate, flaked stone tips popularly familiar as "arrowheads"; and a composite point made of bone or antler with razor blade-like stone microblades embedded around the edges.

The three likely were invented at separate times but remained in use during the same period because each presumably had its own advantages, Wood said. Learning how they functioned informs what we know about prehistoric hunters and the repercussions of their practices.

So Wood traveled to the area around Fairbanks, Alaska, and crafted 30 projectile points, 10 of each kind. She tried to stay as true to the original materials and manufacturing processes as possible, using poplar projectiles, and birch tar as an adhesive to affix the points to the tips of the projectiles. While ancient Alaskans used atlatls (a kind of throwing board), Wood used a maple recurve bow to shoot the arrows for greater control and precision.

  • For the bone tip, modeled on a 12,000-year-old ivory point from an Alaskan archaeological site, Wood used a multipurpose tool to grind a commercially purchased cow bone;
  • For the stone tip, she used a hammerstone to strike obsidian into flakes, then shaped them into points modeled on those found at another site in Alaska from 13,000 years ago;
  • And for the composite microblade tip -- modeled microblade technologies seen in Alaska since at least 13,000 years ago and a rare, preserved grooved antler point from a more recent Alaskan site used more than 8,000 years ago -- Wood used a saw and sandpaper to grind a caribou antler to a point. She then used the multipurpose tool to gouge out a groove around its perimeter, into which she inserted obsidian microblades.

Wood then tested how well each point could penetrate and damage two different targets: blocks of ballistic gelatin (a clear synthetic gelatin meant to mimic animal muscle tissue) and a fresh reindeer carcass, purchased from a local farm. Wood conducted her trials over seven hours on a December day, with an average outdoor temperature of minus 17 degrees Fahrenheit.

In Wood's field trial, the composite microblade points were more effective than simple stone or bone on smaller prey, showing the greatest versatility and ability to cause incapacitating damage no matter where they struck the animal's body. But the stone and bone points had their own strengths: Bone points penetrated deeply but created narrower wounds, suggesting their potential for puncturing and stunning larger prey (such as bison or mammoth); the stone points could have cut wider wounds, especially on large prey (moose or bison), resulting in a quicker kill.

Wood said the findings show that hunters during this period were sophisticated enough to recognize the best point to use, and when. Hunters worked in groups; they needed to complete successful hunts, in the least amount of time, and avoid risk to themselves.

"We have shown how each point has its own performance strengths," she said. Bone points punctured effectively, flaked stone created a greater incision, and the microblade was best for lacerated wounds. "It has to do with the animal itself; animals react differently to different wounds. And it would have been important to these nomadic hunters to bring the animal down efficiently. They were hunting for food."

Weapon use can shed light on the movement of people and animals as humans spread across the globe and how ecosystems changed before, during and after the ice ages.

"The findings of our paper have relevance to the understanding of ballistic properties affecting hunting success anywhere in the world people lived during the 99 percent of human history that falls between the invention of stone tools more than 3 million years ago in Africa and the origins of agriculture," Fitzhugh said.

It could also inform debates on whether human hunting practices directly led to the extinction of some species. The team's findings and other research show that our ancestors were thinking about effectiveness and efficiency, Wood said, which may have influenced which animals they targeted. An animal that was easier to kill may have been targeted more often, which could, along with changing climates, explain why animals such as the horse disappeared from the Arctic. A shot to the lung was lethal for early equines, Wood said, but a caribou could keep going.

Read more at Science Daily

Most of last 11,000 years cooler than past decade in North America, Europe

Little Pond, located in Royalston, Mass., was among 642 ponds or lakes in North America and Europe from which fossil pollen was collected to reconstruct temperatures. The reconstructions, which looked at climate in North America and Europe over the past 11,000 years, closely matched the climate simulations run by NCAR, which were conducted independently as part of separate projects.
University of Wyoming researchers led a climate study that determined recent temperatures across Europe and North America appear to have few, if any, precedent in the past 11,000 years.

The study revealed important natural fluctuations in climate have occurred over past millennia, which would have naturally led to climatic cooling today in the absence of human activity.

Bryan Shuman, a UW professor in the Department of Geology and Geophysics, and Jeremiah Marsicek, a recent UW Ph.D. graduate in geology and geophysics, led the new study that is highlighted in a paper, titled "Reconciling Divergent Trends and Millennial Variations in Holocene Temperatures," published today (Jan. 31) in Nature.

Marsicek, a current postdoctoral researcher at the University of Wisconsin-Madison, was lead author of the paper. He worked on the study from 2011-16. Other contributors to the paper were from the University of Oregon, University of Utah and the U.S. Geological Survey in Corvallis, Ore. The study was largely funded by a combination of fellowships from the Environmental Protection Agency and the UW NASA Space Grant Consortium, and grant support from the National Science Foundation (NSF).

"The major significance here is temperature across two continents over the last 11,000 years. The paper provides a geologically long-term perspective on recent temperature changes in the Northern Hemisphere and the ability of climate models, such as the National Oceanic Atmospheric Administration and National Center for Atmospheric Research (NCAR) models used in the study, to predict the changes," says Shuman, senior author on the paper and Marsicek's supervisor. "Climate simulations do a strikingly good job of forecasting the changes."

"I would say it is significant that temperatures of the most recent decade exceed the warmest temperatures of our reconstruction by 0.5 degrees Fahrenheit, having few -- if any -- precedents over the last 11,000 years," Marsicek says. "Additionally, we learned that the climate fluctuates naturally over the last 11,000 years and would have led to cooling today in the absence of human activity."

The study covers a period that begins at the end of the Ice Age and when there still was an ice sheet covering Canada, Shuman says.

Researchers reconstructed temperatures from fossil pollen collected from 642 lake or pond sites across North America -- including water bodies in Wyoming -- and Europe. The Wyoming locations included Slough Creek Pond and Cub Creek Pond in Yellowstone National Park, Divide Lake in Bridger-Teton National Forest, Sherd Lake in the Bighorn Mountains and Fish Creek Park near Dubois.

"When we collect sediment from the bottom of the lake, we can recognize sequences of plants that grew in a given area based on the shape of the fossil pollen left behind," Shuman explains. "Because different plants grow at different temperatures, we can constrain what the temperatures were in a given place at a certain time."

The reconstructions closely matched the climate simulations run by NCAR, which were conducted independently as part of separate projects. The computer simulations later became part of the study.

"Our temperature estimates and the NCAR simulations were within one-quarter of one degree Fahrenheit, on average, for the last 11,000 years," says Shuman, as he pointed to a graph that included a black line for his group's climate research temperature and a gray line that represents the computer simulations. "I was surprised the computer models did as good of a job as they did as predicting the changes that we estimated."

Long-term warming, not cooling, defined the Holocene Epoch, which began 12,000 to 11,500 years ago at the close of the Pleistocene Ice Age. The reconstructions indicate that evidence of periods that were significantly warmer than the last decade were limited to a few areas of the North Atlantic that were probably unusual globally. Shuman says results determined that the last decade was roughly 6.5 degrees Fahrenheit warmer today than it was 11,000 years ago. Additionally, the decade was at least one-half degree Fahrenheit warmer today than the warmest periods of that 11,000-year time frame, even counting for uncertainties, Shuman says.

"In the absence of people, the trend would have been cooling," Shuman says. "It does show that what has happened in the last 30 years -- a warming trend -- puts us outside of all but the most extreme single years every 500 years since the Ice Age. The last 10 years have, on average, been as warm as a normal one year in 500 warm spell."

In prior climate change studies, long-term cooling has been difficult to reconcile with known global controls that would have forced warming and climate models that consistently simulate long-term warming. In those studies, marine and coastal temperature records were used. However, certain areas in the oceans could be unusually warm and skew the overall long-term average temperature results of some of those prior studies, Shuman says.

Read more at Science Daily

Engineers 3-D print shape-shifting smart gel

A tiny chess king, 3D-printed with a temperature-responsive hydrogel, in cold water. It contains 73 percent water but remains solid.
Rutgers engineers have invented a "4D printing" method for a smart gel that could lead to the development of "living" structures in human organs and tissues, soft robots and targeted drug delivery.

The 4D printing approach here involves printing a 3D object with a hydrogel (water-containing gel) that changes shape over time when temperatures change, said Howon Lee, senior author of a new study and assistant professor in the Department of Mechanical and Aerospace Engineering at Rutgers University-New Brunswick.

The study, published online today in Scientific Reports, demonstrates fast, scalable, high-resolution 3D printing of hydrogels, which remain solid and retain their shape despite containing water. Hydrogels are everywhere in our lives, including in Jell-O, contact lenses, diapers and the human body.

The smart gel could provide structural rigidity in organs such as the lungs, and can contain small molecules like water or drugs to be transported in the body and released. It could also create a new area of soft robotics, and enable new applications in flexible sensors and actuators, biomedical devices and platforms or scaffolds for cells to grow, Lee said.

"The full potential of this smart hydrogel has not been unleashed until now," said Lee, who works in the School of Engineering. "We added another dimension to it, and this is the first time anybody has done it on this scale. They're flexible, shape-morphing materials. I like to call them smart materials."

Engineers at Rutgers-New Brunswick and the New Jersey Institute of Technology worked with a hydrogel that has been used for decades in devices that generate motion and biomedical applications such as scaffolds for cells to grow on. But hydrogel manufacturing has relied heavily on conventional, two-dimensional methods such as molding and lithography.

In their study, the engineers used a lithography-based technique that's fast, inexpensive and can print a wide range of materials into a 3D shape. It involves printing layers of a special resin to build a 3D object. The resin consists of the hydrogel, a chemical that acts as a binder, another chemical that facilitates bonding when light hits it and a dye that controls light penetration.

The engineers learned how to precisely control hydrogel growth and shrinkage. In temperatures below 32 degrees Celsius (about 90 degrees Fahrenheit), the hydrogel absorbs more water and swells in size. When temperatures exceed 32 degrees Celsius, the hydrogel begins to expel water and shrinks. The objects they can create with the hydrogel range from the width of a human hair to several millimeters long. The engineers also found that they can grow one area of a 3D-printed object -- creating and programming motion -- by changing temperatures.

Read more at Science Daily

Jan 30, 2018

Research finds early childhood program linked to degree completion at age 35

Participating in an intensive early childhood education program from preschool to third grade is linked to higher educational attainment in mid-life, according to a new study by University of Minnesota researchers.

The study, published in JAMA Pediatrics, tracked the progress of more than 1,500 children from low-income neighborhoods in Chicago, from the time they entered preschool in 1983 and 1984 in Child-Parent Centers (CPC) until roughly 30 years later. The children were part of the Chicago Longitudinal Study, one of the longest-running follow-ups of early childhood intervention.

"Children from low-income families are less likely to attend college than their higher-income peers," said lead author Arthur J. Reynolds, a professor at the University of Minnesota Institute of Child Development and director of the Chicago Longitudinal Study. "A strong system of educational and family supports in a child's first decade is an innovative way to improve educational outcomes leading to greater economic well-being. The CPC program provides this."

The JAMA Pediatrics study is the first of a large-scale public program to assess impacts on mid-life educational attainment and the contributions of continuing services in elementary school. The study's co-authors include Suh-Ruu Ou and Judy A. Temple of the University of Minnesota's Human Capital Research Collaborative.

For the study, which was funded by the National Institutes of Health, the researchers followed the progress of 989 graduates of the Chicago Public School District's CPC program, which provided intensive instruction in reading and math from preschool through third grade as part of a school reform model.

The program provides small classes, intensive learning experiences, menu-based parent involvement, and professional development. The children's parents received job skills training, parenting skills training, educational classes and social services. They also volunteered in their children's classrooms, assisted with field trips, and attended parenting support groups.

The authors compared the educational outcomes of those children to the outcomes of 550 children from low-income families who attended other early childhood intervention programs in the Chicago area. The researchers collected information on the children from administrative records, schools and families, from birth through 35 years of age. More than 90 percent of the original sample had available data on educational attainment.

On average, CPC graduates -- whether they participated in preschool only, or through second or third grade -- completed more years of education than those who participated in other programs.

For children who received an intervention in preschool, those in the CPC group were more likely to achieve an associate's degree or higher (15.7 percent vs. 10.7 percent), a bachelor's degree (11.0 percent vs. 7.8 percent), or a master's degree (4.2 percent vs. 1.5 percent). These differences translate to a 47 percent increase in an earned associate's degree and a 41 percent increase in an earned bachelor's degree.

CPC graduates through second or third grade showed even greater gains: a 48 percent increase in associate's degree or higher and a 74 percent increase for bachelor's degree or higher.

"Every child deserves a strong foundation for a successful future, and this report provides more concrete, compelling evidence that investments in early childhood education pay dividends for decades," said Chicago Mayor Rahm Emanuel. "Chicago is expanding access to early childhood education so every child, regardless of their zip code or parents' income, can have the building blocks for a lifetime of success."

According to the study's authors, successful early childhood programs not only may lead to higher adult educational achievement, but also to improved health. The authors note that adults with less education are more likely to adopt unhealthy habits like smoking and to experience high blood pressure, obesity, and mental health problems than those who complete more schooling.

"This study shows that a well run early childhood intervention program can have benefits well into adult life," said James Griffin, Ph.D., Deputy Chief of the Child Development Branch at the Eunice Kennedy Shriver National Institute of Child Health and Human Development at the National Institutes of Health.

Read more at Science Daily

This is your brain: This is your brain outdoors

Neuroscientists at the University of Alberta are measuring auditory P3 during outdoor cycling using an active wet EEG system.
The brain acts much differently when we're outdoors compared to when we're inside the lab, a new study has found.

"It happens when we're doing normal, everyday activities, like riding a bike," explained Kyle Mathewson, a neuroscientist in UAlberta's Department of Psychology.

Mathewson and his research team put EEG equipment into backpacks and had subjects perform a standard neuroscience task while riding a bike outside. The task involved identifying changes in an otherwise consistent set of stimuli, such as a higher pitch in a series of beep sounds. They had previously performed the same experiment on stationary bikes inside their lab but in the but in the new study, the scientists were able to record laboratory quality measurements of brain activity outdoors, using portable equipment.

"Something about being outdoors changes brain activity," said Joanna Scanlon, graduate student and lead author on the study. "In addition to dividing attention between the task and riding a bike, we noticed that brain activity associated with sensing and perceiving information was different when outdoors, which may indicate that the brain is compensating for environmental distractions."

The great outdoors

The study showed that our brains process stimuli, like sounds and sights, differently when we perform the same task outdoors compared to inside a lab.

"If we can understand how and what humans are paying attention to in the real world, we can learn more about how our minds work," said Scanlon. "We can use that information to make places more safe, like roadways."

"If we want to apply these findings to solve issues in our society, we need to ensure that we understand how the brain works out in the world where humans actually live, work, and play," said Mathewson, who added that almost everything we know about the human brain is learned from studies in very tightly controlled environments.

Next, the researchers will explore how this effect differs in outdoor environments with varying degrees of distraction, such as quiet path or a busy roadway.

From Science Daily

Mammals and birds could have best shot at surviving climate change

Bird and squirrel on tree trunk
New research that analyzed more than 270 million years of data on animals shows that mammals and birds -- both warm-blooded animals -- may have a better chance of evolving and adapting to the Earth's rapidly changing climate than their cold-blooded peers, reptiles and amphibians.

"We see that mammals and birds are better able to stretch out and extend their habitats, meaning they adapt and shift much easier," said Jonathan Rolland, a Banting postdoctoral fellow at the biodiversity research centre at UBC and lead author of the study. "This could have a deep impact on extinction rates and what our world looks like in the future."

By combining data from the current distribution of animals, fossil records and phylogenetic information for 11,465 species, the researchers were able to reconstruct where animals have lived over the past 270 million years and what temperatures they needed to survive in these regions.

The planet's climate has changed significantly throughout history and the researchers found that these changes have shaped where animals live. For example, the planet was fairly warm and tropical until 40 million years ago, making it an ideal place for many species to live. As the planet cooled, birds and mammals were able to adapt to the colder temperatures so they were able to move into habitats in more northern and southern regions.

"It might explain why we see so few reptiles and amphibians in the Antarctic or even temperate habitats," said Rolland. "It's possible that they will eventually adapt and could move into these regions but it takes longer for them to change."

Rolland explained that animals that can regulate their body temperatures, known as endotherms, might be better able to survive in these places because they can keep their embryos warm, take care of their offspring and they can migrate or hibernate.

"These strategies help them adapt to cold weather but we rarely see them in the ectotherms or cold-blooded animals," he said.

Rolland and colleagues argue that studying the past evolution and adaptations of species might provide important clues to understand how current, rapid changes in temperature impact biodiversity on the planet.

From Science Daily

Northern European population history revealed by ancient human genomes

Skull included in this study from Ölsund, Hälsingland, Sweden, dating to around 2,300 BCE, in the ancient DNA laboratory at the Max Planck Institute for the Science of Human History.
An international team of scientists, led by researchers from the Max Planck Institute for the Science of Human History, analyzed ancient human genomes from 38 northern Europeans dating from approximately 7,500 to 500 BCE. The study, published today in Nature Communications, found that Scandinavia was initially settled via a southern and a northern route and that the arrival of agriculture in northern Europe was facilitated by movements of farmers and pastoralists into the region.

Northern Europe could be considered a late bloomer in some aspects of human history: initial settlement by hunter-gatherers occurred only about 11,000 years ago, after the retreat of the lingering ice sheets from the Pleistocene, and while agriculture was already widespread in Central Europe 7,000 years ago, this development reached Southern Scandinavia and the Eastern Baltic only millennia later.

Several recent studies of ancient human genomes have dealt with the prehistoric population movements that brought new technology and subsistence strategies into Europe, but how they impacted the very north of the continent has still been poorly understood.

For this study, the research team, which included scientists from Lithuania, Latvia, Estonia, Russia and Sweden, assembled genomic data from 38 ancient northern Europeans, from mobile hunter-gatherers of the Mesolithic (approximately 12,000 to 7,000 years ago) and the first Neolithic farmers in southern Sweden (approximately 6,000 to 5,300 years ago) to the metallurgists of the Late Bronze Age in the Eastern Baltic (approximately 1300 to 500 BCE). This allowed the researchers to uncover surprising aspects of the population dynamics of prehistoric northern Europe.

Two routes of settlement for Scandinavia

Previous analysis of ancient human genomes has revealed that two genetically differentiated groups of hunter-gatherers lived in Europe during the Mesolithic: the so-called Western Hunter-Gatherers excavated in locations from Iberia to Hungary, and the so-called Eastern Hunter-Gatherers excavated in Karelia in north-western Russia. Surprisingly, the results of the current study show that Mesolithic hunter-gatherers from Lithuania appear very similar to their Western neighbors, despite their geographic proximity to Russia. The ancestry of contemporary Scandinavian hunter-gatherers, on the other hand, was comprised from both Western and Eastern Hunter-Gatherers.

"Eastern Hunter-Gatherers were not present on the eastern Baltic coast, but a genetic component from them is present in Scandinavia. This suggests that the people carrying this genetic component took a northern route through Fennoscandia into the southern part of the Scandinavian peninsula. There they genetically mixed with Western Hunter-Gatherers who came from the South, and together they formed the Scandinavian Hunter-Gatherers," explains Johannes Krause, Director of the Department of Archaeogenetics at the Max Planck Institute for the Science of Human History, and senior author of the study.

Agriculture and animal herding -- cultural imports by incoming people

Large-scale farming first started in southern Scandinavia around 6,000 years ago, about one millennium after it was already common in Central Europe. In the Eastern Baltic, the inhabitants relied solely on hunting, gathering and fishing for another 1000 years. Although some have argued that the use of the new subsistence strategy was a local development by foragers, possibly adopting the practices of their farming neighbors, the genetic evidence uncovered in the present study tells a different story.

The earliest farmers in Sweden are not descended from Mesolithic Scandinavians, but show a genetic profile similar to that of Central European agriculturalists. Thus it appears that Central Europeans migrated to Scandinavia and brought farming technology with them. These early Scandinavian farmers, like the Central European agriculturalists, inherited a substantial portion of their genes from Anatolian farmers, who first spread into Europe around 8,200 years ago and set in motion the cultural transition to agriculture known as the Neolithic Revolution.

Similarly, a near-total genetic turnover is seen in the Eastern Baltic with the advent of large-scale agro-pastoralism. While they did not mix genetically with Central European or Scandinavian farmers, beginning around 2,900 BCE the individuals in the Eastern Baltic derive large parts of their ancestry from nomadic pastoralists of the Pontic-Caspian steppe.

"Interestingly, we find an increase of local Eastern Baltic hunter-gatherer ancestry in this population at the onset of the Bronze Age," states Alissa Mittnik of the Max Planck Institute for the Science of Human History, lead author of the study. "The local population was not completely replaced but coexisted and eventually mixed with the newcomers."

Read more at Science Daily

Newborns or survivors? The unexpected matter found in hostile black hole winds

Galaxy-scale outflow driven by the central black hole.
The existence of large numbers of molecules in winds powered by supermassive black holes at the centers of galaxies has puzzled astronomers since they were discovered more than a decade ago. Molecules trace the coldest parts of space, and black holes are the most energetic phenomena in the universe, so finding molecules in black hole winds was like discovering ice in a furnace.

Astronomers questioned how anything could survive the heat of the energetic outflows, but a new theory from researchers in Northwestern University's Center for Interdisciplinary Research and Exploration in Astrophysics (CIERA) predicts that these molecules are not survivors at all, but brand-new molecules, born in the winds with unique properties that enable them to adapt to and thrive in the hostile environment.

The theory, published in the Monthly Notices of the Royal Astronomical Society, is the work of Lindheimer post-doctoral fellow Alexander Richings, who developed the computer code that, for the first time, modeled the detailed chemical processes that occur in interstellar gas accelerated by radiation emitted during the growth of supermassive black holes. Claude-André Faucher-Giguère, who studies galaxy formation and evolution as an assistant professor in Northwestern's Weinberg College of Arts and Sciences, is a co-author.

"When a black hole wind sweeps up gas from its host galaxy, the gas is heated to high temperatures, which destroy any existing molecules," Richings said. "By modeling the molecular chemistry in computer simulations of black hole winds, we found that this swept-up gas can subsequently cool and form new molecules."

This theory answers questions raised by previous observations made with several cutting-edge astronomical observatories including the Herschel Space Observatory and the Atacama Large Millimeter Array, a powerful radio telescope located in Chile.

In 2015, astronomers confirmed the existence of energetic outflows from supermassive black holes found at the center of most galaxies. These outflows kill everything in their path, expelling the food -- or molecules -- that fuel star formation. These winds are also presumed to be responsible for the existence of "red and dead" elliptical galaxies, in which no new stars can form.

Then, in 2017, astronomers observed rapidly moving new stars forming in the winds -- a phenomenon they thought would be impossible given the extreme conditions in black hole-powered outflows.

New stars form from molecular gas, so Richings and Faucher-Giguère's new theory of molecule formation helps explain the formation of new stars in winds. It upholds previous predictions that black hole winds destroy molecules upon first collision but also predicts that new molecules -- including hydrogen, carbon monoxide and water -- can form in the winds themselves.

"This is the first time that the molecule formation process has been simulated in full detail, and in our view, it is a very compelling explanation for the observation that molecules are ubiquitous in supermassive black hole winds, which has been one of the major outstanding problems in the field," Faucher-Giguère said.

Read more at Science Daily

Jan 29, 2018

NASA poised to topple a planet-finding barrier

Goddard optics experts Babak Saif (left) and Lee Feinberg (right), with help from engineer Eli Griff-McMahon an employee of Genesis, have created an Ultra-Stable Thermal Vacuum system that they will use to make picometer-level measurements.
NASA optics experts are well on the way to toppling a barrier that has thwarted scientists from achieving a long-held ambition: building an ultra-stable telescope that locates and images dozens of Earth-like planets beyond the solar system and then scrutinizes their atmospheres for signs of life.

Babak Saif and Lee Feinberg at NASA's Goddard Space Flight Center in Greenbelt, Maryland, have shown for the first time that they can dynamically detect subatomic- or picometer-sized distortions -- changes that are far smaller than an atom -- across a five-foot segmented telescope mirror and its support structure. Collaborating with Perry Greenfield at the Space Telescope Science Institute in Baltimore, the team now plans to use a next-generation tool and thermal test chamber to further refine their measurements.

The measurement feat is good news to scientists studying future missions for finding and characterizing extrasolar Earth-like planets that potentially could support life.

To find life, these observatories would have to gather and focus enough light to distinguish the planet's light from that of its much brighter parent star and then be able to dissect that light to discern different atmospheric chemical signatures, such as oxygen and methane. This would require a super-stable observatory whose optical components move or distort no more than 12 picometers, a measurement that is about one-tenth the size of a hydrogen atom.

To date, NASA has not built an observatory with such demanding stability requirements.

How Displacements Occur

Displacements and movement occur when materials used to build telescopes shrink or expand due to wildly fluctuating temperatures, such as those experienced when traveling from Earth to the frigidity of space, or when exposed to fierce launch forces more than six-and-a-half times the force of gravity.

Scientists say that even nearly imperceptible, atomic-sized movements would affect a future observatory's ability to gather and focus enough light to image and analyze the planet's light. Consequently, mission planners must design telescopes to picometer accuracies and then test it at the same level across the entire structure, not just between the telescope's reflective mirrors. Movement occurring at any particular position might not accurately reflect what's actually happening in other locations.

"These future missions will require an incredibly stable observatory," said Azita Valinia, deputy Astrophysics Projects Division program manager. "This is one of the highest technology tall poles that future observatories of this caliber must overcome. The team's success has shown that we are steadily whittling away at that particular obstacle."

The Initial Test

To carry out the test, Saif and Feinberg used the High-Speed Interferometer, or HSI -- an instrument that the Arizona-based 4D Technology developed to measure nanometer-sized dynamic changes in the James Webb Space Telescope's optical components -- including its 18 mirror segments, mounts, and other supporting structures -- during thermal, vibration and other types of environmental testing.

Like all interferometers, the instrument splits light and then recombines it to measure tiny changes, including motion. The HSI can quickly measure dynamic changes across the mirror and other structural components, giving scientists insights into what is happening all across the telescope, not just in one particular spot.

Even though the HSI was designed to measure nanometer or molecule-sized distortions -- which was the design standard for Webb -- the team wanted to see it could use the same instrument, coupled with specially developed algorithms, to detect even smaller changes over the surface of a spare five-foot Webb mirror segment and its support hardware.

The test proved it could, measuring dynamic movement as small as 25 picometers -- about twice the desired target, Saif said.

Next Steps

However, Goddard and 4D Technology have designed a new high-speed instrument, called a speckle interferometer, that allows measurements of both reflective and diffuse surfaces at picometer accuracies. 4D Technology has built the instrument and the Goddard team has begun initial characterization of its performance in a new thermal-vacuum test chamber that controls internal temperatures to a frosty 1-millikelvin.

Read more at Science Daily