Nov 4, 2017

Cells driving gecko's ability to re-grow its tail identified

This is a Leopard gecko.
A U of G researcher is the first to discover the type of stem cell that is behind the gecko's ability to re-grow its tail, a finding that has implications for spinal cord treatment in humans.

Many lizards can detach a portion of their tail to avoid a predator and then regenerate a new one. Unlike mammals, the lizard tail includes a spinal cord.

Prof. Matthew Vickaryous found that the spinal cord of the tail contained a large number of stem cells and proteins known to support stem cell growth.

"We knew the gecko's spinal cord could regenerate, but we didn't know which cells were playing a key role," said Vickaryous, lead author of the study recently published in the Journal of Comparative Neurology. "Humans are notoriously bad at dealing with spinal cord injuries so I'm hoping we can use what we learn from geckos to coax human spinal cord injuries into repairing themselves."

Geckos are able to re-grow a new tail within 30 days -- faster than any other type of lizard.

In the wild, they detach their tails when grabbed by a predator. The severed tail continues to wiggle, distracting the predator long enough for the reptile to escape.

In the lab, Vickaryous simulates this by pinching the gecko's tail causing the tail to drop. Once detached, the site of the tail loss begins to repair itself, eventually leading to new tissue formation and a new spinal cord. For this study, the biomedical sciences professor, along with PhD student Emily Gilbert, investigated what happens at the cellular level before and after detachment.

They discovered that the spinal cord houses a special type of stem cell known as the radial glia. These stem cells are normally fairly quiet.

"But when the tail comes off everything temporarily changes," he said. "The cells make different proteins and begin proliferating more in response to the injury. Ultimately, they make a brand new spinal cord. Once the injury is healed and the spinal cord is restored, the cells return to a resting state."

Humans, on the other hand, respond to a spinal cord injury by making scar tissue rather than new tissue, he added. The scar tissue seals the wound quickly, but sealing the injury prevents regeneration.

"It's a quick fix but in the long term it's a problem."

"This may play a role in why we have a limited ability to repair our spinal cords. We are missing the key cells types required."

This study is part of a series of investigations into the regenerative abilities of the gecko's central nervous system. The next step is to examine how the gecko is able to make new brain cells, said Vickaryous.

Read more at Science Daily

The First Americans May Have Migrated Along a Coastal ‘Kelp Highway’

Kwakiutl tribe of the Pacific Northwest rowing in canoes.
A conventional belief about the first settlement of the Americas holds that people with ancestry from Siberia in northeastern Asia traveled into North America across the Bering Strait when it was exposed as a land bridge during the last ice age roughly 13,500 years ago.

These first settlers were thought to be the likely creators of the prehistoric Clovis culture. Remnants of this culture primarily consist of stone tools that were first excavated near Clovis, New Mexico, in 1932. Little is known about the Paleo-Indians who made the tools, but the remains of an infant boy named Anzick-1 have been associated with the Clovis. DNA analysis of Anzick-1 in 2014 revealed a genetic connection to modern Native American populations.

Although the “Clovis-first” theory concerning the earliest peopling of the Americas remained the accepted view throughout much of the 20th century, confidence in it started to crumble in the late 1980s. Archaeologists then began to find evidence for extensive seafaring and maritime colonization of places like the Ryukyu Islands off eastern Asia. Evidence for American settlements earlier than 13,500 years ago has also been mounting.

Clovis culture tools from the Rummells-Maske Site of Cedar County, Iowa.
“Understanding when, how, and who colonized the Americas remains one of the most challenging and enduring questions in archaeology,” Torben Rick of the National Museum of Natural History’s Department of Anthropology told Seeker. “One of the most important questions in American archaeology is: Who were the first Americans and how and when did they arrive?”

Todd Braje of the San Diego State University Department of Anthropology agrees.

“Thirty years ago, we thought we had all the answers,” he said. “Now, there are more questions than answers.”

Nevertheless, a somewhat clearer view of early American settlement is coming to  light,  and one that Rick, Braje, and their colleagues address in a new paper in the journal Science.

“There is a coalescence of data — genetic, archaeological, and geologic — that support a colonization around 20,000–15,000 years ago,” Rick said. “This doesn’t preclude earlier  migrations,  or suggest that we should not investigate earlier migrations, but a growing body of evidence is building on intensive research that supports the 20,000–15,000 years ago  timeframe,  and evidence for earlier migrations is problematic and speculative.”

In the paper, he and his colleagues mention — and question — the recent discovery of the Cerutti Mastodon site in San Diego County, California. The site contains the remains of a 130,700-year-old juvenile male mastodon, which some scientists believe was butchered by hominid that was not necessarily Homo sapiens.

One such researcher is Daniel Fisher, director of the University of Michigan Museum of Paleontology. He told Seeker that, “based on decades of experience seeing sites with evidence of human activity, and also a great deal of work on modern material trying to replicate the patterns of fractures that we see, I really know of no other way that the material of the Cerutti Mastodon site could have been produced than through human activity.”

If validated, the Cerutti find would blow a mammoth-sized hole in virtually all theories concerning when and how the Americas were first settled, given its extremely early age. For now, though, Braje, Rick, and many other archaeologists are supporting the emerging “kelp highway hypothesis.”

Fish swim through a kelp forest at the Channel Islands National Park.
Braje said that “a coastal migration into the Americas was the most likely initial colonization event,” and that “the first Americans likely arrived along the Pacific coast, not crossing the open Pacific, but migrating along the Pacific Rim in boats in a step-wise fashion” starting around 20,000 years ago.

The region then, and now, is rich in aquatic and terrestrial natural resources, with a productive kelp forest and associated marine ecosystems at sea level. The researchers explained that the kelp forest extended as far south as Baja California. At that point, productive mangrove and other aquatic habitats would have been available along the Central American coast. The kelp forest then started up again in northern Peru, reaching as far south as Tierra del Fuego.

It is possible that the first people migrating to the Americas along the kelp highway and perhaps other entrance points came from different locations and cultures.

“However, there were likely limited migration corridors during the 25,000–15,000-year-ago interval,” Braje noted. “The ice-free corridor route was closed, so a terrestrial migration during this interval is unlikely. But there may have been multiple migrations during and after this interval.”

Read more at Seeker

Nov 1, 2017

Humans don't use as much brainpower as we like to think

A tiny treeshrew brain saps just as much of the body's energy as a human brain, researchers report. Shown in red are the blood vessels that deliver glucose to fuel cellular activities.
For years, scientists assumed that humans devote a larger share of their daily calories to their brains than other animals. Although the human brain makes up only 2 percent of body weight, it consumes more than 25 percent of our baseline energy budget.

But a study published Oct. 31 in the Journal of Human Evolution comparing the relative brain costs of 22 species found that, when it comes to brainpower, humans aren't as exceptional as we like to think.

"We don't have a uniquely expensive brain," said study author Doug Boyer, assistant professor of evolutionary anthropology at Duke University. "This challenges a major dogma in human evolution studies."

Boyer and his graduate student Arianna Harrington decided to see how humans stack up in terms of brain energy uptake.

Because energy travels to the brain via blood vessels, which deliver a form of sugar called glucose, the researchers measured the cross-sectional area of the bony canals that enclose the cranial arteries.

By coupling these measurements with previously published estimates of brain glucose uptake and internal skull volume as an indicator of brain size, they examined seven species, including mice, rats, squirrels, rabbits, monkeys and humans. The researchers were able to show that larger canals enclose arteries that deliver more blood, and thus glucose, to the brain.

Then, using a statistical technique called multiple regression, they calculated brain glucose uptake for an additional 15 species for which brain costs were unknown, including lemurs, monkeys and treeshrews, primate relatives from Southeast Asia.

As expected, the researchers found that humans allot proportionally more energy to their brains than rodents, Old World monkeys, and great apes such as orangutans and chimpanzees.

Relative to resting metabolic rate -- the total amount of calories an animal burns each day just to keep breathing, digesting and staying warm -- the human brain demands more than twice as many calories as the chimpanzee brain, and at least three to five times more calories than the brains of squirrels, mice and rabbits.

But other animals have hungry brains too.

In terms of relative brain cost, there appears to be little difference between a human and a pen-tailed treeshrew, for example.

Even the ring-tailed lemur and the tiny quarter-pound pygmy marmoset, the world's smallest monkey, devote as much of their body energy to their brains as we do.

"This shouldn't come as too much of a surprise," Boyer said. "The metabolic cost of a structure like the brain is mainly dependent on how big it is, and many animals have bigger brain-to-body mass ratios than humans."

The results suggest that the ability to grow a relatively more expensive brain evolved not at the dawn of humans, but millions of years before, when our primate ancestors and their close relatives split from the branch of the mammal family tree that includes rodents and rabbits, Harrington said.

Previous studies calculated the amount of energy needed to fuel a brain based on neuron counts. But because the current study's method for estimating energy use relies on measurements of bone, rather than soft tissue such as neurons, it is now possible to estimate brain energy demand from the fossilized remains of animals that are extinct too, including early human ancestors.

"All you would need to take the measurements is an intact skull and some of the neck vertebrae," Harrington said.

Read more at Science Daily

How life arose from primordial muck: Experimental evidence overturns accepted theory

In the beginning, there were chemicals.
Life on Earth originated in an intimate partnership between the nucleic acids (genetic instructions for all organisms) and small proteins called peptides, according to two new papers from biochemists and biologists at the University of North Carolina at Chapel Hill and the University of Auckland. Their "peptide-RNA" hypothesis contradicts the widely-held "RNA-world" hypothesis, which states that life originated from nucleic acids and only later evolved to include proteins.

The new papers -- one in Molecular Biology and Evolution, the other in Biosystems -- show how recent experimental studies of two enzyme superfamilies surmount the tough theoretical questions about how complex life emerged on Earth more than four billion years ago.

"Until now, it has been thought to be impossible to conduct experiments to penetrate the origins of genetics," said co-author Charles Carter, PhD, professor of biochemistry and biophysics at the UNC School of Medicine. "But we have now shown that experimental results mesh beautifully with the 'peptide-RNA' theory, and so these experiments provide quite compelling answers to what happened at the beginning of life on Earth."

The special attributes of the ancestral versions of these enzyme superfamlies, and the self-reinforcing feedback system they would have formed with the first genes and proteins, would have kick-started early biology and driven the first life forms toward greater diversity and complexity, the researchers said.

Co-author Peter Wills, PhD, professor of physics at the University of Auckland, said, "Compared to the RNA-world hypothesis, what we've outlined is simply a much more probable scenario for the origin of life. We hope our data and the theory we've outlined in these papers will stimulate discussion and further research on questions relevant to the origins of life."

The two scientists are fully aware that the RNA-world hypothesis still dominates the origin-of-life research field. "That theory is so alluring and expedient that most people just don't think there's any alternative," Carter said. "But we are very confident there is."

Before there was life on Earth, there were simple chemicals. Somehow, they produced both amino acids and nucleotides that eventually became the proteins and nucleic acids necessary to create single cells. And the single cells became plants and animals. Research this century has revealed how the primordial chemical soup created the building blocks of life. There is also widespread scientific consensus on the historical path by which cells evolved into plants and animals.

But it's still a mystery how the amino acid building blocks were first assembled according to coded nucleic acid templates into the proteins that formed the machinery of all cells.

The widely accepted RNA-world theory posits that RNA -- the molecule that today plays roles in coding, regulating, and expressing genes -- elevated itself from the primordial soup of amino acids and cosmic chemicals, eventually to give rise first to short proteins called peptides and then to single-celled organisms.

Carter and Wills argue that RNA could not kick-start this process alone because it lacks a property they call "reflexivity." It cannot enforce the rules by which it is made. RNA needed peptides to form the reflexive feedback loop necessary to lead eventually to life forms.

At the heart of the peptide-RNA theory are enzymes so ancient and important that their remnants are still found in all living cells and even in some sub-cellular structures, including mitochondria and viruses. There are 20 of these ancient enzymes called aminoacyl-tRNA synthetases (aaRSs).

Each of them recognizes one of the 20 amino acids that serve as the building blocks of proteins. (Proteins, considered the machines of life, catalyze and synchronize the chemical reactions inside cells.) In modern organisms, an aaRS effectively links its assigned amino acid to an RNA string containing three nucleotides complementary to a similar string in the transcribed gene. The aaRSs thus play a central role in converting genes into proteins, a process called translation that is essential for all life forms.

The 20 aaRS enzymes belong to two structurally distinct families, each with 10 aaRSs. Carter's recent experimental studies showed that the two small enzyme ancestors of these two families were encoded by opposite, complementary strands of the same small gene. The simplicity of this arrangement, with its initial binary code of just two kinds of amino acids, suggests it occurred at the very dawn of biology. Moreover, the tight, yin-yang interdependence of these two related but highly distinct enzymes would have stabilized early biology in a way that made inevitable the orderly diversification of life that followed.

"These interdependent peptides and the nucleic acids encoding them would have been able to assist each other's molecular self-organization despite the continual random disruptions that beset all molecular processes," Carter said. "We believe that this is what gave rise to a peptide-RNA world early in Earth's history," Carter said.

Related research by Carter and UNC colleague Richard Wolfenden, PhD, previously revealed how the intimate chemistries of amino acids enabled the first aaRS enzymes to fold properly into functional enzymes, while simultaneously determining the assignments in the universal genetic coding table.

"The enforcement of the relationship between genes and amino acids depends on aaRSs, which are themselves encoded by genes and made of amino acids," Wills said. "The aaRSs, in turn, depend on that same relationship. There is a basic reflexivity at work here. Theorist Douglas Hofstadter called it a 'strange loop.' We propose that this, too, played a crucial role in the self-organization of biology when life began on Earth. Hofstadter argued that reflexivity furnishes the force driving the growth of complexity."

Carter and Wills developed two additional reasons why a pure RNA biology of any significance was unlikely to have predated a peptide-RNA biology. One reason is catalysis -- the acceleration of chemical reactions involving other molecules.

Catalysis is a key feature of biology that RNA cannot perform with much versatility. In particular, RNA enzymes cannot readily adjust their activities to temperature changes likely to have happened as the earth cooled, and so cannot perform the very broad range of catalytic accelerations that would have been necessary to synchronize the biochemistry of early cell-based life forms. Only peptide or protein enzymes have that kind of catalytic versatility, Carter said.

Secondly, Wills has shown that impossible obstacles would have blocked any transition from a pure-RNA world to a protein-RNA world and onward toward life.

"Such a rise from RNA to cell-based life would have required an out-of-the-blue appearance of an aaRS-like protein that worked even better than its adapted RNA counterpart," Carter said. "That extremely unlikely event would have needed to happen not just once but multiple times -- once for every amino acid in the existing gene-protein code. It just doesn't make sense."

Read more at Science Daily

Stem Cells From Young Donors Could Combat Frailty in Old Age

Picture two 80-year-old women. One is confined to a wheelchair, physically weak, and wholly dependent on her family for buying food, getting around, and managing her finances. The second 80-year-old plays tennis every day and drives her own car. She’s physically active, mentally sharp, and 100-percent independent.

So what’s going on here? Both women have the same number of candles on their birthday cake, but they seem to be aging at different rates. The culprit, physicians say, is a condition called frailty that affects roughly 10 percent of all people over 60. Frailty is characterized by decreased physical function and immune response with age, but also cognitive loss and increasing dependence on others for daily tasks.

The good news is that while aging is inevitable, frailty can be staved off. The standard treatment regimens for frailty are exercise, better nutrition, and prescription drug management. But researchers from the University of Miami believe they may have found another solution: stem cells. In two separate trials, groups of people in their 70s suffering from moderate to severe frailty were injected with hundreds of millions of stem cells from young healthy donors. And the initial results were promising.

Dr. Joshua Hare is a professor of medicine and director of the Interdisciplinary Stem Cell Institute at the University of Miami, where he conducted both of the stem cell-frailty trials. Participants were run through a battery of physical and cognitive tests six months after receiving stem cell treatments, including a six-minute walking test on a treadmill.

“One of the most striking findings was the walk distance,” Hare told Seeker. “It was up about 60 to 70 meters, which is close to the length of a football field. That’s a big increase in walk distance in six minutes for a person who’s slowing down. That kind of difference can make a meaningful change in a person’s day-to-day experience.”

It’s been proven that all humans lose stem cells as we age. These adult stem cells reside in “niches” in our organs and bone marrow, and are responsible for replenishing dead cells and growing new ones. We need stem cells to grow hair, maintain organs, and heal wounds. All of this slows down as we age and our stem cell reserves are depleted.

But there’s one type of adult stem cell that serves as a “master regulator” of all other stem cells, Hare explained. Mesenchymal stem cells, which are found all over the body, including the lining of blood vessels, orchestrate the replenishing activity in tissue, organs, and bone marrow. Hare’s hypothesis was that by boosting the supply of these master stem cells in older people, it could replenish stem cells across the body and combat the symptoms of frailty.

Hare and his team started with bone marrow donations from healthy young individuals, extracted the mesenchymal stem cells, and then grew them out in a specialized on-campus lab. In the Phase I trial, the main goal was to prove that injecting subjects with mesenchymal stem cells was safe at various doses, from 50 million to 200 million individual cells.

For the 15 individuals who participated in Phase I, there were no negative immune responses or serious health effects (Okay, one subject passed away, but it was from unrelated causes).

As for positive outcomes, participants who received doses of the 100 million and 200 million stem cells experienced a 50 percent reduction in levels of an inflammatory biomarker called TNF-α, a signaling protein that’s been linked to mitochondrial dysfunction and disruption of organ systems. Lower levels of TNF-α means less inflammation, which is believed to be a driver of frailty.

The Phase II stem cell trial recruited more participants (30 people with frailty) and was a double-blind placebo-controlled study. Again, there were no negative health effects of the treatment, and participants receiving the 100 million stem cell dose showed the most physical and cognitive improvements. Inflammatory markers were down, quality-of-life scores were up, and there was even evidence that stem cell therapy strengthened immune response to infections, which usually weakens with aging.

Hare was quick to point out that while mesenchymal stem cells may be a promising treatment for frailty, the injections don’t represent a cure for aging.

“Anti-aging is a science fiction concept,” said Hare. “I don’t think we’re ever going to able to reverse aging. Frailty is aging at a faster rate. What I like about the frailty paradigm, it’s very biological, very definable, it’s very agreed upon. We’re all aging, and some people are aging in a negative way. This treatment could put them on a more positive trajectory.”

Read more at Seeker

New Discoveries Bring Hope to the Fight Against Antibiotic Resistance

Colonies of pathogenic MRSA bacteria on a blood agar plate.
Antibiotic resistance is a worldwide threat to human health and survival. Bacteria are changing in response to the extensive overuse of antibiotics in medicine and the livestock industry, rendering these medicines ineffective while endangering the people who need them most.

Antibiotics known as β-lactams — which include penicillins, cephalosporins, and carbapenems — are the most widely prescribed around the world, and are increasingly resisted by bacteria.

The World Health Organization recently placed bacteria that are resistant to cephalosporins and carbapenems on its global priority list of antibiotic-resistant pathogens, emphasizing the critical need for a way to combat this problem. After completing two recent studies, a team of UK scientists may be close to a solution.

The papers, published in the Journal of Antimicrobial Chemotherapy and in the journal Molecular Microbiology, detail the team’s discovery of two mechanisms associated with β-lactam antibiotic resistance. This allowed the researchers — a team of chemists from the University of Bristol, the University of Oxford, and the University of Leeds — to develop a promising method for reversing resistance in the most commonly prescribed antibiotics.

“Antibiotic resistance is a growing problem, directly causing the deaths of around 750,000 people globally each year,” Dr. Matthew Avison, reader in molecular bacteriology at the University of Bristol and senior author of both studies, told Seeker. “β-lactam antibiotics are the most commonly used class, about 60 percent of all prescriptions, and they are particularly important for serious infections of patients in hospitals.”

The use of β-lactam enzymes is one important mechanism that bacteria use to resist antibiotics, by using them to destroy the medication once it enters a cell. Avison’s team was searching for a β-lactam enzyme inhibitor that would protect antibiotics in the body, allowing them to effectively fight infection.

They examined two inhibitors, one known as avibactam that is already being used in clinical trials, and another that is a bicyclic boronate inhibitor. On their own, both inhibitors failed to protect the antibiotic, known as ceftazidime. When paired with another antibiotic called aztreonam, however, they both worked extremely well, killing some of the most resistant bacteria the lab has ever observed.

“This discovery adds significantly to what we already know about β-lactam inhibitors,” Avison said. “We’ve shown that different combinations of β-lactamase inhibitors and β-lactam antibiotics work better than others, and most importantly, we have identified the biological explanation of this difference. We hope it will give clinicians the confidence to use these combinations in the treatment of serious infections where all other antimicrobials have failed.”

Earlier this year, the combination of avibactam and aztreonam helped save the lives of two patients in the US. Each case involved antibiotic-resistant illnesses, and the novel combination of the β-lactamase inhibitor with the β-lactam antibiotic cleared the infections.

“This would only be used on very seriously ill patients in hospitals,” Avidon noted, “and so would only be given intravenously, usually via an infusion pump.”

Studies on β-lactamase enzymes continue to aid in the discovery of many more β-lactamase inhibitors, like the bicyclic boronate inhibitor that this study found to be effective. According to Avison, this is the first time in 10 years that scientists are hopeful about reversing β-lactam antibiotic resistance.

Read more at Seeker

Oct 30, 2017

Oldest recorded solar eclipse helps date the Egyptian pharaohs

Total solar eclipse corona.
Researchers have pinpointed the date of what could be the oldest solar eclipse yet recorded. The event, which occurred on 30 October 1207 BC, is mentioned in the Bible, and could have consequences for the chronology of the ancient world.

Using a combination of the biblical text and an ancient Egyptian text, the researchers were then able to refine the dates of the Egyptian pharaohs, in particular the dates of the reign of Ramesses the Great. The results are published in the Royal Astronomical Society journal Astronomy & Geophysics.

The biblical text in question comes from the Old Testament book of Joshua and has puzzled biblical scholars for centuries. It records that after Joshua led the people of Israel into Canaan -- a region of the ancient Near East that covered modern-day Israel and Palestine -- he prayed: "Sun, stand still at Gibeon, and Moon, in the Valley of Aijalon. And the Sun stood still, and the Moon stopped, until the nation took vengeance on their enemies."

"If these words are describing a real observation, then a major astronomical event was taking place -- the question for us to figure out is what the text actually means," said paper co-author Professor Sir Colin Humphreys from the University of Cambridge's Department of Materials Science & Metallurgy, who is also interested in relating scientific knowledge to the Bible.

"Modern English translations, which follow the King James translation of 1611, usually interpret this text to mean that the sun and moon stopped moving," said Humphreys, who is also a Fellow of Selwyn College. "But going back to the original Hebrew text, we determined that an alternative meaning could be that the sun and moon just stopped doing what they normally do: they stopped shining. In this context, the Hebrew words could be referring to a solar eclipse, when the moon passes between the earth and the sun, and the sun appears to stop shining. This interpretation is supported by the fact that the Hebrew word translated 'stand still' has the same root as a Babylonian word used in ancient astronomical texts to describe eclipses."

Humphreys and his co-author, Graeme Waddington, are not the first to suggest that the biblical text may refer to an eclipse, however, earlier historians claimed that it was not possible to investigate this possibility further due to the laborious calculations that would have been required.

Independent evidence that the Israelites were in Canaan between 1500 and 1050 BC can be found in the Merneptah Stele, an Egyptian text dating from the reign of the Pharaoh Merneptah, son of the well-known Ramesses the Great. The large granite block, held in the Egyptian Museum in Cairo, says that it was carved in the fifth year of Merneptah's reign and mentions a campaign in Canaan in which he defeated the people of Israel.

Earlier historians have used these two texts to try to date the possible eclipse, but were not successful as they were only looking at total eclipses, in which the disc of the sun appears to be completely covered by the moon as the moon passes directly between the earth and the sun. What the earlier historians failed to consider was that it was instead an annular eclipse, in which the moon passes directly in front of the sun, but is too far away to cover the disc completely, leading to the characteristic 'ring of fire' appearance. In the ancient world the same word was used for both total and annular eclipses.

The researchers developed a new eclipse code, which takes into account variations in the Earth's rotation over time. From their calculations, they determined that the only annular eclipse visible from Canaan between 1500 and 1050 BC was on 30 October 1207 BC, in the afternoon. If their arguments are accepted, it would not only be the oldest solar eclipse yet recorded, it would also enable researchers to date the reigns of Ramesses the Great and his son Merneptah to within a year.

Read more at Science Daily

Mystery of raging black hole beams penetrated

Artist's impression of the V404 Cygni black hole jet.
They are nature's very own Death Star beams -- ultra-powerful jets of energy that shoot out from the vicinity of black holes like deadly rays from the Star Wars super-weapon.

Now a team of scientists led by the University of Southampton has moved a step closer to understanding these mysterious cosmic phenomena -- known as relativistic jets -- by measuring how quickly they 'switch on' and start shining brightly once they are launched.

How these jets form is still a puzzle. One theory suggests that they develop within the 'accretion disc' -- the matter sucked into the orbit of a growing black hole. Extreme gravity within the disc twists and stretches magnetic fields, squeezing hot, magnetised disc material called plasma until it erupts in the form of oppositely directed magnetic pillars along the black hole's rotational axis.

Plasma travels along these focused jets and gains tremendous speed, shooting across vast stretches of space. At some point, the plasma begins to shine brightly, but how and where this occurs in the jet has been debated by scientists.

In a new study published today [Monday, October 30] in Nature Astronomy, an international team of scientists led by Dr Poshak Gandhi show how they used precise multi-wavelength observations of a binary system called V404 Cygni -- consisting of a star and a black hole closely orbiting each other, with the black hole feeding off matter from the star that falls through the disc -- to throw light on this hotly debated phenomenon.

V404 Cygni is located about 7,800 light years away in the constellation of Cygnus, and weighs as much as about nine of our Suns put together. Dr Gandhi and his collaborators captured the data in June 2015, when V404 Cygni was observed radiating one of the brightest 'outbursts' of light from a black hole ever seen -- bright enough to be visible to small telescopes used by amateur astronomers, and energetic enough to tear apart an Earth-like planet if properly focused.

Using telescopes on Earth and in space observing at exactly the same time, they captured a 0.1-second delay between X-ray flares emitted from near the black hole, where the jet forms, and the appearance of visible light flashes, marking the moment when accelerated jet plasma begins to shine.

This 'blink of an eye' delay was calculated to represent a maximum distance of 19,000 miles (30,000 km), impossible to resolve at the distance of V404 with any current telescope.

Dr Gandhi, of the University of Southampton, said: "Scientists have been observing jets for decades, but are far from understanding how nature creates these mind-bogglingly vast and energetic structures.

"Now, for the first time, we have captured the time delay between the appearance of X-rays and the appearance of optical light in a stellar-mass black hole at the moment jet plasma is activated. This lays to rest the controversy regarding the origin of the optical flashes, and also gives us a critical distance over which jet plasma must have been strongly accelerated to speeds approaching that of light."

In Star Wars terms, the key measurement of this study can roughly be likened to measuring the distance between the surface of the Death Star, where multiple rays of light shoot out, and the point where they converge into a single bright beam.

"But the physics of black hole jets has nothing to do with lasers or the fictional Kyber crystals that power the Death Star. Nature has found other ways to power jets," said Dr Gandhi. "Gravity and magnetic fields play the key roles here, and this is the mechanism we are trying to unravel."

The study also creates a link between V404 Cygni and supermassive black holes, which lie at the centre of massive galaxies and which weigh billions of times more than stellar-mass black holes. Similar jet physics may apply to all black holes.

Dr Gandhi said: "This is an exciting and important discovery which can be fed back into theory about relativistic jets, and contributes to our ever-growing understanding of black holes."

The X-ray emission, representing the accretion disc 'feeding' the jet at its base, was captured from Earth orbit by NASA's NuSTAR telescope, while the moment the jet became visible as optical light was caught by the ULTRACAM high-speed camera, mounted on the William Herschel Telescope on La Palma, in the Canary Islands.

Professor Vik Dhillon, of the University of Sheffield, the principal investigator behind ULTRACAM, commented: "This discovery was made possible thanks to our camera gathering 28 frames per second. It demonstrates the untapped potential of studying astrophysical phenomena at high speeds."

Read more at Science Daily

Small asteroid or comet 'visits' from beyond the solar system

A/2017 U1 is most likely of interstellar origin. Approaching from above, it was closest to the Sun on Sept. 9. Traveling at 27 miles per second (44 kilometers per second), the comet is headed away from the Earth and Sun on its way out of the solar system.
A small, recently discovered asteroid -- or perhaps a comet -- appears to have originated from outside the solar system, coming from somewhere else in our galaxy. If so, it would be the first "interstellar object" to be observed and confirmed by astronomers.

This unusual object -- for now designated A/2017 U1 -- is less than a quarter-mile (400 meters) in diameter and is moving remarkably fast. Astronomers are urgently working to point telescopes around the world and in space at this notable object. Once these data are obtained and analyzed, astronomers may know more about the origin and possibly composition of the object.

A/2017 U1 was discovered Oct. 19 by the University of Hawaii's Pan-STARRS 1 telescope on Haleakala, Hawaii, during the course of its nightly search for near-Earth objects for NASA. Rob Weryk, a postdoctoral researcher at the University of Hawaii Institute for Astronomy (IfA), was first to identify the moving object and submit it to the Minor Planet Center. Weryk subsequently searched the Pan-STARRS image archive and found it also was in images taken the previous night, but was not initially identified by the moving object processing.

Weryk immediately realized this was an unusual object. "Its motion could not be explained using either a normal solar system asteroid or comet orbit," he said. Weryk contacted IfA graduate Marco Micheli, who had the same realization using his own follow-up images taken at the European Space Agency's telescope on Tenerife in the Canary Islands. But with the combined data, everything made sense. Said Weryk, "This object came from outside our solar system."

"This is the most extreme orbit I have ever seen," said Davide Farnocchia, a scientist at NASA's Center for Near-Earth Object Studies (CNEOS) at the agency's Jet Propulsion Laboratory in Pasadena, California. "It is going extremely fast and on such a trajectory that we can say with confidence that this object is on its way out of the solar system and not coming back."

The CNEOS team plotted the object's current trajectory and even looked into its future. A/2017 U1 came from the direction of the constellation Lyra, cruising through interstellar space at a brisk clip of 15.8 miles (25.5 kilometers) per second.

The object approached our solar system from almost directly "above" the ecliptic, the approximate plane in space where the planets and most asteroids orbit the Sun, so it did not have any close encounters with the eight major planets during its plunge toward the Sun. On Sept. 2, the small body crossed under the ecliptic plane just inside of Mercury's orbit and then made its closest approach to the Sun on Sept. 9. Pulled by the Sun's gravity, the object made a hairpin turn under our solar system, passing under Earth's orbit on Oct. 14 at a distance of about 15 million miles (24 million kilometers) -- about 60 times the distance to the Moon. It has now shot back up above the plane of the planets and, travelling at 27 miles per second (44 kilometers per second) with respect to the Sun, the object is speeding toward the constellation Pegasus.

"We have long suspected that these objects should exist, because during the process of planet formation a lot of material should be ejected from planetary systems. What's most surprising is that we've never seen interstellar objects pass through before," said Karen Meech, an astronomer at the IfA specializing in small bodies and their connection to solar system formation.

The small body has been assigned the temporary designation A/2017 U1 by the Minor Planet Center (MPC) in Cambridge, Massachusetts, where all observations on small bodies in our solar system -- and now those just passing through -- are collected. Said MPC Director Matt Holman, "This kind of discovery demonstrates the great scientific value of continual wide-field surveys of the sky, coupled with intensive follow-up observations, to find things we wouldn't otherwise know are there."

Since this is the first object of its type ever discovered, rules for naming this type of object will need to be established by the International Astronomical Union.

Read more at Science Daily

Humans Started Domesticating Crops Thousands of Years Earlier Than We Thought

An Iraqi farmer harvests barley in the village of Muaibira on May 2, 2006 in Iraq.
Humankind was domesticating grains thousands of years earlier than previously thought, a discovery that calls into question whether people are unique compared to other animals that tame plants for their benefit, according to researchers.

Writing in in the journal Philosophical Transactions of the Royal Society B, researchers in Britain and Japan used genetic evidence to conclude that people in the Fertile Crescent — a band of land running from what today is the Nile River in Egypt to the Tigris and Euphrates Rivers in Iraq — influenced the evolution of einkorn, an ancient grain, as long as 30,000 years ago.

Other cereals emmer and barley, likely mutated due to human actions around 25,000 years ago and 21,000 years ago, respectively, in the same region. In Asia, rice was undergoing the same process around 13,000 years ago.

Until now, the scientific consensus was that people domesticated plants around 12,000 years ago, paving the way for the dawn of civilization and the establishment of cities.

“This study changes the nature of the debate about the origins of agriculture,” said study co-author Robin Allaby, a life sciences professor at the University of Warwick.

Robin Allaby
With the help of funding from the European Research Council, Allaby and his colleagues examined cereal grains from archeological sites throughout the world and tested them for a genetic trait that makes grain seeds fall off the stem easily.

Wild cereal strains bear the trait. It makes it easier for them to spread their seeds and propagate. But domesticated cereal strains lack it, with good reason. Keeping the seeds on the steam makes it easier to cultivate. Otherwise, farmers would lose their crops before harvest.

Scientists have long believed that people unwittingly domesticated grains by harvesting wild cereals.

The theory goes that people would have more easily harvested grain seeds that fell off the plant with less effort, leaving behind those with the mutation that kept the seeds on the plant for longer. Those remaining seeds would then eventually fall off, spreading strains containing the mutation. Natural selection would then result in the proliferation of strains we now know as domesticated grains.

The study didn’t challenge or confirm that theory. But the researchers found traces of the hardy seed mutation dating back far earlier than experts had discovered in the past. That lead them to conclude that domesticating wild grains took tens of thousands of years rather than over the course of a few millennia during the so-called “agricultural revolution.”

“It really looks natural,” said Allaby. “It really looks like a natural consequence of dense human populations in a certain ecology. There is no revolution.”

Read more at Seeker

Alcohol Consumption Appears to Improve Foreign Language Skills

Do you ever feel like your high school Spanish comes back to you more easily after a few drinks? A new study published in the Journal of Psychopharmacology shows there might be some validity to that feeling. Researchers at Maastricht University in Holland found that people who recently learned Dutch, spoke the language better after they consumed a small amount of alcohol.

“On the one hand, we know that alcohol consumption has negative effects on executive functioning and cognitive functioning which is necessary for language production,” Fritz Renner, lead author and a post-doctoral student in emotional disorders at the University of Cambridge, told Seeker. “[But] there is a popular belief among foreign language learners that alcohol improves their foreign language. There are two seemingly opposing positions that makes it interesting to test.”

Study participants included 50 native German speakers who are students at Maastricht where they recently learned to speak, read, and write in Dutch.

For the trials, each participant drank a beverage that was either alcoholic or non-alcoholic without knowing which one they were consuming. The amount of alcohol consumed by a subject was adjusted to the person’s height and body weight, but was equivalent to a 154-pound-man drinking a little less than one pint of beer.

Afterwards, two native Dutch speakers, who did not know if the subject had consumed an alcoholic drink, observed the students speaking Dutch. Each subject also rated their own language skills.

Nearly every participant who consumed alcohol before speaking Dutch was given a higher rating than those who did not. The native speakers rated them particularly high on their pronunciation.

“The [speakers] received higher ratings on an overall rating scale,” Renner said. “But when looking at specific ratings, pronunciation was higher in the alcohol group.”

The alcohol did not affect self-ratings. People rated themselves on their language skills the same with or without alcohol.

It’s a long-held assumption that we can speak a foreign language better after a few drinks, the study authors noted, and this research partially supports that idea. However, it’s important to keep in mind that the dosage of alcohol was very low. Drinking more may not have the same benefits. Researchers have studied the effects of alcohol on speech for years, with most studies showing that intoxication causes a speaker to slur their words and make errors even in their native language.

Read more at Seeker