Jun 2, 2018

Weight changes associated with reduced bone strength

Researchers from Hebrew SeniorLife's Institute for Aging Research, Boston University, Beth Israel Deaconess Medical Center, and University of Calgary have found evidence that weight loss can result in worsening bone density, bone architecture and bone strength. The results were published in the Journal of Bone and Mineral Research.

Douglas P. Kiel, MD, MPH, principal investigator for the study said, "The study is significant because it used data on weight changes over 40 years in participants in the Framingham Study. We showed that men and women with both shorter term weight loss over 4-6 years and longer term weight loss over 40 years had more micro-architectural deterioration of their bones than persons who did not lose weight."

The magnitude of changes to the skeleton were clinically significant and translated into an almost three-fold increase in the risk of fracture for those who lost 5% or more weight over 40 years.

Elizabeth (Lisa) Samelson, PhD, senior author of the paper cautioned that "Older adults who are losing weight should be aware of the potential negative effects on the skeleton and may want to consider counteracting these effects through interventions such as weight-bearing exercise and eating a balanced diet. Given that weight loss is highly common in older adults, further work is needed to evaluate if these bone deficits can be prevented through interventions or therapy."

From Science Daily

Synthetic 'tissues' build themselves

Researchers programmed cells to self-assemble into complex structures such as this one with three differently colored layers.
How do complex biological structures -- an eye, a hand, a brain -- emerge from a single fertilized egg? This is the fundamental question of developmental biology, and a mystery still being grappled with by scientists who hope to one day apply the same principles to heal damaged tissues or regrow ailing organs.

Now, in a study published May 31, 2018 in Science, researchers have demonstrated the ability to program groups of individual cells to self-organize into multi-layered structures reminiscent of simple organisms or the first stages of embryonic development.

"What is amazing about biology is that DNA allows all the instructions required to build an elephant to be packed within a tiny embryo," said study senior author Wendell Lim, PhD, chair and Byers Distinguished Professor in the Department of Cellular and Molecular Pharmacology at UCSF, director of the NIH-funded Center for Systems and Synthetic Biology, and co-director of the National Science Foundation-funded Center for Cellular Construction. "DNA encodes an algorithm for growing the organism -- a series of instructions that unfolds in time in a way we still don't really understand. It's easy to get overwhelmed by the complexity of natural systems, so here we set out to understand the minimal set of rules for programming cells to self-assemble into multicellular structures."

A critical part of development is that, as biological structures form, cells communicate with one another and make coordinated, collective decisions about how to structurally organize themselves. To mimic this process, the new research -- led by UCSF postdoctoral researcher Satoshi Toda, PhD, in Lim's lab -- relied on a powerfully customizable synthetic signaling molecule called synNotch (for "synthetic Notch receptor") recently developed in Lim's lab, which allowed the researchers to program cells to respond to specific cell-cell communication signals with bespoke genetic programs.

For example, using synNotch, the researchers engineered cells to respond to specific signals from neighboring cells by producing Velcro-like adhesion molecules called cadherins as well as fluorescent marker proteins. Remarkably, just a few simple forms of collective cell communication were sufficient to cause ensembles of cells to change color and self-organize into multi-layered structures akin to simple organisms or developing tissues.

In their simplest experiment of this sort, the researchers programmed two groups of cells to self-organize into a two-layered sphere. They started with one group of blue cells expressing a signaling protein on their surfaces, and a second group of colorless cells sporting a custom synNotch receptor programmed to detect this signal protein. When isolated from one another, these cell populations did nothing, but when the two groups were mixed, the blue cells activated the synNotch receptors on the clear cells and triggered them to produce sticky cadherins and a green marker protein called GFP. As a result, the colorless cells quickly began to turn green and cluster together, forming a central core surrounded by an outer layer of their partner blue cells.

The researchers went on to program groups of cells to self-assemble in increasingly complex ways, such as building three-layered spheres or starting with a single group of cells that sorted themselves into two distinct groups before forming a layered sphere. They even engineered cells that formed the beginnings of "polarity" -- the distinct front-back, left-right, head-toe axes that define the "body plans" of many multicellular organisms -- by expressing different types of cadherin adhesion molecules that instructed the cellular assemblages to divide into "head" and "tail" sections or to produce four distinct radial "arms."

These more complex cellular programming feats demonstrated that simple starter cells could be programmed to develop over time to form more complex structures, much like a single fertilized egg divides and differentiates to form different parts of the body and distinct tissues like skin, muscle, nerve, and bone. Lim's team showed that these complex spheroids were also self-repairing: when the researchers cut the multi-layered spheroids in half with a micro-guillotine developed by co-authors Lucas R. Blauch and Sindy Tang, PhD, of Stanford University, the remaining cells quickly re-formed and reorganized themselves according to their intrinsic program.

SynNotch was originally developed in the Lim lab by co-author Kole Roybal, PhD, now an assistant professor of microbiology and immunology at UCSF, and Leonardo Morsut, PhD, now an assistant professor of stem cell biology and regenerative medicine at the University of Southern California and co-corresponding author on the new paper.

In the future, Lim imagines programming ever more complex tissue-like cellular structures through multiple layers of synNotch signaling. For example, activation of one synNotch receptor by cell-cell contact or chemical signaling could trigger cells to produce additional distinct synNotch receptors, leading to a cascade of engineered signaling steps. In this way, Lim envisions programming the self-organization of the elaborate structures that would eventually be needed for growing tissues for wound repair or transplant.

"People talk about 3D-printing organs, but that is really quite different from how biology builds tissues. Imagine if you had to build a human by meticulously placing every cell just where it needs to be and gluing it in place," said Lim, who is a Howard Hughes Medical Institute investigator and member of the UCSF Helen Diller Family Comprehensive Cancer Center. "It's equally hard to imagine how you would print a complete organ, then make sure it was hooked up properly to the bloodstream and the rest of the body. The beauty of self-organizing systems is that they are autonomous and compactly encoded. You put in one or a few cells, and they grow and organize, taking care of the microscopic details themselves."

Read more at Science Daily

Where the brain processes spiritual experiences

Stained glass brain
Yale scientists have identified a possible neurobiological home for the spiritual experience -- the sense of connection to something greater than oneself.

Activity in the parietal cortex, an area of the brain involved in awareness of self and others as well as attention processing, seems to be a common element among individuals who have experienced a variety of spiritual experiences, according to a study published online May 29 in the journal Cerebral Cortex.

"Spiritual experiences are robust states that may have profound impacts on people's lives," said Marc Potenza, professor of psychiatry, of the Yale Child Study Center, and of neuroscience. "Understanding the neural bases of spiritual experiences may help us better understand their roles in resilience and recovery from mental health and addictive disorders."

Spiritual experiences can be religious in nature or not, such as feeling of oneness in nature or the absence of self during sporting events. Researchers at Yale and the Spirituality Mind Body Institute at Columbia University interviewed 27 young adults to gather information about past stressful and relaxing experiences as well as their spiritual experiences. The subjects then underwent fMRI scans while listening for the first time to recordings based on their personalized experiences. While individual spiritual experiences differed, researchers noted similar patterns of activity in the parietal cortex as the subjects imagined experiencing the events in the recordings.

Potenza stressed other brain areas are probably also involved in formation of spiritual experiences. The method can help future researchers study spiritual experience and its impact on mental health, he said.

Lisa Miller of Columbia is first author of the study.

From Science Daily

Jun 1, 2018

Ancient tooth shows Mesolithic ancestors were fish and plant eaters

The Mesolithic tooth had evidence of fish scales and fish tissue.
Previous analysis of Mesolithic skeletal remains in this region has suggested a more varied Medittaranrrean diet consisting of terrestrial, freshwater and marine food resources, not too dissimilar to what modern humans eat today.

Although this recent find is the only example of a skeleton that provides evidence of both fish and plants in the diet of early people in this region, the researchers argue that the discovery provides a significant insight into the lifestyle of Adriatic and Mediterranean foragers.

The team found microfossils entombed in the dental calculus, commonly known as tooth plaque or tartar, of the young male skeleton, revealing fish scale fragments and fish muscle fibres.

Analysis also showed microfossils of plants in the dental calculus, which has not been identified in skeletal remains in this part of the Mediterranean before. The researchers point out that finding both ancient plant and fish deposits in the teeth further demonstrates the value of dental remains in the understanding of human evolution.

Dr Harry Robson, from the University of York's Department of Archaeology, said: "Whilst fishing during the Mesolithic period has been demonstrated by fish remains as well as fishing related technologies in the past, here, for the first time we have direct evidence that humans consumed these resources, or used their teeth for de-scaling activities, which is very unique.

"The skeleton, which has been dated to the late eighth millennium BC, is also significant in terms of its bone chemistry. Using carbon and nitrogen stable isotope analysis we were able to demonstrate that marine resources were a major component of the diet of this individual over a sustained period of time."

The team were unable to identify the fish scales, although they are thought to be very similar to tuna, mackeral, and gilthead sea bream.

Despite the lack of a grave, the male who was between 30 and 40 years-of-age, was probably purposefully buried there. Although long-term consumption of marine resources is a rare find for this period and region, dental analysis of more skeletal finds could help reveal if it was common to early human diets.

Lead researcher, Professor Emanuela Cristiani, from Sapienza University of Rome, said: "This is an exciting, but surprising finding. We only have three skeletal remains from this period that demonstrate the long-term consumption of marine-resources, so when you can identify microfossils of this kind, it can provide a great leap forward in our understanding.

"Our data provides a novel perspective on forager diet in the Mediterranean region by revealing the role of marine organisms during the Mesolithic.

Read more at Science Daily

Meet NOTCH2NL, the human-specific genes that may have given us our big brains

This image shows a human skull overlaid with an illustration of the human brain.
The evolution of larger brains in the last 3 million years played an important role in our ability as a species to think, problem-solve, and develop culture. But the genetic changes behind the expansion that made us human have been elusive. In a pair of papers publishing May 31 in Cell, two teams of researchers identify a gene family, NOTCH2NL, that appears to play an important role in human-specific cortex development and may have been a driving force in the evolution of our large brains. NOTCH2NL genes delay the differentiation of cortical stem cells into neurons, resulting in the production of more neurons across the course of development. The genes are found exclusively in humans, are heavily expressed in neural stem cells of the human cerebral cortex, and are located on a part of the genome implicated in neurodevelopmental disorders.

"Our brains got three times as big primarily through the expansion of certain functional areas of the cerebral cortex, and that has to be a fundamental substrate for us becoming human. There's really no more exciting scientific question that I can think of than discovering and decoding the mysterious genetic changes that made us who we are," says David Haussler, co-senior author of one of the papers and a bioinformatician at the University of California, Santa Cruz, and Howard Hughes Medical Institute.

The team led by Haussler and co-senior authors Frank Jacobs of the University of Amsterdam and Sofie Salama of University of California, Santa Cruz, and Howard Hughes Medical Institute were comparing genes expressed during brain development in humans and macaque monkeys in stem cell-derived models when they realized that they could detect NOTCH2NL in human cells but not in those of the macaques. Looking at the DNA, they also didn't see it in orangutans and found only truncated, inactive versions in our closest relatives, gorillas and chimpanzees.

Reconstructing the evolutionary history of NOTCH2NL genes revealed that a process called gene conversion was likely responsible for repairing a non-functional version of NOTCH2NL, which originally emerged as a partial duplication of an essential neurodevelopmental gene known as NOTCH2. This repair happened only in humans -- and they estimate it happened 3-4 million years ago, around the same time that the fossil record suggests human brains began to expand. After it was repaired, but before we diverged from our common ancestor with Neanderthals, NOTCH2NL was duplicated two more times.

The team behind the other paper, led by developmental biologist Pierre Vanderhaeghen of Université Libre de Bruxelles ULB and VIB-KU Leuven, arrived at NOTCH2NL from a related direction, searching for human-specific genes active during fetal brain development using primary tissue. "One of the holy grails of researchers like us is to find out what during human development and evolution is responsible for a bigger brain, particularly the cerebral cortex," Vanderhaeghen says. "Given the relatively fast evolution of the human brain, it is tempting to speculate that newly evolved, human-specific genes may help shape our brain in a species-specific way."

Searching for human-specific genes involved in brain development proved challenging because these genes are typically poorly annotated in genome databases and hard to distinguish from more common genes present in other species. The Vanderhaeghen team developed a tailored RNA sequencing analysis for specific and sensitive detection of human-specific genes in human fetal cerebral cortex. This allowed them to identify a repertoire of 35 genes unique to humans that are active during development of the cerebral cortex in humans, including NOTCH2NL genes.

They zeroed in on NOTCH2NL in particular because of the importance of its ancestral gene, NOTCH2, in signaling processes that control whether cortical stem cells produce neurons or regenerate more stem cells. And they found that artificially expressing NOTCH2NL in mouse embryos increased the number of progenitor stem cells in the mouse cortex. To better understand what the genes do in humans, the team turned to an in vitro model of cortical development from human pluripotent stem cells to explore NOTCH2NL function.

In this model, they found that NOTCH2NL can substantially expand the population of cortical stem cells, which in turn then generate more neurons, a feature expected to distinguish between human and non-human cortical neurogenesis. "From one stem cell, you can either regenerate two progenitor cells, generate two neurons, or generate one progenitor stem cell and one neuron. And what NOTCH2NL does is bias that decision in a slight way towards regenerating progenitors, which can later go on to make more neurons. It's a small early effect with large late consequences, as often happens with evolution," Vanderhaeghen says.

Haussler's team looked at what happened when NOTCH2NL wasn't expressed: they deleted it from human stem cells and used them to grow patches of cortex called organoids. In the organoids derived from NOTCH2NL-depleted stem cells, differentiation occurred faster, but the organoids ended up being smaller. "If you lose NOTCH2NL, it leads to premature differentiation of cortical stem cells into neurons, but at the same time the very important stem cell pool gets depleted," says Jacobs.

NOTCH2NL's location on the genome, incorrectly mapped until recently, is further support for its role in human brain size. Duplications or deletions at a genome region known as 1q21.1 are known to cause macrocephaly or microcephaly, respectively, and are associated with a range of neurodevelopmental disorders, including ADHD, autism spectrum disorder, and intellectual disability. Haussler's team looked at 11 patients with errors at this locus and found that NOTCH2NL was indeed being duplicated and deleted in the rearrangement events associated with larger and smaller brain size that resulted. "We really wanted the gene to be in the 1q21.1 disease interval, because it made logical sense, but in the incorrect reference genome, it wasn't. And then we found new data, and we realized that it was a mistake in the reference genome! It seldom happens that when you want something that appears to be false to be true, it turns out to actually be true. I don't think something of that level will ever happen again in my career," says Haussler.

This part of the genome is simply challenging to sequence and read. "We've been looking under the lamppost in human genetics, as they say, by studying just the regions that were easy to sequence. There's a lot of information in these other regions, and there's a reasonable argument that they are the real cauldron for rapid change over the last few million years," he says.

Because NOTCH2NL is something of an evolutionary trade-off between larger brains and 1q21.1 disease susceptibility, the researchers are all quick to point out that there is plenty of healthy variation here, too. "It's a boon that may have enabled us to get a big brain. And yes, it's a bane, because we can have these recombination events that can be bad. But what we found when we developed the technology to really sequence it in individuals is that there are multiple different alleles of this gene. And it's possible that that variation creates the subtlety and plasticity that is important in enabling humans to be human," says Salama.

There are still a lot of unknowns when it comes to NOTCH2NL. Haussler's team points out that they were only able to look at the genomes of a small sample of patients and that their organoid models didn't address the later stages of cortical development, at a time when NOTCH2NL might be even more important. Another important question Vanderhaeghen's team wants to address is what other human-specific genes identified here (in particular, those also found in the 1q21.1 region or other genome areas associated with brain diseases) do during brain development. And although both teams were able to show that NOTCH2NL is involved in the well-studied Notch signaling pathway, Vanderhaeghen acknowledges that there is still uncertainty around the exact mechanism by which NOTCH2NL tips the balance between differentiation and regeneration.

"What's amazing is that there are many signaling pathways that control the development of the embryo and are completely conserved between species. The Notch signaling pathway is the oldest one. You can find it in every animal you look at. It has been used by developing embryos for as long as animals have existed. And yet, there is a very recent innovation in this pathway specifically in the human lineage, through NOTCH2NL," says Vanderhaeghen.

Read more at Science Daily

Micro-CT scans show 2,100-year-old 'hawk' mummy a stillborn baby

Micro-CT scans determined a 'hawk mummy' at Maidstone Museum UK is in fact a stillborn male human with severe congenital abnormalities that include a malformed skull and vertebrae. An international team's unprecedented analysis was led by bioarchaeologist and mummy expert Andrew Nelson of Western University, Canada.
A tiny Egyptian mummy long believed to be that of a hawk is actually a rare example of a near-to-term, severely malformed fetus, says an examination led by mummy expert Andrew Nelson of Western University in London, Canada.

Detailed micro-CT scans have virtually unwrapped the mummy to reveal what would have been a family tragedy even two millennia ago: a male, stillborn at 23 to 28 weeks of gestation, and with a rare condition called anencephaly in which the brain and skull fail to develop properly.

Its misidentification in the Maidstone Museum in the UK, as 'EA 493 -- Mummified Hawk Ptolemaic Period', came to light in 2016 when the museum decided to CT-scan its resident female mummy and, incidentally, to scan 'EA 493' and other animal mummies at the same time. That's when the smaller mummy surprised experts, who identified it as a human fetus. But the CT scans lacked detail and Nelson worked with the Museum and Nikon Metrology (UK) to conduct a micro-CT scan: an extremely high-resolution scan that didn't entail damaging the mummy in any way.

Nelson then assembled an interdisciplinary team to examine and interpret the images in what has become the highest-resolution scan ever conducted of a fetal mummy.

The images show well-formed toes and fingers but a skull with severe malformations, says Nelson, a bioarchaeologist and professor of anthropology at Western. "The whole top part of his skull isn't formed. The arches of the vertebrae of his spine haven't closed. His earbones are at the back of his head."

There are no bones to shape the broad roof and sides of the skull, where the brain would ordinarily grow. "In this individual, this part of the vault never formed and there probably was no real brain," Nelson says.

That makes it one of just two anencephalic mummies known to exist (the other was described in 1826), and by far the most-studied fetal mummy in history.

Nelson recently presented the team's findings at the Extraordinary World Congress on Mummy Studies in the Canary Islands.

The research provides important clues to the maternal diet -- anencephaly can result from lack of folic acid, found in green vegetables -- and raises new questions about whether mummification in this case took place because fetuses were believed to have some power as talismans, Nelson says.

"It would have been a tragic moment for the family to lose their infant and to give birth to a very strange-looking fetus, not a normal-looking fetus at all. So this was a very special individual," Nelson says.

Read more at Science Daily

Cosmic collision lights up the darkness

This image, taken with the Wide Field Camera 3 (WFC3) and the Advanced Camera for Surveys (ACS), both installed on the NASA/ESA Hubble Space Telescope, shows the peculiar galaxy NGC 3256. The galaxy is about 100 million light-years from Earth and is the result of a past galactic merger, which created its distorted appearance. As such, NGC 3256 provides an ideal target to investigate starbursts that have been triggered by galaxy mergers.
Though it resembles a peaceful rose swirling in the darkness of the cosmos, NGC 3256 is actually the site of a violent clash. This distorted galaxy is the relic of a collision between two spiral galaxies, estimated to have occured 500 millions years ago. Today it is still reeling in the aftermath of this event.

Located about 100 million light-years away in the constellation of Vela (The Sails), NGC 3256 is approximately the same size as our Milky Way and belongs to the Hydra-Centaurus Supercluster. It still bears the marks of its tumultuous past in the extended luminous tails that sprawl out around the galaxy, thought to have formed 500 million years ago during the initial encounter between the two galaxies, which today form NGC 3256. These tails are studded with young blue stars, which were born in the frantic but fertile collision of gas and dust.

When two galaxies merger, individual stars rarely collide because they are separated by such enormous distances, but the gas and dust of the galaxies do interact -- with spectacular results. The brightness blooming in the centre of NGC 3256 gives away its status as a powerful starburst galaxy, host to vast amounts of infant stars born into groups and clusters. These stars shine most brightly in the far infrared, making NGC 3256 exceedingly luminous in this wavelength domain. Because of this radiation, it is classified as a Luminous Infrared Galaxy.

NGC 3256 has been the subject of much study due to its luminosity, its proximity, and its orientation: astronomers observe its splendour face-on orientation, that shows the disc in all its splendour. NGC 3256 provides an ideal target to investigate starbursts that have been triggered by galaxy mergers. It holds particular promise to further our understanding of the properties of young star clusters in tidal tails.

As well as being lit up by over 1000 bright star clusters, the central region of NGC 3256 is also home to crisscrossing threads of dark dust and a large disc of molecular gas spinning around two distinct nuclei -- the relics of the two original galaxies. One nucleus is largely obscured, only unveiled in infrared, radio and X-ray wavelengths.

These two initial galaxies were gas-rich and had similar masses, as they seem to be exerting roughly equal influence on each other. Their spiral disks are no longer distinct, and in a few hundred million years time, their nuclei will also merge and the two galaxies will likely become united as a large elliptical galaxy.

Read more at Science Daily

May 31, 2018

Italy's oldest olive oil discovered in peculiar pot

Chemical analysis conducted on ancient pottery proves olive oil existed in Italy 700 years sooner than what's previously been recorded.
Olive oil is a staple of Italian cuisine. It's been that way for thousands of years. And new chemical analysis conducted on ancient pottery proves the liquid gold has existed in Italy hundreds of years longer than what anthropologists have previously recorded.

A team of researchers lead by Davide Tanasi, PhD, assistant professor of history at the University of South Florida, carried out chemical analyses to identify the content of a large jar, found in the 90s by Giuseppe Voza during the excavations at the site of Castelluccio. Conservators at the Archaeological Museum of Siracusa restored and reassembled 400 ceramic fragments, resulting in an egg-shaped 3 ½ foot storage container adorned with rope bands and three vertical handles on each side. At the same architectural site in Castelluccio in Sicily, researchers found two fragmented basins with an internal septum, indicating it was used to keep multiple substances together, but separate, along with a large terracotta cooking plate.

"The shape of this storage container and the nearby septum was like nothing else Voza found at the site in Castelluccio," said Dr. Tanasi. "It had the signature of Sicilian tableware dated to the end of the 3rd and beginning of the 2nd millennium BCE (Early Bronze Age). We wanted to learn how it was used, so we conducted chemical analysis on organic residues found inside."

In the study published in Analytical Methods, Dr. Tanasi tested the three artifacts using techniques traditionally and successfully used on archaeological pottery: Gas Chromatography, Mass Spectrometry and Nuclear Magnetic Resonance. His team found organic residue from all three samples contained oleic and linoleic acids, signatures of olive oil. They conclude the artifacts are from the Sicilian Early Bronze Age due to their location and peculiar shapes.

"The results obtained with the three samples from Castelluccio become the first chemical evidence of the oldest olive oil in Italian prehistory, pushing back the hands of the clock for the systematic olive oil production by at least 700 years," said Tanasi.

The only known identification of chemical signatures of olive oil are from storage jars discovered in southern Italy in Cosenza and Lecce believed to be from the 12th and 11th century BCE (Copper Age).

From Science Daily

Details that look sharp to people may be blurry to their pets

A household scene as viewed by various pets and pests. Human eyesight is roughly seven times sharper than a cat, 40 to 60 times sharper than a rat or a goldfish, and hundreds of times sharper than a fly or a mosquito.
Compared with many animals, human eyes aren't particularly adept at distinguishing colors or seeing in dim light. But by one measure at least -- something called visual acuity -- human eyes can see fine details that most animals can't, Duke University researchers say.

A new study of animal vision compared hundreds of species by the sharpness of their sight.

In a paper published in the May 2018 issue of the journal Trends in Ecology & Evolution, the researchers compiled previously published estimates of visual acuity for roughly 600 species of insects, birds, mammals, fish and other animals.

Across the animal kingdom, most species "see the world with much less detail than we do," said first author Eleanor Caves, a postdoctoral researcher at Duke.

The study measured acuity in terms of cycles per degree, which is how many pairs of black and white parallel lines a species can discern within one degree of the field of vision before they turn into a smear of gray.

Researchers can't ask a camel to identify letters on an eye chart. Instead, they estimate visual acuity based on an animal's eye anatomy -- such as the spacing and density of light-sensing structures -- or using behavioral tests.

The limit of detail that human eyes can resolve is about 60 cycles per degree, which helps us make out road signs and recognize faces from afar.

Chimpanzees and other primates can pick out similarly fine patterns.

A few birds of prey do better. For instance, the wedge-tailed eagle of Australia can see 140 cycles per degree, more than twice the limit of human visual acuity. Eagles can spot something as small as a rabbit while flying thousands of feet above the ground.

But apart from some eagles, vultures and falcons, the results show that most birds see fewer than 30 cycles per degree -- less than half as much detail as humans.

The same goes for fish. "The highest acuity in a fish is still only about half as sharp as us," Caves said.

Humans can resolve four to seven times more detail than dogs and cats, and more than a hundred times more than a mouse or a fruit fly.

A person who sees less than 10 cycles per degree is considered legally blind. Most insects, it turns out, can't see more than one.

Overall, the researchers found a 10,000-fold difference between the most sharp-sighted and the most blurry-eyed species.

The researchers also created a series of images showing how different scenes might appear to animals with different acuities, using a software package they developed called AcuityView. The software takes a digital photo and strips away all the spatial detail that may be too fine for a given animal to distinguish.

The converted images reveal animal patterns that, while easy for some species to see, may be imperceptible to others, or only recognizable from a short distance.

Take the patterns on a butterfly's wing. Scientists have debated the function of their spots, stripes and splotches.

One common assumption is that they warn birds and other predators to stay away. It has also been proposed that they help butterflies check out or seduce potential mates.

The researchers determined that the wing patterns of, say, the map butterfly may be apparent to many birds, but to others of their kind their wing patterns are likely a blur, even from just a few inches away.

"I don't actually think butterflies can see them," Caves said.

Some animals may use such differences in acuity to send secret messages that sharper-sighted species can read but others can't, Caves said.

For instance, orb-weaver spiders decorate their webs with white silk zigzags, spirals and other designs whose function has been debated.

One theory is that they keep larger animals from accidentally colliding with their delicate webs, like the window stickers used to keep birds from flying into the glass. Another idea is that they lure insect prey.

But images of spider web decorations as they might appear to different species suggest that, while birds can spot them from as far away as six feet, they are virtually invisible to house flies and other small insects that might blunder into the spider's sticky traps.

It seems the decorations help spiders alert birds to webs that might be in their flight path, without blowing their cover with the creatures they might be trying to catch for lunch.

The converted images the team produced don't represent what animals actually see, the researchers caution. That's because while the eyes take in visual information, the brain must make sense of it.

Read more at Science Daily

Novel RNA-modifying tool corrects genetic diseases

Professor Matthew Disney of The Scripps Research Institute led the new study.
As scientists gain insights into which genes drive diseases, they are pursuing the next logical question: Can gene editing technologies be developed to treat or even cure those diseases? Much of that effort has focused on developing technologies such as CRISPR-Cas9, a protein-based system.

At The Scripps Research Institute campus in Florida, chemist Matthew D. Disney, PhD, has taken a different approach, developing a small-molecule-based tool that acts on RNA to selectively delete certain gene products.

Disney's deletion tool opens the possibility of creating drugs that can be taken conveniently as pills to correct genetic diseases -- by destroying toxic gene products, and by chemically controlling the body's defense mechanisms. The paper, "Small molecule targeted recruitment of a nuclease to RNA," was published online by the Journal of the American Chemical Society.

"These studies, like much science, were about a decade in the making. We are very excited to see how this initial application evolves," Disney says. "This research further shows that RNA is indeed a viable target to make medicines."

RNAs represent a diverse group of molecules within cells that act like the cells' laborers, reading, regulating and expressing DNA's genetic instructions. Within our cells, RNAs are constantly in motion. They assemble, they carry out their duties, and then they are broken up for recycling by RNA-degrading enzymes, which are chemical scissors that cut apart other molecules.

While about 2 percent of our genome encodes proteins, 70 to 80 percent of the genome is transcribed into RNA, potentially offering significantly more druggable targets, Disney says. Until recently, however, most researchers considered RNAs undruggable, because of their small size and relative lack of stability.

Disney's innovation tethers a drug-like molecule -- one engineered to bind precisely and selectively to a specific RNA -- to a common RNA-degrading enzyme. The small-molecule/enzyme complex is designed to latch onto the undesirable gene product and destroy it. Disney named the technology RIBOTAC, short for "ribonuclease-targeting chimeras."

To test the RIBOTAC technology, Disney chose for his RNA-degrading enzyme RNase L, which is a critical part of the human antiviral immune response. Present in small amounts in every cell, production of RNase L typically surges on viral infection to destroy the viral RNA and overcome the illness.

For the other piece of the RIBOTAC complex, its drug-like molecule, Disney chose Targaprimir-96, a molecule engineered by his lab in 2016 to bind with a microRNA oncogene known to boost cancer cell proliferation, especially in difficult-to-treat triple-negative breast cancer, miRNA-96.

Destroying the oncogene led to a reawakening of the cancer cell's innate self-destruct program, via an increase in the FOXO1 gene, which ultimately spurred the death of the malignant cells, says Matthew G. Costales, first author of the paper and a graduate student in the Disney lab.

"Anchoring our previous work with Targaprimir-96 to the targeted recruitment of RNase L, we were able to program the RIBOTACs approach to only degrade cells that highly express the miRNA-96 oncogene, thus allowing FOXO1 to signal the selective destruction of triple negative breast cancer cells," says Costales.

Awakening the body's ability to kill its own cancer by exploiting cells' RNA degradation system offers a novel approach to attacking cancer, Disney says. The RIBOTAC technology has potentially broad applications for cancer and other gene-driven diseases as well, he says.

"I believe this is just the tip of the iceberg of how this approach will ultimately be applied," says Disney.

Disney's lab has spent many years developing a computational method called InfornaTM to match RNAs with adequate stability and structure to small, drug-like molecules capable of binding to them. His technique led to the development of Targaprimir-96 and multiple other disease-modifying compounds, some of which are now moving toward clinical development.

"Since it is now known that RNA is a key driver in nearly every disease, optimization of this approach that turns a cell's natural defenses toward destroying disease-causing RNAs is likely broadly applicable. We will be laser-focused on diseases for which there are no known cure and have a poor prognosis, such as hard-to-treat cancers and incurable human genetic disease," Disney says. "I am excited to see where we and others ultimately take this."

Read more at Science Daily

In ancient boulders, new clues about the story of human migration to the Americas

University at Buffalo Ph.D. candidate Alia Lesnek works at Suemez Island.
When and how did the first people come to the Americas?

The conventional story says that the earliest settlers came via Siberia, crossing the now-defunct Bering land bridge on foot and trekking through Canada when an ice-free corridor opened up between massive ice sheets toward the end of the last ice age.

But with recent archaeological evidence casting doubt on this thinking, scientists are seeking new explanations. One dominant, new theory: The first Americans took a coastal route along Alaska's Pacific border to enter the continent.

A new geological study provides compelling evidence to support this hypothesis.

By analyzing boulders and bedrock, a research team led by the University at Buffalo shows that part of a coastal migration route became accessible to humans 17,000 years ago. During this period, ancient glaciers receded, exposing islands of southern Alaska's Alexander Archipelago to air and sun -- and, possibly, to human migration.

The timing of these events is key: Recent genetic and archaeological estimates suggest that settlers may have begun traveling deeper into the Americas some 16,000 years ago, soon after the coastal gateway opened up.

The research will be published online on May 30 in the journal Science Advances.

"People are fascinated by these questions of where they come from and how they got there," says lead scientist Jason Briner, PhD, professor of geology in UB's College of Arts and Sciences. "Our research contributes to the debate about how humans came to the Americas. It's potentially adding to what we know about our ancestry and how we colonized our planet."

"Our study provides some of the first geologic evidence that a coastal migration route was available for early humans as they colonized the New World," says UB geology PhD candidate Alia Lesnek, the study's first author. "There was a coastal route available, and the appearance of this newly ice-free terrain may have spurred early humans to migrate southward."

The findings do not mean that early settlers definitely traversed Alaska's southern coast to spread into the Americas: The project examined just one section of the coast, and scientists would need to study multiple locations up and down the coastline to draw firmer conclusions.

Still, the work is exciting because it hints that the seafaring theory of migration is viable.

The bones of an ancient ringed seal -- previously discovered in a nearby cave by other researchers -- provide further, tantalizing clues. They hint that the area was capable of supporting human life at the time that early settlers may have been passing through, Briner says. The new study calculates that the seal bones are about 17,000 years old. This indicates that the region was ecologically vibrant soon after the ice retreated, with resources including food becoming available.

Co-authors on the research included Briner; Lesnek; Charlotte Lindqvist, PhD, an associate professor of biological sciences at UB and a visiting associate professor at Nanyang Technological University; James Baichtal of Tongass National Forest; and Timothy Heaton, PhD, of the University of South Dakota.

A landscape, touched by ice, that tells a story

To conduct their study, the scientists journeyed to four islands within the Alexander Archipelago that lie about 200 miles south/southeast of Juneau.

The team traveled by helicopter to reach these remote destinations. As soon as the researchers arrived, Briner knew that the islands had once been covered by ice.

"The landscape is glacial," he says. "The rock surfaces are smooth and scratched from when the ice moved over it, and there are erratic boulders everywhere. When you are a geologist, it hits you in the face. You know it immediately: The glacier was here."

To pinpoint when the ice receded from the region, the team collected bits of rock from the surfaces of boulders and bedrock. Later, the scientists ran tests to figure out how long the samples -- and thus the islands as a whole -- had been free of ice.

The researchers used a method called surface exposure dating. As Lesnek explains, "When land is covered by a glacier, the bedrock in the area is hidden under ice. As soon as the ice disappears, however, the bedrock is exposed to cosmic radiation from space, which causes it to accumulate certain chemicals on their surface. The longer the surface has been exposed, the more of these chemicals you get. By testing for these chemicals, we were able to determine when our rock surfaces were exposed, which tells us when the ice retreated.

"We use the same dating method for huge boulders called erratics. These are big rocks that are plucked from the Earth and carried to new locations by glaciers, which actually consist of moving ice. When glaciers melt and disappear from a specific region, they leave these erratics behind, and surface exposure dating can tell us when the ice retreated."

For the region that was studied, this happened roughly 17,000 years ago.

The case for a coastal migration route

In recent years, evidence has mounted against the conventional thinking that humans populated North America by taking an inland route through Canada. To do so, they would have needed to walk through a narrow, ice-free ribbon of terrain that appeared when two major ice sheets started to separate. But recent research suggests that while this path may have opened up more than 14,000 years ago, it did not develop enough biological diversity to support human life until about 13,000 years ago, Briner says.

That clashes with archaeological findings that suggest humans were already living in Chile about 15,000 years ago or more and in Florida 14,500 years ago.

The coastal migration theory provides an alternative narrative, and the new study may mark a step toward solving the mystery of how humans came to the Americas.

"Where we looked at it, the coastal route was not only open -- it opened at just the right time," Lindqvist says. "The timing coincides almost exactly with the time in human history that the migration into the Americas is thought to have occurred."

Read more at Science Daily

May 30, 2018

New model explains what we see when a massive black hole devours a star

Illustration of emissions from a tidal disruption event shows in cross section what happens when the material from a disrupted star is devoured by a black hole. The material forms an accretion disk, which heats up and emits vast amounts of light and radiation. The emissions we are able to see from Earth depend on our viewing angle with respect to the orientation of the black hole.
A star that wanders too close to the supermassive black hole in the center of its galaxy will be torn apart by the black hole's gravity in a violent cataclysm called a tidal disruption event (TDE), producing a bright flare of radiation. A new study led by theoretical astrophysicists at the University of Copenhagen's Niels Bohr Institute and UC Santa Cruz provides a unified model that explains recent observations of these extreme events.

The breakthrough study, published in Astrophysical Journal Letters, provides a new theoretical perspective for a fast-growing research field.

"Only in the last decade or so have we been able to distinguish TDEs from other galactic phenomena, and the new model will provide us with the basic framework for understanding these rare events," said coauthor Enrico Ramirez-Ruiz, professor and chair of astronomy and astrophysics at UC Santa Cruz and Niels Bohr Professor at the University of Copenhagen.

In most galaxies, the central black hole is quiescent, not actively consuming any material and therefore not emitting any light. Tidal disruption events are rare, only happening about once every 10,000 years in a typical galaxy. When an unlucky star gets torn apart, however, the black hole is "overfed" with stellar debris for a while and emits intense radiation.

"It is interesting to see how materials get their way into the black hole under such extreme conditions," said first author Jane Lixin Dai, assistant professor at the University of Copenhagen, who led the study. "As the black hole is eating the stellar gas, a vast amount of radiation is emitted. The radiation is what we can observe, and using it we can understand the physics and calculate the black hole properties. This makes it extremely interesting to go hunting for tidal disruption events."

While the same physics is expected to happen in all tidal disruption events, about two dozen of which have been observed so far, the observed properties of these events have shown great variation. Some emit mostly x-rays, while others emit mostly visible and ultraviolet light. Theorists have been struggling to understand this diversity and assemble different pieces of the puzzle into a coherent model.

In the new model, it is the viewing angle of the observer that accounts for differences in the observations. Galaxies are oriented randomly with respect to the line of sight of observers on Earth, who see different aspects of a tidal disruption event depending on its orientation.

"It is like there is a veil that covers part of a beast," Ramirez-Ruiz explained. "From some angles we see an exposed beast, but from other angles we see a covered beast. The beast is the same, but our perceptions are different."

The model developed by Dai and her collaborators combines elements from general relativity, magnetic fields, radiation, and gas hydrodynamics. It shows what astronomers can expect to see when viewing tidal disruption events from different angles, allowing researchers to fit different events into a coherent framework.

Survey projects planned for the next few years are expected to provide much more data on tidal disruption events and will help greatly expand this field of research, according to Dai. These include the Young Supernova Experiment (YSE) transient survey, led by the DARK Cosmology Centre at the Niels Bohr Institute and UC Santa Cruz, and the Large Synoptic Survey Telescopes being built in Chile.

"We will observe hundreds to thousands of tidal disruption events in a few years. This will give us a lot of 'laboratories' to test our model and use it to understand more about black holes," Dai said.

Read more at Science Daily

World’s oldest lizard fossil discovered

The restudy of Megachirella wachtleri fossil allowed the authors to re-write the history of all fossil and living lizards and snakes.
An international team of paleontologists, which includes the University of Bristol, have identified the world's oldest lizard, providing key insight into the evolution of modern lizards and snakes.

The 240-million-year-old fossil, Megachirella wachtleri, is the most ancient ancestor of all modern lizards and snakes, known as squamates, the new study, published today in the journal Nature, shows.

The fossil, along with data from both living and extinct reptiles -- which involved anatomical data drawn from CT scans and DNA -- suggests the origin of squamates is even older, taking place in the late Permian period, more than 250 million years ago.

Tiago Simões, lead author and PhD student from the University of Alberta in Canada, said: "The specimen is 75 million years older than what we thought were the oldest fossil lizards in the entire world and provides valuable information for understanding the evolution of both living and extinct squamates."

Currently, there are 10,000 species of lizards and snakes around the world -- twice as many different species as mammals. Despite this modern diversity, scientists did not know much about the early stages of their evolution.

Tiago Simões added: "It is extraordinary when you realize you are answering long-standing questions about the origin of one of the largest groups of vertebrates on Earth."

Co-author, Dr Michael Caldwell, also from the University of Alberta, added: "Fossils are our only accurate window into the ancient past. Our new understanding of Megachirella is but a point in ancient time, but it tells us things about the evolution of lizards that we simply cannot learn from any of the 9000 or so species of lizards and snakes alive today."

Originally found in the early 2000s in the Dolomites Mountains of Northern Italy, researchers considered it an enigmatic lizard-like reptile but could not reach conclusive placement, and it ramained nearly unnoticed by the international community.

In order to better understand both the anatomy of Megachirella and the earliest evolution of lizards and snakes the authors assembled the largest reptile dataset ever created.

The authors combined it with several new anatomical information from Megachirella obtained from high-resolution CT scans.

All this new information was analysed using state of the art methods to assess relationships across species, revealing that the once enigmatic reptile was actually the oldest known squamate.

Co-author Dr Randall Nydam of the Midwestern University in Arizona, said: "At first I did not think Megachirella was a true lizard, but the empirical evidence uncovered in this study is substantial and can lead to no other conclusion."

Co-author Dr Massimo Bernardi from MUSE -- Science Museum, Italy and University of Bristol's School of Earth Sciences, added: "This is the story of the re-discovery of a specimen and highlights the importance of preserving naturalistic specimens in well maintained, publicly accessible collections.

Read more at Science Daily

No more sweet tooth? Scientists switch off pleasure from food in brains of mice

Brain illustration
New research in mice has revealed that the brain's underlying desire for sweet, and its distaste for bitter, can be erased by manipulating neurons in the amygdala, the emotion center of the brain.

The study showed that removing an animal's capacity to crave or despise a taste had no impact on its ability to identify it. The findings suggest that the brain's complex taste system -- which produces an array of thoughts, memories and emotions when tasting food -- are actually discrete units that can be individually isolated, modified or removed all together. The research points to new strategies for understanding and treating eating disorders including obesity and anorexia nervosa.

The research was published today in Nature.

"When our brain senses a taste it not only identifies its quality, it choreographs a wonderful symphony of neuronal signals that link that experience to its context, hedonic value, memories, emotions and the other senses, to produce a coherent response," said Charles S. Zuker, PhD, a principal investigator at Columbia's Mortimer B. Zuckerman Mind Brain Behavior Institute and the paper's senior author.

Today's study builds upon earlier work by Dr. Zuker and his team to map the brain's taste system. Previously, the researchers revealed that when the tongue encounters one of the five tastes -- sweet, bitter, salty, sour or umami -- specialized cells on the tongue send signals to specialized regions of the brain so as to identify the taste, and trigger the appropriate actions and behaviors.

To shed light on that experience, the scientists focused on sweet and bitter taste and the amygdala, a brain region known to be important for making value judgments about sensory information. Previous research by Dr. Zuker, a professor of biochemistry and molecular biophysics and of neuroscience and a Howard Hughes Medical Institute Investigator at Columbia University Irving Medical Center, and others showed that the amygdala connects directly to the taste cortex.

"Our earlier work revealed a clear divide between the sweet and bitter regions of the taste cortex," said Li Wang, PhD, a postdoctoral research scientist in the Zuker lab and the paper's first author. "This new study showed that same division continued all the way into the amygdala. This segregation between sweet and bitter regions in both the taste cortex and amygdala meant we could independently manipulate these brain regions and monitor any resulting changes in behavior."

The scientists performed several experiments in which the sweet or bitter connections to the amygdala were artificially switched on, like flicking a series of light switches. When the sweet connections were turned on, the animals responded to water just as if it were sugar. And by manipulating the same types of connections, the researchers could even change the perceived quality of a taste, turning sweet into an aversive taste, or bitter into an attractive one.

In contrast, when the researchers instead turned off the amygdala connections but left the taste cortex untouched, the mice could still recognize and distinguish sweet from bitter, but now lacked the basic emotional reactions, like preference for sugar or aversion to bitter.

"It would be like taking a bite of your favorite chocolate cake but not deriving any enjoyment from doing so," said Dr. Wang. "After a few bites, you may stop eating, whereas otherwise you would have scarfed it down."

Usually, the identity of a food and the pleasure one feels when eating it are intertwined. But the researchers showed that these components can be isolated from each other, and then manipulated separately. This suggests that the amygdala could be a promising area of focus when looking for strategies to treat eating disorders.

In the immediate future, Drs. Zuker and Wang are investigating additional brain regions that serve critical roles in the taste system. For example, the taste cortex also links directly to regions involved in motor actions, learning and memory.

Read more at Science Daily

Life recovered rapidly at impact site of dino-killing asteroid

An asteroid impact 66 million years ago wiped out life across the planet, but microorganisms quickly rebounded. New research has found evidence for a diverse array of plankton and other organisms inhabiting the crater only a few years after the extinction-causing impact. The three hair-covered forms (left) represent species of plankton found inside the crater. The geometric form (bottom left) is a species of algae. Small organisms like these moved into the crater so quickly that bones from animals that were killed by the impact, such as the mosasaur pictured here, may have still been visible.
About 66 million years ago, an asteroid smashed into Earth, triggering a mass extinction that ended the reign of the dinosaurs and snuffed out 75 percent of life.

Although the asteroid killed off species, new research led by The University of Texas at Austin has found that the crater it left behind was home to sea life less than a decade after impact, and it contained a thriving ecosystem within 30,000 years -- a much quicker recovery than other sites around the globe.

Scientists were surprised by the findings, which undermine a theory that recovery at sites closest to the crater is the slowest due to environmental contaminants -- such as toxic metals -- released by the impact. Instead, the evidence suggests that recovery around the world was influenced primarily by local factors, a finding that could have implications for environments rocked by climate change today.

"We found life in the crater within a few years of impact, which is really fast, surprisingly fast," said Chris Lowery, a postdoctoral researcher at the University of Texas Institute for Geophysics (UTIG) who led the research. "It shows that there's not a lot of predictability of recovery in general."

The research was published May 30 in the journal Nature. UTIG research scientists Gail Christeson and Sean Gulick and postdoctoral researcher Cornelia Rasmussen are co-authors on the paper, along with a team of international scientists. UTIG is a research unit of the Jackson School of Geosciences.

The evidence for life comes primarily in the form of microfossils -- the remains of unicellular organisms such as algae and plankton -- as well as the burrows of larger organisms discovered in a rock extracted from the crater during recent scientific drilling conducted jointly by the International Ocean Discovery Program and International Continental Drilling Program.

The tiny fossils are hard evidence that organisms inhabited the crater, but also a general indicator about habitability in the environment years after impact. The swift recovery suggests that other life forms aside from the microscopic were living in the crater shortly after impact.

"Microfossils let you get at this complete community picture of what's going on," Lowery said. "You get a chunk of rock and there's thousands of microfossils there, so we can look at changes in the population with a really high degree of confidence ... and we can use that as kind of a proxy for the larger scale organisms."

The scientists found the first evidence for the appearance of life two to three years after impact. The evidence included burrows made by small shrimp or worms. By 30,000 years after impact, a thriving ecosystem was present in the crater, with blooming phytoplankton (microscopic plants) supporting a diverse community of organisms in the surface waters and on the seafloor. In contrast, other areas around the world, including the North Atlantic and other areas of the Gulf of Mexico, took up to 300,000 years to recover in a similar manner.

The core containing the fossil evidence was extracted from the crater during a 2016 expedition co-led by the Jackson School. In this study, scientists zeroed in on a unique core section that captures the post-impact seafloor in unprecedented detail. Whereas core samples from other parts of the ocean hold only millimeters of material deposited in the moments after impact, the section from the crater used in this study contains more than 130 meters of such material, the upper 30 inches of which settled out slowly from the turbid water. This material provides a record that captures the seafloor environment days to years after the impact.

"You can see layering in this core, while in others, they're generally mixed, meaning that the record of fossils and materials is all churned up, and you can't resolve tiny time intervals," said co-author Timothy Bralower, a micropaleontology professor at Pennsylvania State University. "We have a fossil record here where we're able to resolve daily, weekly, monthly, yearly changes."

Ellen Thomas, a senior research scientist in geology and geophysics at Yale University who was not part of the study, said that although she thinks the paper makes a strong case for a speedy recovery, she expects that the larger scientific community will be interested in digging into the data for themselves.

"In my opinion, we will see considerable debate on the character, age, sedimentation rate and microfossil content ... especially of the speculation that burrowing animals may have returned within years of the impact," Thomas said.

The relatively rapid rebound of life in the crater suggests that although the asteroid caused the extinction, it didn't hamper recovery. The scientists point to local factors, from water circulation to interactions between organisms and the availability of ecological niches, as having the most influence on a particular ecosystem's recovery rate.

Read more at Science Daily

May 29, 2018

Prehistoric teeth dating back two million years reveal details on Africa's paleoclimate

View into excavation area from Wonderwerk Cave entrance.
New research out of South Africa's Wonderwerk Cave led by anthropologists at the University of Toronto (U of T) shows that the climate of the interior of southern Africa almost two million years ago was like no modern African environment -- it was much wetter.

In a paper published in Nature Ecology & Evolution, lead author Michaela Ecker, a postdoctoral fellow in the Department of Anthropology at U of T, alongside an international team of scientists that included Michael Chazan, director of U of T's Archaeology Centre, recreated the environmental change in the interior of southern Africa over a span of almost two million years.

"The influence of climatic and environmental change on human evolution is largely understood from East African research," said Ecker. "Our research constructed the first extensive paleoenvironmental sequence for the interior of southern Africa using a combination of methods for environmental reconstruction at Wonderwerk Cave."

While East African research shows increasing aridity and the spread of grasslands, the study showed that during the same time period, southern Africa was significantly wetter and housed a plant community unlike any other in the modern African savanna -- which means human ancestors were living in environments other than open, arid grasslands.

Using carbon and oxygen stable isotope analysis on the teeth of herbivores excavated from the cave, Ecker and her team were able to reconstruct the vegetation from the time the animal was alive and gain valuable insight into the environmental conditions our human ancestors were living in.

"Understanding the environment humans evolved in is key to improving our knowledge of our species and its development," said Ecker. "Our work at Wonderwerk Cave demonstrates how humankind existed in multiple environmental contexts in the past -- contexts which are substantially different from the environments of today."

This is the latest U of T research out of Wonderwerk Cave, a massive excavation site in the Kuruman Hills of the Northern Cape Province of South Africa. Chazan has previously discovered early evidence of fire by human ancestors, as well as the earliest evidence of cave-dwelling human ancestors, based on excavations carried out by South African archaeologist Peter Beaumont. Research to date has established a chronology for human occupation of the front of the cave stretching back two million years.

The findings are described in the study "The palaeoecological context of the Oldowan-Acheulean in southern Africa," published this month in Nature Ecology & Evolution. Research funding was provided by the Social Sciences and Humanities Research Council of Canada, the German Academic Exchange Service, the University of Oxford's Boise Fund Trust and the Quaternary Research Association. Other team members include James Brink and Lloyd Rossouw of the National Museum, Bloemfontein, Liora Horwitz of the Hebrew University of Jerusalem and Julia Lee-Thorp of the University of Oxford.

Read more at Science Daily

Most popular vitamin and mineral supplements provide no health benefit, study finds

A variety of supplements.
The most commonly consumed vitamin and mineral supplements provide no consistent health benefit or harm, suggests a new study led by researchers at St. Michael's Hospital and the University of Toronto.

Published today in the Journal of the American College of Cardiology, the systematic review of existing data and single randomized control trials published in English from January 2012 to October 2017 found that multivitamins, vitamin D, calcium and vitamin C -- the most common supplements -- showed no advantage or added risk in the prevention of cardiovascular disease, heart attack, stroke or premature death. Generally, vitamin and mineral supplements are taken to add to nutrients that are found in food.

"We were surprised to find so few positive effects of the most common supplements that people consume," said Dr. David Jenkins*, the study's lead author. "Our review found that if you want to use multivitamins, vitamin D, calcium or vitamin C, it does no harm -- but there is no apparent advantage either."

The study found folic acid alone and B-vitamins with folic acid may reduce cardiovascular disease and stroke. Meanwhile, niacin and antioxidants showed a very small effect that might signify an increased risk of death from any cause.

"These findings suggest that people should be conscious of the supplements they're taking and ensure they're applicable to the specific vitamin or mineral deficiencies they have been advised of by their healthcare provider," Dr. Jenkins said.

His team reviewed supplement data that included A, B1, B2, B3 (niacin), B6, B9 (folic acid), C, D and E; and ?-carotene; calcium; iron; zinc; magnesium; and selenium. The term 'multivitamin' in this review was used to describe supplements that include most vitamins and minerals, rather than a select few.

Read more at Science Daily

Archaeologists discover a 1,000-year-old mummy in Peru

This is the Pachacamac mummy 2018.
A team from the Université libre de Bruxelles's centre for archaeological research (CReA-Patrimoine) has completed a significant excavation in Pachacamac, Peru, where they have discovered an intact mummy in especially good condition. Pachacamac's status as a Pre-Colombian pilgrimage site under the Inca empire. is confirmed by further evidence.

Peter Eeckhout and his team's latest campaign of archaeological excavations has concluded with an exciting surprise: after nine weeks spent exploring the Pre-Colombian site of Pachacamac, in Peru, the researchers from CReA-Patrimoine (ULB Faculty of Philosophy and Social Sciences) have unearthed a mummy in especially good condition. 'The deceased is still wrapped in the enormous funeral bundle that served as a coffin,' points out professor Peter Eeckhout. 'Discoveries like this one are exceptionally scarce, and this mummy is incredibly well preserved. Samples were collected for carbon-14 dating, but the area in which it was discovered and the type of tomb suggest this individual was buried between 1000 and 1200 AD.'

The excavation was carried out as a part of the 'Ychsma' project, named after the region's native people, under the supervision of professor Eeckhout. Three monumental structures were explored during the campaign, including a sanctuary dedicated to the local ancestors. Under Inca rule, in the late 15th century, it appears to have been transformed into a water and healing temple. The archaeologists have discovered many offerings left by worshippers, such as Spondylus shells imported from Ecuador; these are associated with the influx of water during El Niño, and they symbolise fertility and abundance.

Before the Inca settled in the area, the sanctuary included large funerary chambers and numerous mummies, most of which were looted during the Spanish conquest. Miraculously, though, one of the chambers was found intact during the latest round of excavations: this is the funeral chamber that held the mummy. Due to how well it was preserved, the researchers will be able to study it without needing to unwrap the bundle. Together with Christophe Moulherat (Musée du Quai Branly, Paris), they will soon examine the mummy using the latest techniques in medical imaging (X-ray scans, axial tomography, 3D reconstruction, etc.). This will enable them to determine the individual's position, any pathologies they might have suffered from, but also what offerings might be inside the bundle.

The other structures that were excavated are also related to worship: the first one, an Inca monument intended to host pilgrims and rituals, was built in several phases, each identified with a series of offerings such as seashells and precious objects. The last structure explored was probably one of the 'chapels' for foreign pilgrims, referred to by Spanish monk Antonio de la Calancha in his 17th-century description of the site. There, the excavations also uncovered many 'foundation' offerings, including vases, dogs, and other animals, as well as a platform with a hole in the centre, where an idol was likely placed. The complex appears to have been designed around this idol, involved in religious activities with pilgrims.

Read more at Science Daily

Dino-bird dandruff research head and shoulders above rest

Dr Maria McNamara, UCC: "What's remarkable is that the fossil dandruff is almost identical to that in modern birds."
Palaeontologists from University College Cork (UCC) in Ireland have discovered 125 million-year-old dandruff preserved amongst the plumage of feathered dinosaurs and early birds, revealing the first evidence of how dinosaurs shed their skin.

UCC's Dr Maria McNamara and her team studied the fossil cells, and dandruff from modern birds, with powerful electron microscopes for the study, published today in the journal Nature Communications.

"The fossil cells are preserved with incredible detail -- right down to the level of nanoscale keratin fibrils. What's remarkable is that the fossil dandruff is almost identical to that in modern birds -- even the spiral twisting of individual fibres is still visible," said Dr Maria McNamara.

Just like human dandruff, the fossil dandruff is made of tough cells called corneocytes, which in life are dry and full of the protein keratin.

The study suggests that this modern skin feature evolved sometime in the late Middle Jurassic, around the same time as a host of other skin features evolved. "There was a burst of evolution of feathered dinosaurs and birds at this time, and it's exciting to see evidence that the skin of early birds and dinosaurs was evolving rapidly in response to bearing feathers," Dr McNamara added.

Dr McNamara led the study, in collaboration with her postdoctoral researcher Dr Chris Rogers; Dr Andre Toulouse and Tara Foley, also from UCC; Dr Paddy Orr from UCD, Ireland; and an international team of palaeontologists from the UK and China.

The dandruff is the first evidence of how dinosaurs shed their skin. The feathered dinosaurs studied -- Microraptor, Beipiaosaurus and Sinornithosaurus -- clearly shed their skin in flakes, like the early bird Confuciusornis studied by the team and also modern birds and mammals, and not as a single piece or several large pieces, as in many modern reptiles.

Co-author Professor Mike Benton, from the University of Bristol's School of Earth Sciences, said: "It's unusual to be able to study the skin of a dinosaur, and the fact this is dandruff proves the dinosaur was not shedding its whole skin like a modern lizard or snake but losing skin fragments from between its feathers."

Read more at Science Daily

May 28, 2018

Rise and fall of the Great Barrier Reef

Aerial view of the Great Barrier Reef.
A landmark international study of the Great Barrier Reef has shown that in the past 30,000 years the world's largest reef system has suffered five death events, largely driven by changes in sea level and associated environmental change.

Over millennia, the reef has adapted to sudden changes in environment by migrating across the sea floor as the oceans rose and fell.

The study published today in Nature Geoscience, led by University of Sydney's Associate Professor Jody Webster, is the first of its kind to reconstruct the evolution of the reef over the past 30 millennia in response to major, abrupt environmental change.

The 10-year, multinational effort has shown the reef is more resilient to major environmental changes such as sea-level rise and sea-temperature change than previously thought but also showed a high sensitivity to increased sediment input and poor water quality.

Associate Professor Webster from the University's School of Geosciences and Geocoastal Research Group said it remains an open question as to whether its resilience will be enough for it to survive the current worldwide decline of coral reefs.

"Our study shows the reef has been able to bounce back from past death events during the last glaciation and deglaciation," he said. "However, we found it is also highly sensitive to increased sediment input, which is of concern given current land-use practices."

The study used data from geomorphic, sedimentological, biological and dating information from fossil reef cores at 16 sites at Cairns and Mackay.

The study covers the period from before the "Last Glacial Maximum" about 20,000 years ago when sea levels were 118 metres below current levels.

History of death events

As sea levels dropped in the millennia before that time, there were two widespread death events (at about 30,000 years and 22,000 years ago) caused by exposure of the reef to air, known as subaerial exposure. During this period, the reef moved seaward to try to keep pace with the falling sea levels.

During the deglaciation period after the Last Glacial Maximum, there were a further two reef-death events at about 17,000 and 13,000 years ago caused by rapid sea level rise. These were accompanied by the reef moving landward, trying to keep pace with rising seas.

Analysis of the core samples and data on sediment flux show these reef-death events from sea-level rise were likely associated with high increases in sediment.

The final reef-death event about 10,000 years ago, from before the emergence of the modern reef about 9000 years ago, was not associated with any known abrupt sea-level rise or "meltwater pulse" during the deglaciation. Rather it appears to be associated with a massive sediment increase and reduced water quality alongside a general rise in sea level.

The authors propose that the reef has been able to re-establish itself over time due to continuity of reef habitats with corals and coralline-algae and the reef's ability to migrate laterally at between 0.2 and 1.5 metres a year.

Future survival

However, Associate Professor Webster said it was unlikely that this rate would be enough to survive current rates of sea surface temperature rises, sharp declines in coral coverage, year-on-year coral bleaching or decreases in water quality and increased sediment flux since European settlement.

"I have grave concerns about the ability of the reef in its current form to survive the pace of change caused by the many current stresses and those projected into the near future," he said.

Associate Professor Webster said previous studies have established a past sea surface temperature rise of a couple of degrees over a timescale of 10,000 years. However, current forecasts of sea surface temperature change are around 0.7 degrees in a century.

Read more at Science Daily

400-million-year-old evolutionary arms race helps researchers understand HIV

Understanding the evolution of a 400 million-year-old anti-viral protein that first emerged in marine life, is helping researchers get the upper-hand on human immunodeficiency virus (HIV).

Researchers at Western University were interested in the origin of a gene that encodes for protein, HERC5, shown to potently inhibit HIV. In a new study published in the Journal of Virology Stephen Barr, PhD, assistant professor at Western's Schulich School of Medicine & Dentistry, shows that the gene first emerged in fish over 400 million-years-ago and has been involved in an evolutionary arms race with viruses ever since.

The study shows that over hundreds of millions of years, this battle for survival caused the genes to develop sophisticated shields to block viruses, which in turn forced the viruses to continually evolve and change to circumvent these defences. This provides insight into both how the viruses and the immune system has evolved.

Using sequencing technology, Barr and his research team found that the HERC5 gene from this 400 million-year-old fish called a coelacanth encodes for a protein that can potently block the primate version of HIV, known as simian immunodeficiency virus (SIV), but fails to block HIV.

"Of course HIV and these modern day viruses that we study aren't present in fish, but ancient versions of them are. So what we assume is that as these ancient retroviruses wreaked havoc on marine life, their immune systems had to develop a defense," Barr explained. "We think that one of those defenses is the HERC family. As retroviruses evolved, eventually giving rise to HIV, different variants of HERC genes emerged to combat these infections."

Since these viruses have been in battle for so long, they have had time to learn of ways to get around these shields and as a result became smarter. Consequently, this new level of sophistication likely helped these viruses to jump the species barrier and establish new infections in humans.

Read more at Science Daily

Bumblebees confused by iridescent colors

This is a bumblebee landing on an iridescent target.
Iridescence is a form of structural colour which uses regular repeating nanostructures to reflect light at slightly different angles, causing a colour-change effect.

It is common in nature, from the dazzling blues of peacock's feathers, to the gem-like appearance of insects.

Although using bright flashy colours as camouflage may seem counterintuitive, researchers at the Bristol Camo Lab found that intense iridescence obstructs the bumblebee's ability to identify shape. The eyes of bumblebees have been extensively studied by scientists and are very similar to those of other insects.

They can be used as a visual model for predatory insects such as wasps and hornets. When presented with different types of artificial flower targets rewarded with sugar water, the bees learned to recognise which shapes contained the sweet reward.

However, they found it much more difficult to discriminate between flower shape when the targets were iridescent.

This current study using bumblebees as a model for (predatory) insect vision and cognition is the first to show that iridescence indeed has the potential to deceive predators and make them overlook the prey, the same way disruptive camouflage would work to break up the otherwise recognisable outline of a prey.

The changing colours make the outline of the prey look completely different to the shape the predators are searching for.

The researchers concluded that iridescence produces visual signals which can confuse potential predators, and this may explain its widespread occurrence in nature.

Lead author Dr Karin Kjernsmo of the University of Bristol's School of Biological Sciences, said: "It's the first solid evidence we have that this type of colouration can be used in this way.

"Thus, if you are a visual predator searching for the specific shape of a beetle (or other prey animal), iridescence makes it difficult for predators to identify them as something edible. We are currently studying this effect using other visual predators, such as birds as well. This because birds are likely to be the most important predators of iridescent insects."

The first links between iridescence and camouflage were first made over one hundred years ago by an American naturalist named Abbott Thayer, who is often referred to as "the father of camouflage".

He published a famous book on different types of camouflage such as mimicry, shape disruption and dazzle, which is thought to have inspired the "Razzle Dazzle" painting of battleships during the first World War.

However, iridescence has been rather overlooked for the past century, as it is often assumed to be purely for attracting mates and displaying to other individuals.

The UK has several species of iridescent beetle, the largest of which being the Rose Chafer, whose superb green and gold colour-changing wing cases can commonly be spotted on flowers in grasslands during the summer.

Read more at Science Daily

Processes in the gut that drive fat build-up around the waist

Research by scientists at King's College London into the role the gut plays in processing and distributing fat could pave the way for the development of personalised treatments for obesity and other chronic diseases within the next decade. The research is published in Nature Genetics.

In the largest study of its kind, scientists analysed the faecal metabolome (the community of chemicals produced by gut microbes in the faeces) of 500 pairs of twins to build up a picture of how the gut governs these processes and distributes fat. The King's team also assessed how much of that activity is genetic and how much is determined by environmental factors.

The analysis of stool samples identified biomarkers for the build-up of internal fat around the waist. It's well known that this visceral fat is strongly associated with the development of conditions including type 2 diabetes, heart disease and obesity.

By understanding how microbial chemicals lead to the development of fat around the waist in some, but not all the twins, the King's team hopes to also advance the understanding of the very similar mechanisms that drive the development of obesity.

An analysis of faecal metabolites (chemical molecules in stool produced by microbes) found that less than a fifth (17.9 per cent) of gut processes could be attributed to hereditary factors, but 67.7 per cent of gut activity was found to be influenced by environmental factors, mainly a person's regular diet.

This means that important changes can be made to the way an individual's gut processes and distributes fat by altering both their diet and microbial interactions in their gut.

On the back of the study researchers have built a gut metabolome bank that can help other scientists engineer bespoke and ideal gut environments that efficiently process and distribute fat. The study has also generated the first comprehensive database of which microbes are associated with which chemical metabolites in the gut. This can help other scientists to understand how bacteria in the gut affect human health.

Lead investigator Dr Cristina Menni from King's College London said: 'This study has really accelerated our understanding of the interplay between what we eat, the way it is processed in the gut and the development of fat in the body, but also immunity and inflammation. By analysing the faecal metabolome, we have been able to get a snapshot of both the health of the body and the complex processes taking place in the gut.'

Head of the King's College London's Twin Research Group Professor Tim Spector said: 'This exciting work in our twins shows the importance to our health and weight of the thousands of chemicals that gut microbes produce in response to food. Knowing that they are largely controlled by what we eat rather than our genes is great news, and opens up many ways to use food as medicine. In the future these chemicals could even be used in smart toilets or as smart toilet paper.'

Read more at Science Daily