Dec 21, 2013

Ancient Cranial Surgery: Practice of Drilling Holes in the Cranium That Dates Back Thousands of Years

Cranial surgery is tricky business, even under 21st-century conditions (think aseptic environment, specialized surgical instruments and copious amounts of pain medication both during and afterward).

However, evidence shows that healers in Peru practiced trepanation -- a surgical procedure that involves removing a section of the cranial vault using a hand drill or a scraping tool -- more than 1,000 years ago to treat a variety of ailments, from head injuries to heartsickness. And they did so without the benefit of the aforementioned medical advances.

Excavating burial caves in the south-central Andean province of Andahuaylas in Peru, UC Santa Barbara bioarchaeologist Danielle Kurin and her research team unearthed the remains of 32 individuals that date back to the Late Intermediate Period (ca. AD 1000-1250). Among them, 45 separate trepanation procedures were in evidence. Kurin's findings appear in the current issue of the American Journal of Physical Anthropology.

"When you get a knock on the head that causes your brain to swell dangerously, or you have some kind of neurological, spiritual or psychosomatic illness, drilling a hole in the head becomes a reasonable thing to do," said Kurin, a visiting assistant professor in the Department of Anthropology at UCSB and a specialist in forensic anthropology.

According to Kurin, trepanations first appeared in the south-central Andean highlands during the Early Intermediate Period (ca. AD 200-600), although the technique was not universally practiced. Still, it was considered a viable medical procedure until the Spanish put the kibosh on the practice in the early 16th century.

But Kurin wanted to know how trepanation came to exist in the first place. And she looked to a failed empire to find some answers.

"For about 400 years, from 600 to 1000 AD, the area where I work -- the Andahuaylas -- was living as a prosperous province within an enigmatic empire known as the Wari," she said. "For reasons still unknown, the empire suddenly collapsed." And the collapse of civilization, she noted, brings a lot of problems.

"But it is precisely during times of collapse that we see people's resilience and moxie coming to the fore," Kurin continued. "In the same way that new types of bullet wounds from the Civil War resulted in the development of better glass eyes, the same way IED's are propelling research in prosthetics in the military today, so, too, did these people in Peru employ trepanation to cope with new challenges like violence, disease and deprivation 1,000 years ago."

Kurin's research shows various cutting practices and techniques being employed by practitioners around the same time. Some used scraping, others used cutting and still others made use of a hand drill. "It looks like they were trying different techniques, the same way we might try new medical procedures today," she said. "They're experimenting with different ways of cutting into the skull."

Sometimes they were successful and the patient recovered, and sometimes things didn't go so well. "We can tell a trepanation is healed because we see these finger-like projections of bone that are growing," Kurin explained. "We have several cases where someone suffered a head fracture and were treated with the surgery; in many cases, both the original wound and the trepanation healed." It could take several years for the bone to regrow, and in a subset of those, a trepanation hole in the patient's head might remain for the rest of his life, thereby conferring upon him a new "survivor" identity.

When a patient didn't survive, his skull (almost never hers, as the practice of trepanation on women and children was forbidden in this region) might have been donated to science, so to speak, and used for education purposes. "The idea with this surgery is to go all the way through the bone, but not touch the brain," said Kurin. "That takes incredible skill and practice.

"As bioarchaeologists, we can tell that they're experimenting on recently dead bodies because we can measure the location and depths of the holes they're drilling," she continued. "In one example, each hole is drilled a little deeper than the last. So you can imagine a guy in his prehistoric Peruvian medical school practicing with his hand drill to know how many times he needs to turn it to nimbly and accurately penetrate the thickness of a skull."

Some might consider drilling a hole in someone's head a form of torture, but Kurin doesn't perceive it as such. "We can see where the trepanations are. We can see that they're shaving the hair. We see the black smudge of an herbal remedy they put over the wound," she noted. "To me, those are signs that the intention was to save the life of the sick or injured individual."

The remains Kurin excavated from the caves in Andahuaylas comprise perhaps the largest well-contextualized collection in the world. Most of the trepanned crania already studied reside in museums such as the Smithsonian Institution, the Field Museum of Natural History or the Hearst Museum of Anthropology. "Most were collected by archaeologists a century ago and so we don't have good contextual information," she said.

But thanks to Kurin's careful archaeological excavation of intact tombs and methodical analysis of the human skeletons and mummies buried therein, she knows exactly where, when and how the remains she found were buried, as well as who and what was buried with them. She used radiocarbon dating and insect casings to determine how long the bodies were left out before they skeletonized or were mummified, and multi-isotopic testing to reconstruct what they ate and where they were born. "That gives us a lot more information," she said.

Read more at Science Daily

Ancient Tributes to the Winter Solstice

Here Comes the Sun

This weekend marks the beginning of the end. Of winter's darkness, that is. Today (Dec. 21), the Northern Hemisphere celebrates the mark of increasingly longer days, while the Southern Hemisphere will transition to shorter days, and those at the equator won't notice much of a difference at all.

The global discrepancy in seasonal sunlight results from Earth's 23.5-degree tilt on its axis: During the Northern Hemisphere winter, the Earth is tilted directly away from the sun, and during the summer, it is tilted directly toward the sun. The equator does not experience much of a change during the year since it sits in the middle of the axis.


For many ancient civilizations that struggled to subsist through harsh winter months, the winter solstice marked a time of spiritual rejoice and celebration. Modern heating technology and the globalization of food markets make the seasonal transition remarkably easier for modern humans to survive, but people still do celebrate the day with festivities and rituals, including a tradition of reading poetry and eating pomegranates in Iran, and the Guatemalan ritual known as polo voladore — or "flying pole dance" — in which three men climb to the top of a 50-foot-tall (15 meters) pole and perform a risky dance to flutes and drums.

Still other people celebrate the day by tuning into the spiritual rituals of ancient civilizations and visiting the sites of winter solstice tributes. Here are six archaeological sites that researchers believe pay tribute to the winter solstice:

1. Stonehenge, England

Stonehenge — one of the most famous archaeological sites in the world — is an arrangement of rocks carefully positioned on a barren ground in southern England. The megalith, which may have been a burial, was built between 3000 B.C. and 2000 B.C., over the course of roughly 1,500 years, in a series of several major phases.

When the sun sets on the winter solstice, its rays align with what are known as the central Altar stone and the Slaughter stone — an event that hundreds of families, tourists, Wiccans, and others visit each year to experience what researchers believe was an important spiritual event for those responsible for creating the monument.

2. Newgrange, Ireland

The Newgrange monument is located northeastern Ireland, and is thought to date back to about 3200 B.C. The mound, with grass on its roof, rises from a green field and, inside, contains a series of tunnels and channels. During sunrise on the winter solstice, the sun pours into the main chambers, which researchers have interpreted to mean it was built to celebrate this special day of the year.

3. Maeshowe, Scotland

Built in Orkney, Scotland, around 2800 B.C., Maeshowe is another burial ground that appears as a grassy mound rising about 24 feet (7.3 m) above a grassy field. Similar to Ireland's Newgrange, the inside of the mound contains a maze of chambers and passageways that become illuminated by sunlight during the winter solstice.

Read more at Discovery News

Dec 20, 2013

Enormous Hermit Crab Tears Through Coconuts, Eats Kittens

It’s hard to go wrong with a hermit crab as your child’s first pet. They’re low maintenance and kinda cute in their own way, plus they’re hypoallergenic, as PetSmart feels the need to point out. But use care when choosing your crab. Whatever you do, don’t pick up Birgus latro, which can grow to a leg span of 3 feet, climb out of your terrarium, and assault the family cat.

Birgus latro is more commonly known as the coconut crab, and it’s the largest terrestrial arthropod in the world (the largest overall being the Japanese spider crab — but that’s a story for another week). Also known as the robber crab due to its curious propensity for stealing silverware and pots and pans, it’s the 9-pound hermit crab PetSmart wouldn’t dare carry, no matter how conveniently hypoallergenic it is.

The coconut crab is endemic to a variety of islands in the Pacific and Indian oceans, though its populations are extremely threatened on some of these thanks to, you guessed it, human tomfoolery. It grows remarkably slowly, taking perhaps 120 years to reach full size, said ecologist Michelle Drew of the Max Planck Institute.

As an arthropod, the coconut crab wears its skeleton on the outside and must shed it as it grows, so once a year it crawls into the safety of a burrow and molts. It’s highly vulnerable once it steps out of this rigid shell, so to hasten the development of new armor it … consumes its old exoskeleton.

It is, in effect, recycling the materials, which are in short supply in its terrestrial environment. Coconut crabs that “are disturbed before they have consumed the entire shell often have soft exoskeletons until they have time to reaccumulate the necessary calcium and other minerals,” Drew said in an email to WIRED.

The crab can grow and molt every year like this for more a century, expanding and expanding like a dying star with claws until it threatens to infringe on the very laws of physics.

Just gonna open this window if you don’t mind.
“In a water environment you get support from the water that allows you to move with a much heavier shell,” said Drew. “But on land, gravity will play a huge role on how you can move and how heavy you can get. [Coconut crabs] are probably at the limits of what is sustainable given gravity, the weight of the shell, and resources available to them in terms of food and water.”

Feeding this incredible growth is no small task, so the coconut crab eats anything it can get its claws on. It’ll go after fruit, vegetation, and carrion: dead birds and other coconut crabs and such. It has been observed hunting other crabs, and Drew has records of them ambushing young chickens as well as — don’t hate me for this — kittens, like a far less cuddly Alf of the tropics.

But what it really loves are, of course, coconuts. Now, contrary to what Harry Nilsson sang in his 1971 hit “Coconut,” one does not simply put the lime in the coconut and drink ‘em both up. Coconuts are extremely difficult to open. But as you may have noticed, the coconut crab is equipped with massive pincers. (One of Drew’s friends had one clamp down on his thumb, which lost feeling for three months. She stresses, though, that the coconut crab is in fact quite gentle unless threatened.)

“They use their claws to pull away the outer fibers,” said Drew. “This can sometimes take many days and it often involves a number of crabs. They then use their longest walking leg to puncture a hole through the eyes of the coconut and then they can use their claws to pry open the shell further.”

That may sound like more trouble than it’s worth, but the average mass of crabs living in coconut-rich habitats is double that of their counterparts living in coconut-free habitats, suggesting they extract a whole lot of calories from the things. It also suggests coconut crabs are among the few creatures on this planet besides my father that would actually enjoy a Mounds bar.

Coconut crabs come in all manner of colors. This one is a lovely burnt sienna.
The coconut crab finds food with its extremely well-developed sense of smell. Like an insect, it uses antennae to zero in on its vittles, but takes this to an extreme by devoting considerable brainpower to the sense.

“The neural (brain) development associated with this is massive compared to other crab species,” said Drew, “and has similarities with insect olfactory development, and is a very good example of convergent evolution associated with a land-based adaptation” — convergent evolution occurring when unrelated species arrive independently at the same adaptation.

Despite its rightful place as the world’s largest terrestrial arthropod, coconut crabs begin their lives in the sea. After mating on terra firma, mom releases her fertilized eggs in the ocean, where the larvae swim about for a month. They then enter what’s known as the glaucothoe stage and find a snail shell to occupy.

At this point the coconut crab is in essence much like the hermit crab you’d buy at the pet store. But whereas commercialized crabs live out their days in a shell, forever battling for the choicest homes, the coconut crab eventually leaves that whole keeping-up-with-the-Joneses silliness behind, developing a hard belly and making its way inland. Once it’s gone fully terrestrial, a coconut crab never returns to the sea except to release its eggs. They’ll drown if fully submerged.

Despite its freakish size, massive pincers and formidable armor, the coconut crab increasingly finds itself in peril. They have for millions of years lived on islands with no large mammalian predators, allowing them to reach such incredible proportions. This is changing as human encroachment has thrown their food chains into chaos.

Read more at Wired Science

Winter Solstice: The Sun Pauses on Saturday

This coming Saturday (Dec. 21) marks one of the four major way stations on the Earth’s annual journey around the sun.

Because of the tilt in the Earth’s axis of rotation, the sun appears to rise and fall in our sky over the course of a year. It’s not the sun itself moving, but the Earth moving relative to the sun.

The Earth’s axis currently points in a northerly direction close to the second-magnitude star Polaris, also known as the Pole Star. Everything in the sky, including the sun, appears to revolve around this almost fixed point in the sky.

Because the Earth’s axis points to Polaris no matter where Earth happens to be in its orbit, the sun appears to move over the year from 23.5 degrees north of the celestial equator on June 21 to 23.5 degrees south of the celestial equator on Dec. 21.

The sun crosses the equator travelling northward around March 21 and going southward on Sept. 21, in celestial events known as "equinoxes" (from the Latin for "equal night," as day and night are of roughly equivalent length on these dates.) The exact dates vary a little bit from year to year because of leap years.

On Dec. 21, the sun stops moving southward, pauses, and then starts moving northward. This pause is called the "solstice," from the Latin words "sol" for "sun" and "sisto" for "stop." Similarly, on June 21 the sun stops moving northward and starts moving southward.

These four dates have been extremely important to humanity since we first started to grow crops 10,000 years ago. Our ancestors have built amazing structures over the millennia to track these important landmarks. For example, Stonehenge in England was built as an astronomical observatory, its stones precisely oriented to detect the extremes of the sun’s movement.

Our calendar is based on the dates of the equinoxes and solstices, though errors over the years have caused the calendar to shift by 10 days from the celestial dates. Many cultures in the world use the winter solstice to mark the beginning of the year. The other three dates neatly divide the year into quarters, or seasons.

The chart above shows what the sky would look like this coming Saturday at precisely 12:11 p.m. EST (1711 GMT), if somehow the sun’s light could be dimmed so that you could see the background stars. The sun is traveling from right to left along the green line, called the "ecliptic" because eclipses happen along it. The sun is as far south as it can get at that instant, and begins moving northward immediately.

The celestial equator is marked by the red line, far to the north of the sun's position. You can see the inner planets gathered around the sun. Venus, off to the left, is moving toward the right, and will pass between us and the sun on Jan. 11. Mercury, to the right, is moving to the left and will pass behind the sun on Dec. 29. Pluto is on the far side of the sun and will pass behind it on Jan. 1.

Notice the Milky Way crossing diagonally through the chart. That’s because our solar system is not oriented in any particular way relative to the plane of the Milky Way. The center of the Milky Way is almost directly below the sun’s position on Dec. 21, something that was made much of last year. As astronomers pointed out repeatedly then, the sun passes in front of the Milky Way’s center every year, not just in 2012. Because the Milky Way’s center is so far away, 27,000 light-years distant, it has no measurable effect on the Earth.

Read more at Discovery News

Mediterranean Sea Was Once a Mile-High Salt Field

About 6 million years ago, a mile-high field of salt formed across the entire Mediterranean seafloor, sucking up 6 percent of the oceans' salt.

Now, new research has pinpointed when key events during the formation of that "salt giant" occurred. The new research, presented here Dec. 11 at the annual meeting of the American Geophysical Union, could help unravel the mystery behind the great salt crisis.

Every so often, huge accumulations of the world's salt form in one place. The most recent salt crisis happened during the Miocene Epoch, which lasted from about 23 million to 5 million years ago.

About 6 million years ago, the Strait of Gibraltar linking the Mediterranean with the Atlantic Ocean was closed and instead, two channels — one in Northern Morocco and another in Southern Spain — fed the sea with salty water and let it flow out, said study co-author Rachel Flecker, a geologist at the University of Bristol in England.

But during the Messinian Salinity Crisis, as this particular event is known, Eurasia was colliding with Africa, squishing the outlet flow for the Mediterranean Sea. But tectonic shifts left the basin floor below the outlet channel between the two water bodies intact. Dense salty water from the Atlantic rushed in, but couldn't leave the sea. Water evaporated; salt piled high; and sea life collapsed.

"It wasn't a nice place," Flecker said.

In a series of pulses over about 600,000 years, the sea dried out, and a 1-mile-high (1.5 kilometer) salt wall grew across the Mediterranean seafloor, a "bit like the Dead Sea, a huge brine field," Flecker told LiveScience. (In places, it might have been even higher.)

Then, in a geologic flash of time just 200 years' long, waters from the Atlantic cut through the Strait of Gibraltar and flooded the Mediterranean, refilling the sea.

Precise dates

Though scientists understood some of what triggered the great salinity crisis, they still don't fully understand the climatic changes that may have been involved.

The Earth wobbles like a top around its axis as it spins, in a roughly 20,000-year cycle. That shift affects how much sunlight certain parts of the Earth receive at different points in the cycle, thereby changing the climate. In the Mediterranean Sea area, sediments are striped with dark and light bands that correspond to surges and die-offs of sea life as a result of those climactic shifts.

Flecker and her colleagues with the Medgate project, a European Union project that is studying the salinity crisis, looked at those sediments to understand how the salt crisis began.

Unfortunately, they didn't know what part of each band corresponded to a particular position of Earth's axis, making it difficult to sequence events in the crisis.

The team used climate simulations to understand rainfall, evaporation and water flow into and out of the Mediterranean for a period spanning 22,000 years around the crisis onset, and tied that to sediment data. Ancient rivers in North Africa dumped huge pulses of freshwater into the sea in late summer, leaving traces of surging biological activity in the fossil record, the models show.

Based on their simulations, the researchers found the freshwater pulses happened at a time in the Earth's orbital rotation when the Northern Hemisphere would experience colder winters and hotter summers. That, in turn, meant that the evaporation must have started much later in the Earth's orbital cycle.

Read more at Discovery News

Unexplained Mysteries of 2013

Science is all about the pursuit of truth. New discoveries can provide answers, but they can also open the door to new questions.

Explore some of the mysteries left unsolved at the end of the year.

The Impossible Planet

Kepler-78b shouldn't exist. An Earth-sized planet with a rocky surface and an iron core, Kepler-78b is so close to its parent star that it completes an orbital revolution every 8.5 hours.

Kepler-78b is a curiosity because scientists have no way of explaining how the lava world could have formed given its proximity to its parent star.

The (Other) Impossible Planet

Exoplanet hunters had another mystery on their hands in 2013: a gas giant, HD 106906b, 11 times the size of Jupiter with a peculiar orbit. Unlike Kepler-78b, which is very close to its parent star, HD 106906b is 650 AU (astronomical unit) -- the measure of the distance between the Earth and the sun -- away from its parent star.

In fact, that's such a great distance that the orbit is larger than what astronomers once thought possible. Like Kepler-78b, the unusual distance means scientists do not yet how this exoplanet formed.

Mystery Date

Human ancestors were a promiscuous bunch. Ancient Homo sapiens mated not only amongst themselves, but also interbred with Neanderthals, a line of humans known as Denisovans, and a mystery lineage of humans. The unknown, extinct population isn't yet in the DNA record, as reported by LiveScience's Stephanie Pappas.

Given the different hominid species around at the time, Mark Thomas, an evolutionary geneticist at University College London, described it as a "Lord of the Rings-type world."

Hidden Hum

Do you hear that? That steady, droning, persistent sound that creeps at night.

If you are hearing things, you're not alone. Since the 1950s, reports have been coming in from around the world of people hearing what is known as "the Hum." As LiveScience's Marc Lallanilla reports, only about 2 percent of the population lives in a hum-prone area. It's louder at night than it is during the day, and typically heard in rural and suburban areas.

The Settlers

Christopher Columbus wasn't the first European to set foot in the New World. The Vikings preceded him. And in 2013, we learned of the arrival of a mysterious group of European settles to the "steps to the Americas" 300 to 500 years before the Vikings arrived in the New World.

Scientists had previously thought the Vikings were the first arrivals to the Faroes in the ninth century. Discovered at an archaeological site of Á Sondum on the island of Sandoy, researchers found evidence of human settlement in patches of burnt peat ash.

Although investigators have yet to discover clear evidence of the group's identity, possibilities include religious hermits from Ireland, late-Iron Age colonists from Scotland or pre-Viking explorers from Scandinavia.

Mona Lisa Mystery

Five hundred years after she had her portrait painted, we're still waiting to find out he identity of the woman in Leonardo da vinci's famous masterpiece, "Mona Lisa."

Lisa Gheradini Del Giocondo, the wife of a rich silk merchant, has long suspected of being the model behind Mona Lisa, and DNA tests of several skeletons found under a Florence convent will confirm whether one of them is Gheradini.

If one of the skeletons is Gheradini, who died in 1542, researchers plan on commissioning a facial reconstruction to determine any similarities between the skeleton and the portrait.

Read more at Discovery News

Dec 19, 2013

Powerful Ancient Explosions Explain New Class of Supernovae

Astronomers affiliated with the Supernova Legacy Survey (SNLS) have discovered two of the brightest and most distant supernovae ever recorded, 10 billion light-years away and a hundred times more luminous than a normal supernova. Their findings appear in the Dec. 20 issue of the Astrophysical Journal.

These newly discovered supernovae are especially puzzling because the mechanism that powers most of them -- the collapse of a giant star to a black hole or normal neutron star -- cannot explain their extreme luminosity. Discovered in 2006 and 2007, the supernovae were so unusual that astronomers initially could not figure out what they were or even determine their distances from Earth.

"At first, we had no idea what these things were, even whether they were supernovae or whether they were in our galaxy or a distant one," said lead author D. Andrew Howell, a staff scientist at Las Cumbres Observatory Global Telescope Network (LCOGT) and adjunct faculty at UC Santa Barbara. "I showed the observations at a conference, and everyone was baffled. Nobody guessed they were distant supernovae because it would have made the energies mind-bogglingly large. We thought it was impossible."

One of the newly discovered supernovae, named SNLS-06D4eu, is the most distant and possibly the most luminous member of an emerging class of explosions called superluminous supernovae. These new discoveries belong to a special subclass of superluminous supernovae that have no hydrogen.

The new study finds that the supernovae are likely powered by the creation of a magnetar, an extraordinarily magnetized neutron star spinning hundreds of times per second. Magnetars have the mass of the sun packed into a star the size of a city and have magnetic fields a hundred trillion times that of Earth. While a handful of these superluminous supernovae have been seen since they were first announced in 2009, and the creation of a magnetar had been postulated as a possible energy source, the work of Howell and his colleagues is the first to match detailed observations to models of what such an explosion might look like.

Co-author Daniel Kasen from UC Berkeley and Lawrence Berkeley National Lab created models of the supernova that explained the data as the explosion of a star only a few times the size of the sun and rich in carbon and oxygen. The star likely was initially much bigger but apparently shed its outer layers long before exploding, leaving only a smallish, naked core.

"What may have made this star special was an extremely rapid rotation," Kasen said. "When it ultimately died, the collapsing core could have spun up a magnetar like a giant top. That enormous spin energy would then be unleashed in a magnetic fury."

Discovered as part of the SNLS -- a five-year program based on observations at the Canada-France-Hawaii Telescope, the Very Large Telescope (VLT) and the Gemini and Keck telescopes to study thousands of supernovae -- the two supernovae could not initially be properly identified nor could their exact locations be determined. It took subsequent observations of the faint host galaxy with the VLT in Chile for astronomers to determine the distance and energy of the explosions. Years of subsequent theoretical work were required to figure out how such an astounding energy could be produced.

The supernovae are so far away that the ultraviolet (UV) light emitted in the explosion was stretched out by the expansion of the universe until it was redshifted (increased in wavelength) into the part of the spectrum our eyes and telescopes on Earth can see. This explains why the astronomers were initially baffled by the observations; they had never seen a supernova so far into the UV before. This gave them a rare glimpse into the inner workings of these supernovae. Superluminous supernovae are so hot that the peak of their light output is in the UV part of the spectrum. But because UV light is blocked by Earth's atmosphere, it had never been fully observed before.

The supernovae exploded when the universe was only 4 billion years old. "This happened before the sun even existed," Howell explained. "There was another star here that died and whose gas cloud formed the sun and Earth. Life evolved, the dinosaurs evolved and humans evolved and invented telescopes, which we were lucky to be pointing in the right place when the photons hit Earth after their 10-billion-year journey."

Read more at Science Daily

Neanderthal Genome Shows Early Human Interbreeding, Inbreeding

The most complete sequence to date of the Neanderthal genome, using DNA extracted from a woman's toe bone that dates back 50,000 years, reveals a long history of interbreeding among at least four different types of early humans living in Europe and Asia at that time, according to University of California, Berkeley, scientists.

Population geneticist Montgomery Slatkin, graduate student Fernando Racimo and post-doctoral student Flora Jay were part of an international team of anthropologists and geneticists who generated a high-quality sequence of the Neanderthal genome and compared it with the genomes of modern humans and a recently recognized group of early humans called Denisovans.

The comparison shows that Neanderthals and Denisovans are very closely related, and that their common ancestor split off from the ancestors of modern humans about 400,000 years ago. Neanderthals and Denisovans split about 300,000 years ago.

Though Denisovans and Neanderthals eventually died out, they left behind bits of their genetic heritage because they occasionally interbred with modern humans. The research team estimates that between 1.5 and 2.1 percent of the genomes of modern non-Africans can be traced to Neanthertals.

Denisovans also left genetic traces in modern humans, though only in some Oceanic and Asian populations. The genomes of Australian aborigines, New Guineans and some Pacific Islanders are about 6 percent Denisovan genes, according to earlier studies. The new analysis finds that the genomes of Han Chinese and other mainland Asian populations, as well as of native Americans, contain about 0.2 percent Denisovan genes.

The genome comparisons also show that Denisovans interbred with a mysterious fourth group of early humans also living in Eurasia at the time. That group had split from the others more than a million years ago, and may have been the group of human ancestors known as Homo erectus, which fossils show was living in Europe and Asia a million or more years ago.

"The paper really shows that the history of humans and hominins during this period was very complicated," said Slatkin, a UC Berkeley professor of integrative biology. "There was lot of interbreeding that we know about and probably other interbreeding we haven't yet discovered."

The genome analysis will be published in the Dec. 19 issue of the journal Nature. Slatkin, Racimo and Jay are members of a large team led by former UC Berkeley post-doc Svante Pääbo, who is now at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.

In another analysis, Jay discovered that the Neanderthal woman whose toe bone provided the DNA was highly inbred. The woman's genome indicates that she was the daughter of a very closely related mother and father who either were half-siblings who shared the same mother, an uncle and niece or aunt and nephew, a grandparent and grandchild, or double first-cousins (the offspring of two siblings who married siblings).

Further analyses suggest that the population sizes of Neanderthals and Denisovans were small and that inbreeding may have been more common in Neanderthal groups than in modern populations.

As part of the new study, Racimo was able to identify at least 87 specific genes in modern humans that are significantly different from related genes in Neanderthals and Denisovans, and that may hold clues to the behavioral differences distinguishing us from early human populations that died out.

"There is no gene we can point to and say, 'This accounts for language or some other unique feature of modern humans,'" Slatkin said. "But from this list of genes, we will learn something about the changes that occurred on the human lineage, though those changes will probably be very subtle."

According to Pääbo, the list of genes "is a catalog of genetic features that sets all modern humans apart from all other organisms, living or extinct. I believe that in it hide some of the things that made the enormous expansion of human populations and human culture and technology in the last 100,000 years possible."

The Pääbo group last year produced a high-quality Denisovan genome based on DNA from a pinky finger bone discovered in 2008 in Denisova Cave in the Altai Mountains of Southern Siberia. That bone is from a young woman who lived about 40,000 years ago. The Neanderthal toe bone was found in the same cave in 2010, though in a deeper layer of sediment that is thought to be about 10,000-20,000 years older. The cave also contains modern human artifacts, meaning that at least three groups of early humans occupied the cave at different times. The Pääbo group developed new techniques to extract DNA from these old bones.

Read more at Science Daily

The Strange History of Santa's Little Helpers

The children of North America have a new Christmas tradition: The elf on the shelf.

Alternatively panned as creepy and adored as a fun holiday ritual, the trademarked Elf on the Shelf dates back to 2005, when author Carol Aebersold self-published a tale of a little elf sent by Santa to report on children's behavior leading up to Christmas. A toy elf sold with Aebersold's book plays that role in thousands of homes around the country.

It's a strange place to end up for these Christmas-y little creatures, who once stood side by side with Norse gods and took the blame for inexplicable illnesses in medieval Europe. But elves stand the test of time, playing modern-day roles in J.R.R. Tolkien's "Lord of the Rings" series as well as acting as Santa's spy agents.

Here's how these little people have evolved.

The origin of elves

Ancient Norse mythology refers to the álfar, also known as huldufólk, or "hidden folk." However, it's risky to translate álfar directly to the English word "elf," said Terry Gunnell, a folklorist at the University of Iceland. Elves are thought of as little people, perhaps wearing stocking caps and cavorting with fairies, but the original conception of álfar was far less whimsical.

Some ancient poems place them side by side with the Norse gods, perhaps as another word for the Vanir, a group of gods associated with fertility, or perhaps as their own godly race. It's likely, Gunnell said, that elves' inventors had no single, unified theory on elvish identity; rather, there were a variety of related folk beliefs regarding this unseen race.

"They look like us, they live like us -- at least in the older materials -- and probably, nowadays, if they're living anywhere, they're living between floors in flats ," Gunnell told LiveScience, referring to the notion of an invisible, parallel world inhabited by álfar — the friendly neighbors who live between the seventh and eighth floors.

Iceland was settled in the 800s by Scandinavians and Celts, brought from Ireland as slaves. Both Scandinavian and Celtic cultures had myths of fairies, elves and nature spirits, which began to meld into the concept of álfar as representatives of the landscape, Gunnell said. Iceland's eerie, volcanic setting probably played into these myths, Gunnell said, especially in the dark of winter, when the Northern Lights are the only thing illuminating the long nights.

"The land is alive, and really, the hidden people are a personification of a very living landscape that you have to show respect for, that you can't really defeat," Gunnell said. "You have to work with it."

Elves evolve

Scandinavians and Celts weren't the only Europeans who used unseen, supernatural species as symbols of the wilds surrounding them. Farther south, Germans believed in dwarves and little sprites called kobolds. Scots had house spirits called brownies.

Elves became part of this mythological mix throughout the first millennium A.D., according to Alaric Hall, a lecturer at the University of Leeds who penned an entry on elves for the upcoming "Ashgate Encyclopedia of Literary and Cinematic Monsters" (Ashgate, 2014). The word "elf" derives from the ancestor language of German, English and today's Scandinavian languages, Hall wrote, and the first written references to them come from church texts starting around A.D. 500.

Medieval Europeans saw elves as dark and dangerous, and linked them to demons. In the Old English "Beowulf," which dates to sometime between A.D. 700 and 1000, elves get a mention as an evil race that descended from Cain, the biblical son of Adam and Eve who murdered his brother:

"Of Cain awoke all that woful breed,

Etins and elves and evil-spirits,

as well as the giants that warred with God."

These religious references reveal the clash and melding of folk beliefs and new religion as Christianity crept into Europe. In different tales at different times, elves alternated between good and bad, Hall wrote.

They could deliver babies safely through a difficult labor -- or steal away a human baby and replace it with a sickly and deformed changeling. Elves, known as alp in German, could cause nightmares (Alpdrück), perhaps similar to other mythology surrounding the scary experience of sleep paralysis. Nevertheless, elves were probably still considered human-size, rather than diminutive, Hall wrote.

By William Shakespeare's day, elves lost many of their malevolent undertones. Shakespeare's "A Midsummer Night's Dream," written in the 1590s, included an elflike figure, Puck, who acted as a jokester or trickster.

From myth to Christmas

Much as the modern Thanksgiving menu dates back to the 1800s, so too do modern U.S. Christmas traditions. Elves became linked with Santa Claus in the 1823 poem "A Visit from St. Nicholas," better known today as "The Night Before Christmas." That poem refers to Santa Claus as a "jolly old elf."

With the elf-Christmas link established, other writers began to get creative with the idea. In 1857, Harper's Weekly published a poem called "The Wonders of Santa Claus," which tells how Santa "keeps a great many elves at work/ All working with all their might/ To make a million of pretty things/ Cakes, sugar-plums, and toys/ To fill the stockings, hung up you know/ By the little girls and boys."

The idea caught on. In 1922, famed artist Norman Rockwell released a painting of an exhausted Santa surrounded by tiny, industrious elves, trying to get a dollhouse finished in time for Christmas. A 1932 short movie by Disney called "Santa's Workshop" showed bearded, blue-clad elves singing, prepping Santa's sleigh, brushing reindeer teeth and helping Santa with the naughty/nice list. "Molly seems to be OK; she eats her spinach every day," an elf rhymes, before nixing another child's ambitious list because he doesn't wash behind his ears.

The modern era has brought nonconformist elves to the forefront, first in the form of Hermey the Misfit Elf in 1964's now-classic TV special, "Rudolph the Red-Nosed Reindeer." (Hermey preferred dentistry to servitude in Santa's workshop.) And in 2003, the comedy "Elf" starred Will Ferrell as a human brought up by Santa's elves who must travel to New York City to find his biological family.

The latest in elf innovation, the Elf on the Shelf, gives elves a duty they've never had before: not just making toys, but also serving as Santa's informants. In some ways, however, the Elf on the Shelf's arguable creepiness gets back to Christmas' roots. In Iceland, Gunnell said, children don't await Santa Claus; they wait for 13 "Yule Lads," who leave gifts in their shoes. Nor do they traditionally fear a lump of coal as a consequence of bad behavior. In Icelandic lore, the horrifying ogress Grýla eats up naughty children.

Read more at Discovery News

'Perfect' Electron Roundness Bruises Supersymmetry

New measurements of the electron have confirmed, to the smallest precision attainable, that it has a perfect roundness. This may sounds nice for the little electron, but to one of the big physics theories beyond the standard model, it’s very bad news.

There currently are many efforts under way to search for physics “beyond” the standard model. The standard model predicts all known quantum interactions to a very high degree of accuracy. But, although being the vanguard for physics for many decades, the standard model does not account for mysterious dark matter and it does not encompass gravity.

One idea that theoretical physicists have pinned their hopes on is supersymmetry — the possible existence of “shadow particle” partners to regular subatomic particles. So, for example, every proton will have a more massive “shadow proton” (or “sproton”). Should these “sparticles” exist, perhaps they might explain the existence of dark matter that we know pervades the entire Universe, but have little idea what it is.

Alas, despite their best efforts in particle accelerators like the Large Hadron Collider (LHC), there is zero evidence of the existence of these sparticles. As if to underline this problem, it turns out that the recently-discovered Higgs boson is a “standard model Higgs” — the elusive particle is even predicted via standard model physics. Also, the LHC’s measurements of a rare Bs meson decay only confirmed standard model calculations and did not reveal anything exotic of a supersymmetrical nature.

Today, in research published in the journal Science Express, physicists have once again chipped away at supersymmetry not by smashing particles together at high speed, but by making the most precise measurement of the electron to date.

“We know the Standard Model does not encompass everything,” said physicist David DeMille, of Yale University, in a press release. “Like our LHC colleagues, we’re trying to see something in the lab that’s different from what the Standard Model predicts.”

DeMille works with John Doyle and Gerald Gabrielse of Harvard University on the ACME collaboration. ACME is hunting for exotic physics by seeking out the dipole moment of electrons and measuring their vital statistics. The standard model predicts that the electron has exactly zero dipole moment, meaning it is perfectly symmetrical. However, should supersymmetry exist, the dipole moment of the electron should be greater than zero, pushing the negatively-charged particle into a a more and more elongated shape. The presence of sparticles will squeeze the electron’s form away from being round.

As announced today, however, in measurements of the electron’s dipole moment that are 10 times more precise than any measurement that has come before it, the electron appears to be perfectly symmetrical, just as the standard model predicts.

When it comes to quantum physics, analogies are king. As described by DeMille: “You can picture the dipole moment as what would happen if you took a perfect sphere, shaved a thin layer off one hemisphere and laid it on top of the other side. The thicker the layer, the larger the dipole moment. Now imagine an electron blown up to the size of the earth. Our experiment would have been able to see a layer 10,000 times thinner than a human hair, moved from the southern to the northern hemisphere. But we didn’t see it, and that rules out some theories.”

Read more at Discovery News

Dec 18, 2013

Magical Medieval Crypt Holds 7 Male Mummies

A 900-year-old medieval crypt, containing seven naturally mummified bodies and walls covered with inscriptions, has been excavated in a monastery at Old Dongola, the capital of a lost medieval kingdom that flourished in the Nile Valley.

Old Dongola is located in modern-day Sudan, and 900 years ago, it was the capital of Makuria, a Christian kingdom that lived in peace with its Islamic neighbor to the north.

One of the mummies in the crypt (scientists aren't certain which one) is believed to be that of Archbishop Georgios, probably the most powerful religious leader in the kingdom. His epitaph was found nearby and says that he died in A.D. 1113 at the age of 82.

Magical inscriptions

The inscriptions on the walls of the crypt, inscribed with black ink on a thin layer of whitewash (paint), were written in Greek and Sahidic Coptic. They include excerpts from the gospels of Luke, John, Mark and Matthew, magical names and signs and a prayer given by the Virgin Mary, at the end of which death appears to her "in the form of a rooster." After Mary dies, according to the text, she ascends to heaven with Jesus.

The inscriptions, written by "Ioannes," who left a signature on three and possibly four of the walls, likely served as protection for the deceased against evil powers, the researchers said.

They were "intended to safeguard not only the tomb, but primarily those who were buried inside of it during the dangerous liminal period between the moment of dying and their appearance before the throne of God," write Adam Lajtar, of the University of Warsaw, and Jacques van der Vliet, of Leiden University, in the most recent edition of the journal Polish Archaeology in the Mediterranean.

The crypt contained the bodies of seven older males, no younger than 40, said anthropologist Robert Mahler, a researcher with the University of Warsaw who examined the remains.

The crypt was likely sealed after the last of the burials took place. "The entrance to the chamber was closed with red bricks bonded in mud mortar," writes W?odzimierz Godlewski, the current director of the Polish Mission to Dongola, in an article in the same journal.

While the mummies' clothing is very poorly preserved, textile specialist Barbara Czaja-Szewczak, with the Wilanów Palace Museum, determined the men were dressed very simply, mainly in linen garments. The garments "consisted of robes characterized by a fairly simple design. Linen predominated," she wrote in an article in the same journal. At least some of the individuals wore crosses somewhere on their body.

The crypt was first found in 1993 by the Polish Mission to Dongola, which at the time was led by director Stefan Jakobielski. However, it wasn't excavated until 2009. During excavations, the bodies were removed and studied, the crypt walls cleaned and its inscriptions recorded and studied in greater detail. Research efforts are ongoing and a complete record of the texts is expected to be detailed in a book in the future.

A lost kingdom

At the time the crypt was created, Makuria was at its height. Its kings, ruling from Old Dongola, controlled territory throughout much of modern-day Sudan and parts of southern Egypt.

"The period between the late eighth and 12th centuries is claimed to have been the golden age of Makuria," said Artur Obluski, a research associate with the University of Chicago's Oriental Institute and the University of Warsaw's Polish Centre of Mediterranean Archaeology, at a recent lecture at Toronto's Royal Ontario Museum.

Makuria's ability to maintain good relations with its Islamic neighbor to the north, the Fatimid Caliphate, which controlled Egypt, was important to the kingdom's success, said Obluski. The two had an extensive trade relationship, and many people from Makuria served in the Fatimid army.

Read more at Discovery News

Expedition Explores Underwater 'Grand Canyon'

A five-week expedition to map and sample a giant underwater canyon off the northwest coast of Morocco has completed its mission, yielding the best look yet at the deep-sea wonder.

More than half a mile (about 1 kilometer) deep, 280 miles (450 km) long and up to 20 miles (30 km) wide, Agadir Canyon is approximately the size of the Grand Canyon. A joint team of British and German scientists aboard the German research vessel Maria S Merian took images and samples of the seafloor to create a high-resolution 3-D map of the canyon and sample its marine life.

Up until now, Agadir Canyon, considered by some measures the world's largest undersea canyon, has rarely been explored, said British expedition leader Russell Wynn of the National Oceanography Centre in England. "There are a lot of interesting features that no one has ever gone and looked at," Wynn told LiveScience's OurAmazingPlanet.

Long flows of sediment carved out the canyon over millions of years, much like a river carves out a canyon, including the Grand Canyon, on land. Researchers had mapped Agadir Canyon previously at a very crude scale that revealed features a few hundred meters big. Now, using a technology called multibeam sonar, Wynn and his colleagues have mapped the region on the scale of a few meters to tens of meters.

The team discovered that Agadir Canyon produced the world's largest sediment flow about 60,000 years ago, depositing up to 38 cubic miles (160 cubic km) of sludge during a single catastrophic landslide.

Powerful flows from the Atlas Mountains of Northwest Africa carried sand and gravel to deep offshore basins nearly 3,000 miles (5,000 km) beneath the sea surface, depositing the sediment over 135,000 square miles (350,000 square km) -- an area roughly the size of Germany.

The team also found a gigantic new landslide south of the canyon that covers more than 2,000 square miles (5,000 square km) of seafloor, about the size of the state of Delaware. This flow didn't mix with the other flows, but blocked up the end of the canyon, "like toothpaste," Wynn said. Preliminary data suggest the flow is quite ancient, at least 130,000 years old, he added.

Read more at Discovery News

Top Myths Busted in 2013

The Year That Wasn't

This past year was a strange one, with a variety of popular beliefs being busted. Some were welcome news: Other myths left a funk like a fart-filled balloon when they burst. Here, in no particular order, are seven popular stories and myths busted in 2013, ranging from the scientific to the sublimely pseudoscientific ...

Childhood Obesity Drops

The news from the American Medical Association and the Centers for Disease Control and Prevention (CDC) has been a steady stream of gloom and doom for years when it comes to obesity in America, and childhood obesity in particular. The obesity rate for children and adolescents has tripled in the past three decades and now stands at just under one in five. Many experts believed that the rate would continue apace, but a new CDC study found instead that obesity levels actually dropped slightly in 18 states, remained the same in another 19, and increased in only three.

It's not clear what caused the unexpected decline, but some experts credit public awareness initiatives such as Michelle Obama's "Let's Move" campaign aimed as school-age kids and their parents. The surprising success, however, may be temporary. The CDC emphasizes that unless both kids and adults need eat healthier and get more exercise these gains will be lost.

The Case of the Discriminating Diner

Dayna Morales, a New Jersey waitress, made national news in November when she claimed she was left a hate-filled, anti-gay note instead of a tip. She produced a receipt on which a message was written, "I'm sorry but I cannot tip because I do not agree with your lifestyle & how you live your life." This created an outpouring of sympathetic support and donations from around the world.

Several weeks later, however, the family that she'd served came forward with the same receipt -- except with an $18 tip and no homophobic message. Morales was accused of faking the incident, as well as making other false claims, including that she had brain cancer. Morales was fired from her job at the bistro after the restaurant completed its investigation and determined that her story was a hoax.

DNA Study Confirms Bigfoot! Or Not

A team of researchers led by Melba Ketchum, a Texas veterinarian, claimed to conclusively prove the existence of Bigfoot through genetic testing. Ketchum says the mysterious monsters are half-human hybrids. The claims had circulated for several years, but Ketchum did not publish her study until February of this year -- though "publish" is not quite the word for it. Instead, because reputable scientific journals rejected her research, Ketchum decided to create her own online publication, the DeNovo Scientific Journal, and publish her findings there.

Unfortunately the study was badly flawed, including a fake April Fool's citation in its references. Scientists and geneticists who examined her claims found it riddled with errors, tainted evidence and incorrect conclusions. Undeterred by the scientific rejection of her work, Ketchum continues her fight to obtain legal status for Bigfoot. She says the elusive creatures are an undiscovered Native American population.

The Homo Erectus Family Expands

A 1.8-million-year-old skull -- the most complete early Homo genus skull ever discovered -- found in the Republic of Georgia earlier this year has scientists reconsidering what our family tree really looks like. It was one of several skulls found together at the same place, suggesting that the characteristics that anthropologists have used for decades to distinguish one species from another may be all wrong. The different skulls, and parts of the same skull, may not have been from distinctly different species of humans as previously thought but instead simply a result of normal variation within a single species, Homo erectus.

Scientists are elated that this myth may have burst because it provides a clearer picture of our human lineage and, if true, consolidates three different species branches -- Homo ergaster, Homo rudolfensis and Homo habilis -- into one. An article on this find, "Complete Skull from Dmanisi, Georgia, and the Evolutionary Biology of Early Homo," was published in the Oct. 18 edition of Science.

Eyeball Licking Not Quite Trendy

A weird news story appeared in June warning parents about a bizarre fad sweeping through Japanese schools: children licking each other's eyeballs. This unexplainable behavior threatened the kids' health by spreading highly contagious pink eye, and could even cause blindness. Many news organizations ran with the story, despite the fact that the only source for it was an anonymous teacher at a primary school in Tokyo. Finally Mark Schreiber, an American journalist living in Japan, researched the story and discovered what some had suspected: The whole story was an urban legend. There was no eyeball licking fad, no trend, no scores of children arriving in hospitals with pink eye or blindness.

This particular myth gained traction because it was also part of a moral panic, which tapped into universal social concerns about kids' behavior. Every year or two a story circulates widely in the news about some dangerous new trend that parents and teachers need to be aware of, some hidden threat to the safety of children based on some strange behavior. Teens face many dangers, but eyeball licking is not among them.

Read more at Discovery News

Fomalhaut's Stellar Sister's Comets: Exoplanet Goldmine?

Astronomers scoping-out the vicinity of the famous star Fomalhaut have discovered that its mysterious stellar sister is also sporting a rather attractive ring of comets.

This news is brought to you by the recently-defunct Herschel space observatory, a European-led infrared mission, that was especially fond of probing the infrared signals being emitted by cool nebulae, the gas in distant galaxies and, in this case, vast icy debris fields surrounding young stars. Sadly, the orbiting telescope was lost earlier this year as it ran out cryogenic coolant, but its legacy lives on.

Located 25 light-years away in the constellation Piscis Austrinus, Fomalhaut A is one of the brightest stars in Southern Hemisphere skies. The bright blue giant is notable in that it hosts a gigantic ring of cometary debris and dust. Embedded within this ring is the infamous Fomalhaut Ab, a massive exoplanet that has been the cause of much debate.

But this new research, published in the journal Monthly Notices of the Royal Astronomical Society, doesn’t focus on the stellar “Eye of Sauron,” it is actually centered around Fomalhaut’s less famous sibling, Fomalhaut C.

Fomalhaut C is a red dwarf star and was only confirmed to be gravitationally bound Fomalhaut A and Fomalhaut B in October. Fomalhaut is therefore a triple, or trinary, star system. The small red dwarf star may be the proverbial runt of the Fomalhaut stellar litter, but it appears to share some common ground with its larger sibling.

“It’s very rare to find two comet belts in one system, and with the two stars 2.5 light years apart this is one of the most widely separated star systems we know of,” said astronomer Grant Kennedy, of the University of Cambridge and lead researcher of this work. “It made us wonder why both Fomalhaut A and C have comet belts, and whether the belts are related in some way.” Interestingly, Fomalhaut B doesn't appear to have such a belt.

Astronomers have noted that both cometary belts around Fomalhaut A and C are bright and elliptical, indicating that gravitational perturbations may be destabilizing comets’ orbits, causing collisions and close encounters with the central stars. As Comet ISON dramatically demonstrated last month, stellar near-misses can result in disintegration, flinging huge quantities of ice, dust and gas around star systems. Scale that activity up in a younger star system and you have huge debris belts like the ones found around Fomalhaut A and C.

So what’s causing this activity? The researchers point out that Fomalhaut A already has a known exoplanet mixing things up from deep inside the dusty ring. Also, gravitational interactions between Fomalhaut A and C could be a huge factor.

“We thought that the Fomalhaut A system was disturbed by a planet on the inside — but now it looks like a small star from the outside could also influence the system. A good test of this hypothesis is to measure (Fomalhaut C’s) exact orbit over the next few years,” said Paul Kalas, of the University of California.

Read more at Discovery News

Organic Matter Found in Ancient Meteorite Glass

Scientists have found organics from Earth's swamp trapped inside of glass created by a meteor impact almost a million years ago. The tiny pockets, only micrometers across, contain material such as cellulose and proteins. Though the impact glass was found on Earth, scientists say that similar samples could have been thrown into space by this or other blasts, allowing organics to be transported from one planet to another.

Approximately 800,000 years ago, a rock 100 to 160 feet (30 to 50 meters) across crashed down in Western Tasmania, Australia. As it slammed into the Earth, temperatures exceeded 1,700 degrees Celsius (3,100 degrees Fahrenheit), melting rock and creating glass sphericals, as well as a quarter-mile wide hole known as the Darwin Crater.

"The reason the glass is so abundant seems likely to relate to the presence of volatiles like water at the surface when the impact occurred," lead author Kieren Howard of the City University of New York told Astrobiology by email.

"A bit like when water from your spatula drips into a frying pan, having the right amount of water at the surface during impact may have increased the magnitude of the explosion, and the production and dispersal of the melt."

In Tasmania, the land was covered by swamps and rainforest, offering sufficient water to create the glass. According to the authors, glass from the Darwin Crater is the most abundant and widely dispersed impact glass on Earth, relative to the crater's size, with glass scattered across 150 square miles (400 square kilometers). In fact, the widespread glass led to the discovery of the crater, which is now filled with younger sediments, in 1972.

Varying types of glass form from impacts, depending on the rocks laying at the surface. Darwin contains quartz-rich rocks that create white colors, though other rock mixes in to create different shades.

"The greater the proportion of shale molten to make the glass, the darker in color it gets — from white through light green, dark green, to black," Howard said.

The white glass blends in with quartzite samples in the area, making them a challenge to pick out. White Darwin glass fragments make up less than 3 percent of all finds. The authors suggest that, in less well-preserved fields, the tiny fraction of white glass could easily remain undiscovered.

Howard and his team weren't originally looking for organics. But when they examined the glass, they found surprising evidence of crystalline quartz.

"I went looking for crystals in the glass, only to discover the spherical inclusions," Howard said.

Inside the tiny crystal pockets, spheres up to 200 micrometers in diameter contained organics including cellulose, lignin, aliphatic biopolymer, and protein. The signature from the biomarkers suggested that fragments of peat were trapped in the molten glass, rapidly heating and degassing to create a frothy, bubble-like texture.

Trapped inside of glass, the organics would have been prevented from breaking down via oxidation. Howard's samples showed no signs of fossilizations, indicating that such trapped organics could last as long as the glass around them.

"Providing the glass seal isn't broken, it's a good preservation method," said organic geochemist Stephen Bowden at the University of Aberdeen in Scotland. Bowden, who was not involved in the research, has previously studied fossil organic matter within impact rocks and its survival during atmosphere re-entry.

"[It's] like setting something in resin or amber — like a scorpion novelty in a glass paperweight."

Fossil organic matter — fossil fuels — have been found in rocks formed by meteorite impacts and glasses from experimental collisions, but never before organics with such a well-preserved biological character

The research was published in the journal Nature Geoscience on November 10.

Finding organics inside of glass could have extensive ramifications. Though Howard found glass spheres that crashed back to Earth, other spheres could have been hurled into space, if their velocities were high enough.

"Survival in or transfer to ejecta is a 'big deal' because of the conditions of its formation — it's surprising — and the fact that it can leave Earth's atmosphere," Bowden said.

Organics launched off the planet could have traveled through space to seed other bodies, suggesting a possible method of travel for panspermia. This theory suggests that life did not originate on Earth, but traveled here from elsewhere in the universe.

Rocky bodies such as Mars, the Moon, and Titan could potentially have organics trapped after impacts hit their surface, though Bowden says that without further evidence, "this is still speculation." Even if ejecta from icy moons such as Jupiter's Europaformed around organics, they would be composed of ice, which would easily be breached when they melted.

Bowden went on to point out the difficulty of locating such finds on Earth.

"The volume of organic matter is small in comparison to the volumes of rock and sediment that are thrown up," he said. "After a few seconds in the hot ejecta, it's an even smaller volume of organic matter being diluted by a lot of rock and sediment."

But although the white glass only makes up a small percentage of the Darwin ejecta, Howard remained positive about the potential for similar discoveries of preserved organic biomarkers in other impact glasses and tekites here, and perhaps elsewhere.

"Impacts are the most common geologic process in the solar system. Mars is littered with craters and known to have impact glasses across its surface," Howard said.

"We've shown these glasses are potentially some of the most stable organic repositories imaginable, so yes, if looking for biomarker evidence of life on Mars — or any other planet — impact glasses are prospective targets."

Read more at Discovery News

Dec 17, 2013

Massive Stars Mark out Milky Way's 'Missing Arms'

A 12-year study of massive stars has reaffirmed that our Galaxy has four spiral arms, following years of debate sparked by images taken by NASA's Spitzer Space Telescope that only showed two arms.

The new research, which is published online in the Monthly Notices of the Royal Astronomical Society, is part of the RMS Survey, which was launched by academics at the University of Leeds.

Astronomers cannot see what our Galaxy, which is called the Milky Way, looks like because we are on the inside looking out. But they can deduce its shape by careful observation of its stars and their distances from us.

"The Milky Way is our galactic home and studying its structure gives us a unique opportunity to understand how a very typical spiral galaxy works in terms of where stars are born and why," said Professor Melvin Hoare, a member of the RMS Survey Team in the School of Physics & Astronomy at the University of Leeds and a co-author of the research paper.

In the 1950s astronomers used radio telescopes to map our Galaxy. Their observations focussed on clouds of gas in the Milky Way in which new stars are born, revealing four major arms. NASA's Spitzer Space Telescope, on the other hand, scoured the Galaxy for infrared light emitted by stars. It was announced in 2008 that Spitzer had detected about 110 million stars, but only found evidence of two spiral arms.

The astronomers behind the new study used several radio telescopes in Australia, USA and China to individually observe about 1650 massive stars that had been identified by the RMS Survey. From their observations, the distances and luminosities of the massive stars were calculated, revealing a distribution across four spiral arms.

"It isn't a case of our results being right and those from Spitzer's data being wrong -- both surveys were looking for different things," said Professor Hoare. "Spitzer only sees much cooler, lower mass stars -- stars like our Sun -- which are much more numerous than the massive stars that we were targeting."

Massive stars are much less common than their lower mass counterparts because they only live for a short time -- about 10 million years. The shorter lifetimes of massive stars means that they are only found in the arms in which they formed, which could explain the discrepancy in the number of galactic arms that different research teams have claimed.

"Lower mass stars live much longer than massive stars and rotate around our Galaxy many times, spreading out in the disc. The gravitational pull in the two stellar arms that Spitzer revealed is enough to pile up the majority of stars in those arms, but not in the other two," explains Professor Hoare. "However, the gas is compressed enough in all four arms to lead to massive star formation."

Read more at Science Daily

Biggest Spider Fossil Now Has Mate — But It's Complicated

A few years ago, scientists uncovered the largest-ever fossil of spider: a female representative of a never-before-seen species that was buried in volcanic ash during the age of the dinosaurs.

Now the researchers say they have found an adult male spider to match, but the discovery complicates the original interpretation of the species. The scientists have proposed a new genus — Mongolarachne — to describe the extinct creature.

When researchers first found the female spider in northern China, they named it Nephila jurassica, putting it in the Nephila genus of golden silk orb-weavers, which still exist today and have been known to ensnare birds and bats in their huge wheel-shaped webs.

"It was so much like the modern golden orb weaver," said Paul Selden, a paleontologist with the University of Kansas. "We couldn't find any reason not to put it in the same genus of the modern ones."

With soft, squishy bodies, spiders don't typically turn up in the fossil record, but several hundred have been found in the volcanic deposits at the Daohugou fossil beds in Inner Mongolia, Selden said.

Volcanic ash is famous for preserving more ephemeral pieces of the past, from bodies buried in their death poses at Pompeii to 2.7-billion-year-old raindrop impressions found in South Africa. Researchers think these spiders were likely swept to the bottom of a sub-tropical lake and covered in fine ash after a volcano blew its lid.

Unlike insects, spiders are typically pretty good at staying away from water, Selden explained.

"It would take something like a volcanic eruption to blow them into the bottom of the lake and bury them," Selden told LiveScience. "That's the sort of scenario we imagine."

And in that volcanic rock layer at Daohugou, the researchers found another spider that looked remarkably similar to Nephila jurassica, except it was male. There were several clues in the newfound fossil, however, that suggest this ancient arachnid just doesn't fit the bill for Nephila.

First of all, the male was remarkably quite similar in size to the female, with a body that measures 0.65 inches (1.65 centimeters) long and a first leg stretching 2.29 inches (5.82 cm).

"This is rather strange," Selden said. "In the modern orb weavers, there is quite a lot of sexual dimorphism," with a huge female and a tiny male.

Compared with Nephila male spiders, this newfound fossilized male had more primitive-looking pedipalps — the sex appendages between a spider's jaws and first legs that it uses to transfer sperm to the female. And it had a more feathery hairstyle: The fossil was preserved so well that Selden could look at imprints of the spider's hair under an electron microscope. Instead of one or two scales along each bristle, Selsen said he saw evidence that this spider had "spirals of hairlets" along the strands covering its body.

Read more at Discovery News

Did Rock Weathering Trigger 'Snowball Earth'?

A global ice age that lasted more than 50 million years may have been triggered by volcanic rocks trapping carbon dioxide that would otherwise warm the planet, researchers say in a new study detailed in the Dec. 16 journal Proceedings of the National Academy of Sciences.

Although ice is now found mostly in Earth's polar regions, analysis of ancient rocks suggests it could at times cover the entire globe. The causes of these "snowball Earth" periods remain mysterious, with the cause of one episode 2.3 billion years ago perhaps being the widespread emergence of oxygen in the atmosphere, which destroyed greenhouse gases keeping Earth warm.

For the new study, scientists focused on a snowball Earth period that began about 717 million years ago known as the Sturtian glaciation. This global ice age was preceded by more than 1 billion years without glaciers, making the Sturtian a transition from a longtime ice-free world to a snowball Earth, the most dramatic episode of climate change in the geological record.

The researchers noted the Sturtian broadly coincided with rifts tearing apart the ancient supercontinent Rodinia, as well as major volcanic activity in equatorial regions. This suggested the Sturtian might have its roots in tectonic activity.

The scientists investigated ancient rocks in the Mackenzie Mountains of northwest Canada known as glaciogenic diamictites, which are sedimentary rocks that are deposited by glaciers as they move over the Earth. They analyzed rocks both above and below these glaciogenic deposits to find out the deposits' ages.

"For me, this type of work combines the best parts of geology — fieldwork in remote and beautiful places, such as the Mackenzie Mountains of northern Canada, and working in a geochemistry lab," said study lead author Alan Rooney, a geologist at Harvard University. "The fieldwork is crucial to provide context for any data you may generate in the lab."

Specifically, the researchers analyzed levels of the elements rhenium and osmium within the sedimentary rocks bracketing the glaciogenic deposits. Rhenium breaks down via radioactive decay, generating osmium over time. By analyzing the ratios of rhenium and osmium isotopes within the rocks, the investigators could determine their age. (Isotopes are different forms of elements, where the atoms have different numbers of neutrons in their nuclei.)

The scientists found the Sturtian lasted about 55 million years.

"The most surprising aspect of these results is the duration of this glacial epoch," Rooney told LiveScience's OurAmazingPlanet.

The researchers also investigated osmium and strontium isotopes within rocks before, during and after the Sturtian. The levels of various isotopes in rocks depends on whether or not they came from radioactive sources such as eroded volcanic rock.

Read more at Discovery News

Pacific Coral Changed Rapidly After 1850

A mysterious change in the food web of the Pacific Ocean started in the mid-19th century, and the skeletons of deep-sea coral tell the tale.

Hundreds, even thousands, of feet beneath the ocean surface, deep sea corals live for centuries. As the grow, the tiny creatures collect a chemical record of what they eat. Marine scientists recently constructed a 1,000 year-long history of North Pacific corals’ cuisine by analyzing the nitrogen trapped in the coral skeletons.

The changing levels of different types of nitrogen, called isotopes, revealed information about the conditions in the ecosystem of the North Pacific subtropical gyre, a 20 million square kilometer counterclockwise circulation of the ocean’s waters.

For most of the past millennium, the nitrogen in the Pacific Ocean food chain came from dissolved nitrate rising from deeper in the sea. However, abruptly 150 years ago, the coral recorded a dramatic change in the source of nitrogen entering the marine food chain. Since approximately 1850, more of the chemical has been coming from microorganisms that transform nitrogen, similarly to how beans and other legumes fix nitrogen on land. Since approximately 1850, the increase in nitrogen from microorganisms increased by 17 to 27 percent.

“In comparison to other transitions in the paleoceanographic record, it’s gigantic,” said lead author Owen Sherwood of the University of Colorado, Boulder, in a press release. The study was published Dec. 15 in the journal Nature.

The cause of the food chain change may have to do with an expansion and warming of the North Pacific subtropical gyre itself. Marine scientists have also observed the gyre changing again over the past few decades.

Read more at Discovery News

Dec 16, 2013

Hand Fossil Turns Back Clock on Complex Tool Use

The discovery of a 1.4-million-year-old hand-bone fossil reveals that the modern human ability to make and use complex tools may have originated far earlier than scientists previously thought, researchers say.

A critical trait that distinguishes modern humans from all other species alive today is the ability to make complex tools. It's not just the extraordinarily powerful human brain, but also the human hand, that gives humans this unique ability. In contrast, apes — humans' closest living relatives — lack a powerful and precise enough grip to create and use complex tools effectively.

A key anatomical feature of the modern human hand is the third metacarpal, a bone in the palm that connects the middle finger to the wrist.

"There's a little projection of bone in the third metacarpal known as a "styloid process" that we need for tools," said study lead author Carol Ward, an anatomist and paleoanthropologist at the University of Missouri."This tiny bit of bone in the palm of the hand helps the metacarpal lock into the wrist, helping the thumb and fingers apply greater amounts of pressure to the wrist and palm. It's part of a whole complex of features that allows us the dexterity and strength to make and use complex tools."

Until now, this styloid process was found only in modern humans, Neanderthals and other archaic humans. Scientists were unsure when this bone first appeared during the course of human evolution. (The human lineage, the genus Homo, first evolved about 2.5 million years ago in Africa.)

"We had thought the modern human hand was something relatively recent, maybe something that appeared as a recent addition near the origin of our species," Ward told LiveScience.

Now, researchers have discovered a fossil almost 1.5 million years old that possesses this vital anatomical feature, meaning it existed more than 500,000 years earlier than it was previously known to have existed.

"This suggests this feature might be fundamental to the origin of the genus Homo," Ward said.

The scientists discovered a third metacarpal bone in northern Kenya, west of Lake Turkana. The fossil was found near the sites where the earliest Acheulean tools — named for St. Acheul in France where tools from this culture were first discovered in 1847 — were unearthed. The Acheulean artifacts were the first known complex stone tools, rough hand axes and cleavers that first appeared some 1.8 million years ago.

"It's an arid badlands desert area now," Ward said. "There's not much vegetation to cover up fossils — there's cobble and rock everywhere, and we try and find fossils by going out and looking under all that cobble and rock on the surface."

The hand-bone fossil is about 1.42 million years old. The researchers suspect it belonged to the extinct human species Homo erectus, the earliest undisputed predecessor of modern humans.

"Back then, this area was an open woodland area much more lush than today, probably with some trees and some areas of grassland," Ward said. "The fossil was found near a winding river, which often deposits things like fossils."

By revealing the early human lineage had a modern handlike anatomy, the fossil "suggests this feature may have a pre-adaptation that helped set the stage for all the technology that came later," Ward said.

Intriguingly, "at this time, in addition to early members of Homo, there were some late-surviving members of Australopithecus still around — close relatives of humans that don't seem to have this adaptation," Ward said. "This raises the question of how important our hands were in the success of our lineage and the extinction of their lineage (Australopithecus)."

Read more at Discovery News

Neanderthals Buried Their Dead

Neanderthals buried their dead, concludes a 13-year study of a former Neanderthal stomping ground in southwestern France.

It’s then possible that our species wasn’t the first on the human family tree to bury our own. At the very least, the discovery adds to the growing evidence that Neanderthals weren’t stupid. They might have been brawny, but they had big brains to match.

“This discovery not only confirms the existence of Neanderthal burials in Western Europe, but also reveals a relatively sophisticated cognitive capacity to produce them,” William Rendu, the study’s lead author and a researcher at the Center for International Research in the Humanities and Social Sciences, said in a press release.

The findings are published in the latest issue of the Proceedings of the National Academy of Sciences.

Rendu and his team excavated caves at the French site, called La Chapelle-aux Saints. Neanderthal remains, belonging to both adults and children, were found. Bones of bison and reindeer were also unearthed.

In contrast to the reindeer and bison remains at the site, the Neanderthal remains contained few cracks, no weathering-related smoothing, and no signs of disturbance by animals.

“The relatively pristine nature of these 50,000-year-old remains implies that they were covered soon after death, strongly supporting our conclusion that Neanderthals in this part of Europe took steps to bury their dead,” observed Rendu. “While we cannot know if this practice was part of a ritual or merely pragmatic, the discovery reduces the behavioral distance between them and us.”

Further supporting Rendu and colleagues’ theory, geological analysis of the depression in which the Neanderthal remains were found suggests that it was not a natural feature of the cave floor.

The study opens up a lot of questions:

Who buried their dead first, Neanderthals or Homo sapiens? (Or maybe even some other ancient human ancestor was the first to bury its dead.)

Why did any human first decide to bury the body of another? There are practical reasons for doing so, but the question still remains unanswered.

Read more at Discovery News