Feb 27, 2015

The Legless Amphibian That Eats Its Mother’s Skin

That’s no earthworm—it’s a vertebrate, specifically a kind of amphibian called a caecilian. But you can call it an earthworm if you want. I’m not gonna stop you.
Perhaps the most contentious title out there is “World’s Greatest Mom.” My mother probably thinks she is, but so too does her mother, and I think they could both make pretty strong cases. But the sacrifices that human moms make pale in comparison to what’s going on in nature. There’s a bug, for instance, whose young devour their mother from the inside out. And one species of deep-sea octopus looks after her eggs for an incredible four and a half years, then perishes.

One amphibian, though, goes about things a little more creepily: A caecilian (pronounced suh-SILL-ee-in) momma lets her kids eat her skin. Like, a lot of it. They’ve even got specialized baby teeth to more efficiently strip her skin away.

That alone is enough to win the caecilian a spot in this column, but pretty much everything about the caecilian is goofy. First of all, they have no legs, not even vestigial traces of limbs (they look just like that giant space slug that almost ate the Millennium Falcon in Star Wars—in fact I’d be surprised if the monster wasn’t inspired by caecilians). They reproduce in pretty much every way imaginable. Et cetera, et cetera.

The 200 or so species of caecilians range from just a few inches long to over 3 feet, and they live in tropical habitats all over the world, rummaging around in the leaf litter or burrowing or even taking to the water. Many species have tiny eyes, or their eye sockets are covered with bone, since peepers don’t do you no good nohow when you’re underground. It makes more sense for the structure to atrophy away over evolutionary time—an eye that isn’t there can’t get infected. Same goes for the legs. It takes a whole lot of valuable time and energy and resources to build those things.

So what caecilians are sensing their world with instead of eyes is “a so-called tentacle, which is usually in front of the eye, but behind the nostrils,” says zoologist Thomas Kleinteich of Germany’s Kiel University. “So this is chemosensory reception. They can actually smell their environment and use this when they are burrowing in the ground.”

When it comes to sex, caecilians have opted for some serious diversification. Some species lay eggs, while others give birth to live young, while still other aquatic varieties go through a larval stage. That’s pretty impressive when you consider that there’s only 200 species of caecilians. Compare this to the thousands upon thousands of species of frog, which typically go with the lay-eggs-add-sperm-develop-into-a-tadpole strategy, though there are a few exceptions.

As far as looking after the young goes—and caecilians all seem to do this to some degree—they have a couple of options for feeding. If you’re laying big eggs, you can pack a lot of yolk in there for them to feed on, then stand guard as they develop. Or…well, this brings us back to the skin thing. “What some species do is actually they have less yolk in the eggs,” says Kleinteich, “and then the juveniles hatch at a premature stage. They don’t actually catch prey, so the first thing they eat is, they scrape off the skin of the females.”

And here you were thinking I was a dum-dum for suggesting the space worm from Star Wars had to have been inspired by the caecilians. You can keep thinking I’m a dum-dum for plenty of other reasons, though.
Other scientists described this approach rather dramatically in a 2008 paper. “Feeding behaviour is quite frenetic with the young frequently tearing pieces of skin by spinning along their long axes and sometimes struggling over the same piece of skin. The mother remains calm during this activity. When the mother has been peeled, the young continue to search for and eat fragments of skin on the substrate.”

The female caecilian’s skin is uniquely adapted to handle being a smorgasbord. Breeding females’ skin is packed with energy-rich lipids (fatty acids), and is constantly replaced so the juveniles have a steady source of food. The frantic feeding event takes only 7 minutes, but just a day and a half later, the skin has regrown and the young can chow down once again. Adorably, juvenile caecilians have baby teeth just like us, only theirs look a bit like shovels—shovels that dig out their mother’s skin.

Not to be outdone, in species where the young develop within their mother, they’ll…gnaw on her uterus. This “stimulates the aggregation of what’s called uterine milk,” says Kleinteich. “Basically, the female keeps producing uterus epithelial cells, which are then scraped off by the developing fetuses in the uterus.” Charming.

This Ichthyophis sp. is one weird-looking bumblebee.
As caecilians mature, though, they lose those baby teeth and grow extremely sharp, conical chompers and become opportunistic hunters. Small varieties probably snap up insects and earthworms (scientists don’t know much about their diets), while bigger species can tackle fish and other amphibians. They’re not the speediest of predators, but they’re voracious.

Caecilians themselves look quite tasty, on account of essentially being squirming tubes of flesh. So some of them have gone and evolved toxic mucus on their skin. And in case you were considering getting one for a roommate, some are apparently so toxic that they’ve reportedly killed other creatures they’ve shared tanks with. “There’s glands in the skin of course for mucus production to keep the skin wet,” says Kleinteich, “but there’s also glands in the skin for toxins and antibiotics to keep bacteria and fungi away.”

Read more at Wired Science

Scientists Are Wrong All the Time, and That’s Fantastic

On February 28, 1998, the eminent medical journal The Lancet published an observational study of 12 children: Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive development disorder in children. It might not sound sexy, but once the media read beyond the title, into the study’s descriptions of how those nasty-sounding symptoms appeared just after the kids got vaccinated, the impact was clear: The measles-mumps-rubella vaccine can cause autism.

This was the famous study by Andrew Wakefield, the one that many credit with launching the current hyper-virulent form of anti-vaccination sentiment. Wakefield is maybe the most prominent modern scientist who got it wrong—majorly wrong, dangerously wrong, barred-from-medical-practice wrong.

But scientists are wrong all the time, in far more innocuous ways. And that’s OK. In fact, it’s great.

When a researcher gets proved wrong, that means the scientific method is working. Scientists make progress by re-doing each other’s experiments—replicating them to see if they can get the same result. More often than not, they can’t. “Failure to reproduce is a good thing,” says Ivan Oransky, co-founder of Retraction Watch. “It happens a lot more than we know about.” That could be because the research was outright fraudulent, like Wakefield’s. But there are plenty of other ways to get a bum result—as the Public Libary of Science’s new collection of negative results, launched this week, will highlight in excruciating detail.

You might have a particularly loosey-goosey postdoc doing your pipetting. You might have picked a weird patient population that shows a one-time spike in drug efficacy. Or you might have just gotten a weird statistical fluke. No matter how an experiment got screwed up, “negative results can be extremely exciting and useful—sometimes even more useful than positive results,” says John Ioannidis, a biologist at Stanford who published a now-famous paper suggesting that most scientific studies are wrong.

The problem with science isn’t that scientists can be wrong: It’s that when they’re proven wrong, it’s way too hard for people to find out.

Negative results, like the one that definitively refuted Wakefield’s paper, don’t make the news. Fun game: Bet you can’t name the lead author of that paper. (It’s okay, neither could we. But keep reading to find out!) It’s way easier for journalists to write a splashy headline about a provocative new discovery (guilty) than a glum dismissal of yet another hypothesis, and scientific journals play into that bias all the time as they pick studies to publish.

“All of the incentives in science are aligned against publishing negative results or failures to replicate,” says Oransky. Scientists feel pressure to produce exciting results because that’s what big-name journals want—it doesn’t look great for the covers of Science and Nature to scream “Whoops, we were wrong!”—and scientists desperately need those high-profile publications to secure funding and tenure. “People are forced to claim significance, or something new, extravagant, unusual, and positive,” says Ioannidis.

Plus, scientists don’t like to step on each other’s toes. “They feel a lot of pressure not to contradict each other,” says Elizabeth Iorns, the CEO of Science Exchange. “There’s a lot of evidence that if you do that, it’ll be negative for your career.”

When the politics of scientific publishing prevent negative results from getting out there, science can’t advance, and potentially dangerous errors—whether due to fraud or an honest mistake—go unchecked. Which is why lots of scientific publications, including PLOS, have recently begun to emphasize reproducibility and negative results.

Big-name journals have said they want to make data more transparent and accessible, so scientists can easily repeat analyses. Others, like the Journal of Negative Results in BioMedicine, are devoted to publishing only negative results. PLOS One’s collection of negative, null, and inconclusive papers, called The Missing Pieces, is now putting the spotlight on papers that contradict previous findings. PLOS thought—and we agree—it’s time to give them the attention they deserve. Negative results, step up:

Vaccines and Autism. Wakefield’s 1998 study reported a possible link between the measles-mumps-rubella vaccine and the onset of autism in children with gastrointestinal problems. More than 20 studies have since ruled out any connection, but they didn’t focus on children with gastrointestinal problems. So in 2008, researchers led by Mady Hornig conducted a case study that did. Again, they found no evidence linking the vaccine with autism.

Psychic Ability. In 2011, Daryl Bem, a psychologist at Cornell, conducted nine experiments that seemed to suggest people could be psychic. Extraordinary claims require extraordinary evidence, so researchers replicated one of the experiments three times in 2012. As the newer paper states, “all three replication attempts failed to produce significant effects and thus do not support the existence of psychic ability.” Bummer.

Priming and Performance. In a highly cited study from 2001, John Bargh, a psychologist at Yale, found that people who were exposed to words like “strive” or “attain,” did better on a cognitive task. Researchers did two experiments in 2013 to reproduce the original findings. They could not.

Read more at Wired Science

The Science of Why No One Agrees on the Color of This Dress

The original image is in the middle. At left, white-balanced as if the dress is white-gold. At right, white-balanced to blue-black.
Not since Monica Lewinsky was a White House intern has one blue dress been the source of so much consternation.

(And yes, it’s blue.)

The fact that a single image could polarize the entire Internet into two aggressive camps is, let’s face it, just another Thursday. But for the past half-day, people across social media have been arguing about whether a picture depicts a perfectly nice bodycon dress as blue with black lace fringe or white with gold lace fringe. And neither side will budge. This fight is about more than just social media—it’s about primal biology and the way human eyes and brains have evolved to see color in a sunlit world.

Light enters the eye through the lens—different wavelengths corresponding to different colors. The light hits the retina in the back of the eye where pigments fire up neural connections to the visual cortex, the part of the brain that processes those signals into an image. Critically, though, that first burst of light is made of whatever wavelengths are illuminating the world, reflecting off whatever you’re looking at. Without you having to worry about it, your brain figures out what color light is bouncing off the thing your eyes are looking at, and essentially subtracts that color from the “real” color of the object. “Our visual system is supposed to throw away information about the illuminant and extract information about the actual reflectance,” says Jay Neitz, a neuroscientist at the University of Washington. “But I’ve studied individual differences in color vision for 30 years, and this is one of the biggest individual differences I’ve ever seen.” (Neitz sees white-and-gold.)

Usually that system works just fine. This image, though, hits some kind of perceptual boundary. That might be because of how people are wired. Human beings evolved to see in daylight, but daylight changes color. That chromatic axis varies from the pinkish red of dawn, up through the blue-white of noontime, and then back down to reddish twilight. “What’s happening here is your visual system is looking at this thing, and you’re trying to discount the chromatic bias of the daylight axis,” says Bevil Conway, a neuroscientist who studies color and vision at Wellesley College. “So people either discount the blue side, in which case they end up seeing white and gold, or discount the gold side, in which case they end up with blue and black.” (Conway sees blue and orange, somehow.)

We asked our ace photo and design team to do a little work with the image in Photoshop, to uncover the actual red-green-blue composition of a few pixels. That, we figured, would answer the question definitively. And it came close.

In the image as presented on, say, BuzzFeed, Photoshop tells us that the places some people see as blue do indeed track as blue. But…that probably has more to do with the background than the actual color. “Look at your RGB values. R 93, G 76, B 50. If you just looked at those numbers and tried to predict what color that was, what would you say?” Conway asks.

So…kind of orange-y?

“Right,” says Conway. “But you’re doing this very bad trick, which is projecting those patches on a white background. Show that same patch on a neutral black background and I bet it would appear orange.” He ran it through Photoshop, too, and now figures that the dress is actually blue and orange.

The point is, your brain tries to interpolate a kind of color context for the image, and then spits out an answer for the color of the dress. Even Neitz, with his weird white-and-gold thing, admits that the dress is probably blue. “I actually printed the picture out,” he says. “Then I cut a little piece out and looked at it, and completely out of context it’s about halfway in between, not this dark blue color. My brain attributes the blue to the illuminant. Other people attribute it to the dress.”

Even WIRED’s own photo team—driven briefly into existential spasms of despair by how many of them saw a white-and-gold dress—eventually came around to the contextual, color-constancy explanation. “I initially thought it was white and gold,” says Neil Harris, our senior photo editor. “When I attempted to white-balance the image based on that idea, though, it didn’t make any sense.” He saw blue in the highlights, telling him that the white he was seeing was blue, and the gold was black. And when Harris reversed the process, balancing to the darkest pixel in the image, the dress popped blue and black. “It became clear that the appropriate point in the image to balance from is the black point,” Harris says.

Read more at Wired Science

Leonard Nimoy, Spock of 'Star Trek,' Dies at 83

Leonard Nimoy has died at the age of 83 at his Los Angeles home. In a statement, his wife, Susan Bay Nimoy, said that he had he sadly succumbed end-stage chronic obstructive pulmonary disease, an illness he had been battling for the past year.

Famed for his portrayal of the half-human, half-Vulcan Spock in the "Star Trek" original series and Star Trek movies, Nimoy is an inspiration to millions, building a huge fan base over the years. He was also a world-renowned director, photographer, poet and singer.

But of all his artistic roles, I will forever remember his role as the logical and often confounding Spock. His starship adventures with Captain Kirk (William Shatner) and the Starship Enterprise crew underscored my childhood and, by appearing in eight of the Star Trek movies — from 1979's “Star Trek: The Motion Picture” to 2013's “Star Trek: Into Darkness” — I’ve realized that Nimoy’s science fiction work has been with me all my life.

For me, I will always remember his stellar acting role in “The Search for Spock,” the second of a three-movie arc of Star Trek movies that Nimoy also directed. In the movie, the crew of the Starship Enterprise, having mourned the loss of Spock and other crewmembers at the hands of Kirk’s nemesis, Kahn, embark on a voyage to the experiment-gone-wrong planet Genesis to retrieve Spock’s body. The Genesis device had resurrected Spock.

According to the New York Times, Nimoy admitted to developing a “mystical identification” with his character Spock, being the lone alien on the starship’s bridge. But he also expressed ambivalence to always being attached to his most famous creation, writing, in his 1977 autobiography, “In Spock, I finally found the best of both worlds: to be widely accepted in public approval and yet be able to continue to play the insulated alien through the Vulcan character.”

Due to Star Trek’s immense popularity in the late 1970s, science fiction received a huge dose of science fact when NASA named one of its brand new Space Shuttle fleet “Enterprise.” During a famous photo shoot in front of the atmospheric test vehicle, the cast of Star Trek, including Nimoy, DeForest Kelley (Dr. “Bones” McCoy), George Takei (Mr. Sulu), James Doohan (Chief Engineer Montgomery “Scotty” Scott) and Nichelle Nichols (Lt. Uhura) joined Star Trek creator Gene Rodenberry and NASA administrator James D. Fletcher at the Shuttle’s Palmdale, Calif., manufacturing facilities.

Thirty-five years later, in April 2012, Nimoy was in New York to welcome Space Shuttle Enterprise to the city after the shuttle fleet was retired in 2011.

“This is a reunion for me,” Nimoy said during a ceremony after Enterprise’s touchdown at John F. Kennedy International Airport. “Thirty-five years ago, I met the Enterprise for the first time.

“When this ship was first built, it was named Constitution,” Nimoy said. “‘Star Trek’ fans can be very persuasive. They sent a lot of letters to president Gerald Ford and the president logically decided that the ship should be named after our spaceship Enterprise.”

Read more at Discovery News

Feb 26, 2015

Rogue Owl Terrorizing Dutch Town

The northern Dutch town of Purmerend has advised residents to arm themselves with an umbrella when going out at night after a mysterious spate of bloody rogue owl attacks.

Over the last three weeks, the European eagle owl has silently swooped on dozens of residents of the usually peaceful town, with many victims requiring hospital treatment.

The latest aerial assault on Tuesday evening saw two members of a local athletics club attacked, with one runner requiring stitches for six head wounds caused by the nocturnal bird of prey's talons.

The club has cancelled all training until further notice.

Residents and workers at Prinsenstichting home for the handicapped have been left terrified following at least 15 attacks, spokeswoman Liselotte de Bruijn told AFP.

"During the day there's no problem, but at night we now only venture outside armed with umbrellas, helmets and hats, anything really, to protect ourselves," said De Bruijn.

"The problem is that you don't hear the owl before it strikes. Its claws are razor-sharp," she said.

"We hope the city will soon catch this rogue bird."

Purmerend city council said it was trying to find a solution.

"We want to catch the owl as our city's residents are in danger," it said on its website, noting however that the European eagle owl is a protected species that requires special permission to be trapped.

"These procedures can still take some time. Meanwhile, we are advising people to stay away from the owl," the city said, telling night strollers in the area to shield themselves with umbrellas.

Gejo Wassink of the Netherlands' OWN owl foundation said the bird's behaviour was unusual.

"Either the owl was reared in captivity and released into the wild and now associates humans with food -- meaning it's not really 'attacking' people."

"Or it may have heightened hormone levels as the breeding season starts, which influences its behaviour and makes it defend its territory," Wassink told AFP, saying the bird "appears to be a female".

Read more at Discovery News

First Film of Surgery and Use of Anesthesia Identified

An 1899 film showing a rather gory surgical procedure has been confirmed as being the oldest known surviving film of a surgery as well as the oldest known film showing the use of anesthesia.

The film was loaded to YouTube some time ago where it was largely forgotten, but it’s gaining attention now due to the new research that supports the film’s importance. The study, which has been accepted for publication in the Journal of Anesthesia History, notes that multiple film libraries in Europe have confirmed the film’s historical significance.

The movie was made by filmmaker Eugenio Py not long after the invention of motion pictures by the French bothers August Marie Louis Lumière and Louse Lumière, according to author Adolfo Venturini, a professor in the Faculty of Medicine at the University of Buenos Aires.

Venturini, who is also director of the Museum and Historical Library of the Association of Anesthesia, Analgesia and Reanimation of Buenos Aires, explained that the 1899 film shows a male patient having a lung cyst removed. The procedure took place at the old Hospital de Clínicas in Buenos Aires.

Argentine surgeon Alejandro Posadas (1870–1902) performed the surgery, assisted by medical students. One of these assistants, Rodolfo Santiago Roccatagliata, administered anesthesia by sporadically dropping it from a bottle into a mask placed over the patient’s face.

“It is likely that anesthesia was performed with chloroform and (a) hand mask ‘chloroform cone’,” Venturini said. “This handmade mask was widely used in the countries of the New World. At this time period in the Americas, young surgical house officers and medical students were expected to be able to fold their own cones, from a variety of textiles, around a stiff paper or cardboard conical shell.”

Read more at Discovery News

See How Sahara Dust Jets to the Amazon in 3-D

The Amazon rainforest exists in part due to an atmospheric pipeline of dust from the Sahara Desert. And if that pipeline were to dry up or be diverted, massive biological changes could occur across the jungle.

New research published on Tuesday in the journal Geophysical Research Letters uses satellite data to create the first three-dimensional look at how dust makes its way across the Atlantic. The findings provide researchers with another clue about how the fate of one of the wettest places on the planet is tied to that of one of the driest.

Winds whipping across the desert and surrounding semi-arid areas kick dust high into the atmosphere for the start of a 6,000-mile trip to the Amazon basin every year. The new research uses recent satellite data covering the period from 2007 to 2013 to show just how massive the dust plume is.

The biggest pulses of dust come in winter and fall when an estimated 27.7 teragrams of dust make the Atlantic crossing. Teragrams not your thing? That’s 182 million tons or equivalent of 498 Empire State Buildings in the estimation of Hongbin Yu, lead author of the new study. Of that, nearly 28 million tons (or 75 Empire State Buildings, if you will) land in the Amazon.

“There have been studies linking dust with the Amazon basin, but how much dust is transported is unknown. It’s been unknown for awhile. You can use a model but models have large uncertainties,” Yu, an atmospheric scientist with NASA’s Goddard Space Flight Center in Maryland, said.

By observing multiple years, Yu’s results show the year-to-year variability in dust transport. In particular, they reveal that when the semi-arid region south of the Sahara, known as the Sahel, has an above normal rainy season during the summer, winter and spring dust transport tends to be lower.

Some research has shown that climate change could mean a wetter Sahel. That could have a downwind effect on the Amazon, which itself is projected to dry out as the world warms.

Kátia Fernandes, a research scientist at the International Research Institute for Climate and Society, said that this aspect of the study showed one of the main advantages of looking at multiple years in this context. She added that the study was “an important addition to the collection of satellite information available for climate and ecosystem studies.”

Why so much ado about dust in the wind? Because it contains phosphorus, a crucial nutrient that plants need to grow. Amazon soils run a phosphorus deficit as high as 90 percent, with rainfall and rivers washing it out to sea regularly.

Decomposing leaves and plants help recycle some of the phosphorus already in the Amazon, but dust provides a key outside source of the nutrient.

“In the long-term, if you don’t have this then it will keep losing phosphorus. Biodiversity could change or we could see other plant adaptation mechanisms,” Yu said.

Dust and other small particles known as aerosols can also influence weather around the globe, including the Atlantic hurricane season and speed ice melt in Greenland.

The data to create the new 3-D view of dust comes from NASA’s CALIPSO satellite. The satellite has lidar, a laser-based technology, aboard to provide a cross section view of clouds, dust and other particles that get stirred up in the atmosphere.

Circling the globe provides the third dimension to give scientists a view of Sahara dust and its transport to the Amazon basin.

Read more at Discovery News

'Big Brain' Gene Found in Humans, Not Chimps

A single gene may have paved the way for the rise of human intelligence by dramatically increasing the number of brain cells found in a key brain region.

This gene seems to be uniquely human: It is found in modern-day humans, Neanderthals and another branch of extinct humans called Denisovans, but not in chimpanzees.

By allowing the brain region called the neocortex to contain many more neurons, the tiny snippet of DNA may have laid the foundation for the human brain's massive expansion.

"It is so cool that one tiny gene alone may suffice to affect the phenotype of the stem cells, which contributed the most to the expansion of the neocortex," said study lead author Marta Florio, a doctoral candidate in molecular and cellular biology and genetics at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany. Still, it's likely this gene is just one of many genetic changes that make human cognition special, Florio said.

An expanding brain

The evolution from primitive apes to humans with complex language and culture has taken millions of years. Some 3.8 million ago, Australopithecus afarensis, the species typified by the iconic early human ancestor fossil Lucy, had a brain that was less than 30 cubic inches (500 cubic centimeters) in volume, or about a third the size of the modern human brain. By about 1.8 million years ago, Homo erectus was equipped with a brain that was roughly twice as big as that of Australopithecus. H. erectus also showed evidence of tool and fire use and more complex social groups.

Once anatomically modern humans, and their lost cousins the Neanderthals and Denisovans, arrived on the scene, the brain had expanded to roughly 85 cubic inches (1.4 liters) in volume. Most of this growth occurred in a brain region called the neocortex.

"The neocortex is so interesting because that's the seat of cognitive abilities, which, in a way, make us human — like language and logical thinking," Florio told Live Science.

The neocortex is so large because it is jam-packed with neurons, or brain cells. But what genetic changes ushered in this explosion of neurons?

Single gene

To understand that question, Florio, along with her thesis advisor, Dr. Wieland Huttner, a neurobiologist also at the Max Planck Institute, were studying one type of neural progenitor cell, a stem cell that divides and then forms brain cells during embryonic development. In mice, these cells divide once, and then make neurons. But in humans, these same types of cells divide many times over before forming a huge number of neurons.

Florio isolated this pool of cells, and then analyzed the genes that were turned on in both mice and humans at a stage of peak brain development. (The researchers looked at this process in both 13-week gestation human fetuses whose tissue had been donated by women after abortions and in mice at 14 days gestation.)

The researchers found that a particular gene, called ARHGAP11B, was turned on and highly activated in the human neural progenitor cells, but wasn't present at all in mouse cells. This tiny snippet of DNA, just 804 letters, or bases, long, was once part of a much longer gene, but somehow this fragment was duplicated and the duplicated fragment was inserted into the human genome.

Then the team inserted and expressed (turned on) this DNA snippet in the brains of mice. Though mice normally have a tiny, smooth neocortex, the mice with the gene insertion grew what looked like larger neocortices; these amped-up brain regions contained loads of neurons and some even began forming the characteristic folds, or convolutions, found in the human brain, a geometry that packs a lot of dense brain tissue into a small amount of space. (The researchers did not check to see if the mice actually got smarter, though that is a potential avenue of future research, Florio said).

Read more at Discovery News

Heads Up! Human Head Transplant in the Works

An Italian doctor is moving forward with plans to transplant a human head to another body, reported New Scientist.

Sergio Canavero, from the Turin Advanced Neuromodulation Group in Italy, told New Scientist that he will announce his project officially in June at the American Academy of Neurological and Orthopaedic Surgeons’ annual meeting.

Canavero has been talking about moving the head of one person to the body of another since 2013 and has presumeably been thinking about it, and its medical and ethical hurdles, far longer.

The first successful head transplant took place in 1970, when the head of a monkey was attached to the body of another by Robert White at Wast Western Reserve University School of Medicine, New Scientist said.

The monkey lived for nine days, until the body rejected the head. Because the spinal column wasn’t attached, though, it couldn’t move. But times have changed, Canavero said.

“I think we are now at a point when the tehcnical aspects are all feasible,” Canavero told New Scientist.

Canavero’s biggest challenge would be to fuse the two spinal cords. He would rely on a substance called polyethylene glycol to do that. The chemical is known to “encourage the fat in cell membranes to mesh,” said New Scientist. Cleanly cutting the cords of both bodies would be key to success, he said.

The new person would then be held in a coma for four months to encourage healing. Canavero said. The candidate could be up and walking after about a year of physiotherapy, he said.

Read more at Discovery News

Feb 25, 2015

Hippos Related to Whales, Fossil Reveals

An ancient relative of the hippopotamus likely swam from Asia to Africa some 35 million years ago, long before the arrival of the lion, rhino, zebra and giraffe, suggests a new study.

Analysis of the previously unknown, long-extinct animal also confirms that cetaceans -- the group to which whales, dolphins and porpoises belong -- are in fact the hippo's closest living cousins.

"The origins of the hippopotamus have been a mystery until now," says study co-author Fabrice Lihoreau, a palaeontologist at France's University of Montpellier.

"Now we can say that hippos came from anthracotheres" -- an extinct group of plant-eating, semi-aquatic mammals with even-toed hooves.

Until now, the oldest known fossil of a hippo ancestor dated from about 20 million years ago, while cetacean remains aged 53 million years have been found.

Scientists had long lumped hippos with the Suidae family of pigs based on palaeontological finds, but DNA later suggested they were the kin of whales instead.

Yet the huge age gap between hippos and cetaceans in the fossil record left experts stumped.

"It meant that either we have never found ancestors of hippos, or we didn't recognise them among the mammal fossils we already had," says Lihoreau.

Now the remains of a 28-million-year-old animal discovered in Kenya has provided an important piece of the puzzle, according to a study in the journal Nature Communications.

Named Epirigenys lokonensis ('epiri' means hippo in the Turkana language and Lokone after the discovery site), it was about the size of a sheep, weighing in at 100 kilograms, which is about a twentieth the size of today's 'common hippopotamus', a sub-Saharan giant.

It may have spent much of its time immersed in water.

E. lokonensis was not a direct forefather of today's hippo, belonging instead to a side branch. But it lived much closer in time to the ancestor from which they both branched off, thus allowing for inferences to be drawn about the ancient animal.

Dental analysis led the team to conclude that E. lokonensis and the hippo both came from an anthracothere forefather, which migrated from Asia to Africa about 35 million years ago.

Read more at Discovery News