Sep 4, 2010
SOME 16 million years ago, north became south in a matter of years. Such fast flips are impossible, according to models of the Earth's core, but this is now the second time that evidence has been found.
The magnetic poles swap every 300,000 years, a process that normally takes up to 5000 years. In 1995 an ancient lava flow with an unusual magnetic pattern was discovered in Oregon. It suggested that the field at the time was moving by 6 degrees a day - at least 10,000 times faster than usual. "Not many people believed it," says Scott Bogue of Occidental College in Los Angeles.
Now Bogue and his colleague Jonathan Glen of the United States Geological Survey in Menlo Park, California, say they have found a second example in Nevada. The lava rock suggests that in one year, Earth's magnetic field shifted by 53 degrees. At that rate, a full flip would take less than four years, but there could be another interpretation. "It may have been a burst of rapid acceleration that punctuated the steady movement of the field," says Bogue.
Read more at New Scientist
“It weighs in at more than 130 pounds, but the authoritative guide to the English language, the Oxford English Dictionary, may eventually slim down to nothing.
Oxford University Press, the publisher, said Sunday so many people prefer to look up words using its online product that it’s uncertain whether the 126-year-old dictionary’s next edition will be printed on paper at all.
The digital version of the Oxford English Dictionary now gets 2 million hits a month from subscribers, who pay $295 a year for the service in the U.S. In contrast, the current printed edition — a 20-volume, 750-pound ($1,165) set published in 1989 — has sold about 30,000 sets in total. It’s just one more sign that the speed and ease of using Internet reference sites — and their ability to be quickly updated — are phasing out printed reference books. Google and Wikipedia are much more popular research tools than the Encyclopaedia Britannica, and dozens of free online dictionaries offer word meanings at the click of a mouse. Dictionary.com even offers a free iPhone application.
By the time the lexicographers behind the century-old Oxford English Dictionary finish revising and updating its third edition — a gargantuan task that will take a decade or more — publishers doubt there will be a market for the printed form. “At present we are experiencing increasing demand for the online product,” a statement from the publisher said. “However, a print version will certainly be considered if there is sufficient demand at the time of publication.” Nigel Portwood, chief executive of Oxford University Press, told The Sunday Times in an interview he didn’t think the newest edition will be printed. “The print dictionary market is just disappearing. It is falling away by tens of percent a year,” he said.”Read more at Google News
Sep 3, 2010
IT'S a big skull. No, wait, it's two people under an arch. Hold on, it's a skull again. Two very different images can be perceived in the trick picture Blossom and Decay (see top). Now we are one step closer to working out how the brain spontaneously flips between such views, with the discovery of what may be the relevant brain region.
The precise neural mechanism that provokes the brain to switch its view of a scene is unknown, but it is thought to play a major role in perception by acting as a sort of reality check, says Ryota Kanai of University College London. "We need a trigger to prompt possible different interpretations so that we don't get stuck with a potentially incorrect interpretation of the world."
To find out which part of the brain might be involved, Kanai and colleagues asked 52 volunteers to watch a video of a revolving sphere and press a button when the rotation of the sphere appeared to change direction. Crucially, the sphere was not changing direction; it could simply be perceived to be rotating in either direction. How long each rotation-direction was perceived for was recorded and an average "switch rate" assigned to each of the volunteers.
The team then used structural magnetic resonance imaging to search for active brain regions during this task. This pointed to the superior parietal lobes (SPL), two areas towards the back of the head known to control attention and process three-dimensional images. People whose cortex was thicker and better connected in this region had faster switch rates.
To test whether or not the SPL had a role in triggering the switch, the researchers stimulated each lobe with a magnetic field - effectively knocking out the function of that lobe - while the volunteers rewatched the sphere illusion. The team found that the switch rate slowed when either lobe was exposed to the magnetic field.
Read more at New Scientist
God did not create the universe, Stephen Hawking revealed yesterday. In the flurry of publicity preceding his new book, The Grand Design, to be published next week, he does some serious dissing of the Almighty, declaring him/her/it irrelevant. The point is, he says, that our universe followed inevitably from the laws of nature. But, we might ask, where did they come from?
It is perhaps a bit rich for Hawking to make God redundant after granting him/her/it a celebrity cameo at the end of his multi-million selling A Brief History of Time. In his famous conclusion to the book, Hawking wrote that if scientists could find the most fundamental laws of nature "then we should know the mind of God". To be fair, he was writing metaphorically – we all know what he meant.
He now suggests that the search for this particular Holy Grail is over, now that scientists have come up with a type of theory, known as M-theory, that may describe the behaviour of all the fundamental particles and force, and even account for the very birth of the universe. If this theory is backed up by experiment, it might perhaps replace all religious accounts of creation – in Hawking's capacious mind, it already has.
The science-religion debate has been going on since science was born, centuries ago. Until relatively recently, it seemed to have quietened down, but now Hawking and others have brought it back into the limelight. It's striking that the scientists who contribute most vociferously to the arguments work in the field of evolutionary biology and fundamental physics. These, at least superficially, appear to be the territories where science and religion can make conflicting claims, leading us to ask which has the better case. But are they alternatives? Is there really any serious argument between the two?
Science and religion are about fundamentally different things. No religion has ever been rendered obsolete by facts or observations, but this happens to most scientific theories, at least in the long run. Science advances over the wreckage of its theories by continually putting theoretical ideas to experimental test; no matter how beautiful a theoretical idea might be, it must be discarded if it is at odds with experiment. Like any other human activity, science has flaws and does not always flow smoothly, but no one can seriously doubt the progress it has made in helping us understand the world and in helping to underpin technology.
A useful characteristic of a scientific theory is that it must be possible, at least in principle, for experimenters to prove it wrong. Newton and Darwin, two of the greatest theoreticians, both set out ideas in this way, putting their heads on Nature's chopping block. In Newton's case, at least, his ideas have been superseded after proving inadequate in some circumstances. Unlike many religions, science has no final authority; the Royal Society, the UK academy of sciences, expresses this neatly in its motto "Take nobody's word for it".
No religion has ever been set out in terms of scientific statements. This is why scientists are able to mock the claims of religions but have never been able to deal a knock-out blow: in the end, a religious believer can always fall back on a faith that does not depend on verification.
The most famous atheist scientist of our times is the fearless Richard Dawkins, whose God Delusion set out to discredit religion once and for all. For him, it was Darwin's theory of evolution that dealt the fatal blow to religious belief. Powerful and eloquent though it was, religion continues to flourish, and scientists (albeit a minority) continue to go to church, just as Galileo, Newton, Faraday and others have done in the past. I suspect that none of them would have abandoned their respective faiths after reading Dawkins (admittedly, not a scientific statement). Religions will survive so long as they steer clear of making statements that can be shown to be factually wrong.
Read more at The Telegraph
Sep 2, 2010
People who sleep less than five hours a night are up to three times more likely to become mentally ill than those sleeping eight or nine hours, the report said.
A 17-24 year old sleeps on average eight to nine hours per night, but this figure has been decreasing due to the amount of time young people spend on electronic gadgets in their bedrooms.
Researchers from George Institute for Global Health in Sydney, Australia, analysed the sleeping habits of almost 20,000 people aged between 17 and 24.
They found over half of those who got fewer than six hours sleep had high levels of psychological distress, compared with one quarter of those who slept eight to nine hours a night.
Professor Nick Glozier, who led the study, said: "Over the past few decades young adults have been sleeping fewer and fewer hours, whereas the rest of us have generally been sleeping more hours.
"There's a whole load of gadgets that kids and young adults now have in their bedrooms that they never used to have.
"Yet of course they have to get up and go to school or college or go to university at exactly the same time. So there's a group of them who are becoming more and more sleep-deprived."
A lack of sleep could have potentially serious effects, he said.
Read more at The Telegraph
Sep 1, 2010
Hot water discovered around a giant carbon star requires a new theory for the chemistry around stars to be explained. The new theory could significantly alter our understanding of what materials exist in interstellar space, and where water and life could exist in the universe.
“It makes us realize that the chemistry in all stars can be much more complex than we thought it was,” said astronomer Leen Decin of the Instituut voor Sterrenkunde in Belgium, lead author of the study published Sept. 2 in Nature. “If we don’t understand what is created from these old stars, we don’t know what the main ingredients of new stars and planets are made from.”
The star the water vapor was found around is much like our star will be in 6 billion years: nearing the end of its life, expanding outward, and with more carbon than oxygen in its atmosphere. Water vapor wasn’t expected around such a star, because it was thought that all the oxygen would be bound up in carbon monoxide, a stable molecule, and not available for making water molecules.
However, water vapor of unknown temperature was first discovered around this star in 2001. Astronomers proposed that the star had icy planets and comets that were vaporized as the star expanded outward. If this theory was right, the water vapor would be far away from the central core of the star, and cold.
With the launch of Herschel satellite in 2009, it was finally possible to test the theory because astronomers could collect information about the temperature of the water around the star. They found water vapor of all different temperatures around the star, which refutes the vaporized comet theory. Water could only get hot if it is closer to the star than where the comets and icy planets would have existed.
The new explanation for the water is that high energy ultraviolet light from nearby hot stars is penetrating the atmosphere of the carbon star and breaking apart the carbon monoxide molecules. Breaking these molecules apart would release oxygen that could react with the abundant hydrogen to form water.
To explain how light could get through the dense matter the star sheds off as it expands, the researchers propose that the star is less like a smooth sphere, and more like a clumpy, irregular surface.
Read more at Wired
One of paleontology’s most revered fossil sites now has a baby brother. Scientists have discovered a group of astonishing fossils high in the Canadian Rockies, just 40 kilometers from the famous Burgess Shale location.
sciencenewsA paper describing the find appears in the September issue of Geology.
Since its discovery in 1909, the Burgess Shale has yielded many thousands of fossils dating to 505 million years ago — a period often called “evolution’s big bang,” when animals were exploding in diverse body plans. These soft-bodied critters scurried around on the sea floor, then were buried in mudslides and exquisitely preserved.
Burgess fossils appear in several outcrops, all within about 60 kilometers of Field, British Columbia, and all occurring in shale deposits of the Stephen Formation that are 270 to 370 meters thick. Now, a team led by paleontologist Jean-Bernard Caron of the Royal Ontario Museum in Toronto reports finding Burgess-like fossils in the valley of the Stanley Glacier in Kootenay National Park, where a much thinner part of the Stephen Formation that ranges from 16 to 160 meters thick is exposed.
“This new locality adds to our knowledge of the environments where these organisms lived and died and thus adds important context,” says Peter Allison, a geoscientist at Imperial College London.
About half of the animal groups found at Stanley Glacier, such as trilobites, are found at other Burgess sites in different abundances. But the creatures unearthed also include eight taxa previously unknown to science. They include an unnamed worm; Stanleycaris hirpex, a segmented shrimp-like critter known as an anomalocarid; and an arthropod with big eyes dangling on stalks from its head shield.
Until now, paleontologists had thought one reason the Burgess fossils were so well preserved was because they settled in thick deposits at the bottom of an ancient ocean protected by a submarine cliff. But the Stanley Glacier fossils weren’t formed in the presence of such a cliff, suggesting that creatures can be fossilized in amazing detail in other environments.
“We consider it likely that future exploration and study will continue to yield new taxa from the ‘thin’ Stephen Formation, which is exposed over a broader area regionally than the ‘thick’ Stephen Formation,” the researchers write.
Read more at Wired
Ultra millionaire sponsorship deals such as those signed by sprinter Usain Bolt, motorcycle racer Valentino Rossi and tennis player Maria Sharapova, are just peanuts compared to the personal fortune amassed by a second century A.D. Roman racer, according to an estimate published in the historical magazine Lapham's Quarterly.
According to Peter Struck, associate professor of classical studies at the University of Pennsylvania, an illiterate charioteer named Gaius Appuleius Diocles earned “the staggering sum" of 35,863,120 sesterces (ancient Roman coins) in prize money.
Recorded in a monumental inscription erected in 146 A.D., the figure eclipses the fortunes of all modern sport stars, including golfer Tiger Woods, hailed by Forbes magazine last fall “sports' first billion-dollar man.”
Diocles, “the most eminent of all charioteers,” according to the inscription, was born in Lusitania, in what is now Portugal and south-west Spain, and started his spectacular career in 122 A.D., when he was 18.
Life for a charioteer in Rome wasn’t easy. Often slaves who could eventually buy their freedom, these racers engaged in wild laps of competition at the Circus Maximum, running a total of about 4,000 meters (nearly 2.5 miles).
“After seven savage laps, those who managed not to be upended or killed and finish in the top three took home prizes,” wrote Struck.
Experienced charioteers drove hard-to-manage chariots driven by four or even more horses.
Their sporting equipment included a leather helmet, shin guards, chest protector, a jersey, a whip, and a sharp knife with which to cut the reins if the chariot overturned.
Although drivers did not have their helmets or whips blessed by generous sponsorship, they could rely on stables or factions, basically teams similar to today’s Formula One: the Reds, Greens, Blues and Whites.
“The drivers affiliated with teams supported by large businesses that invested heavily in training and upkeep of the horses and equipment,” said Struck.
Diocles won his first race two years after his debut with the Whites, four years later, he briefly moved with the great rivals the Greens. But had the most success with the Reds, with whom he remained until the end of his career at the age of “42 years, 7 months, and 23 days.”
Read more at Discovery News
Aug 31, 2010
The argument over whether the universe has a creator, and who that might be, is among the oldest in human history. But amid the raging arguments between believers and sceptics, one possibility has been almost ignored – the idea that the universe around us was created by people very much like ourselves, using devices not too dissimilar to those available to scientists today.
As with much else in modern physics, the idea involves particle acceleration, the kind of thing that goes on in the Large Hadron Collider in Switzerland. Before the LHC began operating, a few alarmists worried that it might create a black hole which would destroy the world. That was never on the cards: although it is just possible that the device could generate an artificial black hole, it would be too small to swallow an atom, let alone the Earth.
However, to create a new universe would require a machine only slightly more powerful than the LHC – and there is every chance that our own universe may have been manufactured in this way.
This is possible for two reasons. First, black holes may – as science fiction aficionados will be well aware – act as gateways to other regions of space and time. Second, because of the curious fact that gravity has negative energy, it takes no energy to make a universe. Despite the colossal amount of energy contained in every atom of matter, it is precisely balanced by the negativity of gravity.
Black holes, moreover, are relatively easy to make. For any object, there is a critical radius, called the Schwarzschild radius, at which its mass will form a black hole. The Schwarzschild radius for the Sun is about two miles, 1/200,000th of its current width; for the Earth to become a black hole, it would have to be squeezed into a ball with a radius of one centimetre.
The black holes that could be created in a particle accelerator would be far smaller: tiny masses squeezed into incredibly tiny volumes. But because of gravity's negative energy, it doesn't matter how small such holes are: they still have the potential to inflate and expand in their own dimensions (rather than gobbling up our own). Such expansion was precisely what our universe did in the Big Bang, when it suddenly exploded from a tiny clump of matter into a fully-fledged cosmos.
Alan Guth of the Massachusetts Institute of Technology first proposed the now widely accepted idea of cosmic inflation – that the starting point of the Big Bang was far smaller, and its expansion far more rapid, than had been assumed. He has investigated the technicalities of "the creation of universes in the laboratory", and concluded that the laws of physics do, in principle, make it possible.
The big question is whether that has already happened – is our universe a designer universe? By this, I do not mean a God figure, an "intelligent designer" monitoring and shaping all aspects of life. Evolution by natural selection, and all the other processes that produced our planet and the life on it, are sufficient to explain how we got to be the way we are, given the laws of physics that operate in our universe.
However, there is still scope for an intelligent designer of universes as a whole. Modern physics suggests that our universe is one of many, part of a "multiverse" where different regions of space and time may have different properties (the strength of gravity may be stronger in some and weaker in others). If our universe was made by a technologically advanced civilisation in another part of the multiverse, the designer may have been responsible for the Big Bang, but nothing more.
If such designers make universes by manufacturing black holes – the only way to do it that we are aware of – there are three levels at which they might operate. The first is just to manufacture black holes, without influencing the laws of physics in the new universe. Humanity is nearly at this level, which Gregory Benford's novel Cosm puts in an entertaining context: an American researcher finds herself, after an explosion in a particle accelerator, with a new universe on her hands, the size of a baseball.
The second level, for a slightly more advanced civilisation, would involve nudging the properties of the baby universes in a certain direction. It might be possible to tweak the black holes in such a way that the force of gravity was a little stronger than in the parent universe, without the designers being able to say exactly how much stronger.
The third level, for a very advanced civilisation, would involve the ability to set precise parameters, thereby designing it in detail. An analogy would be with designer babies – instead of tinkering with DNA to get a perfect child, a scientists might tinker with the laws of physics to get a perfect universe. Crucially, though, it would not be possible in any of these cases – even at the most advanced level – for the designers to interfere with the baby universes once they had formed. From the moment of its own Big Bang, each universe would be on its own.
This might sound far-fetched, but the startling thing about this theory is how likely it is to happen – and to have happened already. All that is required is that evolution occurs naturally in the multiverse until, in at least one universe, intelligence reaches roughly our level. From that seed point, intelligent designers create enough universes suitable for evolution, which bud off their own universes, that universes like our own (in other words, suitable for intelligent life) proliferate rapidly, with "unintelligent" universes coming to represent a tiny fraction of the whole multiverse. It therefore becomes overwhelmingly likely that any given universe, our own included, would be designed rather than "natural".
Read more at The Telegraph
The beer is placed inside a pocket of salty, pretzel-like dough and then dunked in oil at 375 degrees for about 20 seconds, a short enough time for the confection to remain alcoholic.
When diners take a bite the hot beer mixes with the dough in what is claimed to be a delicious taste sensation.
Inventor Mark Zable said it had taken him three years to come up with the cooking method and a patent for the process is pending. He declined to say whether any special ingredients were involved.
His deep-fried beer will be officially unveiled in a fried food competition at the Texas state fair later this month.
Five ravioli-like pieces will sell for $5 (£3) and the Texas Alcoholic Commission has already ruled that people must be aged over 21 to try it.
Mr Zable has so far been deep frying Guinness but said he may switch to a pale ale in future.He said: "Nobody has been able to fry a liquid before. It tastes like you took a bite of hot pretzel dough and then took a drink of beer." Mr Zable previously invented dishes including chocolate-covered strawberry waffle balls and jalapeño corndog shrimps.
Read more at The Telegraph
Aug 30, 2010
Anthropologists have unearthed the leftovers of the world's first known organized feast, which took place around 12,000 years ago at a burial site in Israel, according to a new study.
Based on the findings, approximately 35 guests ate meat from 71 tortoises and at least three wild cattle while attending this first known human-orchestrated event involving food.
The discovery additionally provides the earliest known compelling evidence for a shaman burial, the apparent reason for the feasting. A shaman is an individual who performs rituals and engages in other practices for healing or divination.
In this case, the shaman was a woman.
"I wasn't surprised that the shaman was a woman, because women have often taken on shamanistic roles as healers, magicians and spiritual leaders in societies across the globe," lead author Natalie Munro told Discovery News.
Munro, a University of Connecticut anthropologist, and colleague Leore Grosman of Hebrew University in Jerusalem excavated and studied the shaman's skeleton and associated feasting remains. These were found at the burial site, Hilazon Tachtit cave, located about nine miles west of the Sea of Galilee in Israel.
According to the study, published in the latest Proceedings of the National Academy of Sciences, the grave consisted of an oval-shaped basin that was intentionally cut into the cave's floor.
"After the oval was excavated, the sides and bottom of the floor were lined with stone slabs lined and plastered with clay brought into the cave from outside," said Munro.
The 71 tortoise shells, previously butchered for meat removal, were found situated under, around and on top of the remains of the woman. The woman's skeleton indicates she suffered from deformities that would have possibly made her limp and "given her an unnatural, asymmetrical appearance." A large triangular stone slab was placed over the grave to seal it.
Bones from at least three butchered aurochs -- large ancestors of today's domestic cattle -- were unearthed in a nearby hollow. An auroch's tail, a wild boar forearm, a leopard pelvis and two marten skulls were also found.
The total amount of meat could have fed 35 people, but it is possible that many more attended the event.
Read more at Discovery News
Long a favorite of lovers and honeymooners, a Japanese beach town with fading sparkle has found a new tourism niche in the wired age by drawing young men and their virtual girlfriends.
One recent sweltering summer's day, a tour bus from Tokyo pulled up at a sun-kissed beach at Atami, a Pacific coast resort southwest of the metropolis, and disgorged more than a dozen excited, iPhone-clutching young men.
The determined youngsters, paying scant attention to the bikini-clad girls frolicking on the sand, instead headed straight for a bronze statue that depicts Kanichi and Omiya, a couple from an old love story set in Atami.
The focus of the men's attention -- and of their smartphone cameras -- was a tiny black and white square, a two-dimensional barcode that, thanks to "augmented reality" (AR) software, brought to life the object of their desire.
"Look, it's like I'm in a snapshot with her," said Shu Watanabe, 23, as he showed off his iPhone display, featuring himself next to the image of a doe-eyed cartoon character named Rinko, a smiling high school girl.
Rinko may only be digital, but try telling that to Watanabe or the legions of other fans of "Love Plus," a dating sim or simulation game that is played on handheld Nintendo DS consoles and also boasts AR applications for iPhones.
Its creators, Konami Digital Entertainment, have long thrilled young men obsessed with high-tech, manga and anime, known as "otaku," by letting them chase virtual girls in the alternative universe of their digital dreams.
The hit video game made headlines when a 27-year-old Japanese man known only as "Sal 9000" staged a tuxedo wedding late last year, which was watched by thousands online, with his favorite cartoon girl, Nene Anegasaki.
But in the latest edition, game makers have gone a step further and teamed up with the very real city of Atami, an onsen or hot spring town 100 kilometers (60 miles) southwest of the Japanese capital.
They have selected 13 romantic locations which can be overlaid with images of Rinko or her teenage friends Manaka and Nene, who have all swapped their usual sailor-style school uniforms for casual summer wear.
Local souvenir shops in the resort town have caught on and capitalized on the love-struck new clientele, selling Love Plus-themed souvenirs, from good-luck charms to steamed buns and fish sausages.
Read more at Discovery News
Mothers are said to hold a special lifelong place in their children's hearts, but it also appears they have a unique significance in their brains too.
Scientists have discovered that when adults look at their mothers' faces, it triggers a stronger response in the brain than when they look at pictures of strangers - or even of their fathers.
Using Magnetic Resonance Imaging (MRI) machines, the researchers measured the brain activity of volunteers as they were shown photographs of their parents, strangers and celebrities.
When images of the participants' mothers were shown to them, the scientists found that it "lit up" key areas, associated with recognition and emotion.
The findings suggest that mothers produce a complex and lasting emotional and cognitive response in their children's brain, as a result of the bonding experience that takes place between them and their children, as babies.
Scientists believe the findings shed new light on the extent to which humans experience "imprinting" - a phenomenon observed in many birds and animals in which youngsters form very strong attachments to the first creature they see after being born.
As a result, the youngsters follow their mother around and can rapidly learn from her characteristics and behaviour, which are said to be "imprinted" on them.
Human babies do not undergo such rapid imprinting, but many scientists believe the bond between mother and child can have crucial implications in later life and even into adulthood.
The new study, which is reported in the scientific journal Brain and Cognition, involved 20 volunteers with an average age of 35.
Read more at The Telegraph
Aug 29, 2010
“Burger King is set to launch the Pizza Burger – a two-in-one dish that contains more than 2,500 calories and is four times the size of the chain’s Whoppers.
The meal will delight fast-food fans when it is exclusively introduced at Burger King’s Whopper Bar in Times Square, New York, next month. Besides the beef and a 9.5-inch sesame bun, the Pizza Burger is topped with pepperoni, mozzarella, Tuscan pesto and marinara sauce. It also comes in six slices, just like a pizza.
According to blogger Me So Hungry, it is the perfect mix between a pizza and a burger. “The visual highlight was the New York Pizza Burger… it’s not bad. Tastes kinda like pizza, but also like a burger,” the blogger said. It has been dubbed the “fat bomb” because, for $13 (£8.40), customers can bite into 2,520 calories – the recommended daily intake is 2,500 calories for men and 2,000 for women.
One Pizza Burger contains 144g of fat – 59g of which is saturated. It also has 3,780mg of salt, which is more than double the daily limit for adults. John Schaufelberger, Burger King’s vice president of global marketing, insisted the Pizza Burger is “intended to be shared”. But he also admitted that it “demonstrates the type of menu offerings our guests can expect”. According to Mr Schaufelberger, the Pizza Burger is a homage to New York, the home of Burger King.”Read more at Sky News