The coldest of cold cases may have emerged from the bottom of a Stone-Age well in Israel as archaeologists uncovered two 8,500-year-old human skeletons deep inside the structure.
Belonging to a woman aged about 19 and an older man between 30 and 40, the skeletal remains were found in a 26-foot-deep well in the Jezreel Valley in Israel's Galilee region.
With the upper part built of stones and the lower hewn in the bedrock, the impressive well was connected to a Neolithic farming settlement.
"It seems the inhabitants used it for their subsistence and living," Yotam Tepper, excavation director at the Israel Antiquities Authority, said in a statement.
Tepper and colleagues cannot yet explain whether the two individuals accidentally fell into the well or were murdered and then dumped inside.
"As of now the answer to this question remains a mystery," they said.
Studies on the bones and the objects found near the human remains may provide an answer.
"What is clear is that after these unknown individuals fell into the well, it was no longer used for the simple reason that the well water was contaminated and was no longer potable," Tepper said.
Indeed, the artifacts recovered from inside the well suggest the structure ended up being used as a dump after it became polluted.
Along with the skeletal remains, the archaeologists found deeply dented flint sickle blades used for harvesting, as well as arrow heads, stone implements, animal bones and charcoal.
Whether the man and woman found at the well's bottom were the victims of a crime or a simple accident, Tepper and colleagues agree the structure itself is worth studying further.
"Wells from this period are unique finds in the archaeology of Israel, and probably also in the prehistoric world in general," Tepper said.
Read more at Discovery News
Nov 10, 2012
Iceman Mummy Finds His Closest Relatives
Ötzi the Iceman, an astonishingly well-preserved Neolithic mummy found in the Italian Alps in 1991, was a native of Central Europe, not a first-generation émigré from Sardinia, new research shows. And genetically, he looked a lot like other Stone Age farmers throughout Europe.
The new findings, reported Thursday (Nov. 8) here at the American Society of Human Genetics conference, support the theory that farmers, and not just the technology of farming, spread during prehistoric times from the Middle East all the way to Finland.
"The idea is that the spread of farming and agriculture, right now we have good evidence that it was also associated with a movement of people and not only technology," said study co-author Martin Sikora, a geneticist at Stanford University.
In what may be the world's oldest cold case, Ötzi was pierced by an arrow and bled to death on a glacier in the Alps between Austria and Italy more than 5,000 years ago.
Scientists sequenced Ötzi's genome earlier this year, yielding a surprising result: The Iceman was more closely related to present-day Sardinians than he was to present-day Central Europeans.
But the researchers sequenced only part of the genome, and the results didn't resolve an underlying question: Did most of the Neolithic people in Central Europe have genetic profiles more characteristic of Sardinia, or had Ötzi's family recently emigrated from Southern Europe?
"Maybe Ötzi was just a tourist, maybe his parents were Sardinian and he decided to move to the Alps," Sikora said.
That would have required Ötzi's family to travel hundreds of miles, an unlikely prospect, Sikora said.
"Five thousand years ago, it's not really expected that our populations were so mobile," Sikora told LiveScience.
To answer that question, Sikora's team sequenced Ötzi's entire genome and compared it with those from hundreds of modern-day Europeans, as well as the genomes of a Stone Age hunter-gatherer found in Sweden, a farmer from Sweden, a 7,000-year-old hunter-gatherer iceman found in Iberia, and an Iron Age man found in Bulgaria.
The team confirmed that, of modern people, Sardinians are Ötzi's closest relatives. But among the prehistoric quartet, Ötzi most closely resembled the farmers found in Bulgaria and Sweden, while the Swedish and Iberian hunter-gatherers looked more like present-day Northern Europeans.
The findings support the notion that people migrating from the Middle East all the way to Northern Europe brought agriculture with them and mixed with the native hunter-gatherers, enabling the population to explode, Sikora said.
Read more at Discovery News
The new findings, reported Thursday (Nov. 8) here at the American Society of Human Genetics conference, support the theory that farmers, and not just the technology of farming, spread during prehistoric times from the Middle East all the way to Finland.
"The idea is that the spread of farming and agriculture, right now we have good evidence that it was also associated with a movement of people and not only technology," said study co-author Martin Sikora, a geneticist at Stanford University.
In what may be the world's oldest cold case, Ötzi was pierced by an arrow and bled to death on a glacier in the Alps between Austria and Italy more than 5,000 years ago.
Scientists sequenced Ötzi's genome earlier this year, yielding a surprising result: The Iceman was more closely related to present-day Sardinians than he was to present-day Central Europeans.
But the researchers sequenced only part of the genome, and the results didn't resolve an underlying question: Did most of the Neolithic people in Central Europe have genetic profiles more characteristic of Sardinia, or had Ötzi's family recently emigrated from Southern Europe?
"Maybe Ötzi was just a tourist, maybe his parents were Sardinian and he decided to move to the Alps," Sikora said.
That would have required Ötzi's family to travel hundreds of miles, an unlikely prospect, Sikora said.
"Five thousand years ago, it's not really expected that our populations were so mobile," Sikora told LiveScience.
To answer that question, Sikora's team sequenced Ötzi's entire genome and compared it with those from hundreds of modern-day Europeans, as well as the genomes of a Stone Age hunter-gatherer found in Sweden, a farmer from Sweden, a 7,000-year-old hunter-gatherer iceman found in Iberia, and an Iron Age man found in Bulgaria.
The team confirmed that, of modern people, Sardinians are Ötzi's closest relatives. But among the prehistoric quartet, Ötzi most closely resembled the farmers found in Bulgaria and Sweden, while the Swedish and Iberian hunter-gatherers looked more like present-day Northern Europeans.
The findings support the notion that people migrating from the Middle East all the way to Northern Europe brought agriculture with them and mixed with the native hunter-gatherers, enabling the population to explode, Sikora said.
Read more at Discovery News
Labels:
Archeology,
Biology,
History,
Human,
Science
Nov 9, 2012
Pre-Drinking Is a Risky Way to Begin an Evening out
Previous research from the U.S. and the U.K. has shown that "pre-drinking" or "frontloading" often leads to heavy drinking by young people in public settings and can lead to greater harm. Pre-drinking typically occurs in locations where low-cost alcohol that is usually bought off-premise is consumed, rapidly and in large quantities. A study using Swiss data has found that pre-drinking, when combined with on-premise drinking, leads to almost twice as much drinking and negative outcomes.
Results will be published in the February 2013 issue of Alcoholism: Clinical & Experimental Research and are currently available at Early View.
"At first glance, it might seem that pre-drinking is not so prevalent in Switzerland," said Florian Labhart, a researcher at Addiction Switzerland as well as corresponding author for the study. "However, pre-drinking has been found in about one third of all on-premise drinking, which is a very high rate. Considering that pre-drinking leads people to consume nearly twice the normal amount of alcohol on a given night, its prevalence should not be underestimated from a public-health perspective."
"Only recently has pre-drinking -- also referred to as pre-partying, pre-gaming, pre-loading, or pre-funking -- been identified and introduced into the empirical alcohol literature," said Shannon R. Kenney, visiting assistant research professor and associate director of the Heads Up Research Lab in the Psychology Department at Loyola Marymount University in Los Angeles, California. "Although pre-drinking has not received the attention it deserves thus far, it appears that researchers are beginning to recognize the importance of gaining a better understanding of this risky and prevalent drinking context."
Kenney added that existing studies of pre-drinking/pre-partying have revealed similar prevalence rates in the United States and Europe. "In fact, due to U.S. legal drinking age requirements, pre-drinking may be most prevalent among underage drinkers in the U.S.," she said. "Research shows that underage drinkers may be motivated to pre-drink to achieve a 'buzz' or become intoxicated before going to a licensed premise where they cannot legally consume alcohol, such as a bar, club, concert, or sporting event."
Researchers examined the drinking practices of 183 young adults (97 women, 86 men) with a mean age of 23 years from three higher-education institutions in Switzerland, using the recently developed Internet-based Cell phone optimized Assessment Technique (ICAT) to assess alcohol consumption and drinking location at six time points (from 5 p.m. to the next morning) on Thursdays, Fridays, and Saturdays during five consecutive weeks by means of the participants' cell phones. A total of 7,828 assessments were provided for analysis for 1,441 evenings. The study authors examined the association between pre-drinking, overnight drinking levels, and adverse outcomes.
"Pre-drinking is a pernicious drinking pattern that is likely to lead people to cumulate two normal drinking occasions -- one off-premise followed by one on-premise -- and generally results in excessive alcohol consumption," said Labhart. "Excessive consumption and adverse consequences are not simply related to the type of people who pre-drink, but rather to the practice of pre-drinking itself."
"Moreover," said Kenney, "pre-drinking tended to involve further drinking throughout the evening. That is to say that pre-drinking did not reduce or replace the amount of post pre-drinking consumption, but enhanced risk through increased consumption."
"In terms of specific adverse or risky outcomes from drinking," said Labhart, "47.5 percent of the men and women in the study reported the following outcomes: hangover (40.7% men, 36.1% women), unplanned substance use (20.9% and 12.4%), blackouts (11.6% and 7.2%), unintended or unprotected sexual intercourse (8.1% and 5.2%), injured self or someone else (5.8% and 3.1%), and property damage or vandalism (3.5% and 0.0%). Logically, given the large amounts of alcohol consumed, blackouts and hangover were especially prevalent on pre-drinking evenings."
"These findings hold important implications for prevention practices by highlighting the advantages of event-level intra-individual assessment," said Kenney. "The authors utilized a cell-phone based method that allowed for the assessment of participant's alcohol consumption and drinking locations throughout the evening. This type of novel data collection technique appears to offer much promise for research of young adult and adolescent drinking, both with respect to pre-drinking and more generally."
"Changing the location during a night increases the overall amount of alcohol consumption," added Labhart. "It's important that young people count the number of drinks they have during a night and to remember how many drinks they had already when they reach a new drinking location." The study authors also recommend prevention practices that incorporate educational interventions as well as structural measures such as reduced late-night off-sale opening hours, and more staff training regarding responsible beverage-service practices.
Read more at Science Daily
Results will be published in the February 2013 issue of Alcoholism: Clinical & Experimental Research and are currently available at Early View.
"At first glance, it might seem that pre-drinking is not so prevalent in Switzerland," said Florian Labhart, a researcher at Addiction Switzerland as well as corresponding author for the study. "However, pre-drinking has been found in about one third of all on-premise drinking, which is a very high rate. Considering that pre-drinking leads people to consume nearly twice the normal amount of alcohol on a given night, its prevalence should not be underestimated from a public-health perspective."
"Only recently has pre-drinking -- also referred to as pre-partying, pre-gaming, pre-loading, or pre-funking -- been identified and introduced into the empirical alcohol literature," said Shannon R. Kenney, visiting assistant research professor and associate director of the Heads Up Research Lab in the Psychology Department at Loyola Marymount University in Los Angeles, California. "Although pre-drinking has not received the attention it deserves thus far, it appears that researchers are beginning to recognize the importance of gaining a better understanding of this risky and prevalent drinking context."
Kenney added that existing studies of pre-drinking/pre-partying have revealed similar prevalence rates in the United States and Europe. "In fact, due to U.S. legal drinking age requirements, pre-drinking may be most prevalent among underage drinkers in the U.S.," she said. "Research shows that underage drinkers may be motivated to pre-drink to achieve a 'buzz' or become intoxicated before going to a licensed premise where they cannot legally consume alcohol, such as a bar, club, concert, or sporting event."
Researchers examined the drinking practices of 183 young adults (97 women, 86 men) with a mean age of 23 years from three higher-education institutions in Switzerland, using the recently developed Internet-based Cell phone optimized Assessment Technique (ICAT) to assess alcohol consumption and drinking location at six time points (from 5 p.m. to the next morning) on Thursdays, Fridays, and Saturdays during five consecutive weeks by means of the participants' cell phones. A total of 7,828 assessments were provided for analysis for 1,441 evenings. The study authors examined the association between pre-drinking, overnight drinking levels, and adverse outcomes.
"Pre-drinking is a pernicious drinking pattern that is likely to lead people to cumulate two normal drinking occasions -- one off-premise followed by one on-premise -- and generally results in excessive alcohol consumption," said Labhart. "Excessive consumption and adverse consequences are not simply related to the type of people who pre-drink, but rather to the practice of pre-drinking itself."
"Moreover," said Kenney, "pre-drinking tended to involve further drinking throughout the evening. That is to say that pre-drinking did not reduce or replace the amount of post pre-drinking consumption, but enhanced risk through increased consumption."
"In terms of specific adverse or risky outcomes from drinking," said Labhart, "47.5 percent of the men and women in the study reported the following outcomes: hangover (40.7% men, 36.1% women), unplanned substance use (20.9% and 12.4%), blackouts (11.6% and 7.2%), unintended or unprotected sexual intercourse (8.1% and 5.2%), injured self or someone else (5.8% and 3.1%), and property damage or vandalism (3.5% and 0.0%). Logically, given the large amounts of alcohol consumed, blackouts and hangover were especially prevalent on pre-drinking evenings."
"These findings hold important implications for prevention practices by highlighting the advantages of event-level intra-individual assessment," said Kenney. "The authors utilized a cell-phone based method that allowed for the assessment of participant's alcohol consumption and drinking locations throughout the evening. This type of novel data collection technique appears to offer much promise for research of young adult and adolescent drinking, both with respect to pre-drinking and more generally."
"Changing the location during a night increases the overall amount of alcohol consumption," added Labhart. "It's important that young people count the number of drinks they have during a night and to remember how many drinks they had already when they reach a new drinking location." The study authors also recommend prevention practices that incorporate educational interventions as well as structural measures such as reduced late-night off-sale opening hours, and more staff training regarding responsible beverage-service practices.
Read more at Science Daily
Comet Collisions Every Six Seconds Explain 17-Year-Old Stellar Mystery
Every six seconds, for millions of years, comets have been colliding with one another near a star in the constellation Cetus called 49 CETI, which is visible to the naked eye.
Over the past three decades, astronomers have discovered hundreds of dusty disks around stars, but only two -- 49 CETI is one -- have been found that also have large amounts of gas orbiting them.
Young stars, about a million years old, have a disk of both dust and gas orbiting them, but the gas tends to dissipate within a few million years and almost always within about 10 million years. Yet 49 CETI, which is thought to be considerably older, is still being orbited by a tremendous quantity of gas in the form of carbon monoxide molecules, long after that gas should have dissipated.
"We now believe that 49 CETI is 40 million years old, and the mystery is how in the world can there be this much gas around an otherwise ordinary star that is this old," said Benjamin Zuckerman, a UCLA professor of physics and astronomy and co-author of the research, which was recently published in the Astrophysical Journal. "This is the oldest star we know of with so much gas."
Zuckerman and his co-author Inseok Song, a University of Georgia assistant professor of physics and astronomy, propose that the mysterious gas comes from a very massive disk-shaped region around 49 CETI that is similar to the sun's Kuiper Belt, which lies beyond the orbit of Neptune.
The total mass of the various objects that make up the Kuiper Belt, including the dwarf planet Pluto, is about one-tenth the mass of Earth. But back when Earth was forming, astronomers say, the Kuiper Belt likely had a mass that was approximately 40 times larger than Earth's; most of that initial mass has been lost in the last 4.5 billion years.
By contrast, the Kuiper Belt analogue that orbits around 49 CETI now has a mass of about 400 Earth masses -- 4,000 times the current mass of the Kuiper Belt.
"Hundreds of trillions of comets orbit around 49 CETI and one other star whose age is about 30 million years. Imagine so many trillions of comets, each the size of the UCLA campus -- approximately 1 mile in diameter -- orbiting around 49 CETI and bashing into one another," Zuckerman said. "These young comets likely contain more carbon monoxide than typical comets in our solar system. When they collide, the carbon monoxide escapes as a gas. The gas seen around these two stars is the result of the incredible number of collisions among these comets.
"We calculate that comets collide around these two stars about every six seconds," he said. "I was absolutely amazed when we calculated this rapid rate. I would not have dreamt it in a million years. We think these collisions have been occurring for 10 million years or so."
Read more at Science Daily
Over the past three decades, astronomers have discovered hundreds of dusty disks around stars, but only two -- 49 CETI is one -- have been found that also have large amounts of gas orbiting them.
Young stars, about a million years old, have a disk of both dust and gas orbiting them, but the gas tends to dissipate within a few million years and almost always within about 10 million years. Yet 49 CETI, which is thought to be considerably older, is still being orbited by a tremendous quantity of gas in the form of carbon monoxide molecules, long after that gas should have dissipated.
"We now believe that 49 CETI is 40 million years old, and the mystery is how in the world can there be this much gas around an otherwise ordinary star that is this old," said Benjamin Zuckerman, a UCLA professor of physics and astronomy and co-author of the research, which was recently published in the Astrophysical Journal. "This is the oldest star we know of with so much gas."
Zuckerman and his co-author Inseok Song, a University of Georgia assistant professor of physics and astronomy, propose that the mysterious gas comes from a very massive disk-shaped region around 49 CETI that is similar to the sun's Kuiper Belt, which lies beyond the orbit of Neptune.
The total mass of the various objects that make up the Kuiper Belt, including the dwarf planet Pluto, is about one-tenth the mass of Earth. But back when Earth was forming, astronomers say, the Kuiper Belt likely had a mass that was approximately 40 times larger than Earth's; most of that initial mass has been lost in the last 4.5 billion years.
By contrast, the Kuiper Belt analogue that orbits around 49 CETI now has a mass of about 400 Earth masses -- 4,000 times the current mass of the Kuiper Belt.
"Hundreds of trillions of comets orbit around 49 CETI and one other star whose age is about 30 million years. Imagine so many trillions of comets, each the size of the UCLA campus -- approximately 1 mile in diameter -- orbiting around 49 CETI and bashing into one another," Zuckerman said. "These young comets likely contain more carbon monoxide than typical comets in our solar system. When they collide, the carbon monoxide escapes as a gas. The gas seen around these two stars is the result of the incredible number of collisions among these comets.
"We calculate that comets collide around these two stars about every six seconds," he said. "I was absolutely amazed when we calculated this rapid rate. I would not have dreamt it in a million years. We think these collisions have been occurring for 10 million years or so."
Read more at Science Daily
Link Found Between Child Prodigies and Autism
A new study of eight child prodigies suggests a possible link between these children's special skills and autism. Of the eight prodigies studied, three had a diagnosis of autism spectrum disorders. As a group, the prodigies also tended to have slightly elevated scores on a test of autistic traits, when compared to a control group.
In addition, half of the prodigies had a family member or a first- or second-degree relative with an autism diagnosis.
The fact that half of the families and three of the prodigies themselves were affected by autism is surprising because autism occurs in only one of 120 individuals, said Joanne Ruthsatz, lead author of the study and assistant professor of psychology at Ohio State University's Mansfield campus.
"The link between child prodigies and autism is strong in our study," Ruthsatz said. "Our findings suggest child prodigies have traits in common with autistic children, but something is preventing them from displaying the deficits we associate with the disorder."
The study also found that while child prodigies had elevated general intelligence scores, where they really excelled was in working memory -- all of them scored above the 99th percentile on this trait.
Ruthsatz conducted the study with Jourdan Urbach of Yale University. Their results were published in a recent issue of the journal Intelligence.
For the study, the researchers identified eight child prodigies through the internet and television specials and by referral. The group included one art prodigy, one math prodigy, four musical prodigies and two who switched domains (one from music to gourmet cooking, and one from music to art). The study included six males and two females.
The researchers met with each prodigy individually over the course of two or three days. During that time, the prodigies completed the Stanford-Binet intelligence test, which included sub-tests on fluid reasoning, knowledge, quantitative reasoning, visual spatial abilities and working memory.
In addition, the researchers administered the Autism-Spectrum Quotient assessment, which scores the level of autistic traits. The prodigies' scores on the test were compared to a control group of 174 adults who were contacted randomly by mail.
Ruthsatz said the most striking data was that which identified autistic traits among the prodigies.
The prodigies showed a general elevation in autistic traits compared to the control group, but this elevation was on average even smaller than that seen in high-functioning autistic people diagnosed with Asperger's syndrome.
Autism is a developmental disability characterized by problems with communicating and socializing and a strong resistance to change. People with Asperger's are more likely than those with autism to have normal intelligence, but tend to have difficulties with social interaction.
The prodigies did score higher than the control group and the Asperger's group on one subsection of the autism assessment: attention to detail.
"These prodigies had an absolutely amazing memory for detail," she said. "They don't miss anything, which certainly helps them achieve the successes they have."
Ruthsatz said it was not the three prodigies who were diagnosed with autism who were driving this particular finding. In fact, the three autistic prodigies scored an average of 8 on attention to detail, compared to 8.5 for the entire group of prodigies.
On the intelligence test, the prodigies scored in the gifted range, but were not uniformly exceptional. While five of the eight prodigies scored in the 90th percentile or above on the IQ test, one scored at the 70th percentile and another at the 79th percentile.
But just as they did in the autism assessment, the prodigies stood out in one of the sub-tests of the intelligence test. In this case, the prodigies showed an exceptional working memory, with all of them scoring above the 99th percentile.
Working memory is the system in the brain that allows people to hold multiple pieces of information in mind for a short time in order to complete a task.
The findings paint a picture of what it takes to create a prodigy, Ruthsatz said.
"Overall, what we found is that prodigies have an elevated general intelligence and exceptional working memory, along with an elevated autism score, with exceptional attention to detail," Ruthsatz said.
These results suggest that prodigies share some striking similarities with autistic savants -- people who have the developmental disabilities associated with autism combined with an extraordinary talent or knowledge that is well beyond average.
"But while autistic savants display many of the deficits commonly associated with autism, the child prodigies do not," Ruthsatz said. "The question is why."
The answer may be some genetic mutation that allows prodigies to have the extreme talent found in savants, but without the deficits seen in autism. But the answer will require more study, Ruthsatz said.
Read more at Science Daily
In addition, half of the prodigies had a family member or a first- or second-degree relative with an autism diagnosis.
The fact that half of the families and three of the prodigies themselves were affected by autism is surprising because autism occurs in only one of 120 individuals, said Joanne Ruthsatz, lead author of the study and assistant professor of psychology at Ohio State University's Mansfield campus.
"The link between child prodigies and autism is strong in our study," Ruthsatz said. "Our findings suggest child prodigies have traits in common with autistic children, but something is preventing them from displaying the deficits we associate with the disorder."
The study also found that while child prodigies had elevated general intelligence scores, where they really excelled was in working memory -- all of them scored above the 99th percentile on this trait.
Ruthsatz conducted the study with Jourdan Urbach of Yale University. Their results were published in a recent issue of the journal Intelligence.
For the study, the researchers identified eight child prodigies through the internet and television specials and by referral. The group included one art prodigy, one math prodigy, four musical prodigies and two who switched domains (one from music to gourmet cooking, and one from music to art). The study included six males and two females.
The researchers met with each prodigy individually over the course of two or three days. During that time, the prodigies completed the Stanford-Binet intelligence test, which included sub-tests on fluid reasoning, knowledge, quantitative reasoning, visual spatial abilities and working memory.
In addition, the researchers administered the Autism-Spectrum Quotient assessment, which scores the level of autistic traits. The prodigies' scores on the test were compared to a control group of 174 adults who were contacted randomly by mail.
Ruthsatz said the most striking data was that which identified autistic traits among the prodigies.
The prodigies showed a general elevation in autistic traits compared to the control group, but this elevation was on average even smaller than that seen in high-functioning autistic people diagnosed with Asperger's syndrome.
Autism is a developmental disability characterized by problems with communicating and socializing and a strong resistance to change. People with Asperger's are more likely than those with autism to have normal intelligence, but tend to have difficulties with social interaction.
The prodigies did score higher than the control group and the Asperger's group on one subsection of the autism assessment: attention to detail.
"These prodigies had an absolutely amazing memory for detail," she said. "They don't miss anything, which certainly helps them achieve the successes they have."
Ruthsatz said it was not the three prodigies who were diagnosed with autism who were driving this particular finding. In fact, the three autistic prodigies scored an average of 8 on attention to detail, compared to 8.5 for the entire group of prodigies.
On the intelligence test, the prodigies scored in the gifted range, but were not uniformly exceptional. While five of the eight prodigies scored in the 90th percentile or above on the IQ test, one scored at the 70th percentile and another at the 79th percentile.
But just as they did in the autism assessment, the prodigies stood out in one of the sub-tests of the intelligence test. In this case, the prodigies showed an exceptional working memory, with all of them scoring above the 99th percentile.
Working memory is the system in the brain that allows people to hold multiple pieces of information in mind for a short time in order to complete a task.
The findings paint a picture of what it takes to create a prodigy, Ruthsatz said.
"Overall, what we found is that prodigies have an elevated general intelligence and exceptional working memory, along with an elevated autism score, with exceptional attention to detail," Ruthsatz said.
These results suggest that prodigies share some striking similarities with autistic savants -- people who have the developmental disabilities associated with autism combined with an extraordinary talent or knowledge that is well beyond average.
"But while autistic savants display many of the deficits commonly associated with autism, the child prodigies do not," Ruthsatz said. "The question is why."
The answer may be some genetic mutation that allows prodigies to have the extreme talent found in savants, but without the deficits seen in autism. But the answer will require more study, Ruthsatz said.
Read more at Science Daily
Internet Becomes Next Nostradamus for Allergy Season
While it's believed that Nostradamus' prophecies predicted many historical events, his digital successor, the Internet, may be foreseeing the height of allergy suffering. According to allergist Leonard Bielory, M.D., American College of Allergy, Asthma and Immunology (ACAAI) board member, Google search volume is shedding light on the most common allergy symptoms, when searches peak and how they pertain to pollen types.
In his research, being presented at the ACAAI Annual Scientific Meeting, Dr. Bielory found that, due to tree pollens, nasal allergy symptoms are the most common searches from March through May. Symptoms include sneezing, a runny and itchy nose, and stuffiness due to blockage or congestion.
"Allergy sufferers experience heighted allergy symptoms in the spring season, and again during September due to weed pollen and grass season," said Dr. Bielory. "The peak week for all allergy symptom searches is the second week of May, suggesting sufferers may be experiencing both spring and summer allergy symptoms."
Nasal allergy symptoms were also commonly searched during the fall months. The second most common symptom, based on search volume, is eye allergies.
With spring allergy season being only four short months away, the ACAAI advises sufferers to schedule an appointment with their board-certified allergist during the winter months to find relief.
"Treating symptoms early, before they appear, means less suffering," said Dr. Bielory. "An allergist will develop a customized treatment plan to keep you living an active, healthy lifestyle."
According to an ACAAI patient survey, board-certified allergists are successful in treating up to 90 percent of patients with seasonal allergies and 70 to 80 percent with perennial allergies.
Read more at Science Daily
In his research, being presented at the ACAAI Annual Scientific Meeting, Dr. Bielory found that, due to tree pollens, nasal allergy symptoms are the most common searches from March through May. Symptoms include sneezing, a runny and itchy nose, and stuffiness due to blockage or congestion.
"Allergy sufferers experience heighted allergy symptoms in the spring season, and again during September due to weed pollen and grass season," said Dr. Bielory. "The peak week for all allergy symptom searches is the second week of May, suggesting sufferers may be experiencing both spring and summer allergy symptoms."
Nasal allergy symptoms were also commonly searched during the fall months. The second most common symptom, based on search volume, is eye allergies.
With spring allergy season being only four short months away, the ACAAI advises sufferers to schedule an appointment with their board-certified allergist during the winter months to find relief.
"Treating symptoms early, before they appear, means less suffering," said Dr. Bielory. "An allergist will develop a customized treatment plan to keep you living an active, healthy lifestyle."
According to an ACAAI patient survey, board-certified allergists are successful in treating up to 90 percent of patients with seasonal allergies and 70 to 80 percent with perennial allergies.
Read more at Science Daily
Labels:
Biology,
History,
Human,
Science,
Technology
Nov 8, 2012
Vampire Skeleton Rediscovered in Britain
Details of one of the few "vampire" burials in Britain have emerged as a new archaeological report details the long forgotten discovery of a skeleton found buried with metal spikes through shoulders, heart area and ankles.
Dating from 550-700 A.D., the skeleton was unearthed in 1959 in the minster town of Southwell, Nottinghamshire, during excavations in preparation for a new school. The dig also turned up Roman remains.
Archaeologist Charles Daniels immediately recognized the skeletal remains as being out of the ordinary, but no further investigation was carried out at that time.
"Daniels did jokingly comment he had 'checked the eye teeth,' clearly associating the skeleton with the vampire being," Matthew Beresford, of Southwell Archaeology told Discovery News.
"However, the skeleton had largely been forgotten about since then," Beresford said.
The author of a detailed report on the Southwell skeleton, as well as other two books on the subject of the "vampire" being, Beresford looked at the wider context for the burial, including excavations that occurred in the past decades near the "deviant" burial.
He learned 225 skeletons were discovered in the area in 1971.
"We can only ponder as to whether any of those skeletons had a similar practice bestowed upon them," he said.
Only a handful of deviant burials have been recognized in the UK. "Dangerous dead" such as vampires were interred with particular rituals to prevent them rising from their graves and attacking the living.
"Throughout the Anglo-Saxon period the 'punishment' of being buried in water-logged ground, face down, decapitated, staked or otherwise was reserved for thieves, murderers or traitors," Beresford wrote.
The treatment was later extended to all those who did not conform to society's rules.
"These were adulterers, disrupters of the peace, the unpious or oath-breaker. Which of these the Southwell skeleton was we will never know," Beresford said.
The archaeologist believes the remains of the skeleton may still be buried on the site where they originally lay as Daniels admitted he was unable to retrieve the body completely from the ground.
"There is a final twist in the tale of the Southwell vampire. It seems he was not the last person to be buried in the town who the locals feared might return to plague the living," Beresford wrote on his website.
Historical accounts report that in 1822 one Henry Standley was found guilty of the murder of a hawker named John Dale. Arrested, Standley was found then dead in his cell.
"He had committed suicide by hanging," Beresford said.
A local newspaper report dated Feb. 15, 1822 reveals that Standley was buried near the crossroads and a stake was driven through his body, suggesting that fear for the dead rising from the grave did exist in British society in the 1820s.
Read more at Discovery News
Dating from 550-700 A.D., the skeleton was unearthed in 1959 in the minster town of Southwell, Nottinghamshire, during excavations in preparation for a new school. The dig also turned up Roman remains.
Archaeologist Charles Daniels immediately recognized the skeletal remains as being out of the ordinary, but no further investigation was carried out at that time.
"Daniels did jokingly comment he had 'checked the eye teeth,' clearly associating the skeleton with the vampire being," Matthew Beresford, of Southwell Archaeology told Discovery News.
"However, the skeleton had largely been forgotten about since then," Beresford said.
The author of a detailed report on the Southwell skeleton, as well as other two books on the subject of the "vampire" being, Beresford looked at the wider context for the burial, including excavations that occurred in the past decades near the "deviant" burial.
He learned 225 skeletons were discovered in the area in 1971.
"We can only ponder as to whether any of those skeletons had a similar practice bestowed upon them," he said.
Only a handful of deviant burials have been recognized in the UK. "Dangerous dead" such as vampires were interred with particular rituals to prevent them rising from their graves and attacking the living.
"Throughout the Anglo-Saxon period the 'punishment' of being buried in water-logged ground, face down, decapitated, staked or otherwise was reserved for thieves, murderers or traitors," Beresford wrote.
The treatment was later extended to all those who did not conform to society's rules.
"These were adulterers, disrupters of the peace, the unpious or oath-breaker. Which of these the Southwell skeleton was we will never know," Beresford said.
The archaeologist believes the remains of the skeleton may still be buried on the site where they originally lay as Daniels admitted he was unable to retrieve the body completely from the ground.
"There is a final twist in the tale of the Southwell vampire. It seems he was not the last person to be buried in the town who the locals feared might return to plague the living," Beresford wrote on his website.
Historical accounts report that in 1822 one Henry Standley was found guilty of the murder of a hawker named John Dale. Arrested, Standley was found then dead in his cell.
"He had committed suicide by hanging," Beresford said.
A local newspaper report dated Feb. 15, 1822 reveals that Standley was buried near the crossroads and a stake was driven through his body, suggesting that fear for the dead rising from the grave did exist in British society in the 1820s.
Read more at Discovery News
Labels:
Archeology,
Biology,
History,
Human,
Science
'Alien' Horned Dinosaur Discovered
Paleontologists in Canada have discovered fossils of a new 2-ton, 20-foot-long horned dinosaur that roamed the Earth about 80 million years ago. And its headgear would've put on quite a show for the ladies.
The dinosaur, a distant cousin of Triceratops called Xenoceratops foremostensis, is one of the oldest specimens known to date of the ceratopsid group. The beast's name, Xenoceratops, translates to "alien horned-face," referring to its strange pattern of horns on its head and above its brow, and the rarity of such horned dinosaurs in this part of the fossil record.
"It seems to have the general types of ornamentation that we see taken to even greater extremes in later ceratopsids," said David Evans, a paleontologist at the Royal Ontario Museum. "That suggests the elaborate headgear evolved earlier."
A dinosaur in a drawer
In 1958, paleontologist Wann Langston Jr. discovered fragments of three skulls (now known to belong to Xenoceratops) in a rock formation in the badlands of Alberta, Canada. Though the area is now scrubby woodlands filled with hoodoos and sandstone hills, between 77 million and 90 million years ago, the dinosaur's stomping grounds were part of a river system filled with lush vegetation.
But Langston was busy with other discoveries, so he tossed the fossil fragments into a drawer at the Canadian Museum of Nature in Ottawa and promptly forgot about them.
In 2003, Evans and his colleagues learned of the fragments. The team was trying to fill in gaps in the fossil record for the late Cretaceous Period, when some of the most iconic dinosaurs, such as Tyrannosaurus rex and Triceratops, evolved.
As they pieced together the skull fragments and analyzed the distinctive ornamentation on the skull, they realized that Xenoceratops was a completely new species.
"The frills and hooks are the calling card of the ceratopsian species," Evans told LiveScience. "We knew instantly that it was a brand new type of horned dinosaur."
Oddities evolve
Xenoceratops was about the size of a rhinoceros -- about 20 feet (6 meters) long including the tail -- and weighed about 2 tons, Evans said. The dinosaur used its birdlike beak to graze on the cattails, ferns and flowers in primeval river deltas.
The species most distinct feature, however, is its spiky head: Two hooks jutting from its forehead, two massive spikes rest at the top of its head and a frilly shield adorns its neck.
The new species helps fill in a gap in the evolutionary record, said Andrew Farke, a paleontologist at the Raymond M. Alf Museum of Paleontology in Claremont, Calif.
"The bits of anatomy that are preserved on this species give us a lot of great information about how horned dinosaurs as a group evolved," said Farke, who was not involved in the study.
The stags of the dinosaur world, male Xenoceratops probably used their outlandish headgear to show dominance or impress the females, increasing their odds of reproducing, Evans said.
Read more at Discovery News
The dinosaur, a distant cousin of Triceratops called Xenoceratops foremostensis, is one of the oldest specimens known to date of the ceratopsid group. The beast's name, Xenoceratops, translates to "alien horned-face," referring to its strange pattern of horns on its head and above its brow, and the rarity of such horned dinosaurs in this part of the fossil record.
"It seems to have the general types of ornamentation that we see taken to even greater extremes in later ceratopsids," said David Evans, a paleontologist at the Royal Ontario Museum. "That suggests the elaborate headgear evolved earlier."
A dinosaur in a drawer
In 1958, paleontologist Wann Langston Jr. discovered fragments of three skulls (now known to belong to Xenoceratops) in a rock formation in the badlands of Alberta, Canada. Though the area is now scrubby woodlands filled with hoodoos and sandstone hills, between 77 million and 90 million years ago, the dinosaur's stomping grounds were part of a river system filled with lush vegetation.
But Langston was busy with other discoveries, so he tossed the fossil fragments into a drawer at the Canadian Museum of Nature in Ottawa and promptly forgot about them.
In 2003, Evans and his colleagues learned of the fragments. The team was trying to fill in gaps in the fossil record for the late Cretaceous Period, when some of the most iconic dinosaurs, such as Tyrannosaurus rex and Triceratops, evolved.
As they pieced together the skull fragments and analyzed the distinctive ornamentation on the skull, they realized that Xenoceratops was a completely new species.
"The frills and hooks are the calling card of the ceratopsian species," Evans told LiveScience. "We knew instantly that it was a brand new type of horned dinosaur."
Oddities evolve
Xenoceratops was about the size of a rhinoceros -- about 20 feet (6 meters) long including the tail -- and weighed about 2 tons, Evans said. The dinosaur used its birdlike beak to graze on the cattails, ferns and flowers in primeval river deltas.
The species most distinct feature, however, is its spiky head: Two hooks jutting from its forehead, two massive spikes rest at the top of its head and a frilly shield adorns its neck.
The new species helps fill in a gap in the evolutionary record, said Andrew Farke, a paleontologist at the Raymond M. Alf Museum of Paleontology in Claremont, Calif.
"The bits of anatomy that are preserved on this species give us a lot of great information about how horned dinosaurs as a group evolved," said Farke, who was not involved in the study.
The stags of the dinosaur world, male Xenoceratops probably used their outlandish headgear to show dominance or impress the females, increasing their odds of reproducing, Evans said.
Read more at Discovery News
Flying Dino Too Weak to Lift Off?
Bad news dragon riders: Your dragon can't take off.
A new analysis of the largest of pterodactyls suggests they were too big and their muscles too weak to vault into the air and fly. Instead, they were right at the upper limit of animal flight and needed a hill or stiff breeze so they could soar like hang gliders.
The new analysis was done on the enormous pterosaur Quetzalcoatlus from Late Cretaceous rocks of Big Bend, Texas. Quetzalcoatlus had a wingspan of about 35 feet (10.6 meters), or about the wingspan of a F-16 fighter. It was among the last pterodactyls to look down on dinosaurs 65 million years ago.
The new study, presented on Nov. 7 at the meeting of the Geological Society of America in Charlotte, N.C., puts the mass of the flying reptile at around 155 pounds (70 kilograms). That's near the upper limit of what flesh and bone can support in flight, according to paleontologist Sankar Chatterjee of Texas Tech University in Lubbock.
Chatterjee was spurred to do the research by claims from other researchers that Quetzalcoatlus weighed a great deal more -- up to 440 pounds (200 kilograms) -- and took off by jumping from all fours into the air (called "quad launch").
The claims got a lot of attention, but concerned Chatterjee. So he put Quetzalcoatlus through aeronautical computer simulation that he has used on other pterosaurs to see what would work.
"There's no way this animal could take off from the ground," said Chatterjee of the quad launch, especially of a more massive animal. "There is no way it could fly."
At least not by jumping directly into the air and taking flight, he said. As for the greater weight suggested by others, that doesn't work in his model either. Despite the fact that Quetzalcoatlus was as large as a giraffe, it could not have weighed more than a medium-sized adult human, he said.
"This is the upper limit for any flying animal," said Chatterjee. "This is really the highest limit for there to be able to fly. Above that, they can't even flap." The only way they were able to make Quetzalcoatlus fly at all, he said, was by employing a hand glider approach to take offs.
Other researchers, however, are sticking to their quad launch hypothesis, partly because they can't see how Quetzalcoatlus could weigh as little as 70 kilograms.
"These animals have 2.5- to three-meter-long (8.2- to 9.8-feet-long) heads, three-meter necks, torsos as large as an adult man and walking limbs that were 2.5 meters long," said paleontologist Mark Witton of the University of Portsmouth in the United Kingdom. Quetzalcoatlus skeletons alone weigh 20 kilograms (44 pounds), leaving 50 kilograms (110 pounds) of soft-tissue to cover a giraffe sized skeleton. "(That) leads to one atrophied pterosaur!"
A number of researchers using different techniques are now arguing that 200 kilograms or more is a realistic mass for these giant animals, Witton told Discovery News. That's still very light for an animal of those proportions, he said. But it gives it enough muscle to match its skeleton.
Read more at Discovery News
A new analysis of the largest of pterodactyls suggests they were too big and their muscles too weak to vault into the air and fly. Instead, they were right at the upper limit of animal flight and needed a hill or stiff breeze so they could soar like hang gliders.
The new analysis was done on the enormous pterosaur Quetzalcoatlus from Late Cretaceous rocks of Big Bend, Texas. Quetzalcoatlus had a wingspan of about 35 feet (10.6 meters), or about the wingspan of a F-16 fighter. It was among the last pterodactyls to look down on dinosaurs 65 million years ago.
The new study, presented on Nov. 7 at the meeting of the Geological Society of America in Charlotte, N.C., puts the mass of the flying reptile at around 155 pounds (70 kilograms). That's near the upper limit of what flesh and bone can support in flight, according to paleontologist Sankar Chatterjee of Texas Tech University in Lubbock.
Chatterjee was spurred to do the research by claims from other researchers that Quetzalcoatlus weighed a great deal more -- up to 440 pounds (200 kilograms) -- and took off by jumping from all fours into the air (called "quad launch").
The claims got a lot of attention, but concerned Chatterjee. So he put Quetzalcoatlus through aeronautical computer simulation that he has used on other pterosaurs to see what would work.
"There's no way this animal could take off from the ground," said Chatterjee of the quad launch, especially of a more massive animal. "There is no way it could fly."
At least not by jumping directly into the air and taking flight, he said. As for the greater weight suggested by others, that doesn't work in his model either. Despite the fact that Quetzalcoatlus was as large as a giraffe, it could not have weighed more than a medium-sized adult human, he said.
"This is the upper limit for any flying animal," said Chatterjee. "This is really the highest limit for there to be able to fly. Above that, they can't even flap." The only way they were able to make Quetzalcoatlus fly at all, he said, was by employing a hand glider approach to take offs.
Other researchers, however, are sticking to their quad launch hypothesis, partly because they can't see how Quetzalcoatlus could weigh as little as 70 kilograms.
"These animals have 2.5- to three-meter-long (8.2- to 9.8-feet-long) heads, three-meter necks, torsos as large as an adult man and walking limbs that were 2.5 meters long," said paleontologist Mark Witton of the University of Portsmouth in the United Kingdom. Quetzalcoatlus skeletons alone weigh 20 kilograms (44 pounds), leaving 50 kilograms (110 pounds) of soft-tissue to cover a giraffe sized skeleton. "(That) leads to one atrophied pterosaur!"
A number of researchers using different techniques are now arguing that 200 kilograms or more is a realistic mass for these giant animals, Witton told Discovery News. That's still very light for an animal of those proportions, he said. But it gives it enough muscle to match its skeleton.
Read more at Discovery News
Pick Your Favorite Doomsday
Volcanoes are giving asteroids a run for their money in the case of the biggest mass extinction event in Earth's history. The latest case has been made at the Geological Society of America meeting in Charlotte this week.
The researchers report they have found new evidence that the worst extinction event ever -- the end-Permian mass extinction some 250 million years ago -- was caused by a severe bout of global warming created by greenhouse gases released by a series of super-massive volcanic eruptions in Siberia. A similar case has been studied for several years now as a major contributor to the extinction of the dinosaurs 65 million years ago (the suspect gases at that time were released by gigantic Deccan Traps eruption in India).
I'm not going to analyze the science behind these hypotheses. But I do want to point out how curiously well mass extinction theories they mesh with popular thinking at any given time.
Right now there is a great deal of interest in climate change as a driver of major extinction events. Why? One reason is that we have a lot of people studying paleoclimates trying to learn as much as possible about the past so we can better understand our planet. Another reason is that the biggest threat to our planet and survival right now is anthropogenic climate change. But the story goes back much further than that.
In the 1980s climate change wasn't in the news yet and the asteroid impact hypothesis was gaining ground as the primary cause of at least the end-Cretaceous, dino-killing event. The scenario went like this: the asteroid crashed down and stirred up so much dust that Earth experienced a few dark years of what's called a nuclear winter. That winter disrupted the food chain and dinosaurs were wiped out. Of course, the important sociopolitical context for the asteroid impact hypothesis in the 1980s was the horrific "nuclear winter" that threatened humanity due to our proliferation of nuclear arms during the Cold War. However, this isn't the end of the story.
Before the 1980s, the idea of an asteroid impact would have been considered absurd. It went against a 19th century geological concept known as uniformitarianism or gradualism, in which the majority of Earth's changes happen very, very slowly over enormous periods of time. Gradualism was a dominant tenet of geological thinking almost to the end of the 20th century. But we still have one more twist to this tale.
Before gradualism took over, there was something called catastrophism, which was originally based on the Old Testament and considered the Earth young and the geological record we see today as the product of a series of divinely-inspired catastrophes (of course many religious fundamentalists still hold this view, although today it is blatantly anti-scientific). Once catastrophism was supplanted by science and gradualism, any scientist who talked about global catastrophes raining down from the heavens came off sounding like a throwback. That's why it was an uphill battle for hypotheses like asteroid impacts and supervolcanoes to get traction in the late 20th century.
Read more at Discovery News
The researchers report they have found new evidence that the worst extinction event ever -- the end-Permian mass extinction some 250 million years ago -- was caused by a severe bout of global warming created by greenhouse gases released by a series of super-massive volcanic eruptions in Siberia. A similar case has been studied for several years now as a major contributor to the extinction of the dinosaurs 65 million years ago (the suspect gases at that time were released by gigantic Deccan Traps eruption in India).
I'm not going to analyze the science behind these hypotheses. But I do want to point out how curiously well mass extinction theories they mesh with popular thinking at any given time.
Right now there is a great deal of interest in climate change as a driver of major extinction events. Why? One reason is that we have a lot of people studying paleoclimates trying to learn as much as possible about the past so we can better understand our planet. Another reason is that the biggest threat to our planet and survival right now is anthropogenic climate change. But the story goes back much further than that.
In the 1980s climate change wasn't in the news yet and the asteroid impact hypothesis was gaining ground as the primary cause of at least the end-Cretaceous, dino-killing event. The scenario went like this: the asteroid crashed down and stirred up so much dust that Earth experienced a few dark years of what's called a nuclear winter. That winter disrupted the food chain and dinosaurs were wiped out. Of course, the important sociopolitical context for the asteroid impact hypothesis in the 1980s was the horrific "nuclear winter" that threatened humanity due to our proliferation of nuclear arms during the Cold War. However, this isn't the end of the story.
Before the 1980s, the idea of an asteroid impact would have been considered absurd. It went against a 19th century geological concept known as uniformitarianism or gradualism, in which the majority of Earth's changes happen very, very slowly over enormous periods of time. Gradualism was a dominant tenet of geological thinking almost to the end of the 20th century. But we still have one more twist to this tale.
Before gradualism took over, there was something called catastrophism, which was originally based on the Old Testament and considered the Earth young and the geological record we see today as the product of a series of divinely-inspired catastrophes (of course many religious fundamentalists still hold this view, although today it is blatantly anti-scientific). Once catastrophism was supplanted by science and gradualism, any scientist who talked about global catastrophes raining down from the heavens came off sounding like a throwback. That's why it was an uphill battle for hypotheses like asteroid impacts and supervolcanoes to get traction in the late 20th century.
Read more at Discovery News
Super-Earth Discovered in Star's Habitable Zone
The family of planets circling a relatively close dwarf star has grown to six, including a potential rocky world at least seven times more massive than Earth that is properly located for liquid water to exist on its surface, a condition believed to be necessary for life.
Scientists added three new planets to three discovered in 2008 orbiting an orange star called HD 40307, which is roughly three-quarters as massive as the sun and located about 42 light-years away in the constellation Pictor.
Of particular interest is the outermost planet, which is believed to fly around its parent star over 200 days, a distance that places it within HD 40307's so-called "habitable zone."
The planet's five siblings are all believed to be too close to the star and therefore too hot for water to exist in a liquid state.
"The planetary system around HD 40307 has an architecture radically different from that of the solar system," lead researchers Mikko Tuomi, with the University of Hertfordshire in the United Kingdom, and Guillem Anglada-Escude, with Germany's University of Goettingen, write in a paper to be published in the journal Astronomy & Astrophysics.
The finding suggests there may be many ways for a planet to end up in a star's habitable zone, the astronomers added.
More detailed studies of HD 40307's brood are unlikely because the planets do not appear to transit, or pass in front of, their parent star, relative to Earth's line of sight.
The new findings are based on a re-analysis and refinement of data collected by Europe's High Accuracy Radial velocity Planet Searcher (HARPS) instrument, a light-splitting spectrograph installed on Europe's La Silla Observatory in Chile. Planets beyond the solar system can be detected by tiny gravitational tugs they exert of the light coming from their parent stars.
To find HD 40307's sixth planet, scientists had to make the difficult distinction between starlight impacted by a planet's gravity and the effects of stellar activity, such as flares and magnetic storms.
"All we know at this point is that it has a minimum mass of about 7.1 Earth-masses. We have no explicit follow-up planned, thought the HARPS team is probably still gathering more data, and may in the future be able to confirm these results, and perhaps add even more planets to the brood," astronomer Steven Vogt, with the University of California's Lick Observatory, wrote in an email to Discovery News.
"We feel pretty comfortable that these six panets are all there," Vogt said.
Astronomers have no hard evidence that the sixth planet is a rocky world, however, but they point out that recent observations of hot super-Earths transiting bright nearby stars indicate a good fraction of the planets can be made of rock.
Whatever its composition, HD 40307's sixth planet would receive about 62 percent of the radiation that Earth gets from the sun.
Read more at Discovery News
Scientists added three new planets to three discovered in 2008 orbiting an orange star called HD 40307, which is roughly three-quarters as massive as the sun and located about 42 light-years away in the constellation Pictor.
Of particular interest is the outermost planet, which is believed to fly around its parent star over 200 days, a distance that places it within HD 40307's so-called "habitable zone."
The planet's five siblings are all believed to be too close to the star and therefore too hot for water to exist in a liquid state.
"The planetary system around HD 40307 has an architecture radically different from that of the solar system," lead researchers Mikko Tuomi, with the University of Hertfordshire in the United Kingdom, and Guillem Anglada-Escude, with Germany's University of Goettingen, write in a paper to be published in the journal Astronomy & Astrophysics.
The finding suggests there may be many ways for a planet to end up in a star's habitable zone, the astronomers added.
More detailed studies of HD 40307's brood are unlikely because the planets do not appear to transit, or pass in front of, their parent star, relative to Earth's line of sight.
The new findings are based on a re-analysis and refinement of data collected by Europe's High Accuracy Radial velocity Planet Searcher (HARPS) instrument, a light-splitting spectrograph installed on Europe's La Silla Observatory in Chile. Planets beyond the solar system can be detected by tiny gravitational tugs they exert of the light coming from their parent stars.
To find HD 40307's sixth planet, scientists had to make the difficult distinction between starlight impacted by a planet's gravity and the effects of stellar activity, such as flares and magnetic storms.
"All we know at this point is that it has a minimum mass of about 7.1 Earth-masses. We have no explicit follow-up planned, thought the HARPS team is probably still gathering more data, and may in the future be able to confirm these results, and perhaps add even more planets to the brood," astronomer Steven Vogt, with the University of California's Lick Observatory, wrote in an email to Discovery News.
"We feel pretty comfortable that these six panets are all there," Vogt said.
Astronomers have no hard evidence that the sixth planet is a rocky world, however, but they point out that recent observations of hot super-Earths transiting bright nearby stars indicate a good fraction of the planets can be made of rock.
Whatever its composition, HD 40307's sixth planet would receive about 62 percent of the radiation that Earth gets from the sun.
Read more at Discovery News
Nov 7, 2012
Fossils and Genes Brought Together to Piece Together Evolutionary History
Paleontology, with its rocks and fossils, seems far removed from the world of developmental genetics, with its petri dishes and embryos. Whereas paleontology strives to determine "What happened in evolution?," developmental genetics uses gene control in embryos to try to answer "How did it happen?" Combined, the two approaches can lead to remarkable insights that benefit both fields.
In the current issue of the Journal of Vertebrate Paleontology, Hans Thewissen, Ingalls-Brown Professor at Northeast Ohio Medical University (NEOMED), and his colleagues review recent studies that have used modern genetic techniques to shed light on fossils, and vice versa. "It is a very exciting time to be an evolutionary scientist. So many researchers are investigating evolution, either by finding new fossils or by figuring out the genes that underlie changes in evolution. Now it is possible to combine those two fields and go beyond what each field could have accomplished on its own," said Dr. Thewissen.
Their review discusses the profound evolutionary changes that brought about some of the more spectacular animals of today and the past, including dolphins, whales, snakes, bats, elephants, and dinosaurs. For instance, although the transition from a four-legged ancestor to something with only two forelimbs, like a dolphin, or no limbs at all, like a snake, may seem like a big leap, transitional fossils have been discovered that bridge these gaps. Additionally, using developmental genetics, researchers have come to understand that these large changes in shape involved relatively small changes in the working of just of few genes.
Perhaps even more fascinating, recent research has discovered that similarly shaped organisms may not have experienced similar developmental changes in their past. Cetaceans (whales and dolphins) and snakes both lost limbs independently from their respective ancestors through evolution, but they did so in different ways. Snakes lost their forelimbs by basically getting rid of their neck region and leaving no room for forelimbs. During snake embryonic development, no limb buds form in that region of the body. Snakes do still develop hind limb buds as embryos, but the genes that control their growth have been knocked out through the course of evolution, so hind limbs do not develop (except for small stubs in some snakes like pythons). This demonstrates that different developmental mechanisms can be at work even in the evolutionary history of a single animal. Whales and dolphins lost their hind limbs in a process similar to that of snakes.
Dr. Thewissen says, "For me personally, as someone who has spent most of his life studying fossil whales, it is very exciting to be able to use information from the development of living mammals, and use it to teach me about how whale evolution happened, 50 million years ago."
Scientists can even modify the genetic code of living animals to replicate changes that have been observed in the fossil record. As explained in the paper, it has been shown that heightened activity of a particular gene in mouse embryos causes their teeth to grow larger. A similar change occurred during the course of elephant evolution -- early elephants had teeth less than an inch long, while modern elephants have teeth over a foot in length. The genetic changes that brought about this increase in size in elephants may have resembled the ones induced in lab mice.
Read more at Science Daily
In the current issue of the Journal of Vertebrate Paleontology, Hans Thewissen, Ingalls-Brown Professor at Northeast Ohio Medical University (NEOMED), and his colleagues review recent studies that have used modern genetic techniques to shed light on fossils, and vice versa. "It is a very exciting time to be an evolutionary scientist. So many researchers are investigating evolution, either by finding new fossils or by figuring out the genes that underlie changes in evolution. Now it is possible to combine those two fields and go beyond what each field could have accomplished on its own," said Dr. Thewissen.
Their review discusses the profound evolutionary changes that brought about some of the more spectacular animals of today and the past, including dolphins, whales, snakes, bats, elephants, and dinosaurs. For instance, although the transition from a four-legged ancestor to something with only two forelimbs, like a dolphin, or no limbs at all, like a snake, may seem like a big leap, transitional fossils have been discovered that bridge these gaps. Additionally, using developmental genetics, researchers have come to understand that these large changes in shape involved relatively small changes in the working of just of few genes.
Perhaps even more fascinating, recent research has discovered that similarly shaped organisms may not have experienced similar developmental changes in their past. Cetaceans (whales and dolphins) and snakes both lost limbs independently from their respective ancestors through evolution, but they did so in different ways. Snakes lost their forelimbs by basically getting rid of their neck region and leaving no room for forelimbs. During snake embryonic development, no limb buds form in that region of the body. Snakes do still develop hind limb buds as embryos, but the genes that control their growth have been knocked out through the course of evolution, so hind limbs do not develop (except for small stubs in some snakes like pythons). This demonstrates that different developmental mechanisms can be at work even in the evolutionary history of a single animal. Whales and dolphins lost their hind limbs in a process similar to that of snakes.
Dr. Thewissen says, "For me personally, as someone who has spent most of his life studying fossil whales, it is very exciting to be able to use information from the development of living mammals, and use it to teach me about how whale evolution happened, 50 million years ago."
Scientists can even modify the genetic code of living animals to replicate changes that have been observed in the fossil record. As explained in the paper, it has been shown that heightened activity of a particular gene in mouse embryos causes their teeth to grow larger. A similar change occurred during the course of elephant evolution -- early elephants had teeth less than an inch long, while modern elephants have teeth over a foot in length. The genetic changes that brought about this increase in size in elephants may have resembled the ones induced in lab mice.
Read more at Science Daily
Complex Tool Find Argues for Early Human Smarts
Rocks carved into ancient stone arrowheads or into lethal tools for hurling spears suggest humans innovated relatively advanced weapons much earlier than thought, researchers in South Africa say.
The researchers' finds, partially exposed by a coastal storm, suggest ancient peoples were capable of complex forms of thinking, scientists added.
"These people were like you and I," researcher Curtis Marean, a paleoanthropologist at Arizona State University in Tempe, told LiveScience.
The early human brain
Modern humans originated in Africa about 200,000 years ago, but when modern human ways of thinking emerged remains controversial. For instance, some researchers note that the first signs of complex thought such as art appeared relatively late in history, suggesting that genetic mutations linked with modern human behavior occurred as recently as 40,000 years ago. Other scientists argue that modern human thought originated much earlier but that the evidence was largely lost to the rigors of time.
One potential sign of complex thought would be an elaborately produced artifact that would have required capabilities such as language to pass along the technique to future generations.
Some have argued that advanced technologies in Africa frequently appeared and disappeared over time. Now, however, tiny stone blades discovered on the south coast of South Africa and dating back about 71,000 years suggest advanced stoneworking techniques persisted for vast spans of history instead of flickering in and out of use.
"Every time we excavate a new site in coastal South Africa with advanced field techniques, we discover new and surprising results that push back in time the evidence for uniquely human behaviors," Marean said.
Continuity of history
These artifacts were discovered over the course of nine years at a site known as Pinnacle Point, in a storm-prone area with a temperate climate like San Francisco's. Initially, Marean found many artifacts and fossil bones on the beach. Then, one day, a storm exposed deposits of these materials from a cave higher up. So far the researchers have found sediments about 45 feet (14 meters) deep containing artifacts and fossils dating from approximately 50,000 to 90,000 years ago.
"As an archaeologist and scientist, it is a privilege to work on a site that preserves a near-perfect layered sequence capturing almost 50,000 years of human prehistory," said researcher Kyle Brown at the University of Cape Town, South Africa. "Our team has done a remarkable job of identifying some of the subtle but important clues to just how innovative these early humans on the south coast were."
The scientists uncovered thin blades of stone, known as microliths, each about 1.2 inches (3 centimeters) long at most. These were blunted along one edge so they could be glued onto slots carved in wood or bone. The stone used to produce these blades, silcrete (quartz grains cemented by silica), was carefully treated with heat to make it easier to shape.
These microliths could have found use as the earliest known arrowheads. However, researchers suggest they were more likely incorporated into spear-hurling devices known as atlatls. These spear-throwers were essentially sticks with spurs or cups to hold the projectile. Swinging the atlatl provided leverage to increase the distance and killing power of the hurled dart or spear.
The microliths may have served as spurs in these atlatls.
"They are parts of a complex composite weapon," Marean said.
Past research suggested microlithic technology appeared briefly between 60,000 and 65,000 years ago.
"Eleven thousand years of continuity is, in reality, an almost unimaginable time span for people to consistently make tools the same way," Marean said. "This is certainly not a flickering pattern." Moreover, heat treatment of stone was seen at Pinnacle Point about 160,000 years ago, suggesting people there mastered this complex technique for nearly 100,000 years.
The researchers suggest these projectile weapons were pivotal to the success of modern humans as they left Africa and encountered Neanderthals, who apparently possessed only hand-thrown spears.
"When Africans left Africa and entered Neanderthal territory, they had projectiles with greater killing reach, and these early moderns probably also had higher levels of pro-social, hyper-cooperative behavior. These two traits were a knockout punch. Combine them, as modern humans did and still do, and no prey or competitor is safe," Marean said. "This probably laid the foundation for the expansion out of Africa of modern humans and the extinction of many prey as well as our sister species such as Neanderthals."
Read more at Discovery News
The researchers' finds, partially exposed by a coastal storm, suggest ancient peoples were capable of complex forms of thinking, scientists added.
"These people were like you and I," researcher Curtis Marean, a paleoanthropologist at Arizona State University in Tempe, told LiveScience.
The early human brain
Modern humans originated in Africa about 200,000 years ago, but when modern human ways of thinking emerged remains controversial. For instance, some researchers note that the first signs of complex thought such as art appeared relatively late in history, suggesting that genetic mutations linked with modern human behavior occurred as recently as 40,000 years ago. Other scientists argue that modern human thought originated much earlier but that the evidence was largely lost to the rigors of time.
One potential sign of complex thought would be an elaborately produced artifact that would have required capabilities such as language to pass along the technique to future generations.
Some have argued that advanced technologies in Africa frequently appeared and disappeared over time. Now, however, tiny stone blades discovered on the south coast of South Africa and dating back about 71,000 years suggest advanced stoneworking techniques persisted for vast spans of history instead of flickering in and out of use.
"Every time we excavate a new site in coastal South Africa with advanced field techniques, we discover new and surprising results that push back in time the evidence for uniquely human behaviors," Marean said.
Continuity of history
These artifacts were discovered over the course of nine years at a site known as Pinnacle Point, in a storm-prone area with a temperate climate like San Francisco's. Initially, Marean found many artifacts and fossil bones on the beach. Then, one day, a storm exposed deposits of these materials from a cave higher up. So far the researchers have found sediments about 45 feet (14 meters) deep containing artifacts and fossils dating from approximately 50,000 to 90,000 years ago.
"As an archaeologist and scientist, it is a privilege to work on a site that preserves a near-perfect layered sequence capturing almost 50,000 years of human prehistory," said researcher Kyle Brown at the University of Cape Town, South Africa. "Our team has done a remarkable job of identifying some of the subtle but important clues to just how innovative these early humans on the south coast were."
The scientists uncovered thin blades of stone, known as microliths, each about 1.2 inches (3 centimeters) long at most. These were blunted along one edge so they could be glued onto slots carved in wood or bone. The stone used to produce these blades, silcrete (quartz grains cemented by silica), was carefully treated with heat to make it easier to shape.
These microliths could have found use as the earliest known arrowheads. However, researchers suggest they were more likely incorporated into spear-hurling devices known as atlatls. These spear-throwers were essentially sticks with spurs or cups to hold the projectile. Swinging the atlatl provided leverage to increase the distance and killing power of the hurled dart or spear.
The microliths may have served as spurs in these atlatls.
"They are parts of a complex composite weapon," Marean said.
Past research suggested microlithic technology appeared briefly between 60,000 and 65,000 years ago.
"Eleven thousand years of continuity is, in reality, an almost unimaginable time span for people to consistently make tools the same way," Marean said. "This is certainly not a flickering pattern." Moreover, heat treatment of stone was seen at Pinnacle Point about 160,000 years ago, suggesting people there mastered this complex technique for nearly 100,000 years.
The researchers suggest these projectile weapons were pivotal to the success of modern humans as they left Africa and encountered Neanderthals, who apparently possessed only hand-thrown spears.
"When Africans left Africa and entered Neanderthal territory, they had projectiles with greater killing reach, and these early moderns probably also had higher levels of pro-social, hyper-cooperative behavior. These two traits were a knockout punch. Combine them, as modern humans did and still do, and no prey or competitor is safe," Marean said. "This probably laid the foundation for the expansion out of Africa of modern humans and the extinction of many prey as well as our sister species such as Neanderthals."
Read more at Discovery News
'Bear Dog,' Big Cats Among Europe's Top Carnivores
Three tough mammals -- a huge "bear dog" and two saber-toothed cats -- were among Europe's top predators 9 million years ago, according to a new study.
The unusual toothy trio managed to coexist and thrive near what is now Madrid, Spain. The two cat species, lion-sized Machairodus aphanistus and the smaller leopard-sized Promegantereon ogygia, were in the same family Felidae as living big cats and domesticated housekitties.
The prehistoric cats lived together in a woodland area and likely hunted the same species: horses and wild boar.
"The killing technique of these two saber-toothed cats is through a bite to the throat of the immobilized prey that produced a quick death due to the damage inflicted to blood vessels and trachea," study author Soledad Domingo told Discovery News. "These cats used their long, flattened upper canines to cut the throat of their prey in a head-depression movement in which the mandible served as an anchor."
The bear dog (Magericyon anceps), as its name suggests, looked half bear and half dog.
"In fact they were neither dogs nor bears, but in a group of their own," said Domingo, a postdoctoral fellow at the University of Michigan Museum of Paleontology, who added that the entire family of these carnivores went extinct. "The features of the teeth of Magericyon anceps indicate that this bear dog was able to crush bones."
It likely hunted antelopes.
Domingo and colleagues used a dentist's drill with a diamond bit to sample teeth from 69 specimens, including 27 saber-toothed cats and bear dogs. The rest were plant eaters.
They isolated carbon from the tooth enamel and measured the ratio of the more massive carbon 13 molecules to the less massive carbon 12. Both forms are present in the carbon dioxide that plants take in during photosynthesis. When an herbivore eats a plant, the plant leaves an isotopic signature in the animal's bones and teeth.
"This would be the same in your tooth enamel today," Domingo explained. "If we sampled them, we could have an idea of what you eat. It's a signature that remains through time."
Although the saber-toothed cats went after the same prey, the researchers believe that the smaller species could have used tree cover to avoid encountering the larger cat. The bear dog hunted in a more open area that overlapped the cats' territory, but was slightly separated.
"By analogy with modern carnivorans, it is likely that encounters between carnivorans were aggressive and that they could have fought for prey or carcasses, however, it is not likely that these carnivorans had each other as common prey," Domingo said.
The toothy prehistoric trio lived during the late Miocene Period in a forested region that had patches of grassland. Today it's called Cerro de los Batallones.
Domingo, who has been excavating there for the past eight years, said two of its known nine fossil areas are ancient pits with an abundance of carnivore bones. Prey likely became trapped in the pits back in the day, leading to a feast for predators.
Read more at Discovery News
The unusual toothy trio managed to coexist and thrive near what is now Madrid, Spain. The two cat species, lion-sized Machairodus aphanistus and the smaller leopard-sized Promegantereon ogygia, were in the same family Felidae as living big cats and domesticated housekitties.
The prehistoric cats lived together in a woodland area and likely hunted the same species: horses and wild boar.
"The killing technique of these two saber-toothed cats is through a bite to the throat of the immobilized prey that produced a quick death due to the damage inflicted to blood vessels and trachea," study author Soledad Domingo told Discovery News. "These cats used their long, flattened upper canines to cut the throat of their prey in a head-depression movement in which the mandible served as an anchor."
The bear dog (Magericyon anceps), as its name suggests, looked half bear and half dog.
"In fact they were neither dogs nor bears, but in a group of their own," said Domingo, a postdoctoral fellow at the University of Michigan Museum of Paleontology, who added that the entire family of these carnivores went extinct. "The features of the teeth of Magericyon anceps indicate that this bear dog was able to crush bones."
It likely hunted antelopes.
Domingo and colleagues used a dentist's drill with a diamond bit to sample teeth from 69 specimens, including 27 saber-toothed cats and bear dogs. The rest were plant eaters.
They isolated carbon from the tooth enamel and measured the ratio of the more massive carbon 13 molecules to the less massive carbon 12. Both forms are present in the carbon dioxide that plants take in during photosynthesis. When an herbivore eats a plant, the plant leaves an isotopic signature in the animal's bones and teeth.
"This would be the same in your tooth enamel today," Domingo explained. "If we sampled them, we could have an idea of what you eat. It's a signature that remains through time."
Although the saber-toothed cats went after the same prey, the researchers believe that the smaller species could have used tree cover to avoid encountering the larger cat. The bear dog hunted in a more open area that overlapped the cats' territory, but was slightly separated.
"By analogy with modern carnivorans, it is likely that encounters between carnivorans were aggressive and that they could have fought for prey or carcasses, however, it is not likely that these carnivorans had each other as common prey," Domingo said.
The toothy prehistoric trio lived during the late Miocene Period in a forested region that had patches of grassland. Today it's called Cerro de los Batallones.
Domingo, who has been excavating there for the past eight years, said two of its known nine fossil areas are ancient pits with an abundance of carnivore bones. Prey likely became trapped in the pits back in the day, leading to a feast for predators.
Read more at Discovery News
Humans Caused Historic Great Barrier Reef Collapse
The expansion of European settlement in Australia triggered a massive coral collapse at the Great Barrier Reef more than 50 years ago, according to a new study.
The study, published Nov. 6 in the Proceedings of the Royal Society B, found that runoff from farms clouded the pristine waters off the Queensland coast and killed the natural branching coral species, leaving a stunted, weedy type of coral in its place. The findings suggest that decades before climate change and reef tourism, humans were disrupting the ecology of the Great Barrier Reef.
"There was a very significant shift in the coral community composition that was associated with the colonization of Queensland," said study co-author John Pandolfi, a marine biologist at the University of Queensland Australia.
Europeans began to colonize Queensland, Australia, in the 1860s, cutting down forests to make way for sheep grazing and sugar plantations. By the 1930s, large amounts of fertilizer and pesticide-laden runoff poured from rivers into the nearby ocean.
Several recent studies have shown that snorkelers and climate change kill coral, and one study found that half of the majestic Great Barrier Reef has vanished over the last 30 years.
But Pandolfi's team wondered whether humans had been altering reef ecology for much longer.
To find out, the team drilled sediment cores, 6.5 to 16.5 feet (2 to 5 meters) long, from the seafloor at Pelorus Island, an island fringed by coral reefs off the Queensland coast. When coral dies, new coral sprout on the skeletons of old organisms and ocean sediments gradually bury them in place, Pandolfi told LiveScience.
By dating different layers of that sediment, the team reconstructed the story of the reef.
The fast-growing Acropora coral dominated the reef for a millennium. This massive, three-dimensional coral can grow to 16 feet (5 m) high and span 65 feet (20 m) across, forming a labyrinth of nooks and crannies for marine life to hide in, Pandolfi said.
"They're like the big buildings in the city, they house a lot of the biodiversity" he said.
But somewhere between 1920 and 1955, the Acropora stopped growing altogether and a slow-growing, spindly coral called Pavona took its place.
That spelled trouble for the panoply of animal species that shelter in the reef, and for the nearby coastline, because the native Acropora species provide wave resistance to shelter harbors.
Read more at Discovery News
The study, published Nov. 6 in the Proceedings of the Royal Society B, found that runoff from farms clouded the pristine waters off the Queensland coast and killed the natural branching coral species, leaving a stunted, weedy type of coral in its place. The findings suggest that decades before climate change and reef tourism, humans were disrupting the ecology of the Great Barrier Reef.
"There was a very significant shift in the coral community composition that was associated with the colonization of Queensland," said study co-author John Pandolfi, a marine biologist at the University of Queensland Australia.
Europeans began to colonize Queensland, Australia, in the 1860s, cutting down forests to make way for sheep grazing and sugar plantations. By the 1930s, large amounts of fertilizer and pesticide-laden runoff poured from rivers into the nearby ocean.
Several recent studies have shown that snorkelers and climate change kill coral, and one study found that half of the majestic Great Barrier Reef has vanished over the last 30 years.
But Pandolfi's team wondered whether humans had been altering reef ecology for much longer.
To find out, the team drilled sediment cores, 6.5 to 16.5 feet (2 to 5 meters) long, from the seafloor at Pelorus Island, an island fringed by coral reefs off the Queensland coast. When coral dies, new coral sprout on the skeletons of old organisms and ocean sediments gradually bury them in place, Pandolfi told LiveScience.
By dating different layers of that sediment, the team reconstructed the story of the reef.
The fast-growing Acropora coral dominated the reef for a millennium. This massive, three-dimensional coral can grow to 16 feet (5 m) high and span 65 feet (20 m) across, forming a labyrinth of nooks and crannies for marine life to hide in, Pandolfi said.
"They're like the big buildings in the city, they house a lot of the biodiversity" he said.
But somewhere between 1920 and 1955, the Acropora stopped growing altogether and a slow-growing, spindly coral called Pavona took its place.
That spelled trouble for the panoply of animal species that shelter in the reef, and for the nearby coastline, because the native Acropora species provide wave resistance to shelter harbors.
Read more at Discovery News
Nov 6, 2012
Living Abroad Can Bring Success, If You Do It Right
"Travel broadens the mind" goes the old adage, and potential employers often agree, valuing the open-mindedness and creativity fostered by such worldliness. But according to new Tel Aviv University research, not all international experiences are created equal.
"Although living abroad does help to hone creative abilities, not all individuals who have lived abroad derive an equal benefit from such experiences," explains Dr. Carmit Tadmor of TAU's Recanati School of Business, who conducted the study with Dr. Adam Galinsky of the Kellogg School of Management at Northwestern University and Dr. William Maddux of the international graduate business school and research institution INSEAD.
The researchers discovered that the simple act of living abroad was not enough to bolster creative and professional success. The potential benefits of extended international travel depend on the ability to simultaneously identify with both home and host cultures, which the researchers call "biculturalism." Identifying with two cultures simultaneously fosters a more complex thinking style that views things from multiple perspectives and forges conceptual links among them.
"Unlike patterns of cultural identification in which individuals endorse only one of the two cultures, bicultural identification requires individuals to take into account and combine the perspectives of both old and new cultures," explains Dr. Tadmor. "Over time, this information processing capability, or 'integrative complexity,' becomes a tool for making sense of the world and will help individuals perform better in both creative and professional domains."
This study was recently published in the Journal of Personality and Social Psychology.
Measuring creative and professional success
The researchers conducted three experiments to determine the impact of biculturalism when living abroad. In the first, 78 MBA students comprising 26 different nationalities at a European business school were asked to complete a series of tasks, including a standard creativity task that asked for as many uses for a brick as possible within a two-minute time limit. In the second experiment, a group of 54 MBA students comprising 18 nationalities at an American business school were asked to describe the new businesses, products, and processes they had invented during their careers. All of the study participants had lived abroad for a period of time.
The studies found that those who identified with both their host culture and their home culture consistently demonstrated more fluency, flexibility, novelty and innovation.
Finally, the third experiment extended the idea, exploring whether the biculturals' advantages also gave them an advantage in the workplace. In this study, 100 Israelis living and working mainly in California's Silicon Valley were interviewed. The researchers found that Israelis who identified with both their home and host cultures enjoyed higher promotion rates and more positive reputations among their colleagues. Across all three studies, the researchers found that bicultural individuals ranked higher on integrative complexity tests than the other participants, and this drove their success.
Taking the hard road to success
The road to biculturalism is fraught with internal conflicts, notes Dr. Tadmor, in which two cultural identities struggle to coexist. It's much easier to surround yourself with your expat community than to straddle two separate worlds. But bypassing the conflicts means giving up the best benefits. Integrative complexity, which is responsible for creative and professional success, evolves through the repetitive resolution of these internal conflicts.
Read more at Science Daily
"Although living abroad does help to hone creative abilities, not all individuals who have lived abroad derive an equal benefit from such experiences," explains Dr. Carmit Tadmor of TAU's Recanati School of Business, who conducted the study with Dr. Adam Galinsky of the Kellogg School of Management at Northwestern University and Dr. William Maddux of the international graduate business school and research institution INSEAD.
The researchers discovered that the simple act of living abroad was not enough to bolster creative and professional success. The potential benefits of extended international travel depend on the ability to simultaneously identify with both home and host cultures, which the researchers call "biculturalism." Identifying with two cultures simultaneously fosters a more complex thinking style that views things from multiple perspectives and forges conceptual links among them.
"Unlike patterns of cultural identification in which individuals endorse only one of the two cultures, bicultural identification requires individuals to take into account and combine the perspectives of both old and new cultures," explains Dr. Tadmor. "Over time, this information processing capability, or 'integrative complexity,' becomes a tool for making sense of the world and will help individuals perform better in both creative and professional domains."
This study was recently published in the Journal of Personality and Social Psychology.
Measuring creative and professional success
The researchers conducted three experiments to determine the impact of biculturalism when living abroad. In the first, 78 MBA students comprising 26 different nationalities at a European business school were asked to complete a series of tasks, including a standard creativity task that asked for as many uses for a brick as possible within a two-minute time limit. In the second experiment, a group of 54 MBA students comprising 18 nationalities at an American business school were asked to describe the new businesses, products, and processes they had invented during their careers. All of the study participants had lived abroad for a period of time.
The studies found that those who identified with both their host culture and their home culture consistently demonstrated more fluency, flexibility, novelty and innovation.
Finally, the third experiment extended the idea, exploring whether the biculturals' advantages also gave them an advantage in the workplace. In this study, 100 Israelis living and working mainly in California's Silicon Valley were interviewed. The researchers found that Israelis who identified with both their home and host cultures enjoyed higher promotion rates and more positive reputations among their colleagues. Across all three studies, the researchers found that bicultural individuals ranked higher on integrative complexity tests than the other participants, and this drove their success.
Taking the hard road to success
The road to biculturalism is fraught with internal conflicts, notes Dr. Tadmor, in which two cultural identities struggle to coexist. It's much easier to surround yourself with your expat community than to straddle two separate worlds. But bypassing the conflicts means giving up the best benefits. Integrative complexity, which is responsible for creative and professional success, evolves through the repetitive resolution of these internal conflicts.
Read more at Science Daily
Dino Named After Lord of the Rings' Sauron
Earlier this year a team of palaeontologists came into the possession of what appeared to be a 95 million-year-old skull cap from a previously unknown dinosaur. Further analysis showed that the bone likely belonged to a carcharodontosaurid -- an offshoot of the familiar Allosaurus. Given its unique domed skull, the researchers concluded that it was in fact a newly discovered species, one they've decided to name after the demonic Sauron from the Lord of the Rings series.
It's full name is Sauroniops pachytholus, a massive bipedal carcharodontosaur that lived during the Cretaceous period. The paleontologists, Andrea Cau, Fabio Dalla Vecchia, and Matteo Fabbri, felt that the single fragment provided enough evidence to warrant the classification of an entirely new species, and their work describing the new dinosaur has since been published in Acta Palaeontologica Polonica.
Interestingly, the discovery now adds credence to the hypothesis that a fourth large theropod existed in the Cenomanian of Morocco together with Carcharodontosaurus, Deltadromeus, and Spinosaurus (yes, all four of them at the same time — must have been a nice place to visit).
Unfortunately, however, the limited bone fragment reveals achingly little about Sauron. That said, the researchers speculate that it was more than 30 feet in length, and that it was probably just as large as the Carcharodontosaurus. The palaeontologists are obviously hoping to find more fossils to be absolutely sure.
There's also the prominent bump on its head. Brian Switek from Smithsonian offers some theories as to its function:
Read more at Discovery News
It's full name is Sauroniops pachytholus, a massive bipedal carcharodontosaur that lived during the Cretaceous period. The paleontologists, Andrea Cau, Fabio Dalla Vecchia, and Matteo Fabbri, felt that the single fragment provided enough evidence to warrant the classification of an entirely new species, and their work describing the new dinosaur has since been published in Acta Palaeontologica Polonica.
Interestingly, the discovery now adds credence to the hypothesis that a fourth large theropod existed in the Cenomanian of Morocco together with Carcharodontosaurus, Deltadromeus, and Spinosaurus (yes, all four of them at the same time — must have been a nice place to visit).
Unfortunately, however, the limited bone fragment reveals achingly little about Sauron. That said, the researchers speculate that it was more than 30 feet in length, and that it was probably just as large as the Carcharodontosaurus. The palaeontologists are obviously hoping to find more fossils to be absolutely sure.
There's also the prominent bump on its head. Brian Switek from Smithsonian offers some theories as to its function:
Why did such a large theropod have a prominent bump on its head? In other theropod lineages, such as the abelisaurids, bumps, knobs and horns are common forms of ornamentation. Perhaps the same was true for Sauroniops -- thanks to Acrocanthosaurus and the sail-backed Concavenator, we know that carcharodontosaurs showed off with visual signals. Then again, Cau and coauthors speculate that the dome might have been a sexual signal or might have even been used in head-butting behavior. I think the last hypothesis is unlikely, especially since we don't know what the microstructure of the dome looks like and there's no evidence of pathology, but it's still a distant possibility.
Read more at Discovery News
Hunting for the Real 'Planet X'
The announcement of the discovery of exoplanet Alpha Centauri Bb on Oct. 16 is a testimony to how far planetary detection techniques have come over the last few decades.
It brings the total of confirmed exoplanets -- or "extra-solar planets" -- to a staggering 825. However, the search for planets in our own solar system has subsided since the pioneering days at the end of the 18th century with the discovery of Uranus and almost one hundred years later with the identification of Neptune. The idea of another planet, dubbed 'Planet X,' inspired astronomers to keep searching for yet another 100 years in a hunt that was full of twists and turns.
The hunt for Planet X began in 1781 when British astronomer Sir William Herschel was studying stars in the constellation of Taurus and noticed one star seemed slightly fuzzy or nebulous in appearance. A few days later it seemed to have moved position -- he concluded it was a comet. Further study revealed it was actually a planet -- Uranus -- the seventh planet in our solar system and beyond the orbit of Saturn.
Detailed observations of Uranus' movement revealed an orbit that seemed to be influenced by another, even more distant, object. Mathematicians studying the data predicted the position of an eighth planet before it was officially discovered. Visual confirmation of Neptune's existence was announced in 1846.
Using the same techniques to study the orbital characteristics of both Uranus and Neptune revealed they were both still being tugged at by the gravitational force from another unknown object. The search for the ninth planet in the solar system began and it was American astronomer Percival Lowell who identified possible candidates.
Some years after Lowell's death in 1930, Pluto was identified by Clyde Tombaugh (an astronomer working at Lowell Observatory) and was believed to be the final member of the solar system's planetary family.
However, the 1978 discovery of Pluto's moon Charon reopened the Planet X debate. Through accurate measurements of Charon's orbit, the mass of Pluto could be deduced. Ultimately it showed that the 'ninth planet' couldn't possibly have affected the orbits of Uranus and Neptune as observations appeared to show.
The renewed interest in Planet X was short lived as the Neptunian flyby by Voyager 2 in 1989 revealed its mass was less than thought. By reapplying this knowledge showed the outermost "ice giant" planets were behaving exactly as they should and the orbital perturbations were doewn to observational error. It seems the myth of Planet X had finally died.
This could have been the end of the Planet X saga, but recent studies of the Kuiper Belt -- a region of icy minor planets located in the outermost reaches of the solar system -- suggest this may not be the case.
Read more at Discovery News
It brings the total of confirmed exoplanets -- or "extra-solar planets" -- to a staggering 825. However, the search for planets in our own solar system has subsided since the pioneering days at the end of the 18th century with the discovery of Uranus and almost one hundred years later with the identification of Neptune. The idea of another planet, dubbed 'Planet X,' inspired astronomers to keep searching for yet another 100 years in a hunt that was full of twists and turns.
The hunt for Planet X began in 1781 when British astronomer Sir William Herschel was studying stars in the constellation of Taurus and noticed one star seemed slightly fuzzy or nebulous in appearance. A few days later it seemed to have moved position -- he concluded it was a comet. Further study revealed it was actually a planet -- Uranus -- the seventh planet in our solar system and beyond the orbit of Saturn.
Detailed observations of Uranus' movement revealed an orbit that seemed to be influenced by another, even more distant, object. Mathematicians studying the data predicted the position of an eighth planet before it was officially discovered. Visual confirmation of Neptune's existence was announced in 1846.
Using the same techniques to study the orbital characteristics of both Uranus and Neptune revealed they were both still being tugged at by the gravitational force from another unknown object. The search for the ninth planet in the solar system began and it was American astronomer Percival Lowell who identified possible candidates.
Some years after Lowell's death in 1930, Pluto was identified by Clyde Tombaugh (an astronomer working at Lowell Observatory) and was believed to be the final member of the solar system's planetary family.
However, the 1978 discovery of Pluto's moon Charon reopened the Planet X debate. Through accurate measurements of Charon's orbit, the mass of Pluto could be deduced. Ultimately it showed that the 'ninth planet' couldn't possibly have affected the orbits of Uranus and Neptune as observations appeared to show.
The renewed interest in Planet X was short lived as the Neptunian flyby by Voyager 2 in 1989 revealed its mass was less than thought. By reapplying this knowledge showed the outermost "ice giant" planets were behaving exactly as they should and the orbital perturbations were doewn to observational error. It seems the myth of Planet X had finally died.
This could have been the end of the Planet X saga, but recent studies of the Kuiper Belt -- a region of icy minor planets located in the outermost reaches of the solar system -- suggest this may not be the case.
Read more at Discovery News
Astrophysicist 'Discovers' Superman's Krypton
A prominent astrophysicist has pinned down a real location for Superman's fictional home planet of Krypton.
Krypton is found 27.1 light-years from Earth, in the southern constellation Corvus (The Crow), says Neil deGrasse Tyson, director of the American Museum of Natural History's Hayden Planetarium in New York City. The planet orbits the red dwarf star LHS 2520, which is cooler and smaller than our sun.
Tyson performed the celestial sleuthing at the request of DC Comics, which wanted to run a story about Superman's search for his home planet.
The new book -- Action Comics Superman #14, titled "Star Light, Star Bright" -- comes out Wednesday (Nov. 7). Tyson appears within its pages, aiding the Man of Steel on his quest.
"As a native of Metropolis, I was delighted to help Superman, who has done so much for my city over all these years," Tyson said in a statement. "And it's clear that if he weren't a superhero he would have made quite an astrophysicist."
You'll have to read "Star Light, Star Bright" to find out just how Superman and Tyson pinpoint Krypton. For amateur astronomers who want to spot the real star LHS 2520 in the night sky, here are its coordinates:
Right Ascension: 12 hours 10 minutes 5.77 seconds
Declination: -15 degrees 4 minutes 17.9 seconds
Proper Motion: 0.76 arcseconds per year, along 172.94 degrees from due north
Superman was born on Krytpon but was launched toward Earth as an infant by his father, Jor-El, just before the planet's destruction. After touching down in Kansas, Superman was raised as Clark Kent by a farmer and his wife.
Now Superman will apparently know exactly where he came from.
Read more at Discovery News
Krypton is found 27.1 light-years from Earth, in the southern constellation Corvus (The Crow), says Neil deGrasse Tyson, director of the American Museum of Natural History's Hayden Planetarium in New York City. The planet orbits the red dwarf star LHS 2520, which is cooler and smaller than our sun.
Tyson performed the celestial sleuthing at the request of DC Comics, which wanted to run a story about Superman's search for his home planet.
The new book -- Action Comics Superman #14, titled "Star Light, Star Bright" -- comes out Wednesday (Nov. 7). Tyson appears within its pages, aiding the Man of Steel on his quest.
"As a native of Metropolis, I was delighted to help Superman, who has done so much for my city over all these years," Tyson said in a statement. "And it's clear that if he weren't a superhero he would have made quite an astrophysicist."
You'll have to read "Star Light, Star Bright" to find out just how Superman and Tyson pinpoint Krypton. For amateur astronomers who want to spot the real star LHS 2520 in the night sky, here are its coordinates:
Right Ascension: 12 hours 10 minutes 5.77 seconds
Declination: -15 degrees 4 minutes 17.9 seconds
Proper Motion: 0.76 arcseconds per year, along 172.94 degrees from due north
Superman was born on Krytpon but was launched toward Earth as an infant by his father, Jor-El, just before the planet's destruction. After touching down in Kansas, Superman was raised as Clark Kent by a farmer and his wife.
Now Superman will apparently know exactly where he came from.
Read more at Discovery News
Nov 5, 2012
Cockatoo 'Can Make Its Own Tools'
A cockatoo from a species not known to use tools in the wild has been observed spontaneously making and using tools for reaching food and other objects.
A Goffin's cockatoo called 'Figaro', that has been reared in captivity and lives near Vienna, used his powerful beak to cut long splinters out of wooden beams in its aviary, or twigs out of a branch, to reach and rake in objects out of its reach. Researchers from the Universities of Oxford and Vienna filmed Figaro making and using these tools.
How the bird discovered how to make and use tools is unclear but shows how much we still don't understand about the evolution of innovative behaviour and intelligence.
A report of the research is published this week in Current Biology and an accompanying video showing the behaviour is available here: http://www.zoo.ox.ac.uk/group/kacelnik/movie_figaro_for_media.mov
Dr Alice Auersperg of the University of Vienna, who led the study, said: 'During our daily observation protocols, Figaro was playing with a small stone. At some point he inserted the pebble through the cage mesh, and it fell just outside his reach. After some unsuccessful attempts to reach it with his claw, he fetched a small stick and started fishing for his toy.
'To investigate this further we later placed a nut where the pebble had been and started to film. To our astonishment he did not go on searching for a stick but started biting a large splinter out of the aviary beam. He cut it when it was just the appropriate size and shape to serve as a raking tool to obtain the nut.
'It was already a surprise to see him use a tool, but we certainly did not expect him to make one by himself. From that time on, Figaro was successful on obtaining the nut every single time we placed it there, nearly each time making new tools. On one attempt he used an alternative solution, breaking a side arm off a branch and modifying the leftover piece to the appropriate size for raking.'
Professor Alex Kacelnik of Oxford University, an author of the study, said: 'Figaro shows us that, even when they are not habitual tool-users, members of a species that are curious, good problem-solvers, and large-brained, can sculpt tools out of a shapeless source material to fulfil a novel need.
'Even though Figaro is still alone in the species and among parrots in showing this capacity, his feat demonstrates that tool craftsmanship can emerge from intelligence not-specialized for tool use. Importantly, after making and using his first tool, Figaro seemed to know exactly what to do, and showed no hesitation in later trials.'
Professor Kacelnik previously led studies in the natural tool-using New Caledonian crows. One of them, named Betty, surprised scientists by fashioning hooks out of wire to retrieve food that was out of reach. These crows use and make tools in the wild, and live in groups that may support culture, but there was no precedent for Betty's form of hook making. Her case is still considered as a striking example of individual creativity and innovation, and Figaro seems ready to join her.
Read more at Science Daily
A Goffin's cockatoo called 'Figaro', that has been reared in captivity and lives near Vienna, used his powerful beak to cut long splinters out of wooden beams in its aviary, or twigs out of a branch, to reach and rake in objects out of its reach. Researchers from the Universities of Oxford and Vienna filmed Figaro making and using these tools.
How the bird discovered how to make and use tools is unclear but shows how much we still don't understand about the evolution of innovative behaviour and intelligence.
A report of the research is published this week in Current Biology and an accompanying video showing the behaviour is available here: http://www.zoo.ox.ac.uk/group/kacelnik/movie_figaro_for_media.mov
Dr Alice Auersperg of the University of Vienna, who led the study, said: 'During our daily observation protocols, Figaro was playing with a small stone. At some point he inserted the pebble through the cage mesh, and it fell just outside his reach. After some unsuccessful attempts to reach it with his claw, he fetched a small stick and started fishing for his toy.
'To investigate this further we later placed a nut where the pebble had been and started to film. To our astonishment he did not go on searching for a stick but started biting a large splinter out of the aviary beam. He cut it when it was just the appropriate size and shape to serve as a raking tool to obtain the nut.
'It was already a surprise to see him use a tool, but we certainly did not expect him to make one by himself. From that time on, Figaro was successful on obtaining the nut every single time we placed it there, nearly each time making new tools. On one attempt he used an alternative solution, breaking a side arm off a branch and modifying the leftover piece to the appropriate size for raking.'
Professor Alex Kacelnik of Oxford University, an author of the study, said: 'Figaro shows us that, even when they are not habitual tool-users, members of a species that are curious, good problem-solvers, and large-brained, can sculpt tools out of a shapeless source material to fulfil a novel need.
'Even though Figaro is still alone in the species and among parrots in showing this capacity, his feat demonstrates that tool craftsmanship can emerge from intelligence not-specialized for tool use. Importantly, after making and using his first tool, Figaro seemed to know exactly what to do, and showed no hesitation in later trials.'
Professor Kacelnik previously led studies in the natural tool-using New Caledonian crows. One of them, named Betty, surprised scientists by fashioning hooks out of wire to retrieve food that was out of reach. These crows use and make tools in the wild, and live in groups that may support culture, but there was no precedent for Betty's form of hook making. Her case is still considered as a striking example of individual creativity and innovation, and Figaro seems ready to join her.
Read more at Science Daily
Rare Specimens From a World-Class Skull Collection
If you were to go clicking down Alan Dudley’s anonymous-looking English street in Google’s Street View, there’d be no reason to stop outside his anonymous-looking English home. But inside, in a space no bigger than a child’s bedroom, Dudley has amassed one of the world’s most impressive private collections of skulls -- some 2,500 of them, incredibly well-organized and impeccably preserved. A good chunk of the animal kingdom is represented -- fish, reptiles, birds, and mammals fill the space.
British journalist Simon Winchester’s first reaction to the collection was horror. “I thought, ‘This is macabre, this is horrible, this is grotesque,’ because I was, I think like most of us, brought up to associate skulls with piracy or warning or danger or death,” Winchester says. “But then you see beneath the muscle and the skin something so beautiful, so finely constructed, that you can understand the fascination that someone like Dudley has. It may sound rather corny, but it gives you a new reverence for life.”
Winchester was so moved that he, along with an ex-BBC producer, created an app that showcased the collection. Though highly regarded when it launched last year, the app didn’t sell particularly well. But publishers in New York were interested in the material. When he was asked to turn the app into a book, Winchester happily agreed. The result is Skulls: An Exploration of Alan Dudley’s Curious Collection, which was published earlier this month. The book features hundreds of Dudley’s skulls, supplemented with rarer specimens and Winchester’s writings on skull lore and history. We spoke to Winchester about what he learned and the most interesting skulls he discovered. These are some of his favorites.
Above:
Babirusa from North Sulawesi
Winchester loves “the extraordinary canine teeth that look like horns but are actually teeth which curve back into its own head and make it look utterly weird.”
Read more at Wired Science
British journalist Simon Winchester’s first reaction to the collection was horror. “I thought, ‘This is macabre, this is horrible, this is grotesque,’ because I was, I think like most of us, brought up to associate skulls with piracy or warning or danger or death,” Winchester says. “But then you see beneath the muscle and the skin something so beautiful, so finely constructed, that you can understand the fascination that someone like Dudley has. It may sound rather corny, but it gives you a new reverence for life.”
Winchester was so moved that he, along with an ex-BBC producer, created an app that showcased the collection. Though highly regarded when it launched last year, the app didn’t sell particularly well. But publishers in New York were interested in the material. When he was asked to turn the app into a book, Winchester happily agreed. The result is Skulls: An Exploration of Alan Dudley’s Curious Collection, which was published earlier this month. The book features hundreds of Dudley’s skulls, supplemented with rarer specimens and Winchester’s writings on skull lore and history. We spoke to Winchester about what he learned and the most interesting skulls he discovered. These are some of his favorites.
Above:
Babirusa from North Sulawesi
Winchester loves “the extraordinary canine teeth that look like horns but are actually teeth which curve back into its own head and make it look utterly weird.”
Read more at Wired Science
Why Dogs Find Some Toys Boring
Ever bring a new toy home for your dog -- only for the gizmo to end up neglected and ignored on the floor?
It turns out there could be a way to avoid such flops in the future with new research detailing which toys will either interest or bore canines. The study, published in the journal Animal Cognition, sheds light on why dogs ignore some toys after just a minute of investigation, while other toys become coveted favorites.
"Because we think that dogs perceive toys in the same way that wolves perceive prey, they prefer toys that either taste like food or can be torn apart, however the latter can cause health problems if the dog accidentally swallows some of the pieces," co-author John Bradshaw, a researcher in the University of Bristol's Veterinary School, told Discovery News.
Co-author Anne Pullen, also at the University of Bristol, added that dog toys should be "soft, easily manipulable toys that can be chewed easily and/or make a noise."
As for what toys cause many dogs to grow bored, Pullen said, "Dogs quickly lose interest in toys with hard unyielding surfaces, and those that don't make a noise when manipulated."
The team, including colleague Ralph Merrill of the Waltham Center for Pet Nutrition, has studied canine play and dog toys for some time. Their latest study involved presenting multiple kennel-housed Labrador retrievers with one toy for 30-second periods until interaction ceased.
Prior research has looked at other dogs, but Labradors were chosen for this study "because they're are very popular pets," Merrill told Discovery News.
Bradshaw added that Labradors, due to their breeding, are one of the most playful breeds "and we had to be sure that the dogs we studied would play with the toys for a few minutes at least, otherwise we couldn't have measured what would get them playing again once they'd lost interest in the original toy."
They presented the dogs with toys of varying types, including different colors and odors. The researchers then gave the dogs a unique toy that contrasted with whatever one the canines were playing with first.
It was clear that all of the dogs showed intense, but transient, interest toward nearly all new toys. Dogs appear to be hard-wired to explore any novel object -- toy or not. In the case of toys, the problem is that dogs can become habituated to them quickly, which leads to boredom and neglected toys.
Changing the delay from habituation to presentation of the second toy, between 10 seconds and 15 minutes, did not affect the dogs' duration of play. No single toy characteristic altered the test results much either, suggesting that getting used to the stimulus qualities of a toy -- be they through smell, sound, color, texture -- is the clincher for canine boredom.
Read more at Discovery News
It turns out there could be a way to avoid such flops in the future with new research detailing which toys will either interest or bore canines. The study, published in the journal Animal Cognition, sheds light on why dogs ignore some toys after just a minute of investigation, while other toys become coveted favorites.
"Because we think that dogs perceive toys in the same way that wolves perceive prey, they prefer toys that either taste like food or can be torn apart, however the latter can cause health problems if the dog accidentally swallows some of the pieces," co-author John Bradshaw, a researcher in the University of Bristol's Veterinary School, told Discovery News.
Co-author Anne Pullen, also at the University of Bristol, added that dog toys should be "soft, easily manipulable toys that can be chewed easily and/or make a noise."
As for what toys cause many dogs to grow bored, Pullen said, "Dogs quickly lose interest in toys with hard unyielding surfaces, and those that don't make a noise when manipulated."
The team, including colleague Ralph Merrill of the Waltham Center for Pet Nutrition, has studied canine play and dog toys for some time. Their latest study involved presenting multiple kennel-housed Labrador retrievers with one toy for 30-second periods until interaction ceased.
Prior research has looked at other dogs, but Labradors were chosen for this study "because they're are very popular pets," Merrill told Discovery News.
Bradshaw added that Labradors, due to their breeding, are one of the most playful breeds "and we had to be sure that the dogs we studied would play with the toys for a few minutes at least, otherwise we couldn't have measured what would get them playing again once they'd lost interest in the original toy."
They presented the dogs with toys of varying types, including different colors and odors. The researchers then gave the dogs a unique toy that contrasted with whatever one the canines were playing with first.
It was clear that all of the dogs showed intense, but transient, interest toward nearly all new toys. Dogs appear to be hard-wired to explore any novel object -- toy or not. In the case of toys, the problem is that dogs can become habituated to them quickly, which leads to boredom and neglected toys.
Changing the delay from habituation to presentation of the second toy, between 10 seconds and 15 minutes, did not affect the dogs' duration of play. No single toy characteristic altered the test results much either, suggesting that getting used to the stimulus qualities of a toy -- be they through smell, sound, color, texture -- is the clincher for canine boredom.
Read more at Discovery News
Rarest Whale Seen for the First Time
The world's rarest whale, previously only known from a few bones, was seen for the first time on a New Zealand beach, according to a new Current Biology paper.
The elusive marine mammal is the spade-toothed beaked whale (Mesoplodon traversii). The good news is that it was seen at all, revealing that it still exists. The bad news is that the sighting was of a mother and her male calf, both of which became stranded and died on the beach.
"This is the first time this species -- a whale over five meters (about 16.5 feet) in length -- has ever been seen as a complete specimen, and we were lucky enough to find two of them," Rochelle Constantine of the University of Auckland said in a press release. "Up until now, all we have known about the spade-toothed beaked whale was from three partial skulls collected from New Zealand and Chile over a 140-year period. It is remarkable that we know almost nothing about such a large mammal."
The discovery actually happened two years ago, when the whales live-stranded and died on Opape Beach, New Zealand. It's only after DNA analysis that the identification of the rare species was made. At first, they were incorrectly identified as being the much more common Gray's beaked whales.
"When these specimens came to our lab, we extracted the DNA as we usually do for samples like these, and we were very surprised to find that they were spade-toothed beaked whales," Constantine said. "We ran the samples a few times to make sure before we told everyone."
Read more at Discovery News
The elusive marine mammal is the spade-toothed beaked whale (Mesoplodon traversii). The good news is that it was seen at all, revealing that it still exists. The bad news is that the sighting was of a mother and her male calf, both of which became stranded and died on the beach.
"This is the first time this species -- a whale over five meters (about 16.5 feet) in length -- has ever been seen as a complete specimen, and we were lucky enough to find two of them," Rochelle Constantine of the University of Auckland said in a press release. "Up until now, all we have known about the spade-toothed beaked whale was from three partial skulls collected from New Zealand and Chile over a 140-year period. It is remarkable that we know almost nothing about such a large mammal."
The discovery actually happened two years ago, when the whales live-stranded and died on Opape Beach, New Zealand. It's only after DNA analysis that the identification of the rare species was made. At first, they were incorrectly identified as being the much more common Gray's beaked whales.
"When these specimens came to our lab, we extracted the DNA as we usually do for samples like these, and we were very surprised to find that they were spade-toothed beaked whales," Constantine said. "We ran the samples a few times to make sure before we told everyone."
Read more at Discovery News
Nov 4, 2012
Scientists Monitor Comet Breakup
The Hergenrother comet is currently traversing the inner-solar system. Amateur and professional astronomers alike have been following the icy-dirt ball over the past several weeks as it has been generating a series of impressive outbursts of cometary-dust material. Now comes word that the comet's nucleus has taken the next step in its fragmentation.
"Comet Hergenrother is splitting apart," said Rachel Stevenson, a post-doctoral fellow working at NASA's Jet Propulsion Laboratory in Pasadena, Calif. "Using the National Optical Astronomy Observatory's Gemini North Telescope on top of Mauna Kea, Hawaii, we have resolved that the nucleus of the comet has separated into at least four distinct pieces resulting in a large increase in dust material in its coma."
With more material to reflect the sun's rays, the comet's coma has brightened considerably.
"The comet fragments are considerably fainter than the nucleus," said James Bauer, the deputy principal investigator for NASA's NEOWISE mission, from the California Institute of Technology. "This is suggestive of chunks of material being ejected from the surface."
The comet's fragmentation event was initially detected on Oct. 26 by a team of astronomers from the Remanzacco Observatory, using the Faulkes Telescope North in Haleakala, Hawaii. The initial fragment was also imaged by the WIYN telescope group at Kitt Peak National Observatory in Arizona.
For those interested in viewing Hergenrother, with a larger-sized telescope and a dark sky, the comet can be seen in between the constellations of Andromeda and Lacerta.
Read more at Science Daily
"Comet Hergenrother is splitting apart," said Rachel Stevenson, a post-doctoral fellow working at NASA's Jet Propulsion Laboratory in Pasadena, Calif. "Using the National Optical Astronomy Observatory's Gemini North Telescope on top of Mauna Kea, Hawaii, we have resolved that the nucleus of the comet has separated into at least four distinct pieces resulting in a large increase in dust material in its coma."
With more material to reflect the sun's rays, the comet's coma has brightened considerably.
"The comet fragments are considerably fainter than the nucleus," said James Bauer, the deputy principal investigator for NASA's NEOWISE mission, from the California Institute of Technology. "This is suggestive of chunks of material being ejected from the surface."
The comet's fragmentation event was initially detected on Oct. 26 by a team of astronomers from the Remanzacco Observatory, using the Faulkes Telescope North in Haleakala, Hawaii. The initial fragment was also imaged by the WIYN telescope group at Kitt Peak National Observatory in Arizona.
For those interested in viewing Hergenrother, with a larger-sized telescope and a dark sky, the comet can be seen in between the constellations of Andromeda and Lacerta.
Read more at Science Daily
Asteroid Belts Could Be Key to Finding Alien Life
If we want to find intelligent life elsewhere in the universe, it might be wise to look for stars with asteroid belts similar the one in our own Solar System.
According to the theory of punctuated equilibrium, evolution goes faster and further when life has to make rapid changes to survive new environments — and few things have as dramatic an effect on the environment as an asteroid impact. If humans evolved thanks to asteroid impacts, intelligent life might need an asteroid belt like our own to provide just the right number of periodic hits to spur evolution on. Only a fraction of current exoplanet systems have these characteristics, meaning places like our own Solar System — and intelligent aliens — might be less common than we previously thought.
Astronomers Rebecca Martin of the University of Colorado in Boulder and Mario Livio of the Space Telescope Institute in Baltimore have hypothesised that the location of the Solar System’s asteroid belt — between Jupiter and Mars — is not an accident, and actually necessary for life. As the Solar System formed, the gravitational forces between Jupiter and the Sun would have pulled and stretched clumps of dust and planetoids in the inner Solar System. The asteroid belt lies on the so-called “snow line” — fragile materials like ice will stay frozen further out, but closer in they will melt and fall apart.
During the formation of the Solar System, cold rock and ice coalesced into the planets as we know them. However, as Jupiter formed, it shifted in its orbit closer to the Sun just a little bit before stopping. The tidal forces at work between Jupiter and the Sun would have torn apart the material on the snow line, preventing a planet forming and leaving behind an asteroid belt — which today has a total mass only one percent of that which would have been there originally.
Those asteroids would have bombarded the inner Solar System — including Earth — and, in theory, provided the raw materials needed for life (like water) and also giving evolution a kickstart by drastically changing the early Earth’s climate and environment. To check that this wasn’t just something restricted to our Solar System, Martin and Livio looked at data from Nasa’s Spitzer telescope, which has so far found infrared signals around 90 different stars which can indicate the presence of an asteroid belt. In every case, the belts were located exactly where Martin and Livio had predicted the snow line should be relative to each star’s mass, supporting their snow line theory of asteroid belt formation.
If these are the circumstances which allow intelligent life to evolve somewhere, then it will make the task of finding aliens we can chat with a lot harder — few stars with exoplanets that we’ve found so far have the right setup of a dusty asteroid belt on the snow line with a gas giant parked just outside it.
If the gas giant has formed but not shifted in slightly, as Jupiter did, then the belt will become so full of large objects that the inner planets will be bombarded too frequently for life to fully take hold; if the gas giant continues to move inwards as it orbits, it won’t just stop the belt turning into a planet — it’ll suck everything of any serious size up and leave behind only minor fragments of space rock and dust, including any planets life could evolve on.
Martin and Livio then looked at 520 gas giants found orbiting other stars — in only 19 cases were they outside of where that star’s snow line would be expected to be. That means fewer than four percent of exoplanet systems will have the right setup to support the evolution of advanced, intelligent life in accordance with the punctuated equilibrium theory.
Read more at Wired Science
According to the theory of punctuated equilibrium, evolution goes faster and further when life has to make rapid changes to survive new environments — and few things have as dramatic an effect on the environment as an asteroid impact. If humans evolved thanks to asteroid impacts, intelligent life might need an asteroid belt like our own to provide just the right number of periodic hits to spur evolution on. Only a fraction of current exoplanet systems have these characteristics, meaning places like our own Solar System — and intelligent aliens — might be less common than we previously thought.
Astronomers Rebecca Martin of the University of Colorado in Boulder and Mario Livio of the Space Telescope Institute in Baltimore have hypothesised that the location of the Solar System’s asteroid belt — between Jupiter and Mars — is not an accident, and actually necessary for life. As the Solar System formed, the gravitational forces between Jupiter and the Sun would have pulled and stretched clumps of dust and planetoids in the inner Solar System. The asteroid belt lies on the so-called “snow line” — fragile materials like ice will stay frozen further out, but closer in they will melt and fall apart.
During the formation of the Solar System, cold rock and ice coalesced into the planets as we know them. However, as Jupiter formed, it shifted in its orbit closer to the Sun just a little bit before stopping. The tidal forces at work between Jupiter and the Sun would have torn apart the material on the snow line, preventing a planet forming and leaving behind an asteroid belt — which today has a total mass only one percent of that which would have been there originally.
Those asteroids would have bombarded the inner Solar System — including Earth — and, in theory, provided the raw materials needed for life (like water) and also giving evolution a kickstart by drastically changing the early Earth’s climate and environment. To check that this wasn’t just something restricted to our Solar System, Martin and Livio looked at data from Nasa’s Spitzer telescope, which has so far found infrared signals around 90 different stars which can indicate the presence of an asteroid belt. In every case, the belts were located exactly where Martin and Livio had predicted the snow line should be relative to each star’s mass, supporting their snow line theory of asteroid belt formation.
If these are the circumstances which allow intelligent life to evolve somewhere, then it will make the task of finding aliens we can chat with a lot harder — few stars with exoplanets that we’ve found so far have the right setup of a dusty asteroid belt on the snow line with a gas giant parked just outside it.
If the gas giant has formed but not shifted in slightly, as Jupiter did, then the belt will become so full of large objects that the inner planets will be bombarded too frequently for life to fully take hold; if the gas giant continues to move inwards as it orbits, it won’t just stop the belt turning into a planet — it’ll suck everything of any serious size up and leave behind only minor fragments of space rock and dust, including any planets life could evolve on.
Martin and Livio then looked at 520 gas giants found orbiting other stars — in only 19 cases were they outside of where that star’s snow line would be expected to be. That means fewer than four percent of exoplanet systems will have the right setup to support the evolution of advanced, intelligent life in accordance with the punctuated equilibrium theory.
Read more at Wired Science
The Oldest Trees on the Planet
Trees are some of the longest-lived organisms on the planet. At least 50 trees have been around for more than a millenium, but there may be countless other ancient trees that haven’t been discovered yet.
Trees can live such a long time for several reasons. One secret to their longevity is their compartmentalized vascular system, which allows parts of the tree to die while other portions thrive. Many create defensive compounds to fight off deadly bacteria or parasites.
And some of the oldest trees on earth, the great bristlecone pines, don’t seem to age like we do. At 3,000-plus years, these trees continue to grow just as vigorously as their 100-year-old counterparts. Unlike animals, these pines don’t rack up genetic mutations in their cells as the years go by.
Some trees defy time by sending out clones, or genetically identical shoots, so that one trunk’s demise doesn’t spell the end for the organism. The giant colonies can have thousands of individual trunks, but share the same network of roots.
This gallery contains images of some of the oldest, most venerable and impressive trees on earth.
Pando
While Pando isn’t technically the oldest individual tree, this clonal colony of Quaking Aspen in Utah is truly ancient. The 105-acre colony is made of genetically identical trees, called stems, connected by a single root system. The “trembling giant” got its start at least 80,000 years ago, when all of our human ancestors were still living in Africa. But some estimate the woodland could be as old as 1 million years, which would mean Pando predates the earliest Homo sapiens by 800,000 years. At 6,615 tons, Pando is also the heaviest living organism on earth.
The photo above of the Pando colony was taken by Rachel Sussman, as part of The Oldest Living Things In The World project.
Read more at Wired Science
Trees can live such a long time for several reasons. One secret to their longevity is their compartmentalized vascular system, which allows parts of the tree to die while other portions thrive. Many create defensive compounds to fight off deadly bacteria or parasites.
And some of the oldest trees on earth, the great bristlecone pines, don’t seem to age like we do. At 3,000-plus years, these trees continue to grow just as vigorously as their 100-year-old counterparts. Unlike animals, these pines don’t rack up genetic mutations in their cells as the years go by.
Some trees defy time by sending out clones, or genetically identical shoots, so that one trunk’s demise doesn’t spell the end for the organism. The giant colonies can have thousands of individual trunks, but share the same network of roots.
This gallery contains images of some of the oldest, most venerable and impressive trees on earth.
Pando
While Pando isn’t technically the oldest individual tree, this clonal colony of Quaking Aspen in Utah is truly ancient. The 105-acre colony is made of genetically identical trees, called stems, connected by a single root system. The “trembling giant” got its start at least 80,000 years ago, when all of our human ancestors were still living in Africa. But some estimate the woodland could be as old as 1 million years, which would mean Pando predates the earliest Homo sapiens by 800,000 years. At 6,615 tons, Pando is also the heaviest living organism on earth.
The photo above of the Pando colony was taken by Rachel Sussman, as part of The Oldest Living Things In The World project.
Read more at Wired Science
New Three-Fingered Frog Discovered
On a trek across this Atlantic rainforest reserve in southern Brazil, biologist Michel Garey recalled how on his birthday in 2007 he chanced upon what turned out to be a new species of tiny, three-fingered frogs.
"I was doing research with two friends on a hilltop in the reserve and I stumbled into this unusual frog with only three fingers," he told a small group of reporters this week on a tour of Salto Morato, a nature preserve owned by Brazil's leading cosmetic firm Boticario.
"It happened on February 14, 2007: My birthday. What a treat!" he said.
But it was not until June this year that the discovery of this new species -- Brachycephalus tridactylus -- was officially established. A report on his finding was published in Herpetologica, a quarterly international journal focusing on study and conservation of amphibians and reptiles.
"At the time I was doing some other work related to ecology and I figured I could wait as no one else doing frog research would have access to the area," Garey said.
"It took me 18 months from early 2011 to collect seven of the new frogs, go to museums to compare them with other species, realize that they were new, write my paper and have it published in the journal."
The tiny brachycephalus tridactylus was found at an altitude of around 900 meters (3,000 feet). Its most striking feature is the absence of a fourth finger, which Garey attributes to an evolutionary process rather than to environmental effects.
The frog, which measures less than 1.5-centimeters in length, is mostly orange with olive-gray spots and dots on its body.
Garey said the male frog makes around 30 mating calls a day, sounds he described as "a single short note that decreases in dominant frequency from beginning to end."
Garey said he could not estimate the frog population, but plans to do so in a future research project.
The frog is part of 43 amphibian species found in this 2,253-hectare (5,567-acre) reserve, located in Guaraquecaba, the easternmost city in the southern state of Parana.
Experts estimate around 950 amphibian species live across Brazil and more than 6,700 around the world.
Amphibians -- cold-blooded animals such as frogs, toads, salamanders and newts -- are increasingly threatened by climate change, pollution, and the emergence of a deadly and infectious fungal disease, which has been linked to global warming.
One-third of the known species are threatened with extinction, according to the Global Amphibian Assessment, an extensive survey of the world's amphibian species. More than 120 species are believed to have gone extinct since 1980.
Frogs spend part of their life in water and on land, so understanding their complex life cycle is crucial because they can serve as "bioindicators of environmental quality", said Garey.
Frogs "have permeable skin which make them more susceptible to ultra-violet radiation and their body temperatures change with the environment," Garey said.
"As larvae in the water, they eat various organisms such as algae and as adults they eat insects. The larvae are also eaten by fish while the adults are eaten by cobras and mammals," he added. "So they are having a cascade effect in the food chain."
Read more at Discovery News
"I was doing research with two friends on a hilltop in the reserve and I stumbled into this unusual frog with only three fingers," he told a small group of reporters this week on a tour of Salto Morato, a nature preserve owned by Brazil's leading cosmetic firm Boticario.
"It happened on February 14, 2007: My birthday. What a treat!" he said.
But it was not until June this year that the discovery of this new species -- Brachycephalus tridactylus -- was officially established. A report on his finding was published in Herpetologica, a quarterly international journal focusing on study and conservation of amphibians and reptiles.
"At the time I was doing some other work related to ecology and I figured I could wait as no one else doing frog research would have access to the area," Garey said.
"It took me 18 months from early 2011 to collect seven of the new frogs, go to museums to compare them with other species, realize that they were new, write my paper and have it published in the journal."
The tiny brachycephalus tridactylus was found at an altitude of around 900 meters (3,000 feet). Its most striking feature is the absence of a fourth finger, which Garey attributes to an evolutionary process rather than to environmental effects.
The frog, which measures less than 1.5-centimeters in length, is mostly orange with olive-gray spots and dots on its body.
Garey said the male frog makes around 30 mating calls a day, sounds he described as "a single short note that decreases in dominant frequency from beginning to end."
Garey said he could not estimate the frog population, but plans to do so in a future research project.
The frog is part of 43 amphibian species found in this 2,253-hectare (5,567-acre) reserve, located in Guaraquecaba, the easternmost city in the southern state of Parana.
Experts estimate around 950 amphibian species live across Brazil and more than 6,700 around the world.
Amphibians -- cold-blooded animals such as frogs, toads, salamanders and newts -- are increasingly threatened by climate change, pollution, and the emergence of a deadly and infectious fungal disease, which has been linked to global warming.
One-third of the known species are threatened with extinction, according to the Global Amphibian Assessment, an extensive survey of the world's amphibian species. More than 120 species are believed to have gone extinct since 1980.
Frogs spend part of their life in water and on land, so understanding their complex life cycle is crucial because they can serve as "bioindicators of environmental quality", said Garey.
Frogs "have permeable skin which make them more susceptible to ultra-violet radiation and their body temperatures change with the environment," Garey said.
"As larvae in the water, they eat various organisms such as algae and as adults they eat insects. The larvae are also eaten by fish while the adults are eaten by cobras and mammals," he added. "So they are having a cascade effect in the food chain."
Read more at Discovery News
Subscribe to:
Posts (Atom)