Dec 6, 2016
The researchers focused their study on the garden pea Pisum sativum, seedlings of which they placed at the base of a Y-shaped maze. Then, in a series of training sessions, they put a fan and a light source at the end of either the same arm or opposing arms of the Y.
According to the scientists, the seedlings became better seekers of precious light by learning to associate the breeze of the fan with the location where the light would shine, growing toward that location even when the light was removed.
"The ability of seedlings to anticipate both the imminent arrival of light ('when') and its direction ('where') based on the presence and position of the fan indicates that plants are able to encode both temporal and spatial information and modify their behavior under the control of environmental cues," the scientists wrote.
The team suggested that the type of learning demonstrated by the seedlings should no longer be considered exclusive to the animal kingdom: "Our results show that associative learning is an essential component of plant behavior. We conclude that associative learning represents a universal adaptive mechanism shared by both animals and plants."
"Whilst the possibility that plants also learn by association has been considered by earlier studies," the scientists added, "our current study provides the first unequivocal evidence."
The WA scientists said they're aware of how their "smart plant" discoveries might be interpreted.
"Because our findings are unexpected, we anticipate that this study will stir a lively and exciting debate on the origin and properties of memory, learning and ultimately intelligent behavior in biological systems," said study lead Monica Gagliano, in a statement.
The researchers say their study was inspired by Russian scientist Ivan Pavlov's famed research on conditioned responses in dogs, which explored how behavior could be modified through conditioning.
Gagliano and her team have documented their findings in the online journal Scientific Reports.
From Discovery News
One of the first successful businesswomen of 19th century Europe and a pioneer of the cult of celebrity, Tussaud died in 1850 at the age of 89. Her death certificate only vaguely recorded "old age" as the cause of her demise.
According to her two sons, until a few days before her death, Madame Tussaud sat at the entrance of her exhibition — which now has branches in dozens of locations worldwide — to collect the public's shillings.
But this image of a strong, healthy woman in charge of her business until the very end is likely false.
"It was a family concern to depict a very efficient Madame Tussaud," first author Francesco Galassi, at the Institute of Evolutionary Medicine at the University of Zurich, Switzerland, said.
"However, a re-analysis of the correspondence of her youngest son Francis tells a different story," he added.
A letter written by Francis to his father in 1848, two years before Madame Tussaud's death, reveals that Marie was "growing very feeble."
"At times she is very ill and she suffers from asthma which allows her no rest at night ... Her legs are bad like yours, and she has bunions that hurt her when she walks," Francis wrote.
According to Galassi and colleagues Louise Baker, archivist at Madame Tussauds, Roberta Ballestriero, at the University of the Arts, London, and Frank Rühli, at the Institute of Evolutionary Medicine at the University of Zurich, a cardiorespiratory disease can explain fatigue and weakness, asthma and varicose veins.
"Heart failure, primary or secondary to pulmonary or systemic disease (eg. hypertension) would account for all of these symptoms," the researchers wrote.
They noted that an alternative diagnosis could be progressive lung diseases including emphysema, chronic bronchitis, and asthma, which could still have had consequences on the heart.
Further help with the diagnosis came from historical sources, which report that Tussaud's final illness lasted five days.
"This is suggestive of an infection, such as pneumonia, which is still common today with patients with chronic obstructive pulmonary disease," Galassi said.
Read more at Discovery News
The finds include a 3-foot-long (1 meter) section of mammoth tusk, as well as a skull and partial tusks from a much younger animal, which might have been either a mammoth or a mastodon, according to The Source, a transportation blog about the L.A. Metro.
Though the ice-age fossils (whose exact age has not yet been determined) are certainly treasures that are rarer to unearth under the subway than rat "fossils" and "coprolites," old chicken wings or discarded coffee cups, the finds actually aren't all that surprising. The area around the site of the fossil discovery, near the La Brea/Wilshire station, is not too far from the La Brea Tar Pits, an area of central Los Angeles where natural asphalt has been seeping up from the ground for the last 40,000 years.
Over the eons, this constant ooze of asphalt has created sticky pits in valleys that would often become obscured by leaves, branches and other ground cover. As a result, unwary animals stepped into the sticky death traps. The viscous ooze trapped small animals and insects immediately, while larger beasts like mammoths sank inches into the tar, struggling to get out before becoming stuck, researchers have noted. The dead or dying animals attracted predators as well — some of which also became stuck in the asphalt. All told, more than 1 million fossils have been found in the tar pits, according to the La Brea Tar Pits & Museum.
Mammoths and mastodons are both Proboscideans. Though both were majestically large and had shaggy coats and impressively curved tusks, mammoths are much more closely related to modern-day elephants, having arisen about 5 million years ago in Africa. By contrast, mastodons arose about 27 million to 30 million years ago.
During the Pleistocene epoch, between 1.8 million and 11,700 years ago, mammoths and mastodons roamed over the part of North America that was not covered by ice sheets, including coastal California.
Read more at Discovery News
Though this makes logical sense, many puzzles remain and astronomers still aren't quite sure how the gas and dust around young stars really forms planets. This issue is the basis of a decades-long mystery underlying planetary formation models, but we are closing in on some answers.
Now, the largest radio observatory on the planet has studied the light coming from a young star in an effort to understand the initiation of planet building and astronomers think they've added an important clue to the mystery surrounding the evolution of planets.
The Atacama Large Millimeter/submillimeter Array in Chile was used to study a specific type of polarization of radio waves coming from the 5-million-year-old star HD 142527. The star, which is approximately double the mass of our sun and located 500 light-years away, possesses a ring of dust and gas that astronomers believe forms the building blocks of a planetary system. But how this dust will turn from microscopic specks into huge rocky planets measuring thousands of miles across is a quandary.
|ALMA observation of the dust ring surrounding HD 142527|
Radio waves are emitted by the star and then scattered by particles in the ring. By looking at this scattered emission, information about the particles within the ring leave a "fingerprint." In this case, Kataoka was able to discern the size of the dust particles from their polarization fingerprint and he stumbled upon a surprise: The individual dust particles were a lot smaller than previous studies assumed. It turns out that the basic dusty grains from which planets will form are only 150 micrometers wide — that's roughly half the size of a grain of table salt and 10 times smaller than previous estimates.
This presents a problem with previous planetary formation models and also, possibly, points to a tantalizing planetary formation mechanism.
"In the previous studies, astronomers have estimated the size based on radio emissions assuming hypothetical spherical dust particles," said Kataoka in a statement.
|The dust particles that form the building blocks of planets are not simple spherical particles, they are complex structures composed of many smaller dust particles according to new research|
Read more at Discovery News
Dec 5, 2016
|The figure shows the expected increase in the number of summertime storms that produce extreme precipitation at century's end compared to the period 2000 - 2013.|
The study, published in the journal Nature Climate Change, also finds that the intensity of individual extreme rainfall events could increase by as much as 70 percent in some areas. That would mean that a storm that drops about 2 inches of rainfall today would be likely to drop nearly 3.5 inches in the future.
"These are huge increases," said NCAR scientist Andreas Prein, lead author of the study. "Imagine the most intense thunderstorm you typically experience in a single season. Our study finds that, in the future, parts of the U.S. could expect to experience five of those storms in a season, each with an intensity as strong or stronger than current storms."
The study was funded by the National Science Foundation (NSF), NCAR's sponsor, and the Research Partnership to Secure Energy for America.
"Extreme precipitation events affect our infrastructure through flooding, landslides and debris flows," said Anjuli Bamzai, program director in NSF's Directorate for Geosciences, which funded the research. "We need to better understand how these extreme events are changing. By supporting this research, NSF is working to foster a safer environment for all of us."
A year of supercomputing time
An increase in extreme precipitation is one of the expected impacts of climate change because scientists know that as the atmosphere warms, it can hold more water, and a wetter atmosphere can produce heavier rain. In fact, an increase in precipitation intensity has already been measured across all regions of the U.S. However, climate models are generally not able to simulate these downpours because of their coarse resolution, which has made it difficult for researchers to assess future changes in storm frequency and intensity.
For the new study, the research team used a new dataset that was created when NCAR scientists and study co-authors Roy Rasmussen, Changhai Liu, and Kyoko Ikeda ran the NCAR-based Weather Research and Forecasting (WRF) model at a resolution of 4 kilometers, fine enough to simulate individual storms. The simulations, which required a year to run, were performed on the Yellowstone system at the NCAR-Wyoming Supercomputing Center.
Prein and his co-authors used the new dataset to investigate changes in downpours over North America in detail. The researchers looked at how storms that occurred between 2000 and 2013 might change if they occurred instead in a climate that was 5 degrees Celsius (9 degrees Fahrenheit) warmer -- the temperature increase expected by the end of the century if greenhouse gas emissions continue unabated.
Prein cautioned that this approach is a simplified way of comparing present and future climate. It doesn't reflect possible changes to storm tracks or weather systems associated with climate change. The advantage, however, is that scientists can more easily isolate the impact of additional heat and associated moisture on future storm formation.
"The ability to simulate realistic downpours is a quantum leap in climate modeling. This enables us to investigate changes in hourly rainfall extremes that are related to flash flooding for the very first time," Prein said. "To do this took a tremendous amount of computational resources."
Impacts vary across the U.S.
The study found that the number of summertime storms producing extreme precipitation is expected to increase across the entire country, though the amount varies by region. The Midwest, for example, sees an increase of zero to about 100 percent across swaths of Nebraska, the Dakotas, Minnesota, and Iowa. But the Gulf Coast, Alabama, Louisiana, Texas, New Mexico, Arizona, and Mexico all see increases ranging from 200 percent to more than 400 percent.
The study also found that the intensity of extreme rainfall events in the summer could increase across nearly the entire country, with some regions, including the Northeast and parts of the Southwest, seeing particularly large increases, in some cases of more than 70 percent.
A surprising result of the study is that extreme downpours will also increase in areas that are getting drier on average, especially in the Midwest. This is because moderate rainfall events that are the major source of moisture in this region during the summertime are expected to decrease significantly while extreme events increase in frequency and intensity. This shift from moderate to intense rainfall increases the potential for flash floods and mudslides, and can have negative impacts on agriculture.
The study also investigated how the environmental conditions that produce the most severe downpours might change in the future. In today's climate, the storms with the highest hourly rainfall intensities form when the daily average temperature is somewhere between 20 and 25 degrees C (68 to 77 degrees F) and with high atmospheric moisture. When the temperature gets too hot, rainstorms become weaker or don't occur at all because the increase in atmospheric moisture cannot keep pace with the increase in temperature. This relative drying of the air robs the atmosphere of one of the essential ingredients needed to form a storm.
Read more at Science Daily
The finding represents a major benchmark in our knowledge of how water conducts a positive electrical charge, which is a fundamental mechanism found in biology and chemistry. The researchers, led by Yale chemistry professor Mark Johnson, report their discovery in the Dec. 1 edition of the journal Science.
For more than 200 years, scientists have speculated about the specific forces at work when electricity passes through water -- a process known as the Grotthuss mechanism. It occurs in vision, for example, when light hits the eye's retina. It also turns up in the way fuel cells operate.
But the details have remained murky. In particular, scientists have sought an experimental way to follow the structural changes in the web of interconnected water molecules when an extra proton is transferred from one oxygen atom to another.
"The oxygen atoms don't need to move much at all," Johnson said. "It is kind of like Newton's cradle, the child's toy with a line of steel balls, each one suspended by a string. If you lift one ball so that it strikes the line, only the end ball moves away, leaving the others unperturbed."
Johnson's lab has spent years exploring the chemistry of water at the molecular level. Often, this is done with specially designed instruments built at Yale. Among the lab's many discoveries are innovative uses of electrospray ionization, which was developed by the late Yale Nobel laureate John Fenn.
Johnson and his team have developed ways to fast-freeze the chemical process so that transient structures can be isolated, revealing the contorted arrangements of atoms during a reaction. The practical uses for these methods range from the optimization of alternative energy technologies to the development of pharmaceuticals.
In the case of the proton relay race, previous attempts to capture the process hinged on using infrared color changes to see it. But the result always came out looking like a blurry photograph.
"In fact, it appeared that this blurring would be too severe to ever allow a compelling connection between color and structure," Johnson said.
The answer, he found, was to work with only a few molecules of "heavy water" -- water made of the deuterium isotope of hydrogen -- and chill them to almost absolute zero. Suddenly, the images of the proton in motion were dramatically sharper.
"In essence, we uncovered a kind of Rosetta Stone that reveals the structural information encoded in color," Johnson said. "We were able to reveal a sequence of concerted deformations, like the frames of a movie." Johnson's lab was assisted by the experimental group of Knut Asmis at the University of Leipzig and the theory groups of Ken Jordan of the University of Pittsburgh and Anne McCoy of the University of Washington.
One area where this information will be useful is in understanding chemical processes that occur at the surface of water, Johnson noted. There is active debate among scientists regarding whether the surface of water is more or less acidic than the bulk of water. At present, there is no way to measure the surface pH of water.
Read more at Science Daily
The findings, published in the journal Current Biology, not only help explain why people don't wag dog-like tails, but they also shed light on why we all have a tailbone and begin life with an actual tail that gradually disappears.
"Fleshy tails go all the way back to the earliest vertebrate ancestors and are found in very young embryos, so it would be very difficult to get rid of them entirely without causing other problems," author Lauren Sallan told Seeker. "As a result, both fishes and humans have had to stunt growth instead, leaving a buried, vestigial tail much like the legs of whales."
The origins of this mysterious vestigial tail go back to fish. For the study, Sallan, an assistant professor in the University of Pennsylvania's Department of Earth and Environmental Science, analyzed 350-million-year-old hatchlings of the fossil fish Aetheretmon. This jawed fish distant ancestor of terrestrial animals today had both a scaly, fleshy tail and a flexible tail fin, sitting one atop the other.
Sallan found these structures were entirely separate. By comparing the Aetheretmon hatchlings with those of living fish, she found that the two "tails" started out one atop the other and then grew on their own. This discovery overturns at least two centuries of scientific belief that the modern adult fish tail fin was simply added to the end of an ancestral tail shared with land animals.
|The early double-tailed fish Aetheretmon swimming alongside a singled-tailed modern pufferfish and an early tetrapod in a 350-million-year old river.|
Fish that evolved to become semi-aquatic and then land-dwelling animals lost the flexible back fin, but kept the fleshier one that over time became the familiar appendage we now see on dogs, cats, cows and many other animals. As dogs show, tails are useful for visual communication, slapping away flying insects and other functions.
Read more at Discovery News
The translucent, sea-dwelling invertebrate, called Bathochordaeus charon, was identified recently off the coast of Monterey, California, by scientists using a remotely operated vehicle (ROV). Though B. charon was first discovered a century ago, no one had managed to confirm its existence in all those years, Rob Sherlock, a scientist at the Monterey Bay Aquarium Research Institute who found the creature, told Live Science in an email.
B. charon belongs to a group of sea creatures known as larvaceans — normally teensy, millimeter-size creatures whose bodies resemble a tadpole's, with a large "head" (actually a trunk) and a tail, Sherlock said.
|A giant larvacean, Bathochordaeus charon, that has discarded its mucus feeding filters and is swimming freely in the open ocean.|
If a passing squid or fish crashes through the house, or big particles clog the feeding tube, larvaceans simply move on and build another house. Without their houses, they cannot eat, Sherlock said.
The first report of B. charon's existence came in 1899, when professor Carl Chun of Leipzig University came across one in the south Atlantic Ocean while leading the Valdivia Expedition, a German mission aimed at exploring the deep sea. Chun believed the creature welled up from the deepest depths of the ocean, so he named the larvacean after Charon, who in Greek mythology ferries the souls of the dead across the river Styx, the researchers reported Aug. 16 in the journal Marine Biodiversity Records.
In the decades that followed, several other naturalists reported spotting giant larvaceans, though only a few were captured alive and described thoroughly. In 1936, for instance, British marine biologist Walter Garstang collected a set of giant larvaceans that differed from Chun's, and they were classified as a new species, Bathochordaeus stygius.
Because the two sets of specimens were similar and Chun's originals were lost to history, scientists eventually began to wonder whether Chun's originally described B. charon was actually the same species as B. stygius. One famous larvacean expert even suggested combining the two species names, Sherlock said. Part of the difficulty in capturing these creatures is that they don't fare well in the trawling nets typically used to collect specimens, Sherlock said.
Sherlock and his colleagues happened upon the new species when the team's ROV, called Doc Ricketts, was exploring the waters of Monterey Bay. As soon as they saw it, the crew carefully collected it in a sealed, thermally insulated container.
"Since the vehicle was recovered some tens of minutes later, the animal was alive, in fantastic shape, and we preserved it right away in order to send it to the Smithsonian," Sherlock said. "We had no idea, until we looked more closely at the specimen, that we had actually found B. charon, the species first described over a hundred years ago."
Genetics and analysis of physical features confirmed the find, Sherlock said. It was official: There really were two distinct species of giant larvacean — B. stygius and B. charon.
Read more at Discovery News
Dec 4, 2016
Flower forms in the primrose: Biologists unlock 51.7-million-year-old genetic secret to landmark Darwin theory
Darwin hypothesised that some plant species with two distinct forms of flower, where male and female reproductive organs were of differing lengths, had evolved that way to promote out-crossing by insect pollinators.
His ground-breaking insight into the significance of the two forms of flower known as 'pins' and 'thrums' coined the term 'heterostyly', and subsequent studies contributed to the foundation of modern genetic theory.
And now scientists at the University of East Anglia, working at the John Innes Centre, have identified exactly which part of these species' genetic code made them that way, through an event that occurred more than 51 million years ago.
Prof Philip Gilmartin from UEA's School of Biological Sciences said: "To identify the genes which control the biology noted by Darwin is an exciting moment. Many studies have been done over the past decades to explore the genetic basis of this phenomenon but now we have pinpointed the supergene directly responsible, the S locus."
Supergenes are clusters of closely-associated genes which are always inherited together as a unit and allow complex biology to be controlled. Researchers worked with the Earlham Institute to map the plant's genes and sequence the Primula genome to find the specific gene cluster responsible for creating the differing flower morphs.
Prof Gilmartin said: "Not only did we identify the supergene but we found it is specific to just one of the flower forms, the thrum. This insight has profound implications for our understanding of a key evolutionary innovation of flowering plants.
"Understanding of the genetics which underpin flower development and reproduction of a species broadens our knowledge about the entire system of pollination, which underpins biodiversity and food security.
"With challenges such as climate change and its effects on plants, crops and their insect pollinators, it's even more important to understand pollination mechanisms and how species can and will react.
In their hunt for the genes controlling heterostyly, researchers also managed to date the original mutation, to 51.7 million years ago.
Having found the S locus, they realised the gene was a close relative to another, identified six years ago as responsible for controlling the identity of petals on a Primula flower.
At some point this gene duplicated, inserted itself in the S locus, and mutated to control the position of the anther in the flower. Finding this duplicated gene allowed the team to date how long ago the mutation occurred for the first time.
Read more at Science Daily
|Supercomputer model of a low-mass supernova.|
The research is published in the most recent issue of leading scientific journal Nature Communications.
About 4.6 billion years ago, a cloud of gas and dust that eventually formed our solar system was disturbed.
The ensuing gravitational collapse formed the proto-Sun with a surrounding disc where the planets were born. A supernova -- a star exploding at the end of its life-cycle -- would have enough energy to induce the collapse of such a gas cloud.
"Before this model there was only inconclusive evidence to support this theory," said Professor Alexander Heger from the Monash School of Physics and Astronomy.
The research team, led by University of Minnesota School of Physics and Astronomy Professor Yong-Zhong Qian, decided to focus on short-lived radioactive nuclei only present in the early solar system.
Due to their short lifetimes, these nuclei could only have come from the triggering supernova. Their abundances in the early solar system have been inferred from their decay products in meteorites. As the debris from the formation of the solar system, meteorites are comparable to the leftover bricks and mortar in a construction site. They tell us what the solar system is made of and in particular, what short-lived nuclei the triggering supernova provided.
"Identifying these 'fingerprints' of the final supernova is what we needed to help us understand how the formation of the solar system was initiated," Professor Heger said.
"The fingerprints uniquely point to a low-mass supernova as the trigger.
"The findings in this paper have opened up a whole new direction of research focusing on low-mass supernovae," he said.
In addition to explaining the abundance of Beryllium-10, this low-mass supernova model would also explain the short-lived nuclei Calcium-41, Palladium-107, and a few others found in meteorites.
Professor Qian said the group would like to examine the remaining mysteries surrounding short-lived nuclei found in meteorites. The research is funded by the US Department of Energy Office of Nuclear Physics.
Professor Heger and a new Monash Future Fellow, Dr Bernhard Mueller, also study such supernovae using computational facilities at the Minnesota Supercomputing Institute.
From Science Daily