May 11, 2018

Ancient skull shows early 'baleen whale' had teeth

This image shows a life-like reconstruction of Llanocetus denticrenatus, an ancient 'baleen' whale.
Today's baleen whales (Mysticetes) support their massive bodies by filtering huge volumes of small prey from seawater using comb-like baleen in their mouths much like a sieve. But new evidence reported in the journal Current Biology on May 10 based on careful analysis of a 34-million-year-old whale skull from Antarctica -- the second-oldest "baleen" whale ever found -- suggests that early whales actually didn't have baleen at all. Their mouths were equipped instead with well-developed gums and teeth, which they apparently used to bite large prey.

"Llanocetus denticrenatus is an ancient relative of our modern gentle giants, like humpback and blue whales," says Felix Marx of the Royal Belgian Institute of Natural Sciences. "Unlike them, however, it had teeth, and probably was a formidable predator."

"Until recently, it was thought that filter feeding first emerged when whales still had teeth," adds R. Ewan Fordyce at the University of Otago in New Zealand. "Llanocetus shows that this was not the case."

Like modern whales, Llanocetus had distinctive grooves on the roof of its mouth, the researchers explain, which usually contain blood vessels that supply the baleen. In Llanocetus, however, those grooves cluster around tooth sockets, where baleen would have been useless and at risk of being crushed.

"Instead of a filter, it seems that Llanocetus simply had large gums and, judging from the way its teeth are worn, mainly fed by biting large prey," Marx says. "Even so, it was huge: at a total body length of around 8 meters, it rivals some living whales in size."

The findings suggest that large gums in whales like Llanocetus gradually became more complex over evolutionary time and, ultimately, gave rise to baleen. That transition probably happened only after the teeth had already been lost and whales had switched from biting to sucking in small prey -- as many whales and dolphins now do. Marx and Fordyce suggest that baleen most likely arose as a way to keep such small prey inside the mouth more effectively.

Soft tissues, including baleen, normally rot away, making it difficult to study their evolution. As a result, researchers must rely on indicators preserved on the bones, such as tell-tale grooves or lumps indicating the position of a muscle, or holes for the passage of particular blood vessels and nerves.

"Llanocetus presents a lucky combination, where the shape of the bones, small features suggesting the course of soft tissues, and tooth wear all combine to tell a clear story," Fordyce says. "Crucially, Llanocetus is also extremely old and lived at the very time when Mysticetes first appeared. As such, it provides a rare window into the earliest phase of their evolution."

In the new study, Fordyce and Marx found that the broad rostrum of Llanocetus had sharp, widely spaced teeth with marked tooth wear suggesting that they were used to bite and shear prey. As in living Mysticetes, the palate bears many grooves, which have commonly been interpreted as evidence for baleen. However, the researchers showed that those grooves instead converged on the bony tooth sockets, suggesting a peri-dental blood supply to well-developed gums, rather than racks of baleen.

The findings show that the evolution of filter feeding wasn't as straightforward as previously thought, the researchers say. They'd now like to sort out when filter feeding and baleen first evolved.

Read more at Science Daily

How turning down the heat makes a baby turtle male

Viewed under the microscope, this image shows the immature sex organs of a developing turtle. Cells that will eventually give rise to either eggs or sperm are stained red. The center panel shows that silencing a key gene called Kdm6b transforms a turtle gonad that otherwise would have become a future testis (left) into a future ovary instead (right).
Boy or girl? For those who want to influence their baby's sex, superstition and folk wisdom offer no shortage of advice whose effectiveness is questionable at best -- from what to eat to when to make love. But some animals have a technique backed by scientific proof: In turtles and other reptiles, whether an egg hatches male or female depends on the temperature of its nest.

The phenomenon was first discovered in reptiles more than 50 years ago, but until now the molecular details were a mystery.

In a study published May 11 in the journal Science, researchers say they have finally identified a critical part of the biological "thermometer" that turns a developing turtle male or female.

According to a team at Duke University and Zhejiang Wanli University in China, the explanation lies not in the DNA sequence itself -- the A's, T's, C's and G's -- but in a molecule that affects how genes are expressed without altering the underlying genetic code.

"Temperature-dependent sex determination has been a puzzle for a really long time," said Blanche Capel, a cell biology professor at Duke who led the research. "This is the first functional evidence of a molecular link that connects temperature with sexual development."

Unlike humans and most other mammals, the sex of many turtles, lizards and alligators isn't determined by the chromosomes they inherit, but by ambient temperatures during a sensitive stage of development.

For a common pond and pet turtle called the red-eared slider, for example, eggs incubated at 32 degrees Celsius (nearly 88 Fahrenheit) produce all female hatchlings, while those kept at 26 degrees Celsius (79 Fahrenheit) hatch as males.

In the study, the researchers show that cooler egg incubation temperatures turn up a key gene called Kdm6b in the turtle's immature sex organs, or gonads. This in turn acts as a biological "on" switch, activating other genes that allow testes to develop.

To home in on the critical Kdm6b gene, the researchers took a group of freshly laid turtle eggs, incubated them at either 26 or 32 degrees Celsius, and looked for differences in the way genes were turned on in the turtles' gonads early in development -- before their fate as ovaries or testes has been decided.

In a previous study, researchers examined all the gene readouts, or transcripts, produced in the turtles' gonads over this critical time window.

They found several genes that were turned up or down at one temperature but not the other. But one of the first genes to shift was one called Kdm6b, which became much more active at the cooler incubation temperatures that produce males, and was almost silent at warmer, female-producing temperatures.

In the new study, the team used a technique developed by collaborators at Zhejiang Wanli University to suppress the Kdm6b gene in turtle gonads and see how it affects their sexual development.

Silencing the Kdm6b gene, they found, transforms a growing turtle embryo kept at temperatures that would otherwise produce a male with testes into a baby female with ovaries instead.

Further experiments showed that the protein encoded by the Kdm6b gene in turn interacts with a region of the genome called Dmrt1, which acts as a master switch to turn on testis development.

They found that Kdm6b activates the Dmrt1 master switch by modifying histones, the ball-like proteins that DNA is wrapped around inside the cell nucleus, like thread wound around a spool.

In many species, the tail of histone proteins is decorated with special chemical markers, or methyl tags, that keep genes along the DNA molecule inactive.

Kdm6b gene activity turns on the Dmrt1 master switch by removing these repressive tags and "loosening" the histone tails, which makes the DNA wound around the histones easier to access and read.

"It's like removing the brakes off the male pathway," said co-author Ceri Weber, a PhD candidate in the Capel lab at Duke.

Researchers have found temperature-related shifts in Kdm6b gene activity in other species whose sex depends on incubation temperature, such as alligators and bearded dragons. "This suggests that similar molecular mechanisms may be at work in other reptiles too," Capel said.

Read more at Science Daily

Jurassic fossil tail tells of missing link in crocodile family tree

Pelvis fragments.
A 180-million-year-old fossil has shed light on how some ancient crocodiles evolved into dolphin-like animals.

The specimen -- featuring a large portion of backbone -- represents a missing link in the family tree of crocodiles, and was one of the largest coastal predators of the Jurassic Period, researchers say.

The newly discovered species was nearly five metres long and had large, pointed teeth for grasping prey. It also shared key body features seen in two distinct families of prehistoric crocodiles, the team says.

Some Jurassic-era crocodiles had bony armour on their backs and bellies, and limbs adapted for walking on land. Another group had tail fins and flippers but did not have armour.

The new species was heavily armoured but also had a tail fin, suggesting it is a missing link between the two groups, researchers say.

It has been named Magyarosuchus fitosi in honour of the amateur collector who discovered it, Attila Fitos.

The fossil -- unearthed on a mountain range in north-west Hungary in 1996 and stored in a museum in Budapest -- was examined by a team of palaeontologists, including a researcher from the University of Edinburgh.

It was identified as a new species based on the discovery of an odd-looking vertebra that formed part of its tail fin.

The study, published in the journal PeerJ, also involved researchers in Hungary and Germany. It was supported by the Leverhulme Trust and the SYNTHESYS project, part of the European Commission's Seventh Framework Programme.

Dr Mark Young, of the University of Edinburgh's School of GeoSciences, who was involved in the study, said: "This fossil provides a unique insight into how crocodiles began evolving into dolphin and killer whale-like forms more than 180 million years ago. The presence of both bony armour and a tail fin highlights the remarkable diversity of Jurassic-era crocodiles."

From Science Daily

Neuroscientists find first evidence animals can mentally replay past events

To test the rats' memory, IU researchers placed the animals inside an "arena" with different odors. The rats were rewarded when they identified the second-to-last item or fourth-to-last odor from a list of unpredictable length.
Neuroscientists at Indiana University have reported the first evidence that non-human animals can mentally replay past events from memory. The discovery could help advance the development of new drugs to treat Alzheimer's disease.

The study, led by IU professor Jonathon Crystal, appears today in the journal Current Biology.

"The reason we're interested in animal memory isn't only to understand animals, but rather to develop new models of memory that match up with the types of memory impaired in human diseases such as Alzheimer's disease," said Crystal, a professor in the IU Bloomington College of Arts and Sciences' Department of Psychological and Brain Sciences and director of the IU Bloomington Program in Neuroscience.

Under the current paradigm, Crystal said most preclinical studies on potential new Alzheimer's drugs examine how these compounds affect spatial memory, one of the easiest types of memory to assess in animals. But spatial memory is not the type of memory whose loss causes the most debilitating effects of Alzheimer's disease.

"If your grandmother is suffering from Alzheimer's, one of the most heartbreaking aspects of the disease is that she can't remember what you told her about what's happening in your life the last time you saw her," said Danielle Panoz-Brown, an IU Ph.D. student who is the first author on the study. "We're interested in episodic memory -- and episodic memory replay -- because it declines in Alzheimer's disease, and in aging in general."

Episodic memory is the ability to remember specific events. For example, if a person loses their car keys, they might try to recall every single step -- or "episode" -- in their trip from the car to their current location. The ability to replay these events in order is known as "episodic memory replay." People wouldn't be able to make sense of most scenarios if they couldn't remember the order in which they occurred, Crystal said.

To assess animals' ability to replay past events from memory, Crystal's lab spent nearly a year working with 13 rats, which they trained to memorize a list of up to 12 different odors. The rats were placed inside an "arena" with different odors and rewarded when they identified the second-to-last odor or fourth-to-last odor in the list.

The team changed the number of odors in the list prior to each test to confirm the odors were identified based upon their position in the list, not by scent alone, proving the animals were relying on their ability to recall the whole list in order. Arenas with different patterns were used to communicate to the rats which of the two options was sought.

After their training, Crystal said, the animals successfully completed their task about 87 percent of the time across all trials. The results are strong evidence the animals were employing episodic memory replay.

Additional experiments confirmed the rats' memories were long-lasting and resistant to "interference" from other memories, both hallmarks of episodic memory. They also ran tests that temporarily suppressed activity in the hippocampus -- the site of episodic memory -- to confirm the rats were using this part of their brain to perform their tasks.

Crystal said the need to find reliable ways to test episodic memory replay in rats is urgent since new genetic tools are enabling scientists to create rats with neurological conditions similar to Alzheimer's disease. Until recently, only mice were available with the genetic modifications needed to study the effect of new drugs on these symptoms.

Read more at Science Daily

May 10, 2018

Rapid evolution fails to save butterflies from extinction in face of human-induced change

This is an Edith's checkerspot butterfly (Euphydryas editha) on a narrow-leaved plantain (Plantago lanceolata).
The evolution of wild species, adapting them to human management practices, can cause localised extinctions when those practices rapidly change. And in a new study published in Nature, Professors Michael C. Singer and Camille Parmesan have used more than 30 years of research to fully document an example of this process.

A large, isolated population of a North American butterfly evolved complete dependence on an introduced European weed to the point where the continued existence of the butterfly depended on the plant's availability. The insects then became locally extinct when humans effectively eliminated that availability, confirming a prediction made by the same authors in a 1993 Nature paper.

Thus the advent of cattle ranching more than 100 years ago set an eco-evolutionary trap that the insects obligingly fell into, and the trap was sprung when humans suddenly removed the cattle, withdrawing their 'gift', and driving the butterflies to extinction.

European conservation biologists have long believed this to be the process underlying many local extinctions across Europe, and this study provides the first hard evidence of the process in action in real time. It also foreshadows an increasing importance of maintaining historical land use practices, including cattle ranching, as conservation measures in North America.

The authors, affiliated to the University of Plymouth, the University of Texas at Austin and CNRS Moulis, have spent more than three decades studying changes in the diet of Edith's checkerspot (Euphydryas editha) in a spring-fed meadow surrounded by semi-desert sagebrush and pine forest on a family-run ranch in Nevada.

In particular, the authors assessed the impact of narrow-leaved plantain (Plantago lanceolata), which was introduced to the USA in hay brought from Europe and flourished under cattle-grazing, probably arriving in Nevada more than 100 years ago.

As soon as the butterflies encountered the plantain, their caterpillars survived better on it than on their traditional host, Blue-Eyed Mary (Collinsia parviflora), causing the adults to evolve preference for laying eggs on the plantain. By the mid-2000s, they were 100% reliant on the plantain and the Collinsia had been abandoned.

However, within three years of the ranch's cattle being removed due to financial pressures, the butterflies became locally extinct as the grasses around their favoured new host were no longer grazed, and the plantains became embedded in those grasses, cooling the micro-environment. The Collinsia was unaffected by removal of cattle, so if the butterflies had not evolved so rapidly in response to the introduction of the plantain, they would most likely have survived.

Around five years after the extinction, Edith's checkerspots recolonized the meadow. Since they were all found feeding on Collinsia, the original host plant, scientists believe these colonists to be a new population, and that the lineage which had called the ranch home for several decades no longer exists.

They say the results are similar to that seen in British species such as the large blue butterfly, which went extinct across southern England following a reduction of grazing by both rabbits and sheep. Once this process was understood, the butterflies could be successfully re-introduced.

Professor Singer, who has been studying the diet of Edith's checkerspot for more than 50 years and led the current study, said: "This is a clear example of how humans are able to change habitats faster than even rapidly-evolving species can change their behaviour. This cannot be not an isolated phenomenon, so unless we become aware of the potential consequences of such actions we will continue to inadvertently cause population extinctions of native species, without recognizing what we are doing. Species-level extinctions are possible when human activities are synchronized across wide areas."

Professor Parmesan, a lead contributor to the Intergovernmental Panel on Climate Change which was awarded the Nobel Peace Prize in 2007, said the study had potentially wider implications beyond the scope of changes to farming practices.

Read more at Science Daily

500-year-old Leaning Tower of Pisa mystery unveiled by engineers

Leaning tower of Pisa, Italy.
Why has the Leaning Tower of Pisa survived the strong earthquakes that have hit the region since the middle ages? This is a long-standing question a research group of 16 engineers has investigated, including a leading expert in earthquake engineering and soil-structure interaction from the University of Bristol.

Professor George Mylonakis, from Bristol's Department of Civil Engineering, was invited to join a 16-member research team, led by Professor Camillo Nuti at Roma Tre University, to explore this Leaning Tower of Pisa mystery that has puzzled engineers for many years.

Despite leaning precariously at a five-degree angle, leading to an offset at the top of over five metres, the 58-metre tall Tower has managed to survive, undamaged, at least four strong earthquakes that have hit the region since 1280.

Given the vulnerability of the structure, which barely manages to stand vertically, it was expected to sustain serious damage or even collapse because of moderate seismic activity. Surprisingly this hasn't happened and until now this has mystified engineers for a long time. After studying available seismological, geotechnical and structural information, the research team concluded that the survival of the Tower can be attributed to a phenomenon known as dynamic soil-structure interaction (DSSI).

The considerable height and stiffness of the Tower combined with the softness of the foundation soil, causes the vibrational characteristics of the structure to be modified substantially, in such a way that the Tower does not resonate with earthquake ground motion. This has been the key to its survival. The unique combination of these characteristics gives the Tower of Pisa the world record in DSSI effects.

Professor Mylonakis, Chair in Geotechnics and Soil-Structure Interaction, and Head of Earthquake and Geotechnical Engineering Research Group in the Department of Civil Engineering at the University of Bristol, said: "Ironically, the very same soil that caused the leaning instability and brought the Tower to the verge of collapse, can be credited for helping it survive these seismic events."

Results from the study have been presented to international workshops and will be formally announced at the 16th European Conference in Earthquake Engineering taking place in Thessaloniki, Greece next month [18 to 21 June 2018].

From Science Daily

New research shows how Indo-European languages spread across Asia

Przewalskii horse herd grazing on pasture
A new study has discovered that horses were first domesticated by descendants of hunter-gatherer groups in Kazakhstan who left little direct trace in the ancestry of modern populations. The research sheds new light on the long-standing "steppe theory" on the origin and movement of Indo-European languages made possible by the domestication of the horse.

The domestication of the horse was a milestone in human history that allowed people, their languages, and their ideas to move further and faster than before, leading both to widespread farming and to horse-powered warfare.

Scholars from around the world have collaborated on a new inter-disciplinary research project, which was published in the journal Science 9 May 2018. The researchers analysed ancient and modern DNA samples from humans and compared the results -- the 74 ancient whole-genome sequences studied by the group were up to 11,000 years old and were from inner Asia and Turkey.

Professor Eske Willerslev, who holds positions both at St John's College, University of Cambridge, and the University of Copenhagen, jointly led the study which looked at archaeological findings, history, and linguistics.

Much of the study builds on questions raised by scholars of Indo-European studies at the Institute of Nordic Studies and Linguistics at University of Copenhagen. A number of conflicting theories have been presented about who first domesticated the horse, with previous studies pointing to people of the pastoralist Yamnaya culture, a dominant herding group who lived in Eastern Europe and Western Asia.

Dr. Guus Kroonen, historical linguist at University of Copenhagen, explains:

"The successful spread of the Indo-European languages across Eurasia has puzzled researchers for a century. It was thought that speakers of this language family played a key role in the domestication of the horse, and that this, in combination with the development of wheeled vehicles, allowed them to spread across Eurasia from the Yamnaya culture."

However, as this study shows, domesticated horses were used by the Botai people already 5,500 years ago, and much further East in Central Asia, completely independent of the Yamnaya pastoralists. A further twist to the story is that the descendants of these Botai were later pushed out from the central steppe by migrations coming from the west. Their horses were replaced too, indicating that horses were domesticated separately in other regions as well.

No link between Botai and Yamnaya cultures

The study does not find a genetic link between the people associated with the Yamnaya and Botai archaeological cultures, which is critical to understanding the eastward movement of the Yamnaya. Apparently, their eastward expansion bypassed the Botai completely, moving 3000 kilometres across the steppe to the Altai Mountains in Central and East Asia.

Professor Alan Outram from the Department of Archaeology at the University of Exeter and one of the paper's authors, states:

"We now know that the people who first domesticated the horse in Central Asia were the descendants of ice age hunters, who went on to become the earliest pastoralists in the region. Despite their local innovations, these peoples were overrun and replaced by European steppe pastoralists in the middle and later Bronze Age, and their horses were replaced too."

Languages spread through exchanges between several cultures

The authors also demonstrate the oldest known Indo-European language, Hittite, did not result from a massive population migration from the Eurasian Steppe as previously claimed.

In contrast to a series of recent studies on population movement in Europe during the Bronze Age, the new results from Asia suggest that population and language spread across the region is better understood by groups of people mixing together.

Gojko Barjamovic, Senior Lecturer on Assyriology at Harvard University, explains:

"In Anatolia, and parts of Central Asia, which held densely settled complex urban societies, the history of language spread and genetic ancestry is better described in terms of contact and absorption than by simply a movement of population."

He adds:

"The Indo-European languages are usually said to emerge in Anatolia in the 2nd millennium BCE. However, we use evidence from the palatial archives of the ancient city of Ebla in Syria to argue that Indo-European was already spoken in modern-day Turkey in the 25th century BCE. This means that the speakers of these language must have arrived there prior to any Yamnaya expansions."

The study also shows that the spread of the Indo-Iranian languages to South Asia, with Hindi, Urdu and Persian as major modern offshoots, cannot result from the Yamnaya expansions. Rather, the Indo-Iranian languages spread with a later push of pastoralist groups from the South Ural Mountains during the Middle to Late Bronze Age.

Prior to entering South Asia, these groups, thought to have spoken an Indo-Iranian language, were impacted by groups with an ancestry typical of more western Eurasian populations. This suggests that the Indo-Iranian speakers did not split off from the Yamnaya population directly, but were more closely related to the Indo-European speakers that lived in Eastern Europe.

Unique collaboration between the humanities and the natural sciences

In this study, geneticists, historians, archaeologists and linguists find common ground -- pointing to increased interaction between the steppe and the Indus Valley during the Late Bronze Age as the most plausible time of entry of Indo-European languages in South Asia. Several authors of the paper had radically conflicting views before the final interpretation was achieved.

Lead author on the article, Peter de Barros Damgaard, who is a geneticist working at the University of Copenhagen comments:

"The project has been an extremely enriching and exciting process. We were able to direct many very different academic fields towards a single coherent approach. By asking the right questions, and keeping limitations of the data in mind, contextualizing, nuancing, and keeping dialogues open between scholars of radically different backgrounds and approaches, we have carved out a path for a new field of research. We have already seen too many papers come out in which models produced by geneticists working on their own have been are accepted without vital input from other fields, and, at the other extreme, seen archaeologists opposing new studies built on archaeogenetic data, due to a lack of transparency between the fields."

"Data on ancient DNA is astonishing for its ability to provide a fine-grained image of early human mobility, but it does stand on the shoulders of decades of work by scholars in other fields, from the time of excavation of human skeletons to interpreting the cultural, linguistic origins of the samples. This is how cold statistics are turned into history."

Read more at Science Daily

Stone Age hepatitis B virus decoded

Skeletal remains of HBV positive individual from the Stone Age site of Karsdorf, Germany. The individual was a male with an age at death of around 25-30 years.
An international team of scientists led by researchers at the Max Planck Institute for the Science of Human History and the University of Kiel has successfully reconstructed genomes from Stone Age and Medieval European strains of the hepatitis B virus. This unprecedented recovery of ancient virus DNA indicates that hepatitis B was circulating in Europe at least 7000 years ago. While the ancient virus is similar to its modern counterparts, the strains represent a distinct lineage that has likely gone extinct and is most closely related to chimpanzee and gorilla viruses.

The hepatitis B virus (HBV) is one of the most widespread human pathogens known today, affecting over 250 million people worldwide. However, its origin and evolutionary history remain unclear. Studying the evolution and history of the virus has to date been especially difficult, because until now viral DNA had not been successfully recovered from prehistoric samples. In the present study, which has been accepted for publication in the journal eLife and is due to be published on May 10, 2018, an international team of researchers led by the Max Planck Institute for the Science of Human History and the Institute of Clinical Molecular Biology at Kiel University, not only recovered ancient viral DNA from skeletons but also reconstructed the genomes of three strains of HBV.

The ancient history of hepatitis B

For this study, the researchers analyzed samples from the teeth of 53 skeletons excavated from Neolithic and medieval sites in Germany. The remains dated from around 5000 BC to 1200 AD. The researchers screened all samples for viral pathogens and detected ancient HBV in three of the individuals. Full HBV genomes were recovered from these samples, two of which were from the Neolithic period, dating to about 7000 and 5000 years ago, and one from the medieval period. The Neolithic genomes represent the by far oldest virus genomes reconstructed to date.

Interestingly, the ancient virus genomes appear to represent distinct lineages that have no close relatives today and possibly went extinct. The two Neolithic genomes, although recovered from individuals that lived 2000 years apart, were relatively similar to each other in comparison with modern strains, and were in fact more closely related to modern strains of HBV found in Chimpanzees and Gorillas. In contrast, the medieval HBV genome is more similar to modern strains, but still represents a separate lineage. This is the case even when it is compared to two previously published HBV genomes recovered from mummies dating to the 16th century. The HBV strains found in these mummies are closely related to modern strains, suggesting a surprising lack of change in the virus over the last 500 years. These findings point to a complicated history for the virus, which may have involved multiple cross-species transmission events.

Long and complicated evolution of one of today's most common viruses

"Taken together, our results demonstrate that HBV already existed in Europeans 7000 years ago and that its genomic structure closely resembled that of modern hepatitis B viruses, despite the differences observed," explains first author Ben Krause-Kyora, of the Max Planck Institute for the Science of Human History and Kiel University. "More ancient precursors, intermediates and modern strains of both human and non-human primate HBV strains need to be sequenced to disentangle the complex evolution of this virus," he adds.

Read more at Science Daily

New magnetic process in turbulent space

In this visualization, as the supersonic solar wind (yellow haze) flows around the Earth's magnetic field (blue wavy lines), it forms a highly turbulent boundary layer called the 'magnetosheath' (yellow swirling area). A new research paper describes observations of small-scale magnetic reconnection within the magnetosheath, revealing important clues about heating in the sun's outer layers and elsewhere in the universe.
Though close to home, the space immediately around Earth is full of hidden secrets and invisible processes. In a new discovery reported in the journal Nature, scientists working with NASA's Magnetospheric Multiscale spacecraft -- MMS -- have uncovered a new type of magnetic event in our near-Earth environment by using an innovative technique to squeeze extra information out of the data.

Magnetic reconnection is one of the most important processes in the space -- filled with charged particles known as plasma -- around Earth. This fundamental process dissipates magnetic energy and propels charged particles, both of which contribute to a dynamic space weather system that scientists want to better understand, and even someday predict, as we do terrestrial weather. Reconnection occurs when crossed magnetic field lines snap, explosively flinging away nearby particles at high speeds. The new discovery found reconnection where it has never been seen before -- in turbulent plasma.

"In the plasma universe, there are two important phenomena: magnetic reconnection and turbulence," said Tai Phan, a senior fellow at the University of California, Berkeley, and lead author on the paper. "This discovery bridges these two processes."

Magnetic reconnection has been observed innumerable times in the magnetosphere -- the magnetic environment around Earth -- but usually under calm conditions. The new event occurred in a region called the magnetosheath, just outside the outer boundary of the magnetosphere, where the solar wind is extremely turbulent. Previously, scientists didn't know if reconnection even could occur there, as the plasma is highly chaotic in that region. MMS found it does, but on scales much smaller than previous spacecraft could probe.

MMS uses four identical spacecraft flying in a pyramid formation to study magnetic reconnection around Earth in three dimensions. Because the spacecraft fly incredibly close together -- at an average separation of just four-and-a-half miles, they hold the record for closest separation of any multi-spacecraft formation -- they are able to observe phenomena no one has seen before. Furthermore, MMS's instruments are designed to capture data at speeds a hundred times faster than previous missions.

Even though the instruments aboard MMS are incredibly fast, they are still too slow to capture turbulent reconnection in action, which requires observing narrow layers of fast moving particles hurled by the recoiling field lines. Compared to standard reconnection, in which broad jets of ions stream out from the site of reconnection, turbulent reconnection ejects narrow jets of electrons only a couple miles wide.

"The smoking gun evidence is to measure oppositely directed electron jets at the same time, and the four MMS spacecraft were lucky to corner the reconnection site and detect both jets," said Jonathan Eastwood, a lecturer at Imperial College, London, and a co-author of the paper.

Crucially, MMS scientists were able to leverage the design of one instrument, the Fast Plasma Investigation, to create a technique to interpolate the data -- essentially allowing them to read between the lines and gather extra data points -- in order to resolve the jets.

"The key event of the paper happens in only 45 milliseconds. This would be one data point with the basic data," said Amy Rager, a graduate student at NASA's Goddard Space Flight Center in Greenbelt, Maryland, and the scientist who developed the technique. "But instead we can get six to seven data points in that region with this method, allowing us to understand what is happening."

With the new method, the MMS scientists are hopeful they can comb back through existing datasets to find more of these events, and potentially other unexpected discoveries as well.

Read more at Science Daily

May 9, 2018

'Snowball Earth' resulted from plate tectonics

During its 'snowball' phase, the Earth may have resembled Enceladus, a moon of Saturn that is covered in snow and ice.
About 700 million years ago, the Earth experienced unusual episodes of global cooling that geologists refer to as "Snowball Earth."

Several theories have been proposed to explain what triggered this dramatic cool down, which occurred during a geological era called the Neoproterozoic. Now two geologists at The University of Texas at Dallas and UT Austin suggest that those major climate changes can be linked to one thing: the advent of plate tectonics.

The research was published online in December 2017 and in the April print edition of the journal Terra Nova.

Plate tectonics is a theory formulated in the late 1960s that states the Earth's crust and upper mantle -- a layer called the lithosphere -- is broken into moving pieces, or plates. These plates move very slowly -- about as fast as your fingernails and hair grow -- causing earthquakes, mountain ranges and volcanoes.

"Earth is the only body in our solar system known to currently have plate tectonics, where the lithosphere is fragmented like puzzle pieces that move independently," said Dr. Robert Stern, professor of geosciences in UT Dallas' School of Natural Sciences and Mathematics, and co-author of the study, along with Dr. Nathaniel Miller, a research scientist in UT Austin's Jackson School of Geosciences who earned his PhD in geosciences from UT Dallas in 1995.

"It is much more common for planets to have an outer solid shell that is not fragmented, which is known as 'single lid tectonics'," Stern said.

Geoscientists disagree about when the Earth changed from single lid to plate tectonics, with the plate fragmenting from one plate to two plates and so on to the present global system of seven major and many smaller plates. But Stern highlights geological and theoretical evidence that plate tectonics began between 800 million and 600 million years ago, and has published several articles arguing for this timing.

In the new study, Stern and Miller provide new insights by suggesting that the onset of plate tectonics likely initiated the changes on Earth's surface that led to Snowball Earth. They argue that plate tectonics is the event that can explain 22 theories that other scientists have advanced as triggers of the Neoproterozoic Snowball Earth.

"We went through the literature and examined all the mechanisms that have been put forward for Snowball Earth," Stern said. "The start of plate tectonics could be responsible for each of these explanations."

The onset of plate tectonics should have disturbed the oceans and the atmosphere by redistributing continents, increasing explosive arc volcanism and stimulating mantle plumes, Stern said.

"The fact that strong climate and oceanographic effects are observed in the Neoproterozoic time is a powerful supporting argument that this is indeed the time of the transition from single lid to plate tectonics," Stern said. "It's an argument that, to our knowledge, hasn't yet been considered.

Read more at Science Daily

Earth's orbital changes have influenced climate, life forms for at least 215 million years

Rutgers University-New Brunswick Professor Dennis Kent with part of a 1,700-foot-long rock core through the Chinle Formation in Petrified Forest National Park in Arizona. The background includes boxed archives of cores from the Newark basin that were compared with the Arizona core.
Every 405,000 years, gravitational tugs from Jupiter and Venus slightly elongate Earth's orbit, an amazingly consistent pattern that has influenced our planet's climate for at least 215 million years and allows scientists to more precisely date geological events like the spread of dinosaurs, according to a Rutgers-led study.

The findings are published online today in the Proceedings of the National Academy of Sciences.

"It's an astonishing result because this long cycle, which had been predicted from planetary motions through about 50 million years ago, has been confirmed through at least 215 million years ago," said lead author Dennis V. Kent, a Board of Governors professor in the Department of Earth and Planetary Sciences at Rutgers University-New Brunswick. "Scientists can now link changes in the climate, environment, dinosaurs, mammals and fossils around the world to this 405,000-year cycle in a very precise way."

The scientists linked reversals in the Earth's magnetic field -- when compasses point south instead of north and vice versa -- to sediments with and without zircons (minerals with uranium that allow radioactive dating) as well as to climate cycles.

"The climate cycles are directly related to how the Earth orbits the sun and slight variations in sunlight reaching Earth lead to climate and ecological changes," said Kent, who studies Earth's magnetic field. "The Earth's orbit changes from close to perfectly circular to about 5 percent elongated especially every 405,000 years."

The scientists studied the long-term record of reversals in the Earth's magnetic field in sediments in the Newark basin, a prehistoric lake that spanned most of New Jersey, and in sediments with volcanic detritus including zircons in the Chinle Formation in Petrified Forest National Park in Arizona. They collected a core of rock from the Triassic Period, some 202 million to 253 million years ago. The core is 2.5 inches in diameter and about 1,700 feet long, Kent said.

The results showed that the 405,000-year cycle is the most regular astronomical pattern linked to the Earth's annual turn around the sun, he said.

Prior to this study, dates to accurately time when magnetic fields reversed were unavailable for 30 million years of the Late Triassic. That's when dinosaurs and mammals appeared and the Pangea supercontinent broke up. The break-up led to the Atlantic Ocean forming, with the sea-floor spreading as the continents drifted apart, and a mass extinction event that affected dinosaurs at the end of that period, Kent said.

Read more at Science Daily

78,000 year cave record from East Africa shows early cultural innovations

Worked red ochre; bead made of a sea shell; ostrich eggshell beads; bone tool; close-up of the bone tool showing traces of scraping. (from left to right).
An international, interdisciplinary group of scholars working along the East African coast have discovered a major cave site which records substantial activities of hunter-gatherers and later, Iron Age communities. Detailed environmental research has demonstrated that human occupations occur in a persistent tropical forest-grassland ecotone, adding new information about the habitats exploited by our species, and indicating that populations sought refuge in a relatively stable environment. Prior to this cave excavation, little information was available about the last 78,000 years from coastal East Africa, with the majority of archaeological research focused on the Rift Valley and in South Africa.

Humans lived in the humid coastal forest

A large-scale interdisciplinary study, including scientific analyses of archaeological plants, animals and shells from the cave indicates a broad perseverance of forest and grassland environments. As the cave environment underwent little variation over time, humans found the site attractive for occupation, even during periods of time when other parts of Africa would have been inhospitable. This suggests that humans exploited the cave environment and landscape over the long term, relying on plant and animal resources when the wider surrounding landscapes dried. The ecological setting of Panga ya Saidi is consistent with increasing evidence that Homo sapiens could adapt to a variety of environments as they moved across Africa and Eurasia, suggesting that flexibility may be the hallmark of our species. Homo sapiens developed a range of survival strategies to live in diverse habitats, including tropical forests, arid zones, coasts and the cold environments found at higher latitudes.

Technological innovations occur at 67,000 years ago

Carefully prepared stone tool toolkits of the Middle Stone Age occur in deposits dating back to 78,000 years ago, but a distinct shift in technology to the Later Stone Age is shown by the recovery of small artefacts beginning at 67,000 years ago. The miniaturization of stone tools may reflect changes in hunting practices and behaviors. The Panga ya Saidi sequence after 67,000, however, has a mix of technologies, and no radical break of behavior can be detected at any time, arguing against the cognitive or cultural 'revolutions' theorized by some archaeologists. Moreover, no notable break in human occupation occurs during the Toba volcanic super-eruption of 74,000 years ago, supporting views that the so-called 'volcanic winter' did not lead to the near-extinction of human populations, though hints of increased occupation intensity from 60,000 years ago suggests that populations were increasing in size.

Earliest symbolic and cultural items found at Panga ya Saidi cave

The deep archaeological sequence of Panga ya Saidi cave has produced a remarkable new cultural record indicative of cultural complexity over the long term. Among the recovered items are worked and incised bones, ostrich eggshell beads, marine shell beads, and worked ochre. Panga ya Saidi has produced the oldest bead in Kenya, dating to ~65,000 years ago. At about 33,000 years ago, beads were most commonly made of shells acquired from the coast. While this demonstrates contact with the coast, there is no evidence for the regular exploitation of marine resources for subsistence purposes. Ostrich eggshell beads become more common after 25,000 years ago, and after 10,000 years ago, there is again a shift to coastal shell use. In the layers dating to between ~48,000 to 25,000 years ago, carved bone, carved tusk, a decorated bone tube, a small bone point, and modified pieces of ochre were found. Though indicative of behavioral complexity and symbolism, their intermittent appearance in the cave sequence argues against a model for a behavioral or cognitive revolution at any specific point in time.

Project Principal Investigator and Director of the Department of Archaeology at the Max Planck Institute for the Science of Human History Dr. Nicole Boivin states, "The East African coastal hinterland and its forests and have been long considered to be marginal to human evolution so the discovery of Panga ya Saidi cave will certainly change archaeologists' views and perceptions."

Group Leader of the Stable Isotopes Lab Dr. Patrick Roberts adds, "Occupation in a tropical forest-grassland environment adds to our knowledge that our species lived in a variety of habitats in Africa."

Read more at Science Daily

Exiled asteroid discovered in outer reaches of solar system

This artist's impression shows the exiled asteroid 2004 EW95, the first carbon-rich asteroid confirmed to exist in the Kuiper Belt and a relic of the primordial solar system. This curious object likely formed in the asteroid belt between Mars and Jupiter and must have been transported billions of kilometers from its origin to its current home in the Kuiper Belt.
An international team of astronomers has used ESO telescopes to investigate a relic of the primordial Solar System. The team found that the unusual Kuiper Belt Object 2004 EW95 is a carbon-rich asteroid, the first of its kind to be confirmed in the cold outer reaches of the Solar System. This curious object likely formed in the asteroid belt between Mars and Jupiter and has been flung billions of kilometres from its origin to its current home in the Kuiper Belt.

The early days of our Solar System were a tempestuous time. Theoretical models of this period predict that after the gas giants formed they rampaged through the Solar System, ejecting small rocky bodies from the inner Solar System to far-flung orbits at great distances from the Sun. In particular, these models suggest that the Kuiper Belt -- a cold region beyond the orbit of Neptune -- should contain a small fraction of rocky bodies from the inner Solar System, such as carbon-rich asteroids, referred to as carbonaceous asteroids.

Now, a recent paper has presented evidence for the first reliably-observed carbonaceous asteroid in the Kuiper Belt, providing strong support for these theoretical models of our Solar System's troubled youth. After painstaking measurements from multiple instruments at ESO's Very Large Telescope (VLT), a small team of astronomers led by Tom Seccull of Queen's University Belfast in the UK was able to measure the composition of the anomalous Kuiper Belt Object 2004 EW95, and thus determine that it is a carbonaceous asteroid. This suggests that it originally formed in the inner Solar System and must have since migrated outwards.

The peculiar nature of 2004 EW95 first came to light during routine observations with the NASA/ESA Hubble Space Telescope by Wesley Fraser, an astronomer from Queen's University Belfast who was also a member of the team behind this discovery. The asteroid's reflectance spectrum -- the specific pattern of wavelengths of light reflected from an object -- was different to that of similar small Kuiper Belt Objects (KBOs), which typically have uninteresting, featureless spectra that reveal little information about their composition.

"The reflectance spectrum of 2004 EW95 was clearly distinct from the other observed outer Solar System objects," explains lead author Seccull. "It looked enough of a weirdo for us to take a closer look."

The team observed 2004 EW95 with the X-Shooter and FORS2 instruments on the VLT. The sensitivity of these spectrographs allowed the team to obtain more detailed measurements of the pattern of light reflected from the asteroid and thus infer its composition.

However, even with the impressive light-collecting power of the VLT, 2004 EW95 was still difficult to observe. Though the object is 300 kilometres across, it is currently a colossal four billion kilometres from Earth, making gathering data from its dark, carbon-rich surface a demanding scientific challenge.

"It's like observing a giant mountain of coal against the pitch-black canvas of the night sky," says co-author Thomas Puzia from the Pontificia Universidad Católica de Chile.

"Not only is 2004 EW95 moving, it's also very faint," adds Seccull. "We had to use a pretty advanced data processing technique to get as much out of the data as possible."

Two features of the object's spectra were particularly eye-catching and corresponded to the presence of ferric oxides and phyllosilicates. The presence of these materials had never before been confirmed in a KBO, and they strongly suggest that 2004 EW95 formed in the inner Solar System.

Seccull concludes: "Given 2004 EW95's present-day abode in the icy outer reaches of the Solar System, this implies that it has been flung out into its present orbit by a migratory planet in the early days of the Solar System."

Read more at Science Daily

May 8, 2018

Why does the Sun's Corona sizzle at one million °F?

A team of physicists, including NJIT’s Gregory Fleishman, has discovered previously undetected energy in the Sun’s coronal loops.
The Sun's corona, invisible to the human eye except when it appears briefly as a fiery halo of plasma during a solar eclipse, remains a puzzle even to scientists who study it closely. Located 1,300 miles from the star's surface, it is more than a hundred times hotter than lower layers much closer to the fusion reactor at the Sun's core.

A team of physicists, led by NJIT's Gregory Fleishman, has recently discovered a phenomenon that may begin to untangle what they call "one of the greatest challenges for solar modeling" -- determining the physical mechanisms that heat the upper atmosphere to 1 million degrees Fahrenheit and higher. Their findings, which account for previously undetected thermal energy in the corona, were recently published in the 123-year-old Astrophysical Journal, whose editors have included foundational space scientists such as Edwin Hubble.

"We knew that something really intriguing happens at the interface between the photosphere -- the Sun's surface -- and the corona, given the noticeable disparities in the chemical composition between the two layers and the sharp rise in plasma temperatures at this junction," notes Fleishman, a distinguished research professor of physics.

With a series of observations from NASA's space-based Solar Dynamics Observatory (SDO), the team has revealed regions in the corona with elevated levels of heavy metal ions contained in magnetic flux tubes -- concentrations of magnetic fields -- which carry an electrical current. Their vivid images, captured in the extreme (short wave) ultraviolet (EUV) band, reveal disproportionally large -- by a factor of five or more -- concentrations of multiply charged metals compared to single-electron ions of hydrogen, than exist in the photosphere.

The iron ions reside in what the team calls "ion traps" located at the base of coronal loops, arcs of electrified plasma directed by magnetic field lines. The existence of these traps, they say, implies that there are highly energetic coronal loops, depleted of iron ions, which have thus far eluded detection in the EUV range. Only metal ions, with their fluctuating electrons, produce emissions which make them visible.

"These observations suggest that the corona may contain even more thermal energy than is directly observed in the EUV range and that we have not yet accounted for," he says. "This energy is visible in other wavelengths, however, and we hope to combine our data with scientists who view it through microwaves and X-rays, such as scientists at NJIT's Expanded Owens Valley Solar Array, for example, to clarify mismatches in energy that we've been able to quantify so far."

There are various theories, none yet conclusive, that explain the sizzling heat of the corona: magnetic energy lines that reconnect in the upper atmosphere and release explosive energy and energy waves dumped in the corona, where they are converted to thermal energy, among others.

"Before we can address how energy is generated in the corona, we must first map and quantify its thermal structure," Fleishman notes.

"What we know of the corona's temperature comes from measuring EUV emissions produced by heavy ions in various states of ionization, which depends on their concentrations, as well as plasma temperature and density," he adds. "The non-uniform distribution of these ions in space and time appears to affect the temperature of the corona."

The metal ions enter the corona when variously sized solar flares destroy the traps, and they are evaporated into flux loops in the upper atmosphere.

Energy releases in solar flares and associated forms of eruptions occur when magnetic field lines, with their powerful underlying electric currents, are twisted beyond a critical point that can be measured by the number of turns in the twist. The largest of these eruptions cause what is known as space weather -- the radiation, energetic particles and magnetic field releases from the Sun powerful enough to cause severe effects in Earth's near environment, such as the disruption of communications, power lines and navigation systems.

It is only through recent advances in imaging capabilities that solar scientists can now take routine measurements of photospheric magnetic field vectors from which to compute the vertical component of electric currents, and, simultaneously, quantify the EUV emissions produced by heavy ions.

"Prior to these observations, we have only accounted for the coronal loops filled with heavy ions, but we could not account for flux tubes depleted of them," Fleishman says. "Now all of these poorly understood phenomena have a solid physical foundation that we can observe. We are able to better quantify the corona's thermal structure and gain a clearer understanding of why ion distribution in the solar atmosphere is non-uniform in space and variable in time."

Scientists at NJIT's Big Bear Solar Observatory (BBSO) have captured the first high-resolution images of magnetic fields and plasma flows originating deep below the Sun's surface, tracing the evolution of sunspots and magnetic flux ropes through the chromosphere before their dramatic appearance in the corona as flaring loops.

Read more at Science Daily

Eggs not linked to cardiovascular risk, despite conflicting advice

Fresh chicken eggs.
University of Sydney researchers aim to help clear up conflicting dietary advice around egg consumption, as a new study finds eating up to 12 eggs per week for a year did not increase cardiovascular risk factors in people with pre-diabetes and type 2 diabetes.

Published in the American Journal of Clinical Nutrition today, the research extends on a previous study that found similar results over a period of three months.

Led by Dr Nick Fuller from the University's Boden Institute of Obesity, Nutrition, Exercise and Eating Disorders at the Charles Perkins Centre, the research was conducted with the University of Sydney's Sydney Medical School and the Royal Prince Alfred Hospital.

In the initial trial, participants aimed to maintain their weight while embarking on a high-egg (12 eggs per week) or low-egg (less than two eggs per week) diet, with no difference in cardiovascular risk markers identified at the end of three months.

The same participants then embarked on a weight loss diet for an additional three months, while continuing their high or low egg consumption. For a further six months -- up to 12 months in total -- participants were followed up by researchers and continued their high or low egg intake.

At all stages, both groups showed no adverse changes in cardiovascular risk markers and achieved equivalent weight loss -- regardless of their level of egg consumption, Dr Fuller explained.

"Despite differing advice around safe levels of egg consumption for people with pre-diabetes and type 2 diabetes, our research indicates people do not need to hold back from eating eggs if this is part of a healthy diet," Dr Fuller said.

"A healthy diet as prescribed in this study emphasised replacing saturated fats (such as butter) with monounsaturated and polyunsaturated fats (such as avocado and olive oil)," he added.

The extended study tracked a broad range of cardiovascular risk factors including cholesterol, blood sugar and blood pressure, with no significant difference in results between the high egg and low egg groups.

"While eggs themselves are high in dietary cholesterol -- and people with type 2 diabetes tend to have higher levels of the 'bad' low density lipoprotein (LDL) cholesterol -- this study supports existing research that shows consumption of eggs has little effect on the levels of cholesterol in the blood of the people eating them," Dr Fuller explained.

Dr Fuller said the findings of the study were important due to the potential health benefits of eggs for people with pre-diabetes and type 2 diabetes, as well as the general population.

"Eggs are a source of protein and micronutrients that could support a range of health and dietary factors including helping to regulate the intake of fat and carbohydrate, eye and heart health, healthy blood vessels and healthy pregnancies."

The different egg diets also appeared to have no impact on weight, Dr Fuller said.

Read more at Science Daily

Scientists train spider to jump on demand to discover secrets of animal movement

Visual comparison of body attitude and leg arrangement at the start and end of the jumping tasks. The starting frame is the instant of take-off. The end frame is taken either at the point where the spider makes contact with the landing platform, or where the centre of gravity passes the longitudinal location of the task landing point, depending on which happens first.
Scientists have unlocked the secrets of how some predatory spiders catch their prey whilst hunting by successfully training one to jump different distances and heights for the first time.

The study, conducted by researchers at The University of Manchester, is the most advanced of its kind to date and first to use 3D CT scanning and high-speed, high-resolution cameras to record, monitor and analyse a spider's movement and behaviour.

The aim of the research is to answer the question of why jumping spider anatomy and behaviour evolved the way it did, and secondly, to use this improved understanding of spiders to imagine a new class of agile micro-robots that are currently unthinkable using today's engineering technologies.

The study is being published in the journal Nature Scientific Reports.

Dr Mostafa Nabawy, lead author of the study, says: "The focus of the present work is on the extraordinary jumping capability of these spiders. A jumping spider can leap up to six times its body length from a standing start. The best a human can achieve is about 1.5 body lengths. The force on the legs at take-off can be up to 5 times the weight of the spider -- this is amazing and if we can understand these biomechanics we can apply them to other areas of research."

The researchers trained the spider, which they nicknamed Kim, to jump different heights and distances on a humanmade platform in a laboratory environment. Kim belongs to a species of jumping arachnid known as Phidippus regius, or 'Regal Jumping Spider'. The team then recorded the jumps using ultra-high-speed cameras, and used high resolution micro CT scans to create a 3D model of Kim's legs and body structure in unprecedented detail.

The results show that this particular species of spider uses different jumping strategies depending on the jumping challenge it is presented with.

For example, to jump shorter, close-range distances Kim favoured a faster, lower trajectory which uses up more energy, but minimises flight time. This makes the jump more accurate and more effective for capturing its prey. But, if Kim is jumping a longer distance or to an elevated platform, perhaps to traverse rough terrain, she jumps in the most efficient way to reduce the amount of energy used.

Insects and spiders jump in a number of different ways, either using a spring like mechanism, direct muscle forces or using internal fluid pressure.

Scientists have known for more than 50 years that spiders use internal hydraulic pressure to extend their legs, but what isn't known is if this hydraulic pressure is actively used to enhance or replace muscle force when the spiders jump.

Read more at Science Daily

25 years of fossil collecting yields clearest picture of extinct 12-foot aquatic predator

A rendition of what the Hyneria lindae might have looked like.
After 25 years of collecting fossils at a Pennsylvania site, scientists at the Academy of Natural Sciences of Drexel University now have a much better picture of an ancient, extinct 12-foot fish and the world in which it lived.

Although Hyneria lindae was initially described in 1968, it was done without a lot of fossil material to go on. But since the mid-1990s, dedicated volunteers, students, and paleontologists digging at the Red Hill site in northern Pennsylvania's Clinton County have turned up more -- and better quality -- fossils of the fish's skeleton that have led to new insights.

Academy researchers Ted Daeschler, PhD, and Jason Downs, PhD, who specialize in the Devonian time period (a time before dinosaurs and even land animals) when Hyneria lived, have been able to reconstruct that the predator had a blunt, wide snout, reached 10-12 feet in length, had small eyes and featured a sensory system that allowed it to hunt prey by feeling pressure waves around it.

"Dr. Keith Thomson, the man who first described Hyneria in 1968, did not have enough fossil material to reconstruct the anatomy that we have now been able to document with more extensive collections," explained Daeschler, curator of Vertebrate Zoology at the Academy, as well as a professor in Drexel's College of Arts and Sciences.

Originally, pieces of the fish were collected in the 1950s. Thomson described and officially named Hyneria lindae in 1968, but he had just a few pieces of a crushed skull and some scales to work with.

The new discoveries that Daeschler and Downs (who is an assistant professor at Delaware Valley University) wrote about in the Journal of Vertebrate Paleontology were made possible by years of collecting that turned up, "well-preserved, well-prepared three-dimensional material of almost all of the [bony] parts of the skeleton," according to Downs.

No single complete skeleton exists of this giant, but enough is there to show that Hyneria would have truly been a monster to the other animals in the subtropical streams of the Devonian Period, roughly 365 million years ago. An apex predator, Hyneria's mouth was bristling with two-inch fangs. For reference, that's bigger than most modern Great White Shark's teeth.

Due to its sheer size, weaponry, and sensory abilities, Hyneria may have preyed upon anything from ancient placoderms (armored fish), to acanthodians (related to sharks) and sarcopterygians (lobe-finned fish, the group Hyneria belongs to) -- including early tetrapods (limbed vertebrates) that are also found at the site.

Since the streams Hyneria lived in were likely murky and not conducive to hunting by eyesight, sensory canals allowed it to detect fish swimming near it and attack them.

"We discovered that the skull roof elements have openings on their surfaces that connect up, forming a network of tubes that would function like the sensory line system in some modern aquatic vertebrates," Daeschler said. "Similarly, we found a network of connected pores on the parts of the scales that would be exposed on the body of Hyneria."

All of the new information gleaned about Hyneria is doubly valuable because it provides more information about the ecosystem -- and time period -- it lived in. The Devonian was a pivotal time in vertebrate evolution, especially since some of Hyneria's fellow lobe-finned fish developed specialized fins that would take them onto land and eventually give rise to all limbed verterbates including reptiles, amphibians and mammals.

Read more at Science Daily

May 7, 2018

New evidence that bullfrogs are to blame for deadly fungus outbreaks in western US

American bullfrog.
In the 1890s, settlers crossed the Rocky Mountains seeking new opportunities -- and bearing frogs. A new study coauthored by a San Francisco State University biology professor draws a link between that introduction of American bullfrogs (Rana catesbeiana) to the western half of the United States with the spread of a fungus deadly to amphibians. The work highlights the catastrophic results of moving animals and plants to new regions.

The fungus Batrachochytrium dendrobatidis (Bd) has rapidly spread around the world since the 1970s, causing a skin disease called chytridiomycosis and wiping out more than 200 species of amphibians globally. In the United States, these declines have followed a curious pattern. "In the whole region east of the Rockies, there hasn't been a single outbreak of Bd," said study author Vance Vredenburg, a professor of biology at San Francisco State. "But in the West there's hundreds, if not thousands."

American bullfrogs, a species introduced to the West by settlers who wished to populate ponds with an abundant source of frog legs, have for over a decade been a main suspect. Bullfrogs can carry Bd without falling victim to it themselves, making them a potential vehicle for the fungus to colonize new habitats that harbor vulnerable amphibians.

Firmly placing the blame on bullfrogs, however, has been difficult. "The problem is we need a time machine to see what happened," Vredenburg explained. So he and a team of colleagues sought out historical data from around the American West to pinpoint when bullfrogs arrived in each region and how those dates line up with the first local records of Bd.

In 83 out of the 100 watersheds where the team could dig up data on both bullfrog and Bd occurrence, the frogs were spotted first or in the same year. And in 13 of the remaining 17 cases, bullfrogs had previously been found in a neighboring region. The team reported their results on April 16 in the journal PLoS ONE.

"Even when Bd got there before bullfrogs, the frogs were usually close by," Vredenburg explained. So these new findings are more evidence in the case scientists have built against American bullfrogs -- if their presence is a prerequisite for an outbreak, it appears even more likely that they've contributed to Bd's spread.

That link between frog and fungus explains patterns in the U.S., but it's also relevant far beyond the country's borders. Thanks in part to a U.S. Agency for International Development program that shipped bullfrogs to developing countries to start frog farms, the invasive amphibians have taken hold in parts of Europe, Asia and South America. "I hope researchers will take this study and try it in other parts of the world," Vredenburg said.

Read more at Science Daily

Mars growth was stunted by early giant planetary instability

The particular dynamics of the instability between the giant planets kept Mars from growing to an Earth-mass planet.
A University of Oklahoma astrophysics team explains why the growth of Mars was stunted by an orbital instability among the outer solar system's giant planets in a new study on the evolution of the young solar system. The OU study builds on the widely-accepted Nice Model, which invokes a planetary instability to explain many peculiar observed aspects of the outer solar system. An OU model used computer simulations to show how planet accretion (growth) is halted by the outer solar system instability. Without it, Mars possibly could have become a larger, habitable planet like Earth.

"This study offers a simple and more elegant solution for why Mars is small, barren and uninhabitable," said Matthew S. Clement, OU graduate student in the Homer L. Dodge Department of Physics and Astronomy, OU College of Arts and Sciences. "The particular dynamics of the instability between the giant planets kept Mars from growing to an Earth-mass planet."

Clement and Nathan A. Kaib, OU astrophysics professor, worked with Sean N. Raymond, the University of Bordeaux, France, and Kevin J. Walsh, Southwest Research Institute, to investigate the effect of the Nice Model instability on the process of terrestrial planetary formation. The research team used computing resources provided by the OU Supercomputing Center for Education and Research and the Blue Waters sustained peta-scale computing project to perform 800 computer simulations of this scenario.

The goal of this study was to investigate simulated systems that produced Earth-like planets with Mars analogs as well. Recent geological data from Mars and Earth indicates that Mars' formation period was about 10 times shorter than Earth's, which has led to the idea that Mars was left behind as a 'stranded planetary embryo' during the formation of the Sun's inner planets. The early planet instability modeled in this study provides a natural explanation for how Mars emerged from the process of planet formation as a 'stranded embryo.'

Read more at Science Daily

Astronomers find exoplanet atmosphere free of clouds

WASP-18b is a 'hot Jupiter' located 325 light-years from Earth.
Scientists have detected an exoplanet atmosphere that is free of clouds, marking a pivotal breakthrough in the quest for greater understanding of the planets beyond our solar system.

An international team of astronomers, led by Dr Nikolay Nikolov from the University of Exeter, have found that the atmosphere of the 'hot Saturn' WASP-96b is cloud-free.

Using Europe's 8.2m Very Large Telescope in Chile, the team studied the atmosphere of WASP-96b when the planet passed in front of its host-star. This enabled the team to measure the decrease of starlight caused by the planet and its atmosphere, and thereby determine the planet's atmospheric composition.

Just like an individual's fingerprints are unique, atoms and molecules have a unique spectral characteristic that can be used to detect their presence in celestial objects. The spectrum of WASP-96b shows the complete fingerprint of sodium, which can only be observed for an atmosphere free of clouds.

The results are published in research journal Nature on May 7 2018.

WASP-96b is a typical 1300K hot gas giant similar to Saturn in mass and exceeding the size of Jupiter by 20%. The planet periodically transits a sun-like star 980 light years away in the southern constellation Phoenix, halfway between the southern jewels Fomalhaut (α Piscis Austrini) and Achernar (α Eridani).

It has long been predicted that sodium exists in the atmospheres of hot gas-giant exoplanets, and in a cloud-free atmosphere it would produce spectra that are similar in shape to the profile of a camping tent.

Nikolay Nikolov, lead author and from the University of Exeter said; "We've been looking at more than twenty exoplanet transit spectra. WASP-96b is the only exoplanet that appears to be entirely cloud-free and shows such a clear sodium signature, making the planet a benchmark for characterization.

"Until now, sodium was revealed either as a very narrow peak or found to be completely missing. This is because the characteristic 'tent-shaped' profile can only be produced deep in the atmosphere of the planet and for most planet clouds appear to get in the way."

Clouds and hazes are known to exist in some of the hottest and coldest solar system planets and exoplanets. The presence or absence of clouds and their ability to block light plays an important role in the overall energy budget of planetary atmospheres.

"It is difficult to predict which of these hot atmospheres will have thick clouds. By seeing the full range of possible atmospheres, from very cloudy to nearly cloud-free like WASP-96b, we'll gain a better understanding of what these clouds are made of; " explains Professor Jonathan J. Fortney, study co-author, based at the Other Worlds Laboratory (OWL) at the University of California, Santa Cruz (UCSC).

The sodium signature seen in WASP-96b suggests an atmosphere free of clouds. The observation allowed the team to measure how abundant sodium is in the atmosphere of the planet, finding levels similar to those found in our own Solar System.

"WASP-96b will also provide us with a unique opportunity to determine the abundances of other molecules, such as water, carbon monoxide and carbon dioxide with future observations ," adds co-author Ernst de Mooij from Dublin City University.

Sodium is the seventh most common element in the Universe. On Earth, sodium compounds such as salt give sea water its salty taste and the white colour of salt pans in deserts.

In animal life, sodium is known to regulate heart activity and metabolism. Sodium is also used in technology, such as in the sodium-vapour street lights, where it produces yellow-orange light.

Read more at Science Daily

What will happen when our sun dies?

Abell 39, the 39th entry in a catalog of large nebulae discovered by George Abell in 1966, is a beautiful example of a planetary nebula. It was chosen for study by George Jacoby (WIYN Observatory), Gary Ferland (University of Kentucky), and Kirk Korista (Western Michigan University) because of its beautiful and rare spherical symmetry. This picture was taken at the WIYN Observatory's 3.5-m (138-inch) telescope at Kitt Peak National Observatory, Tucson, AZ, in 1997 through a blue-green filter that isolates the light emitted by oxygen atoms in the nebula at a wavelength of 500.7 nanometers. The nebula has a diameter of about five light-years, and the thickness of the spherical shell is about a third of a light-year. The nebula itself is roughly 7,000 light-years from Earth in the constellation Hercules.
Scientists agree the sun will die in approximately 10 billion years, but they weren't sure what would happen next...until now.

A team of international astronomers, including Professor Albert Zijlstra from the University of Manchester, predict it will turn into a massive ring of luminous, interstellar gas and dust, known as a planetary nebula.

A planetary nebula marks the end of 90% of all stars active lives and traces the star's transition from a red giant to a degenerate white dwarf. But, for years, scientists weren't sure if the sun in our galaxy would follow the same fate: it was thought to have too low mass to create a visible planetary nebula.

To find out the team developed a new stellar, data-model that predicts the lifecycle of stars. The model was used to predict the brightness (or luminosity) of the ejected envelope, for stars of different masses and ages.

The research is being published in Nature Astronomy on Monday 7th May.

Prof Zijslra explains: "When a star dies it ejects a mass of gas and dust -- known as its envelope -- into space. The envelope can be as much as half the star's mass. This reveals the star's core, which by this point in the star's life is running out of fuel, eventually turning off and before finally dying.

"It is only then the hot core makes the ejected envelope shine brightly for around 10,000 years -- a brief period in astronomy. This is what makes the planetary nebula visible. Some are so bright that they can be seen from extremely large distances measuring tens of millions of light years, where the star itself would have been much too faint to see."

The model also solves another problem that has been perplexing astronomers for a quarter of a century.

Approximately 25 years ago astronomers discovered that if you look at planetary nebulae in another galaxy, the brightest ones always have the same brightness. It was found that it was possible to see how far away a galaxy was just from the appearance of its brightest planetary nebulae. In theory it worked in any of type galaxy.

But whilst the data suggested this was correct, the scientific models claimed otherwise. Prof Zijlstra adds: "Old, low mass stars should make much fainter planetary nebulae than young, more massive stars. This has become a source of conflict for the past for 25 years.

"The data said you could get bright planetary nebulae from low mass stars like the sun, the models said that was not possible, anything less than about twice the mass of the sun would give a planetary nebula too faint to see."

The new models show that after the ejection of the envelope, the stars heat up three times faster than found in older models. This makes it much easier for a low mass star, such as the sun, to form a bright planetary nebula. The team found that in the new models, the sun is almost exactly the lowest mass star that still produces a visible, though faint, planetary nebula. Stars even a few per cent smaller do not.

Professor Zijlstra added: "We found that stars with mass less than 1.1 times the mass of the sun produce fainter nebula, and stars more massive than 3 solar masses brighter nebulae, but for the rest the predicted brightness is very close to what had been observed. Problem solved, after 25 years!

Read more at Science Daily

May 6, 2018

Less is more when it comes to developing bigger brains

The superior size and complexity of the human brain compared to other mammals may actually originate from fewer initial starting materials, new research has suggested.

A team from the University of Oxford and Cardiff University have used mathematical models to re-enact the complex process of brain development that occurs as initialising cells, otherwise known as progenitor cells, start to grow and begin to differentiate into more specialist cells at various points in time.

By applying this experimentally realistic model to mice, monkeys and humans, all of which use roughly the same type of raw materials to develop a brain, the team identified the different brain development strategies that separates each of the three mammals.

In particular, the equations looked at the ability of progenitor cells to divide either into more progenitor cells or into neurons. The equations were then linked to real-life experimental data from mice, monkeys and humans and used to predict the original population of progenitor cells before the brains started to develop.

The results showed that the human brain may develop from fewer raw materials compared to both mice and monkeys, which is surprising given that a human brain is much more complex than that of a mouse.

Indeed, the cerebral cortex in the human brain, which is accountable for high cognitive functions such as language, memory and movement, contains approximately 16 billion neurons -- the cerebral cortex of a mouse contains around 14 million neurons.

Similarly, the brain of a mouse weighs around 400 mg whereas a human brain weighs roughly 1,500,000 mg.

Interestingly when comparing the brain of a monkey to a mouse, the results showed that the monkey brain is developed from more initial cells, leading to the creation of a larger brain.

The team have proposed that as the human brain has been formed and sculpted through more than 500 million years of evolution, it has been able to develop more strategic ways of creating complex structures with fewer cells.

In further studies the team hope to use their mathematical models to shed more light on how these strategies may have advanced through evolution and, potentially more importantly, understand diseases where it may well be that different brain strategies are realised, such as schizophrenia, epilepsy and Zika-virus induced microcephaly.

Dr Thomas E. Woolley of Cardiff University's School of Mathematics said: "This project has really brought together the complementary strengths of the mathematicians and biologists. In particular, the mathematics has highlighted the next most important experimental steps."

Dr Noemi Picco, from the University of Oxford, said: "To produce a larger brain we can either stretch development over a longer period of time or adopt an altogether different developmental program to produce neurons more efficiently within the time available.

"It seems plausible that humans adopted the first solution as our gestational period is much longer than a mouse's, rather than starting off with more raw material."

"While this argument is only speculative, this research produced an alternative testable hypothesis, setting the basis for future experimental studies."

Professor Zoltán Molnár of Oxford's Department of Physiology, Anatomy and Genetics said: "The modelling helped us to realise just how little we currently know about the comparative aspects of cerebral cortical development.

Some of the data we have are not sufficient to start modelling more complex issues of brain development and evolution. We are planning to assemble an international collaborative team to feed in the numbers for future models"

Read more at Science Daily

Temperature swings to hit poor countries hardest

Relative changes (%) of standard deviation of monthly temperature anomalies from pre-industrial conditions to the end of the 21st century, averaged over 37 climate models.
Temperature fluctuations that are amplified by climate change will hit the world's poorest countries hardest, new research suggests.

For every degree of global warming, the study suggests temperature variability will increase by up to 15% in southern Africa and Amazonia, and up to 10% in the Sahel, India and South East Asia.

Meanwhile, countries outside the tropics -- many of which are richer countries that have contributed most to climate change -- should see a decrease in temperature variability.

The researchers, from the universities of Exeter, Wageningen and Montpellier, discovered this "unfair pattern" as they addressed the difficult problem of predicting how weather extremes such as heat waves and cold snaps might change in a future climate.

"The countries that have contributed least to climate change, and have the least economic potential to cope with the impacts are facing the largest increases in temperature variability," said lead author Dr Sebastian Bathiany, of Wageningen University.

Co-author Professor Tim Lenton, from the University of Exeter, added: "The countries affected by this dual challenge of poverty and increasing temperature variability already share half of the world's population, and population growth rates are particularly large in these countries."

"These increases are bad news for tropical societies and ecosystems that are not adapted to fluctuations outside of the typical range."

The study also reveals that most of the increased temperature fluctuations in the tropics are associated with droughts -- an extra threat to food and water supplies.

For their investigation, the team analysed 37 different climate models that have been used for the last report of the Intergovernmental Panel on Climate Change (IPCC).

Read more at Science Daily