Feb 1, 2020

Modeling study estimates spread of 2019 novel coronavirus

Coronavirus diagnosis concept
New modelling research, published in The Lancet, estimates that up to 75,800 individuals in the Chinese city of Wuhan may have been infected with 2019 novel coronavirus (2019-nCoV) as of January 25, 2020.

Senior author Professor Gabriel Leung from the University of Hong Kong highlights: "Not everyone who is infected with 2019-nCoV would require or seek medical attention. During the urgent demands of a rapidly expanding epidemic of a completely new virus, especially when system capacity is getting overwhelmed, some of those infected may be undercounted in the official register."

He explains: "The apparent discrepancy between our modelled estimates of 2019-nCoV infections and the actual number of confirmed cases in Wuhan could also be due to several other factors. These include that there is a time lag between infection and symptom onset, delays in infected persons coming to medical attention, and time taken to confirm cases by laboratory testing, which could all affect overall recording and reporting."

The new estimates also suggest that multiple major Chinese cities might have already imported dozens of cases of 2019-nCoV infection from Wuhan, in numbers sufficient to initiate local epidemics.

The early estimates underscore that it will likely take rapid and immediate scale-up of substantial public health control measures to prevent large epidemics in areas outside Wuhan. Further analyses suggest that if transmissibility of 2019-nCoV could be reduced, both the growth rate and size of local epidemics in all cities across China could be reduced.

"If the transmissibility of 2019-nCoV is similar nationally and over time, it is possible that epidemics could be already growing in multiple major Chinese cities, with a time lag of one to two weeks behind the Wuhan outbreak," says lead author Professor Joseph Wu from the University of Hong Kong. "Large cities overseas with close transport links to China could potentially also become outbreak epicentres because of substantial spread of pre-symptomatic cases unless substantial public health interventions at both the population and personal levels are implemented immediately."

According to Professor Gabriel Leung: "Based on our estimates, we would strongly urge authorities worldwide that preparedness plans and mitigation interventions should be readied for quick deployment, including securing supplies of test reagents, drugs, personal protective equipment, hospital supplies, and above all human resources, especially in cities with close ties with Wuhan and other major Chinese cities."

In the study, researchers used mathematical modelling to estimate the size of the epidemic based on officially reported 2019-nCoV case data and domestic and international travel (i.e., train, air, road) data. They assumed that the serial interval estimate (the time it takes for infected individuals to infect other people) for 2019-nCoV was the same as for severe acute respiratory syndrome (SARS: table 1). The researchers also modelled potential future spread of 2019-nCoV in China and internationally, accounting for the potential impact of various public health interventions that were implemented in January 2020 including use of face masks and increased personal hygiene, and the quarantine measures introduced in Wuhan on January 23.

The researchers estimate that in the early stages of the Wuhan outbreak (from December 1, 2019 to January 25, 2020) each person infected with 2019-nCoV could have infected up to 2-3 other individuals on average, and that the epidemic doubled in size every 6.4 days. During this period, up to 75,815 individuals could have been infected in Wuhan.

Additionally, estimates suggest that cases of 2019-nCoV infection may have spread from Wuhan to multiple other major Chinese cities as of January 25, including Guangzhou (111 cases), Beijing (113), Shanghai (98), and Shenzhen (80; figure 3). Together these cities account for over half of all outbound international air travel from China.

While the estimates suggest that the quarantine in Wuhan may not have the intended effect of completely halting the epidemic, further analyses suggest that if transmissibility of 2019-nCoV could be reduced by 25% in all cities nationally with expanded control efforts, both the growth rate and size of local epidemics could be substantially reduced. Moreover, a 50% reduction in transmissibility could shift the current 2019-nCoV epidemic from one that is expanding rapidly, to one that is slowly growing (figure 4).

"It might be possible to reduce local transmissibility and contain local epidemics if substantial, even draconian, measures that limit population mobility in all affected areas are immediately considered. Precisely what and how much should be done is highly contextually specific and there is no one-size-fits-all set of prescriptive interventions that would be appropriate across all settings," says co-author Dr Kathy Leung from the University of Hong Kong. "On top of that, strategies to drastically reduce within-population contact by cancelling mass gatherings, school closures, and introducing work-from-home arrangements could contain the spread of infection so that the first imported cases, or even early local transmission, does not result in large epidemics outside Wuhan."

Read more at Science Daily

Scientists boost gene-editing tools to new heights in human stem cells

During the past decade, the gene editing tool CRISPR has transformed biology and opened up hopeful avenues to correct deadly inherited diseases. Last fall, scientists began the first human clinical trials using CRISPR to combat diseases like cancer. They remove some of a person's cells, CRISPR edit the DNA, and then inject the cells back in, where hopefully, they will cure the disease.

But along with this promise of regenerative, personalized medicine, CRISPR can also have significant safety limitations. CRISPR may not edit in the right place (so-called off-target gene effects) or not being terribly efficient (successful editing may only be achieved in about 10% of the time for every available cell target).

These limitations have frustrated scientists such as Arizona State University's David Brafman, a cell bioengineer. Brafman initial hopes are to use gene editing to get at the heart of uncovering the causes of studies in his lab of neurodegenerative diseases like Alzheimer's.

"We study neurodegenerative diseases like Alzheimer's and use stem cells to study specific mutations or risk factors associated with Alzheimer's disease," said Brafman, a biomedical engineering faculty member in ASU's Ira A. Fulton Schools of Engineering. "We are not necessarily a gene-editing tool development lab, but we were running into difficulty generating stem cell lines by using a traditional CRISPR-based editing approach. For reasons that are still unknown, stem cells are really resistant to that sort of genetic modification."

Green light means go

Now, Brafman, using a new update to the CRISPR base editing technology originally developed in the lab of David Liu at Harvard, has vastly outperformed previous efforts by making highly accurate, single DNA base editing with an efficiency of up to 90% of human stem cells. The results were published in the journal Stem Cell Reports.

"Previously, with CRISPR, it's just been a random guess," said Brafman. "And so, if you are picking at random stem cells and the efficiency is low, you'll likely get only 10% or 5% because you have no idea if the edits have been made -- the cell isn't telling you."

Brafman's lab has developed a new TREE method (an acronym short for transient reporter for editing enrichment, or TREE), which allows for bulk enrichment of DNA base-edited cell populations -- -and for the first time, high efficiency in human stem cell lines.

""Most of the studies are done in immortalized cell lines or cancer cell lines, which are relatively easy to edit," said Brafman. "This is the first example of using base editors in pluripotent stem cells, which is a very valuable cell population to genetically modify. We envision this method will have important implications for the use of human stem cell lines in developmental biology, disease modeling, drug screening and tissue engineering applications,"

Last year, they had shown that their TREE approach can work in human cell lines, but wanted to further push the technology further to find a way to rapidly and efficiently edit human stem cell lines.

Unlike CRISPR, which cuts across both DNA stands, their TREE method only makes a single strand nick in DNA. For example, when a single DNA base is successfully edited from a C to a T, a protein gives off a signal, turning from blue to green.

"Now, if a cell is telling you, 'if I'm glowing green I have a 90% chance of being edited you are going to have better luck identifying edited populations," said Brafman. "Then, you can exclude all of the cells that are not edited. We isolate single cells that are glowing green, then grow those up into clonal populations that you are able to expand indefinitely."

Targeting Alzheimer's

Pluripotent stem cells are valued for regenerative medicine because they have the ability to become or differentiate into any cell type in the human body.

Brafman explains that there are two general sources, "embryonic stem cells, which are derived from the inner cell mass of a preimplantation blastocyst, and then there are induced pluripotent stem cells, which are derived from taking somatic cells like skin or blood from patients."

Brafman's lab uses the induced pluripotent stem cells for their research.

"For this study, we used pluripotent stem cells from both healthy patients and then patients with Alzheimer's disease. Some of the genes that we were interested in modulating are related to Alzheimer's disease. The majority of the patients suffering from Alzheimer's disease suffer from late onset, or sporadic Alzheimer's disease."

To provide their proof-of-concept, they targeted the APOE gene, which can come in three flavors. One of the three gene variants, called APOE4, has been associated with a higher risk for late onset Alzheimer's disease. For the study, they introduced single DNA based edits into the APOE gene.

"That's why we are interested in having these cells," said Brafman. "They are representative of the neurons and the various cell types in the central nervous system with patients with these various risk factors. Then, we can understand why an APOE variant can increase or decrease risk, and then we can start targeting those pathways that are affected."

Not only could TREE make single DNA edits to the APOE4 gene, but unlike CRISPR, make highly accurate corrections to both copies of the APOE4 gene that humans possess.

"The traditional CRISPR approach is that you have to edit once to get a heterozygous edit , then isolate that clone, edit again to get another heterozygous edit," said Brafman. "So, it's very inefficient in that way. We are generating homozygous edits at an efficiency approaching 90%. I haven't seen any other technologies that can do that in pluripotent stem cells."

In addition, TREE could also be used to engineer critical gene knockout mutations into stem cell lines. "The most fundamental experiment you can do if a gene has important implications in disease, development or physiology is knock it out," said Brafman. That opens up a whole bunch of questions that we can address. Using APOE as a case study, now we can knock out APOE in these cells if you don't have APOE at all. Is it beneficial? Detrimental? Or no difference?"

Complex cases

While diseases like sickle-cell anemia or cystric fibrosis are caused by single mutations in DNA, for most diseases and leading causes of death, like heart disease or high blood pressure, are complex, and involve multiple genes. Brafman wanted to also address the complex, root causes of Alzheimer's.

"Especially as it related to Alzheimer's disease, there can be multiple risk factors that act in concert, so we wanted a way to introduce multiple edits simultaneously in pluripotent stem cells. Because otherwise, you would have to take this sequential iterative approach, where you introduce one edit, isolate a clonal population introduce another edit, and so on.

They successfully demonstrated that TREE could be used to make new stem cell lines that had been simultaneously edited at multiple gene locations. Their results showed that more than 80% of stem clones had been targeted at all three different gene sites, and with all clones editing both gene copies.

"We found that if you multiplex you still get the same efficiency of editing as you would if you just edited a single allele," said Brafman. "Now, we can use these cells as in vitro models to study the disease and screen drugs."

Brafman is hopeful that their new tools will generate excitement in the gene editing community, and spur others on to make new discoveries.

Read more at Science Daily

Jan 31, 2020

Red alert as Arctic lands grow greener

New research techniques are being adopted by scientists tackling the most visible impact of climate change -- the so-called greening of Arctic regions.

The latest drone and satellite technology is helping an international team of researchers to better understand how the vast, treeless regions called the tundra is becoming greener.

As Arctic summer temperatures warm, plants are responding. Snow is melting earlier and plants are coming into leaf sooner in spring. Tundra vegetation is spreading into new areas and where plants were already growing, they are now growing taller.

Understanding how data captured from the air compare with observations made on the ground will help to build the clearest picture yet of how the northern regions of Europe, Asia and North America are changing as the temperature rises.

Now a team of 40 scientists from 36 institutions, led by two National Geographic Explorers, have revealed that the causes of this greening process are more complex -- and variable -- than was previously thought.

Researchers from Europe and North America are finding that the Arctic greening observed from space is caused by more than just the responses of tundra plants to warming on the ground. Satellites are also capturing other changes including differences in the timing of snowmelt and the wetness of landscapes.

Lead author Dr Isla Myers-Smith, of the University of Edinburgh's School of GeoSciences, said: "New technologies including sensors on drones, planes and satellites, are enabling scientists to track emerging patterns of greening found within satellite pixels that cover the size of football fields."

Professor Scott Goetz of the School of Informatics, Computing and Cyber Systems at Northern Arizona University, says this research is vital for our understanding of global climate change. Tundra plants act as a barrier between the warming atmosphere and huge stocks of carbon stored in frozen ground.

Changes in vegetation alter the balance between the amount of carbon captured and its release into the atmosphere. Small variations could significantly impact efforts to keep warming below 1.5 degrees centigrade -- a key target of the Paris Agreement. The study will help scientists to figure out which factors will speed up or slow down warming.

Co-lead author Dr Jeffrey Kerby, who was a Neukom Fellow at Dartmouth College while conducting the research, said: "Besides collecting new imagery, advances in how we process and analyse these data -- even imagery that is decades old -- are revolutionising how we understand the past, present, and future of the Arctic."

Read more at Science Daily

'Spring forward' to daylight saving time brings surge in fatal car crashes

Fatal car accidents in the United States spike by 6% during the workweek following the "spring forward" to daylight saving time, resulting in about 28 additional deaths each year, according to new University of Colorado Boulder research.

The study, published January 30 in the journal Current Biology, also found that the farther west a person lives in his or her time zone, the higher their risk of a deadly crash that week.

"Our study provides additional, rigorous evidence that the switch to daylight saving time in spring leads to negative health and safety impacts," said senior author Celine Vetter, assistant professor of integrative physiology. "These effects on fatal traffic accidents are real, and these deaths can be prevented."

The findings come at a time when numerous states, including Oregon, Washington, California and Florida, are considering doing away with the switch entirely, and mounting research is showing spikes in heart attacks, strokes, workplace injuries and other problems in the days following the time change.

For the study -- the largest and most detailed to date to assess the relationship between the time change and fatal motor vehicle accidents -- the researchers analyzed 732,835 accidents recorded through the U.S. Fatality Analysis Reporting System from 1996 to 2017. They excluded Arizona and Indiana, where Daylight Savings Time was not consistently observed.

After controlling for factors like year, season and day of the week, they found a consistent rise in fatal accidents in the week following the spring time change. Notably, that spike moved in 2007, when the Energy Policy Act extended daylight saving time to begin on the second Sunday of March instead of the first Sunday in April.

"Prior to 2007, we saw the risk increase in April, and when daylight saving time moved to March, so did the risk increase," said Vetter. "That gave us even more confidence that the risk increase we observe is indeed attributable to the daylight saving time switch, and not something else."

With the arrival March 9 of daylight saving time, clocks shift forward by one hour, and many people will miss out on sleep and drive to work in darkness -- both factors that can contribute to crashes.

Those on the western edge of their time zone, in places like Amarillo, Texas, and St. George, Utah, already get less sleep on average than their counterparts in the east -- about 19 minutes less per day, research shows -- because the sun rises and sets later but they still have to be at work when everyone else does.

"They already tend to be more misaligned and sleep-deprived, and when you transition to daylight saving time it makes things worse," said first author Josef Fritz, a postdoctoral researcher in the Department of Integrative Physiology. In such western regions, the spike in fatal accidents was more than 8%, the study found.

The increase kicks in right away, on the Sunday when the clocks spring forward, and the bulk of the additional fatal accidents that week occur in the morning.

Changes in accident patterns also occur after the "fall back" time change, the study showed, with a decline in morning accidents and a spike in the evening, when darkness comes sooner.

Because they balance each other out, there is no overall change in accidents during the "fall back" week.

In all, over the course of the 22 years of data analyzed, about 627 people died in fatal car accidents associated with the spring shift to Daylight Savings Time, the study estimated.

Because the data only include the most severe of car accidents, the authors believe the results underestimate the true risk increase to drivers when time springs forward.

"Our results support the theory that abolishing time changes completely would improve public health," said Vetter. "But where do we head from here? Do we go to permanent standard time or permanent daylight saving time?"

Read more at Science Daily

Cycling to work? You may live longer

People who cycle to work have a lower risk of dying, a New Zealand study has found.

The study, by researchers from the University of Otago, Wellington, the University of Melbourne and the University of Auckland, has just been published in the International Journal of Epidemiology.

Lead researcher Dr Caroline Shaw, from the Department of Public Health at the University of Otago, Wellington, says people who cycled to work had a 13 per cent reduction in mortality during the study, likely as a result of the health benefits of physical activity. There was no reduction in mortality for those who walked or took public transport to work.

The researchers used data from the New Zealand Census-Mortality Study, which links census and mortality records, to do follow-up studies of the population for three to five years following the 1996, 2001 and 2006 censuses, when respondents were asked: 'On X date (census day), what was the one main way you travelled to work -- that is, the one you used for the greatest distance?'

Dr Shaw says the study, which analysed data from 3.5 million New Zealanders, is one of the largest ever cohort studies to examine the association between mode of travel to work and mortality outcomes.

"We studied 80 per cent of the working-age population of New Zealand over a 15-year period, so it is highly representative."

Dr Shaw says increasing 'active transport' is being promoted as a way of addressing health and environmental issues, but the association between different modes of transport, such as cycling, walking and public transport, and health outcomes has remained unclear.

The study found more than 80 per cent of people in New Zealand travelled to work by car on census day, with only five per cent walking and three per cent cycling.

"There were gender differences in mode of travel to work, with two per cent of women cycling compared with four per cent of men, but more women walking or jogging (seven per cent), compared with men (five per cent). A higher proportion of younger people cycled, walked or took public transport compared with older people."

Dr Shaw says the census data provided no details about the physical intensity of the commute, so those who lived in the inner city and walked 200 metres to work were in the same category as those who walked briskly up and down a hill for 30 minutes to get to and from work.

"We saw no increase in road traffic injury deaths associated with walking and cycling, although the New Zealand transport system at the time of these studies was heavily car-dominated and roads seldom made allowances for pedestrians and cyclists."

Dr Shaw says the findings lend support for initiatives to increase the number of people commuting to work by bike.

"Increasing cycling for commuting to work in a country with low levels of cycling like New Zealand will require policies directed at both transport and urban planning, such as increasing housing density and implementing cycling networks."

While the study found no association between walking or taking public transport to work and a reduction in mortality, Dr Shaw says there are other reasons to promote these modes of transport.

Read more at Science Daily

Anti-solar cells: A photovoltaic cell that works at night

Solar panels against night sky background.
What if solar cells worked at night? That's no joke, according to Jeremy Munday, professor in the Department of Electrical and Computer Engineering at UC Davis. In fact, a specially designed photovoltaic cell could generate up to 50 watts of power per square meter under ideal conditions at night, about a quarter of what a conventional solar panel can generate in daytime, according to a concept paper by Munday and graduate student Tristan Deppe. The article was published in, and featured on the cover of, the January 2020 issue of ACS Photonics.

Munday, who recently joined UC Davis from the University of Maryland, is developing prototypes of these nighttime solar cells that can generate small amounts of power. The researchers hope to improve the power output and efficiency of the devices.

Munday said that the process is similar to the way a normal solar cell works, but in reverse. An object that is hot compared to its surroundings will radiate heat as infrared light. A conventional solar cell is cool compared to the sun, so it absorbs light.

Space is really, really cold, so if you have a warm object and point it at the sky, it will radiate heat toward it. People have been using this phenomenon for nighttime cooling for hundreds of years. In the last five years, Munday said, there has been a lot of interest in devices that can do this during the daytime (by filtering out sunlight or pointing away from the sun).

Generating power by radiating heat

There's another kind of device called a thermoradiative cell that generates power by radiating heat to its surroundings. Researchers have explored using them to capture waste heat from engines.

"We were thinking, what if we took one of these devices and put it in a warm area and pointed it at the sky," Munday said.

This thermoradiative cell pointed at the night sky would emit infrared light because it is warmer than outer space.

"A regular solar cell generates power by absorbing sunlight, which causes a voltage to appear across the device and for current to flow. In these new devices, light is instead emitted and the current and voltage go in the opposite direction, but you still generate power," Munday said. "You have to use different materials, but the physics is the same."

Read more at Science Daily

Jan 30, 2020

People may lie to appear honest

People may lie to appear honest if events that turned out in their favor seem too good to be true, according to new research published by the American Psychological Association.

"Many people care greatly about their reputation and how they will be judged by others, and a concern about appearing honest may outweigh our desire to actually be honest, even in situations where it will cost us money to lie," said lead researcher Shoham Choshen-Hillel, PhD, a senior lecturer at the School of Business Administration and Center for the Study of Rationality at The Hebrew University of Jerusalem. "Our findings suggest that when people obtain extremely favorable outcomes, they anticipate other people's suspicious reactions and prefer lying and appearing honest over telling the truth and appearing as selfish liars."

The study found similar findings about lying to appear honest in a series of experiments conducted with lawyers and college students in Israel, as well as online participants in the United States and United Kingdom. The research was published online in the Journal of Experimental Psychology: General.

In one experiment with 115 lawyers in Israel, the participants were told to imagine a scenario where they told a client that a case would cost between 60 and 90 billable hours. The lawyer would be working in an office where the client wouldn't know how many hours were truly spent on the case. Half of the participants were told they had worked 60 hours on the case while the other half were told they worked 90 hours. Then they were asked how many hours they would bill the client. In the 60-hour group, the lawyers reported an average of 62.5 hours, with 17% of the group lying to inflate their hours. In the 90-hour group, the lawyers reported an average of 88 hours, with 18% of the group lying to report fewer hours than they had actually worked.

When asked for an explanation for the hours they billed, some lawyers in the 90-hour group said they worried that the client would think he had been cheated because the lawyer had lied about the number of billable hours.

In another experiment, 149 undergraduate students at an Israeli university played online dice-rolling and coin-flipping games in private and then reported their scores to a researcher. The participants received approximately 15 cents for each successful coin flip or dice roll they reported. The computer program was manipulated for half of the students so they received perfect scores in the games, while the other group had random outcomes based on chance. In the perfect-score group, 24% underreported their number of wins even though it cost them money, compared with 4% in the random-outcome group.

"Some participants overcame their aversion toward lying and the monetary costs involved just to appear honest to a single person who was conducting the experiment," Choshen-Hillel said.

In another online experiment with 201 adults from the United States, participants were told to imagine a scenario where they drove on many work trips for a company that had a maximum monthly compensation of 400 miles. They were told that most employees reported 280 to 320 miles per month.

Half of the participants were told they had driven 300 miles in a month while the other half were told they drove 400 miles. When the participants were asked how many miles they would report, the 300-mile group told the truth and reported an average of 301 miles. For the 400-mile group, the participants reported an average of 384 miles, with 12% lying and underreporting their mileage. There were similar findings in another online experiment with 544 participants in the United Kingdom.

Choshen-Hillel said she believes the study findings would apply in the real world, but there could be situations where the amount of money or other high stakes would lead people to tell the truth even if they might appear dishonest.

Read more at Science Daily

Citizen science discovers a new form of the northern lights

Working together with space researchers, Finnish amateur photographers have discovered a new auroral form. Named 'dunes' by the hobbyists, the phenomenon is believed to be caused by waves of oxygen atoms glowing due to a stream of particles released from the Sun.

In the recently published study, the origins of the dunes were tracked to a wave guide formed within the mesosphere and its boundary, the mesopause. The study also posits that this new auroral form provides researchers with a novel way to investigate conditions in the upper atmosphere.

The study was published in the first issue of the journal AGU Advances.

An unknown fingerprint appears in the sky

Minna Palmroth, Professor of Computational Space Physics at the University of Helsinki, heads a research group developing the world's most accurate simulation of the near-Earth space and space weather that cause auroral emissions.

The sun releases a steady flow of charged particles, known as the solar wind. Reaching Earth's ionised upper atmosphere, the ionosphere, they create auroral emissions by exciting atmospheric oxygen and nitrogen atoms. The excitation state is released as auroral light.

In late 2018, Palmroth published a book entitled 'Revontulibongarin opas' ('A guide for aurora borealis watchers'). The book was born out of Palmroth's cooperation with Northern Lights enthusiasts and the answers she provided to questions about the physics of the phenomenon in the hobbyists' Facebook group.

Thousands of magnificent photographs of the Northern Lights taken by hobbyists were surveyed and categorised for the book. Each auroral form is like a fingerprint, typical only of a certain phenomenon in the auroral zone. During the classification, hobbyists pointed out that a certain auroral form did not fit into any of the pre-existing categories. Palmroth set aside these unusual forms for later consideration.

By an almost unbelievable coincidence, just days after the book was published, the hobbyists saw this unusual form again and immediately informed Palmroth. The form appeared as a green-tinged and even pattern of waves resembling a striped veil of clouds or dunes on a sandy beach.

"One of the most memorable moments of our research collaboration was when the phenomenon appeared at that specific time and we were able to examine it in real time," says Northern Lights and astronomy hobbyist Matti Helin.

Waves newly revealed by the aurora

Investigations into the phenomenon were launched, with hobbyist observations and scientific methods coming together to explain the waves.

"It was like piecing together a puzzle or conducting detective work," says Helin. "Every day we found new images and came up with new ideas. Eventually, we got to the bottom of it..."

The phenomenon was photographed at the same time in both Laitila and Ruovesi, southwest Finland, with the same detail observed in the auroral emission in both images. Maxime Grandin, a postdoctoral researcher in Palmroth's team, identified stars behind the emission and determined the azimuths and elevations of the stars with the help of the astronomy software program Stellarium. This made it possible to use the stars as points of reference when calculating the altitude and extent of the auroral phenomenon.

Grandin found that the auroral dunes occur at a relatively low altitude of 100 kilometres, in the upper parts of the mesosphere. The wavelength of the wave field was measured to be 45 kilometres.

A total of seven similar events -- where a camera had recorded the same even pattern of waves -- were further identified from the 'Taivaanvahti' ('Sky Watch') service maintained by the Finnish Amateur Astronomer Association, Ursa.

Unexplored region

The part of the auroral zone where Earth's electrically-neutral atmosphere meets the edge of space is an extremely challenging environment for satellites and other space-borne instruments. Palmroth says this is why it is one of the least studied places on our planet.

"Due to the difficulties in measuring the atmospheric phenomena occurring between 80 and 120 kilometres in altitude, we sometimes call this area 'the ignorosphere'," she says.

The dunes were observed precisely in that particular region of the auroral zone. The observed phenomenon guided the researchers towards a middle ground between atmospheric research and space research, as the usual methodology of space physics could not explain it alone.

"The differences in brightness within the dune waves could be due to either waves in the precipitating particles coming from space, or in the underlying atmospheric oxygen atoms," says Palmroth. "We ended up proposing that the dunes are a result of increased oxygen atom density."

Next, the team had to determine how the variability in the density of the oxygen atoms caused by gravity waves in the atmosphere results in such an even and widespread field of waves. Normally at the altitude of study there are many different kinds of gravity waves travelling in different directions at different wavelengths, which is why they do not easily form the even wavefields exhibited by the dunes.

The Northern Lights illuminate a tidal bore

The study suggests that the phenomenon in question is a mesospheric bore, a rare and little-studied phenomenon that takes place in the mesosphere. The tidal bore phenomenon is a wave common to many rivers, where the tide travels up the river channel.

Various types of gravity wave are born in the atmosphere and then rise. In very rare cases, gravity waves can get filtered as they rise between the mesopause and an inversion layer that is intermittently formed below the mesopause. The inversion layer makes the filtered waves bend and enables them to travel long distances through the channel without attenuation.

When the oxygen atoms in the bore collide with the electrons precipitating down upon the atmosphere, they become excited. When releasing this excitation, they create the auroral light. This is why mesospheric bores -- a phenomenon thus far considered a very challenging subject of research -- can occasionally be seen with the naked eye.

Space researchers focus on the atmosphere


Prior to this discovery, mesospheric bores were not observed in the auroral zone, nor have they been investigated via auroral emissions.

"The auroral zone as a whole is usually discounted in studies focused on the bore, as auroral emissions impair the technique used to identify mesospheric bores," says Palmroth.

Traditionally, researchers specialising in the atmosphere and space have largely investigated their topics of interest separate from each other. This is because there are only a handful of known mechanisms of interaction between the ionosphere bathing in the precipitating electrons, and the neutral atmosphere.

With the help of measuring devices operated by the Finnish Meteorological Institute, the dunes were found to occur simultaneously and in the same region where the electromagnetic energy originating in space is transferred to the ignorosphere.

Read more at Science Daily

Newest solar telescope produces first images

The Daniel K. Inouye Solar Telescope has produced the highest resolution image of the sun's surface ever taken. In this picture, taken at 789 nanometers (nm), we can see features as small as 30km (18 miles) in size for the first time ever. The image shows a pattern of turbulent, 'boiling' gas that covers the entire sun. The cell-like structures -- each about the size of Texas -- are the signature of violent motions that transport heat from the inside of the sun to its surface. Hot solar material (plasma) rises in the bright centers of 'cells,' cools off and then sinks below the surface in dark lanes in a process known as convection. In these dark lanes we can also see the tiny, bright markers of magnetic fields. Never before seen to this clarity, these bright specks are thought to channel energy up into the outer layers of the solar atmosphere called the corona. These bright spots may be at the core of why the solar corona is more than a million degrees.
Just released first images from the National Science Foundation's Daniel K. Inouye Solar Telescope reveal unprecedented detail of the sun's surface and preview the world-class products to come from this preeminent 4-meter solar telescope. NSF's Inouye Solar Telescope, on the summit of Haleakala, Maui, in Hawai'i, will enable a new era of solar science and a leap forward in understanding the sun and its impacts on our planet.

Activity on the sun, known as space weather, can affect systems on Earth. Magnetic eruptions on the sun can impact air travel, disrupt satellite communications and bring down power grids, causing long-lasting blackouts and disabling technologies such as GPS.

The first images from NSF's Inouye Solar Telescope show a close-up view of the sun's surface, which can provide important detail for scientists. The images show a pattern of turbulent "boiling" plasma that covers the entire sun. The cell-like structures -- each about the size of Texas -- are the signature of violent motions that transport heat from the inside of the sun to its surface. That hot solar plasma rises in the bright centers of "cells," cools, then sinks below the surface in dark lanes in a process known as convection.

"Since NSF began work on this ground-based telescope, we have eagerly awaited the first images," said France Córdova, NSF director. "We can now share these images and videos, which are the most detailed of our sun to date. NSF's Inouye Solar Telescope will be able to map the magnetic fields within the sun's corona, where solar eruptions occur that can impact life on Earth. This telescope will improve our understanding of what drives space weather and ultimately help forecasters better predict solar storms."

Expanding knowledge

The sun is our nearest star -- a gigantic nuclear reactor that burns about 5 million tons of hydrogen fuel every second. It has been doing so for about 5 billion years and will continue for the other 4.5 billion years of its lifetime. All that energy radiates into space in every direction, and the tiny fraction that hits Earth makes life possible. In the 1950s, scientists figured out that a solar wind blows from the sun to the edges of the solar system. They also concluded for the first time that we live inside the atmosphere of this star. But many of the sun's most vital processes continue to confound scientists.

"On Earth, we can predict if it is going to rain pretty much anywhere in the world very accurately, and space weather just isn't there yet," said Matt Mountain, president of the Association of Universities for Research in Astronomy, which manages the Inouye Solar Telescope. "Our predictions lag behind terrestrial weather by 50 years, if not more. What we need is to grasp the underlying physics behind space weather, and this starts at the sun, which is what the Inouye Solar Telescope will study over the next decades."

The motions of the sun's plasma constantly twist and tangle solar magnetic fields . Twisted magnetic fields can lead to solar storms that can negatively affect our technology-dependent modern lifestyles. During 2017's Hurricane Irma, the National Oceanic and Atmospheric Administration reported that a simultaneous space weather event brought down radio communications used by first responders, aviation and maritime channels for eight hours on the day the hurricane made landfall.

Finally resolving these tiny magnetic features is central to what makes the Inouye Solar Telescope unique. It can measure and characterize the sun's magnetic field in more detail than ever seen before and determine the causes of potentially harmful solar activity.

"It's all about the magnetic field," said Thomas Rimmele, director of the Inouye Solar Telescope. "To unravel the sun's biggest mysteries, we have to not only be able to clearly see these tiny structures from 93 million miles away but very precisely measure their magnetic field strength and direction near the surface and trace the field as it extends out into the million-degree corona, the outer atmosphere of the sun."

Better understanding the origins of potential disasters will enable governments and utilities to better prepare for inevitable future space weather events. It is expected that notification of potential impacts could occur earlier -- as much as 48 hours ahead of time instead of the current standard, which is about 48 minutes. This would allow more time to secure power grids and critical infrastructure and to put satellites into safe mode.

The engineering

To achieve the proposed science, this telescope required important new approaches to its construction and engineering. Built by NSF's National Solar Observatory and managed by AURA, the Inouye Solar Telescope combines a 13-foot (4-meter) mirror -- the world's largest for a solar telescope -- with unparalleled viewing conditions at the 10,000-foot Haleakala summit.

Focusing 13 kilowatts of solar power generates enormous amounts of heat -- heat that must be contained or removed. A specialized cooling system provides crucial heat protection for the telescope and its optics. More than seven miles of piping distribute coolant throughout the observatory, partially chilled by ice created on site during the night.

The dome enclosing the telescope is covered by thin cooling plates that stabilize the temperature around the telescope, helped by shutters within the dome that provide shade and air circulation. The "heat-stop" (a high-tech, liquid-cooled, doughnut-shaped metal) blocks most of the sunlight's energy from the main mirror, allowing scientists to study specific regions of the sun with unparalleled clarity.

The telescope also uses state-of-the-art adaptive optics to compensate for blurring created by Earth's atmosphere. The design of the optics ("off-axis" mirror placement) reduces bright, scattered light for better viewing and is complemented by a cutting-edge system to precisely focus the telescope and eliminate distortions created by the Earth's atmosphere. This system is the most advanced solar application to date.

"With the largest aperture of any solar telescope, its unique design, and state-of-the-art instrumentation, the Inouye Solar Telescope -- for the first time -- will be able to perform the most challenging measurements of the sun," Rimmele said. "After more than 20 years of work by a large team devoted to designing and building a premier solar research observatory, we are close to the finish line. I'm extremely excited to be positioned to observe the first sunspots of the new solar cycle just now ramping up with this incredible telescope."

New era of solar astronomy

NSF's new ground-based Inouye Solar Telescope will work with space-based solar observation tools such as NASA's Parker Solar Probe (currently in orbit around the sun) and the European Space Agency/NASA Solar Orbiter (soon to be launched). The three solar observation initiatives will expand the frontiers of solar research and improve scientists' ability to predict space weather.

"It's an exciting time to be a solar physicist," said Valentin Pillet, director of NSF's National Solar Observatory. "The Inouye Solar Telescope will provide remote sensing of the outer layers of the sun and the magnetic processes that occur in them. These processes propagate into the solar system where the Parker Solar Probe and Solar Orbiter missions will measure their consequences. Altogether, they constitute a genuinely multi-messenger undertaking to understand how stars and their planets are magnetically connected."

Read more at Science Daily

Meteorite chunk contains unexpected evidence of presolar grains

Illustration of meteor falling to Earth.
An unusual chunk in a meteorite may contain a surprising bit of space history, based on new research from Washington University in St. Louis.

Presolar grains -- tiny bits of solid interstellar material formed before the sun was born -- are sometimes found in primitive meteorites. But a new analysis reveals evidence of presolar grains in part of a meteorite where they are not expected to be found.

"What is surprising is the fact that presolar grains are present," said Olga Pravdivtseva, research associate professor of physics in Arts & Sciences and lead author of a new paper in Nature Astronomy. "Following our current understanding of solar system formation, presolar grains could not survive in the environment where these inclusions are formed."

Curious Marie is a notable example of an "inclusion," or a chunk within a meteorite, called a calcium-aluminum-rich inclusion (CAI). These objects, some of the first to have condensed in the solar nebula, help cosmochemists define the age of the solar system. This particular chunk of meteorite -- from the collection of the Robert A. Pritzker Center for Meteoritics and Polar Studies at the Chicago Field Museum -- was in the news once before, when scientists from the University of Chicago gave it its name to honor chemist Marie Curie.

For the new work, Pravdivtseva and her co-authors, including Sachiko Amari, research professor of physics at Washington University, used noble gas isotopic signatures to show that presolar silicon carbide (SiC) grains are present in Curious Marie.

That's important because presolar grains are generally thought to be too fragile to have endured the high-temperature conditions that existed near the birth of our sun.

But not all CAIs were formed in quite the same way.

"The fact that SiC is present in refractory inclusions tells us about the environment in the solar nebula at the condensation of the first solid materials," said Pravdivtseva, who is part of Washington University's McDonnell Center for the Space Sciences. "The fact that SiC was not completely destroyed in Curious Marie can help us to understand this environment a little bit better.

"Many refractory inclusions were melted and lost all textural evidence of their condensation. But not all."

Like solving a mystery

Pravdivtseva and her collaborators used two mass spectrometers built in-house at Washington University to make their observations. The university has a long history of noble gas work and is home to one of the best-equipped noble gas laboratories in the world. Still, this work was uniquely challenging.

The researchers had 20 mg of Curious Marie to work with, which is a relatively large sample from a cosmochemistry perspective. They heated it up incrementally, increasing temperature and measuring the composition of four different noble gases released at each of 17 temperature steps.

"Experimentally, it is an elegant work," Pravdivtseva said. "And then we had a puzzle of noble gas isotopic signatures to untangle. For me, it is like solving a mystery."

Others have looked for evidence of SiC in such calcium-aluminum-rich inclusions in meteorites using noble gases before, but Pravdivtseva's team is the first to find it.

"It was beautiful when all noble gases pointed to the same source of the anomalies -- SiC," she said.

Read more at Science Daily

Jan 29, 2020

Scientists find far higher than expected rate of underwater glacial melting

Tidewater glaciers, the massive rivers of ice that end in the ocean, may be melting underwater much faster than previously thought, according to a Rutgers co-authored study that used robotic kayaks.

The findings, which challenge current frameworks for analyzing ocean-glacier interactions, have implications for the rest of the world's tidewater glaciers, whose rapid retreat is contributing to sea-level rise.

The study, published in the journal Geophysical Research Letters, surveyed the ocean in front of 20-mile-long LeConte Glacier in Alaska. The seaborne robots made it possible for the first time to analyze plumes of meltwater, the water released when snow or ice melts, where glaciers meet the ocean. It is a dangerous area for ships because of ice calving -- when falling slabs of ice that break from glaciers crash into the water and spawn huge waves.

"With the kayaks, we found a surprising signal of melting: Layers of concentrated meltwater intruding into the ocean that reveal the critical importance of a process typically neglected when modeling or estimating melt rates," said lead author Rebecca Jackson, a physical oceanographer and assistant professor in the Department of Marine and Coastal Sciences in the School of Environmental and Biological Sciences at Rutgers University-New Brunswick. Jackson led the study when she was at Oregon State University.

Two kinds of underwater melting occur near glaciers. Where freshwater discharge drains at the base of a glacier (from upstream melt on the glacier's surface), vigorous plumes result in discharge-driven melting. Away from these discharge outlets, the glacier melts directly into the ocean waters in a regime called ambient melting.

The study follows one published last year in the journal Science that measured glacier melt rates by pointing sonar at the LeConte Glacier from a distant ship. The researchers found melt rates far higher than expected but couldn't explain why. The new study found for the first time that ambient melting is a significant part of the underwater mix.

Before these studies, scientists had few direct measurements of melt rates for tidewater glaciers and had to rely on untested theory to get estimates and model ocean-glacier interactions. The studies' results challenge those theories, and this work is a step toward better understanding of submarine melt -- a process that must be better represented in the next generation of global models that evaluate sea-level rise and its impacts.

Read more at Science Daily

Brain networks come 'online' during adolescence to prepare teenagers for adult life

New brain networks come 'online' during adolescence, allowing teenagers to develop more complex adult social skills, but potentially putting them at increased risk of mental illness, according to new research published in the Proceedings of the National Academy of Sciences (PNAS).

Adolescence is a time of major change in life, with increasing social and cognitive skills and independence, but also increased risk of mental illness. While it is clear that these changes in the mind must reflect developmental changes in the brain, it has been unclear how exactly the function of the human brain matures as people grow up from children to young adults.

A team based in the University of Cambridge and University College London has published a major new research study that helps us understand more clearly the development of the adolescent brain.

The study collected functional magnetic resonance imaging (fMRI) data on brain activity from 298 healthy young people, aged 14-25 years, each scanned on one to three occasions about 6 to 12 months apart. In each scanning session, the participants lay quietly in the scanner so that the researchers could analyse the pattern of connections between different brain regions while the brain was in a resting state.

The team discovered that the functional connectivity of the human brain -- in other words, how different regions of the brain 'talk' to each other -- changes in two main ways during adolescence.

The brain regions that are important for vision, movement, and other basic faculties were strongly connected at the age of 14 and became even more strongly connected by the age of 25. This was called a 'conservative' pattern of change, as areas of the brain that were rich in connections at the start of adolescence become even richer during the transition to adulthood.

However, the brain regions that are important for more advanced social skills, such as being able to imagine how someone else is thinking or feeling (so-called theory of mind), showed a very different pattern of change. In these regions, connections were redistributed over the course of adolescence: connections that were initially weak became stronger, and connections that were initially strong became weaker. This was called a 'disruptive' pattern of change, as areas that were poor in their connections became richer, and areas that were rich became poorer.

By comparing the fMRI results to other data on the brain, the researchers found that the network of regions that showed the disruptive pattern of change during adolescence had high levels of metabolic activity typically associated with active re-modelling of connections between nerve cells.

Dr Petra Vértes, joint senior author of the paper and a Fellow of the mental health research charity MQ, said: "From the results of these brain scans, it appears that the acquisition of new, more adult skills during adolescence depends on the active, disruptive formation of new connections between brain regions, bringing new brain networks 'online' for the first time to deliver advanced social and other skills as people grow older."

Professor Ed Bullmore, joint senior author of the paper and head of the Department of Psychiatry at Cambridge, said: "We know that depression, anxiety and other mental health disorders often occur for the first time in adolescence -- but we don't know why. These results show us that active re-modelling of brain networks is ongoing during the teenage years and deeper understanding of brain development could lead to deeper understanding of the causes of mental illness in young people."

Measuring functional connectivity in the brain presents particular challenges, as Dr František Váša, who led the study as a Gates Cambridge Trust PhD Scholar, and is now at King's College London, explained.

Read more at Science Daily

Scientists discover how malaria parasites import sugar

The consumption of sugar is a fundamental source of fuel in most living organisms. In the malaria parasite Plasmodium falciparum, the uptake of glucose is essential to its life cycle. Like in other cells, sugar is transported into the parasite by a transport protein -- a door designed for sugar to pass through the cell membrane. The details in how this door works has now been revealed.

"By elucidating the atomic structure of the sugar-transporting-protein PfHT1, we can better understand how glucose is transported into the parasite," says David Drew, Wallenberg Scholar at the Department of Biochemistry and Biophysics and leading the study at Stockholm University.

The main goal of the research is basic understanding of this important biological process, but with the potential for development of new antimalarial drugs. Malaria kills almost half a million persons each year, according to the WHO. By blocking the door for sugar, it has been shown that one can stop the growth of the malaria parasites.

"It's a long process from a compound with antimalarial activity to a drug that can be taken in the clinic. However, with this knowledge one can improve known antimalarial compounds so that they are more specific to the malarial transporter, so they do not have the side-effect of stopping sugar transport into our own cells. As such, this knowledge increases the likelihood that more specific compounds can be developed into a successful drug," says David Drew.

Despite million's years of evolution between parasites and humans the research show that glucose is surprisingly captured by the sugar transporting protein in malaria parasites in a similar manner as by transporters in the human brain.

"This conservation reflects the fundamental importance of sugar uptake -- basically, nature hit on a winning concept and stuck with it," says David Drew.

However, the malaria parasite is more flexible. Other sugars, such as fructose, can also be imported. This flexibility could give a selective advantage to the malaria parasite so that it can survive under conditions when its preferred energy source glucose is unavailable.

Read more at Science Daily

Tiny salamander's huge genome may harbor the secrets of regeneration

Axolotl
The type of salamander called axolotl, with its frilly gills and widely spaced eyes, looks like an alien and has other-worldly powers of regeneration. Lose a limb, part of the heart or even a large portion of its brain? No problem: They grow back.

"It regenerates almost anything after almost any injury that doesn't kill it," said Parker Flowers, postdoctoral associate in the lab of Craig Crews, the John C. Malone Professor of Molecular, Cellular, and Developmental Biology and professor of chemistry and pharmacology.

If scientists can find the genetic basis for the axolotl's ability to regenerate, they might be able to find ways to restore damaged tissue in humans. But they have been thwarted in the attempt by another peculiarity of the axolotl -- it has the largest genome of any animal yet sequenced, 10 times larger than that of humans.

Now Flowers and colleagues have found an ingenious way to circumvent the animal's complex genome to identify at least two genes involved in regeneration, they report Jan. 28 in the journal eLife.

The advent of new sequencing technologies and gene-editing technology has allowed researchers to craft a list of hundreds of gene candidates that could responsible for regeneration of limbs. However, the huge size of the axolotl genome populated by vast areas of repeated stretches of DNA has made it difficult to investigate the function of those genes.

Lucas Sanor, a former graduate student in the lab, and fellow co-first author Flowers used gene editing techniques in a multi-step process to essentially create markers that could track 25 genes suspected of being involved in limb regeneration. The method allowed them to identify two genes in the blastema -- a mass of dividing cells that form at the site of a severed limb -- that were also responsible for partial regeneration of the axolotl tail.

Flowers stressed that many more such genes probably exist. Since humans possess similar genes, the researchers say, scientists may one day discover how to activate them to help speed wound repair or regenerate tissue.

From Science Daily

Jan 28, 2020

Driven by Earth's orbit, climate changes in Africa may have aided human migration

In 1961, John Kutzbach, then a recent college graduate, was stationed in France as an aviation weather forecaster for the U.S. Air Force. There, he found himself exploring the storied caves of Dordogne, including the prehistoric painted caves at Lascoux.

Thinking about the ancient people and animals who would have gathered in these caves for warmth and shelter, he took up an interest in glaciology. "It was interesting to me, as a weather person, that people would live so close to an ice sheet," says Kutzbach, emeritus University of Wisconsin-Madison professor of atmospheric and oceanic sciences and the Nelson Institute for Environmental Studies.

Kutzbach went on to a career studying how changes in Earth's movements through space -- the shape of its orbit, its tilt on its axis, its wobble -- and other factors, including ice cover and greenhouse gases, affect its climate. Many years after reveling at Ice Age cave art, today he's trying to better understand how changes in Earth's climate may have influenced human migration out of Africa.

In a recent study published in the Proceedings of the National Academy of Sciences, Kutzbach and a team of researchers trace changes in climate and vegetation in Africa, Arabia and the Mediterranean going back 140,000 years to aid others studying the influences underlying human dispersal.

The study describes a dynamic climate and vegetation model that explains when regions across Africa, areas of the Middle East, and the Mediterranean were wetter and drier and how the plant composition changed in tandem, possibly providing migration corridors throughout time.

"We don't really know why people move, but if the presence of more vegetation is helpful, these are the times that would have been advantageous to them," Kutzbach says.

The model also illuminates relationships between Earth's climate and its orbit, greenhouse gas concentrations, and its ice sheets.

For instance, the model shows that around 125,000 years ago, northern Africa and the Arabian Peninsula experienced increased and more northerly-reaching summer monsoon rainfall that led to narrowing of the Saharan and Arabian deserts due to increased grassland. At the same time, in the Mediterranean and the Levant (an area that includes Syria, Lebanon, Jordan, Israel and Palestine), winter storm track rainfall also increased.

These changes were driven by Earth's position relative to the sun. The Northern Hemisphere at the time was as close as possible to the sun during the summer, and as far away as possible during the winter. This resulted in warm, wet summers and cold winters.

"It's like two hands meeting," says Kutzbach. "There were stronger summer rains in the Sahara and stronger winter rains in the Mediterranean."

Given the nature of Earth's orbital movements, collectively called Milankovitch cycles, the region should be positioned this way roughly every 21,000 years. Every 10,000 years or so, the Northern Hemisphere would then be at its furthest point from the sun during the summer, and closest during winter.

Indeed, the model showed large increases in rainfall and vegetation at 125,000, at 105,000, and at 83,000 years ago, with corresponding decreases at 115,000, at 95,000 and at 73,000 years ago, when summer monsoons decreased in magnitude and stayed further south.

Between roughly 70,000 and 15,000 years ago, Earth was in a glacial period and the model showed that the presence of ice sheets and reduced greenhouse gases increased winter Mediterranean storms but limited the southern retreat of the summer monsoon. The reduced greenhouse gases also caused cooling near the equator, leading to a drier climate there and reduced forest cover.

These changing regional patterns of climate and vegetation could have created resource gradients for humans living in Africa, driving migration outward to areas with more water and plant life.

For the study, the researchers, including Kutzbach's UW-Madison colleagues Ian Orland and Feng He, along with researchers at Peking University and the University of Arizona, used the Community Climate System Model version 3 from the National Center for Atmospheric Research. They ran simulations that accounted for orbital changes alone, combined orbital and greenhouse gas changes, and a third that combined those influences plus the influence of ice sheets.

It was Kutzbach who, in the 1970s and 1980s, confirmed that changes in Earth's orbit can drive the strength of summer monsoons around the globe by influencing how much sunlight, and therefore, how much warming reaches a given part of the planet.

Forty years ago, there was evidence for periodic strong monsoons in Africa, but no one knew why, Kutzbach says. He showed that orbital changes on Earth could lead to warmer summers and thus, stronger monsoons. He also read about periods of "greening" in the Sahara, often used to explain early human migration into the typically-arid Middle East.

"My early work prepared me to think about this," he says.

His current modeling work mostly agrees with collected data from each region, including observed evidence from old lake beds, pollen records, cave features, and marine sediments. A recent study led by Orland used cave records in the Levant to show that summer monsoons reached into the region around 125,000 years ago.

Read more at Science Daily

Children to bear the burden of negative health effects from climate change

The grim effects that climate change will have on pediatric health outcomes was the focus of a "Viewpoint" article published in the Journal of Clinical Investigation by Susan E. Pacheco, MD, an expert at The University of Texas Health Science Center at Houston (UTHealth).

Pacheco, an associate professor of pediatrics at McGovern Medical School at UTHealth, along with professors from Johns Hopkins Medicine and the George Washington University, authored a series of articles that detail how increased temperatures due to climate change will negatively affect the health of humanity. In the article authored by Pacheco, she shines a light on the startling effects the crisis has on children's health before they are even born.

Pacheco points to research published by the Intergovernmental Panel on Climate Change, which highlights several ways humans will experience adverse health effects from climate change, such as increased mortality and morbidity due to heat waves and fires, increased risk of food- and water-borne illnesses, and malnutrition due to food scarcity.

These negative experiences bring with them psychological trauma and mental health issues that can affect both children and their caretakers. Pacheco wrote that after Hurricane Maria in 2017, many adults in Puerto Rico experienced post-traumatic stress disorder, depression, and anxiety from living weeks and months without access to necessities such as clean water, electricity, and basic medical care.

"Some were not capable of meeting the physical and emotional demands that such a disaster imposed on their children," Pacheco wrote.

The negative health effects inflicted by the climate crisis can begin while a child is still in utero, due to maternal stress, poor nutrition, exposure to air pollution, and exposure to extreme weather events brought on by climate change. Studies of women who experienced major flooding events while pregnant reported an association with outcomes such as preterm birth and low birth weights. Pacheco wrote that pregnant women exposed to climate change experience stress, respiratory disease, poor nutrition, increased infections, heat-associated illnesses, and poverty.

"We will continue to see an increase in heat-associated conditions in children, such as asthma, Lyme disease, as well as an increase in congenital heart defects," Pacheco said.

Pacheco wrote that the picture painted by research on climate change is daunting and now is not the time for indifference. In the article's conclusion, she wrote that everyone in the medical community must reflect on a personal level about what can be done with the knowledge they have on climate change and its negative health effects.

Read more at Science Daily

An egg a day not tied to risk of heart disease

The controversy about whether eggs are good or bad for your heart health may be solved, and about one a day is fine.

A team of researchers from the Population Health Research Institute (PHRI) of McMaster University and Hamilton Health Sciences found the answer by analyzing data from three large, long-term multinational studies.

The results suggest there is no harm from consuming eggs. Given that the majority of individuals in the study consumed one or fewer eggs per day, it would be safe to consume this level, says Mahshid Dehghan, first author and a PHRI investigator.

"Moderate egg intake, which is about one egg per day in most people, does not increase the risk of cardiovascular disease or mortality even if people have a history of cardiovascular disease or diabetes," she said.

"Also, no association was found between egg intake and blood cholesterol, its components or other risk factors. These results are robust and widely applicable to both healthy individuals and those with vascular disease."

The details are published in The American Journal of Clinical Nutrition.

Although eggs are an inexpensive source of essential nutrients, some guidelines have recommended limiting consumption to fewer than three eggs a week due to concerns they increase the risk of cardiovascular disease.

Previous studies on egg consumption and diseases have been contradictory, said Salim Yusuf, principal investigator of the study and director of PHRI.

"This is because most of these studies were relatively small or moderate in size and did not include individuals from a large number of countries," he said.

The researchers analyzed three international studies conducted by the PHRI. Egg consumption of 146,011 individuals from 21 countries was recorded in the PURE study and in 31,544 patients with vascular disease from the ONTARGET and the TRANSEND studies.

The data from these three studies involved populations from 50 countries spanning six continents at different income levels, so the results are widely applicable, said Yusuf.

From Science Daily

Lab turns trash into valuable graphene in a flash

Graphene sheet model
That banana peel, turned into graphene, can help facilitate a massive reduction of the environmental impact of concrete and other building materials. While you're at it, toss in those plastic empties.

A new process introduced by the Rice University lab of chemist James Tour can turn bulk quantities of just about any carbon source into valuable graphene flakes. The process is quick and cheap; Tour said the "flash graphene" technique can convert a ton of coal, food waste or plastic into graphene for a fraction of the cost used by other bulk graphene-producing methods.

"This is a big deal," Tour said. "The world throws out 30% to 40% of all food, because it goes bad, and plastic waste is of worldwide concern. We've already proven that any solid carbon-based matter, including mixed plastic waste and rubber tires, can be turned into graphene."

As reported in Nature, flash graphene is made in 10 milliseconds by heating carbon-containing materials to 3,000 Kelvin (about 5,000 degrees Fahrenheit). The source material can be nearly anything with carbon content. Food waste, plastic waste, petroleum coke, coal, wood clippings and biochar are prime candidates, Tour said. "With the present commercial price of graphene being $67,000 to $200,000 per ton, the prospects for this process look superb," he said.

Tour said a concentration of as little as 0.1% of flash graphene in the cement used to bind concrete could lessen its massive environmental impact by a third. Production of cement reportedly emits as much as 8% of human-made carbon dioxide every year.

"By strengthening concrete with graphene, we could use less concrete for building, and it would cost less to manufacture and less to transport," he said. "Essentially, we're trapping greenhouse gases like carbon dioxide and methane that waste food would have emitted in landfills. We are converting those carbons into graphene and adding that graphene to concrete, thereby lowering the amount of carbon dioxide generated in concrete manufacture. It's a win-win environmental scenario using graphene."

"Turning trash to treasure is key to the circular economy," said co-corresponding author Rouzbeh Shahsavari, an adjunct assistant professor of civil and environmental engineering and of materials science and nanoengineering at Rice and president of C-Crete Technologies. "Here, graphene acts both as a 2D template and a reinforcing agent that controls cement hydration and subsequent strength development."

In the past, Tour said, "graphene has been too expensive to use in these applications. The flash process will greatly lessen the price while it helps us better manage waste."

"With our method, that carbon becomes fixed," he said. "It will not enter the air again."

The process aligns nicely with Rice's recently announced Carbon Hub initiative to create a zero-emissions future that repurposes hydrocarbons from oil and gas to generate hydrogen gas and solid carbon with zero emission of carbon dioxide. The flash graphene process can convert that solid carbon into graphene for concrete, asphalt, buildings, cars, clothing and more, Tour said.

Flash Joule heating for bulk graphene, developed in the Tour lab by Rice graduate student and lead author Duy Luong, improves upon techniques like exfoliation from graphite and chemical vapor deposition on a metal foil that require much more effort and cost to produce just a little graphene.

Even better, the process produces "turbostratic" graphene, with misaligned layers that are easy to separate. "A-B stacked graphene from other processes, like exfoliation of graphite, is very hard to pull apart," Tour said. "The layers adhere strongly together.

But turbostratic graphene is much easier to work with because the adhesion between layers is much lower. They just come apart in solution or upon blending in composites.

"That's important, because now we can get each of these single-atomic layers to interact with a host composite," he said.

The lab noted that used coffee grounds transformed into pristine single-layer sheets of graphene.

Bulk composites of graphene with plastic, metals, plywood, concrete and other building materials would be a major market for flash graphene, according to the researchers, who are already testing graphene-enhanced concrete and plastic.

The flash process happens in a custom-designed reactor that heats material quickly and emits all noncarbon elements as gas. "When this process is industrialized, elements like oxygen and nitrogen that exit the flash reactor can all be trapped as small molecules because they have value," Tour said.

He said the flash process produces very little excess heat, channeling almost all of its energy into the target. "You can put your finger right on the container a few seconds afterwards," Tour said. "And keep in mind this is almost three times hotter than the chemical vapor deposition furnaces we formerly used to make graphene, but in the flash process the heat is concentrated in the carbon material and none in a surrounding reactor.

"All the excess energy comes out as light, in a very bright flash, and because there aren't any solvents, it's a super clean process," he said.

Luong did not expect to find graphene when he fired up the first small-scale device to find new phases of material, beginning with a sample of carbon black. "This started when I took a look at a Science paper talking about flash Joule heating to make phase-changing nanoparticles of metals," he said. But Luong quickly realized the process produced nothing but high-quality graphene.

Atom-level simulations by Rice researcher and co-author Ksenia Bets confirmed that temperature is key to the material's rapid formation. "We essentially speed up the slow geological process by which carbon evolves into its ground state, graphite," she said. "Greatly accelerated by a heat spike, it is also stopped at the right instant, at the graphene stage.

Read more at Science Daily

Parkinson's disease may start before birth

Parkinson's definition
People who develop Parkinson's disease before age 50 may have been born with disordered brain cells that went undetected for decades, according to new Cedars-Sinai research. The research points to a drug that potentially might help correct these disease processes.

Parkinson's occurs when brain neurons that make dopamine, a substance that helps coordinate muscle movement, become impaired or die. Symptoms, which get worse over time, include slowness of movement, rigid muscles, tremors and loss of balance. In most cases, the exact cause of neuron failure is unclear, and there is no known cure.

At least 500,000 people in the U.S. are diagnosed with Parkinson's each year, and the incidence is rising. Although most patients are 60 or older when they are diagnosed, about 10% are between 21 and 50 years old. The new study, published in the journal Nature Medicine, focuses on these young-onset patients.

"Young-onset Parkinson's is especially heartbreaking because it strikes people at the prime of life," said Michele Tagliati, MD, director of the Movement Disorders Program, vice chair and professor in the Department of Neurology at Cedars-Sinai. "This exciting new research provides hope that one day we may be able to detect and take early action to prevent this disease in at-risk individuals." Tagliati was a co-author of the study.

To perform the study, the research team generated special stem cells, known as induced pluripotent stem cells (iPSCs), from cells of patients with young-onset Parkinson's disease. This process involves taking adult blood cells "back in time" to a primitive embryonic state. These iPSCs can then produce any cell type of the human body, all genetically identical to the patient's own cells. The team used the iPSCs to produce dopamine neurons from each patient and then cultured them in a dish and analyzed the neurons' functions.

"Our technique gave us a window back in time to see how well the dopamine neurons might have functioned from the very start of a patient's life," said Clive Svendsen, PhD, director of the Cedars-Sinai Board of Governors Regenerative Medicine Institute and professor of Biomedical Sciences and Medicine at Cedars-Sinai. He was the study's senior author.

The researchers detected two key abnormalities in the dopamine neurons in the dish:

  • Accumulation of a protein called alpha-synuclein, which occurs in most forms of Parkinson's disease.
  • Malfunctioning lysosomes, cell structures that act as "trash cans" for the cell to break down and dispose of proteins. This malfunction could cause alpha-synuclein to build up.

"What we are seeing using this new model are the very first signs of young-onset Parkinson's," said Svendsen. "It appears that dopamine neurons in these individuals may continue to mishandle alpha-synuclein over a period of 20 or 30 years, causing Parkinson's symptoms to emerge."

The investigators also used their iPSC model to test a number of drugs that might reverse the abnormalities they had observed. They found that that one drug, PEP005, which is already approved by the Food and Drug Administration for treating precancers of the skin, reduced the elevated levels of alpha-synuclein in both the dopamine neurons in the dish and in laboratory mice.

The drug also countered another abnormality they found in the patients' dopamine neurons -- elevated levels of an active version of an enzyme called protein kinase C -- although the role of this enzyme version in Parkinson's is not clear.

For the next steps, Tagliati said the team plans to investigate how PEP005, currently available in gel form, might be delivered to the brain to potentially treat or prevent young-onset Parkinson's. The team also plans more research to determine whether the abnormalities the study found in neurons of young-onset Parkinson's patients also exist in other forms of Parkinson's.

"This research is an outstanding example of how physicians and investigators from different disciplines join forces to produce translational science with the potential to help patients," said Shlomo Melmed, MB, ChB, executive vice president of Academic Affairs and dean of the Medical Faculty at Cedars-Sinai. "This important work is made possible by the dual leadership of Cedars-Sinai as both a distinguished academic institution and an outstanding hospital."

Read more at Science Daily

Jan 27, 2020

Astronomers detect large amounts of oxygen in ancient star's atmosphere

An international team of astronomers from the University of California San Diego, the Instituto de Astrofísica de Canarias (IAC), and the University of Cambridge have detected large amounts of oxygen in the atmosphere of one of the oldest and most elementally depleted stars known -- a "primitive star" scientists call J0815+4729.

This new finding, which was made using W. M. Keck Observatory on Maunakea in Hawaii to analyze the chemical makeup of the ancient star, provides an important clue on how oxygen and other important elements were produced in the first generations of stars in the universe.

The results are published in the January 21, 2020 edition of the The Astrophysical Journal Letters.

"This result is very exciting. It tells us about some of the earliest times in the universe by using stars in our cosmic back yard," said Keck Observatory Chief Scientist John O'Meara. "I look forward to seeing more measurements like this one so we can better understand the earliest seeding of oxygen and other elements throughout the young universe."

Oxygen is the third most abundant element in the universe after hydrogen and helium, and is essential for all forms of life on Earth, as the chemical basis of respiration and a building block of carbohydrates. It is also the main elemental component of the Earth's crust. However, oxygen didn't exist in the early universe; it is created through nuclear fusion reactions that occur deep inside the most massive stars, those with masses roughly 10 times the mass of the Sun or greater.

Tracing the early production of oxygen and other elements requires studying the oldest stars still in existence. J0815+4729 is one such star; it resides over 5,000 light years away toward the constellation Lynx.

"Stars like J0815+4729 are referred to as halo stars," explained UC San Diego astrophysicist Adam Burgasser, a co-author of the study. "This is due to their roughly spherical distribution around the Milky Way, as opposed to the more familiar flat disk of younger stars that include the Sun."

Halo stars like J0815+4729 are truly ancient stars, allowing astronomers a peek into element production early in the history of the universe.

The research team observed J0815+4729 using Keck Observatory's High-Resolution Echelle Spectrometer (HIRES) on the 10m Keck I telescope. The data, which required more than five hours of staring at the star over a single night, were used to measure the abundances of 16 chemical species in the star's atmosphere, including oxygen.

"The primitive composition of the star indicates that it was formed during the first hundreds of millions of years after the Big Bang, possibly from the material expelled from the first supernovae of the Milky Way," said Jonay González Hernández, Ramón y Cajal postdoctoral researcher and lead author of the study.

Keck Observatory's HIRES data of the star revealed a very unusual chemical composition. While it has relatively large amounts of carbon, nitrogen, and oxygen -- approximately 10, 8, and 3 percent of the abundances measured in the Sun -- other elements like calcium and iron have abundances around one millionth that of the Sun.

"Only a few such stars are known in the halo of our galaxy, but none have such an enormous amount of carbon, nitrogen, and oxygen compared to their iron content," said David Aguado, a postdoctoral researcher at the University of Cambridge and co-author of the study.

The search for stars of this type involves dedicated projects that sift through hundreds of thousands of stellar spectra to uncover a few rare sources like J0815+4729, then follow-up observations to measure their chemical composition. This star was first identified in data obtained with the Sloan Digital Sky Survey (SDSS), then characterized by the IAC team in 2017 using the Grand Canary Telescope in La Palma, Spain.

Read more at Science Daily

New insights about the brightest explosions in the Universe

Swedish and Japanese researchers have, after ten years, found an explanation to the peculiar emission lines seen in one of the brightest supernovae ever observed -- SN 2006gy. At the same time they found an explanation for how the supernova arose.

Superluminous supernovae are the most luminous explosions in cosmos. SN 2006gy is one of the most studied such events, but researchers have been uncertain about its origin. Astrophysicists at Stockholm University have, together with Japanese colleagues, now discovered large amounts of iron in the supernova through spectral lines that have never previously been seen either in supernovae or in other astrophysical objects. That has led to a new explanation for how the supernova arose.

"No-one had tested to compare spectra from neutral iron, i.e. iron which all electrons retained, with the unidentified emission lines in SN 2006gy, because iron is normally ionized (one or more electrons removed). We tried it and saw with excitement how line after line lined up just as in the observed spectrum," says Anders Jerkstrand, Department of Astronomy, Stockholm University.

"It became even more exciting when it quickly turned out that very large amounts of iron was needed to make the lines -- at least a third of the Sun's mass -- which directly ruled out some old scenarios and instead revealed a new one."

The progenitor to SN 2006gy was, according to the new model, a double star consisting of a white dwarf of the same size as the Earth and a hydrogen-rich massive star as large as our solar system in close orbit. As the hydrogen rich star expanded its envelope, which happens when new fuel is ignited in the late stages of evolution, the white dwarf was caught in the envelope and spiralled in towards the centre of the companion. When it reached the centre the unstable white dwarf exploded and a so-called Type Ia supernova was born. This supernova then collided with the ejected envelope, which is flung out during the inspiral, and this gigantic collision gave rise to the light of SN 2006gy.

"That a Type Ia supernova appears to be behind SN 2006gy turns upside down what most researchers have believed," says Anders Jerkstrand.

"That a white dwarf can be in close orbit with a massive hydrogen-rich star, and quickly explode upon falling to the centre, gives important new information for the theory of double star evolution and the conditions necessary for a white dwarf to explode."

Fact: Superluminous supernovae

Superluminous supernovae are the brightest explosions in the Universe. Over a few months they radiate as much energy as the Sun does over its whole lifetime and reach a peak brightness as high as that of an entire galaxy. The origin of this energy, and what kind of star system that has exploded, are still unclear and debated.

Read more at Science Daily

Horror movies manipulate brain activity expertly to enhance excitement

Finnish research team maps neural activity in response to watching horror movies. A study conducted by the University of Turku shows the top horror movies of the past 100 years, and how they manipulate brain activity.

Humans are fascinated by what scares us, be it sky-diving, roller-coasters, or true-crime documentaries -- provided these threats are kept at a safe distance. Horror movies are no different.

Whilst all movies have our heroes face some kind of threat to their safety or happiness, horror movies up the ante by having some kind of superhuman or supernatural threat that cannot be reasoned with or fought easily.

The research team at the University of Turku, Finland, studied why we are drawn to such things as entertainment? The researchers first established the 100 best and scariest horror movies of the past century (Table 1), and how they made people feel.

Unseen Threats Are Most Scary

Firstly, 72% of people report watching at last one horror movie every 6 months, and the reasons for doing so, besides the feelings of fear and anxiety, was primarily that of excitement. Watching horror movies was also an excuse to socialise, with many people preferring to watch horror movies with others than on their own.

People found horror that was psychological in nature and based on real events the scariest, and were far more scared by things that were unseen or implied rather than what they could actually see.

- This latter distinction reflects two types of fear that people experience. The creeping foreboding dread that occurs when one feels that something isn't quite right, and the instinctive response we have to the sudden appearance of a monster that make us jump out of our skin, says principal investigator, Professor Lauri Nummenmaa from Turku PET Centre.

MRI Reveals How Brain Reacts to Different Forms of Fear

Researchers wanted to know how the brain copes with fear in response to this complicated and ever changing environment. The group had people watch a horror movie whilst measuring neural activity in a magnetic resonance imaging scanner.

During those times when anxiety is slowly increasing, regions of the brain involved in visual and auditory perception become more active, as the need to attend for cues of threat in the environment become more important. After a sudden shock, brain activity is more evident in regions involved in emotion processing, threat evaluation, and decision making, enabling a rapid response.

However, these regions are in continuous talk-back with sensory regions throughout the movie, as if the sensory regions were preparing response networks as a scary event was becoming increasingly likely.

Read more at Science Daily

The sexes have equal spatial cognition skills

Men are not better than women at spatial cognition -- such as map reading -- is the principal finding from ground-breaking work by researchers at Lero, the Science Foundation Ireland Research Centre for Software, hosted at University of Limerick (UL), Ireland.

Employing cutting-edge eye-tracking technology researchers Dr Mark Campbell and Dr Adam Toth of the Lero Esports Science Research Lab at UL found that there is no male advantage in mental rotation abilities associated with spatial cognition competences.

Dr Campbell said the skill of spatial cognition or our ability to navigate our environment has been the battleground for almost 40 years for researchers claiming that males have a distinct performance advantage on tests of spatial cognition, notably the mental rotations test.

Studying the cognitive proficiency of individuals and gamers is a key aim of the Lero Esports Science Research Lab which opened in 2019 and is the first of its kind in Ireland.

"Better performance on these tests is strongly associated with higher IQ and better performance in STEM (Science Technology Engineering and Maths) subjects in schools and colleges," Dr Campbell explained.

Dr Toth sums up the results: "So males are better than females? Well no, actually. Our study found that there is no male advantage in mental rotation abilities. By lengthening the time allowed to complete the test, the male performance advantage diminished entirely suggesting that the so-called sex difference in mental rotation is simply not there or may be explained by other factors."

The research published in Nature Scientific Reports also found for the first time that both males and females frequently employed different gaze strategies during the cognitive tests to get to the correct answer. In other words, men and women approach the task in a different way to get the same result.

The research paper is entitled: "Investigating sex differences, cognitive effort, strategy, and performance on a computerised version of the mental rotations test via eye-tracking."

Read more at Science Daily