Dec 28, 2019

New insights into the earliest events of seed germination

Plant seeds may strike the casual observer as unspectacular -- but they have properties that are nothing short of superpowers. In a dry state they can store their energy for years and then suddenly release it for germination when environmental conditions are favourable. One striking example is the "super bloom" in the Death Valley National Park, when seeds that have endured the dry and hot desert for decades suddenly germinate at rainfall followed by a rare and spectacular desert bloom several months later. Seeds conserve a fully formed embryo, which only continues growing when conditions are right for it to do so. This may be the case only years -- or in more extreme cases even centuries -- later.

Seed germination is controlled by several plant hormones, which are researched intensely. However, not much was known about the processes that need to take place to allow the hormones to function. How is energy in the seed made available? How can energy metabolism be started early and efficiently? An international team of researchers has now been looking into these questions.

Using a new type of fluorescent biosensors, the researchers observed, in living seed cells, both energy metabolism and the so-called redox metabolism, which relies in sulphur. The researchers discovered that when the seeds came into contact with water, energy metabolism was established in a matter of minutes, and the plant cells' "power stations" -- known as mitochondria -- activated their respiration. The researchers also found out which molecular switches are activated to enable energy to be released efficiently -- with the so-called thiol-redox switches playing a central role.

"By looking into the very early processes of germination control, we can gain a better understanding of the mechanisms driving seed germination," says Prof. Markus Schwarzländer from the University of Münster (Germany), who led the study. "In future we could think about how such switches could be used in crop biotechnology." The results of the study could be of relevance in farming, when seeds need to keep their germination vigour for as long as possible on the one hand, but should also germinate in synch and with minimal losses on the other hand. The study has been published in the journal PNAS (Proceedings of the National Academy of Sciences).

Background and method

In order to be able to observe the activities taking place in the energy metabolism, the researchers visualized under the microscope adenosine triphosphate (ATP), the general currency for energy in the cell, and Nicotinamide adenine dinucleotide phosphate (NADPH), the electron energy, in the mitochondria. They compared seeds from thale cress: both dry seeds and seeds "imbibed" with water.

To find out whether the redox switches are important for kick-starting germination, the researchers deactivated specific proteins using genetic methods and then compared the reaction shown by the modified seeds with that of the unmodified ones. The researchers allowed the seeds to age artificially in the laboratory, and they saw that the seeds germinated much less actively if they lacked the relevant proteins.

The researchers' next step involved so-called redox proteome analysis, i.e. they examined the relevant redox proteins in their entirety with the use of biochemical methods. For this purpose, they isolated active mitochondria and flash-froze them in order to be able to study this state directly where the process was taking place. The researchers then used mass spectrometry methods to identify several so-called cysteine-peptides which are important for resource efficiency in energy metabolism.

Read more at Science Daily

For restricted eaters, a place at the table but not the meal

Holiday celebrations often revolve around eating, but for those with food restrictions, that can produce an incongruous feeling when dining with friends and loved ones: loneliness.

People with restricted diets -- due to allergies, health issues or religious or cultural norms -- are more likely to feel lonely when they can't share in what others are eating, new Cornell University research shows.

"Despite being physically present with others, having a food restriction leaves people feeling left out because they are not able to take part in bonding over the meal," said Kaitlin Woolley, assistant professor of marketing in the Samuel Curtis Johnson Graduate School of Management and lead author of the research.

Across seven studies and controlled experiments, researchers found that food restrictions predicted loneliness among both children and adults.

The research also offers the first evidence, Woolley said, that having a food restriction causes increased loneliness. For example, in one experiment, assigning unrestricted individuals to experience a food restriction increased reported feelings of loneliness. That suggests such feelings are not driven by non-food issues or limited to picky eaters, Woolley said.

"We can strip that away and show that assigning someone to a restriction or not can have implications for their feeling of inclusion in the group meal," she said.

Further evidence came from a survey of observers of the Jewish holiday of Passover. When reminded during the holiday of the leavened foods they couldn't enjoy with others, participants' loneliness increased. Yet, within their own similarly restricted group, they felt a stronger bond.

Bonding over meals is an inherently social experience, Woolley notes. In previous research, she found that strangers felt more connected and trusting of each other when they shared the same food, and eating food from the same plate increased cooperation between strangers.

But when restricted from sharing in the meal, people suffer "food worries," Woolley said. They fret about what they can eat and how others might judge them for not fitting in.

Those worries generated a degree of loneliness comparable to that reported by unmarried or low-income adults, and stronger than that experienced by schoolchildren who were not native English speakers, according to the research. Compared with non-restricted individuals, having a restriction increased reported loneliness by 19%. People felt lonelier regardless of how severe their restriction was, or whether their restriction was imposed or voluntary.

The study concluded that food restrictions and loneliness are on the rise and "may be related epidemics," warranting further research.

To date, Woolley said, children have been the primary focus of research on the effects of food restrictions. A nationally representative survey she analyzed from the Centers for Disease Control did not track the issue among adults.

But increasingly, she said, food restrictions are being carried into adulthood, or adults are choosing restricted diets such as gluten-free, vegetarian and vegan for health or ethical reasons. Up to 30% of all participants in her research deal with restrictions, Woolley said.

Read more at Science Daily

Dec 27, 2019

300 million year old atmospheric dust

Dust plays a crucial role in the life and health of our planet. In our modern world, dust-borne nutrients traveling in great dust storms from the Saharan Desert fertilize the soil in the Amazon Rainforest and feed photosynthetic organisms like algae in the Atlantic Ocean. In turn, it is those organisms that breathe in carbon dioxide and expel oxygen.

Mehrdad Sardar Abadi, a researcher in the Mewbourne College of Earth and Energy School of Geosciences and School director Lynn Soreghan, led a study with researchers from Florida State University, the Massachusetts Institute of Technology, Hampton University and the College of Charleston, to understand the role of dust on the Earth's atmosphere in deep time -- 300 million years ago.

To do this research, the team needed to find ancient atmospheric dust, which led them to the remnants of a shallow marine ecosystem in modern-day Iran.

Similar to areas of our modern world like the Bahamas, these shallow marine ecosystems cannot survive unless they are in pristine water away from river runoff, Sardar Abadi explained. By targeting the systems, Sardar Abadi and Soreghan knew that silicate particles they found would have been deposited through the air and not from a river.

Sardar Abadi and Soreghan identified and sampled dust trapped in carbonate rocks from two intervals of limestone now preserved in outcroppings in the mountains of northern and central Iran.

Rocks were then subjected to a series of chemical treatments to extract the ancient dust. What was left were silicate minerals like clay and quartz that entered the environment as air-borne particles -- 300-million-year-old dust.

Ancient dust in hand, Sardar Abadi could determine how much dust was in the Late Paleozoic atmosphere. Their results suggested that Earth's atmosphere was much dustier during this ancient time. Working with collaborators at Florida State University, he performed geochemical tests to analyze the iron in the samples. Those tests revealed that the ancient dust also contained remarkable proportions of highly reactive iron -- a particularly rich source of this key micronutrient.

While iron is not the only micronutrient potentially carried in dust, it is estimated that this ancient dust contained twice the bioavailable iron as the modern dust that fertilizes the Amazon Rainforest.

This potent dust fertilization led to a massive surge in marine photosynthesizers. Fueled by iron-rich dust, algae and cyanobacteria took in carbon dioxide and expelled oxygen. Researchers speculate that this action, operating over millions of years, changed the planet's atmosphere.

"Higher abundances in primary producers like plants and algae could lead to higher carbon capture, helping to explain declines in atmospheric carbon dioxide around 300 million years ago," said Sardar Abadi.

"If what we are seeing from our samples was happening on a global scale, it means that the dust fertilization effect brought down atmospheric carbon dioxide and was a fairly significant part of the carbon cycle during this time in the Earth's history," said Soreghan.

One carbon sequestration method scientists have proposed is adding bioavailable iron to isolated parts of the ocean that are so remote and far from dust-containing continents, they are essentially deserts. Scientists who have attempted this on a small scale have documented resultant phytoplankton blooms.

But, Soreghan warned, no one knows the unintended consequences of doing this on a large scale. This is why Sardar Abadi and the team of researchers delved into deep time for answers.

"The Earth's geologic record is like a laboratory book. It has run an infinite number of experiments. We can open Earth's lab book, reconstruct what happened in the past and see how the Earth responded to these sometimes very extreme states," said Soreghan.

The data and syntheses help constrain and refine computer climate models. The further back into deep time a modeler goes, the more unconstrained variables there are. By providing data, models can be more accurate.

Read more at Science Daily

Development of ultrathin durable membrane for efficient oil and water separation

Researchers led by Professor MATSUYAMA Hideto and Professor YOSHIOKA Tomohisa at Kobe University's Research Center for Membrane and Film Technology have succeeded in developing an ultrathin membrane with a fouling-resistant silica surface treatment for high performance separation of oil from water.

Furthermore, this membrane was shown to be versatile; it was able to separate water from a wide variety of different oily substances.

These results were published online in the Journal of Materials Chemistry A on October 3 2019.

Introduction

The development of technology to separate oil from water is crucial for dealing with oil spills and water pollution generated by various industries. By 2025, it is predicted that two thirds of the world's population won't have sufficient access to clean water. Therefore the development of technologies to filter oily emulsions and thus increase the amount of available clean water is gaining increasing attention.

Compared with traditional purification methods including centrifugation and chemical coagulation, membrane separation has been proposed as a low cost, energy efficient alternative. Although this technology has been greatly developed, most membranes suffer from fouling issues whereby droplets of oil get irreversibly absorbed onto the surface. This leads to membrane pore blocking, subsequently reducing its lifespan and efficiency.

One method of mitigating the fouling issues is to add surface treatments to the membrane. However, many experiments with this method have encountered problems such as changes in the original surface structure and the deterioration of the treated surface layer by strong acid, alkaline and salt solutions. These issues limit the practical applications of such membranes in the harsh conditions during wastewater treatment.

Research Methodology

In this study, researchers succeeded in developing a membrane consisting of a porous polyketone (PK) support with a 10 nano-meter thick silica layer applied on the top surface. This silica layer was formed onto the PK fibrils using electrostatic attraction- the negatively charged silica was attracted to the positively charged PK.

The PK membrane has a high water permeance due to its large pores and high porosity. The silicification process- the addition of silica on the PK fibrils- provides a strong oil-repellant coating to protect the surface modified membrane from fouling issues.

Another advantage of this membrane is that it requires no large pressure application to achieve high water penetration. The membrane exhibited water permeation by gravity- even when a water level as low as 10cm (with a pressure of approx. 0.01atm) was utilized. In addition, the developed membrane was able to reject 99.9% of oil droplets- including those with a size of 10 nanometers. By using this membrane with an area of 1m2, 6000 liters of wastewater can be treated in one hour under an applied pressure of 1atm. It was also shown to be effective at separating water from various different oily emulsions.

As mentioned, the silification provided a strong oil repellant coating. Through the experiments carried out on the membrane to test its durability against fouling, it was discovered that oil did not become adsorbed onto the surface and that the oil droplets could be easily cleaned off. This membrane showed great tolerance against a variety of acidic, alkaline, solvent and salt solutions.

Read more at Science Daily

California's stricter vaccine exemption policy and improved vaccination rates

California's elimination, in 2016, of non-medical vaccine exemptions from school entry requirements was associated with an estimated increase in vaccination coverage at state and county levels, according to a new study published this week in PLOS Medicine by Nathan Lo of the University of California, San Francisco, and colleagues.

Vaccine hesitancy, the reluctance or refusal to receive vaccinations, is a growing public health problem in the United States and globally. The effectiveness of state policies that eliminate non-medical exemptions to childhood vaccination requirements has been unclear. In the new study, researchers used publicly available data from the US Centers for Disease Control and Prevention on coverage of measles, mumps, and rubella (MMR) vaccination and rates of both non-medical and medical exemptions in children entering kindergarten. The dataset included information on 45 states from 2011 to 2017 and county-level data from 17 states spanning 2010 through 2017.

The results of the analysis suggest that after the 2016 implementation of California's new exemption policy, MMR coverage in California increased by 3.3% over what the projected MMR coverage in California would be in the absence of the policy. Non-medical vaccination exemptions decreased by 2.4% and medical exemptions increased by 0.4%. Change in MMR vaccination coverage across California counties from 2015 to 2017 ranged from a 6% decrease to a 26% increase, with the largest increases seen in "high risk" counties with the lower pre-policy vaccination coverage.

"These study results support the idea that state level governmental policies to remove non-medical exemptions can be effective strategies to increase vaccination coverage across the United States," the authors say.

From Science Daily

Intermittent fasting: Live 'fast,' live longer?

For many people, the New Year is a time to adopt new habits as a renewed commitment to personal health. Newly enthusiastic fitness buffs pack into gyms and grocery stores are filled with shoppers eager to try out new diets.

But, does scientific evidence support the claims made for these diets? In a review article published in the Dec. 26 issue of The New England Journal of Medicine, Johns Hopkins Medicine neuroscientist Mark Mattson, Ph.D., concludes that intermittent fasting does.

Mattson, who has studied the health impact of intermittent fasting for 25 years, and adopted it himself about 20 years ago, writes that "intermittent fasting could be part of a healthy lifestyle." A professor of neuroscience at the Johns Hopkins University School of Medicine, Mattson says his new article is intended to help clarify the science and clinical applications of intermittent fasting in ways that may help physicians guide patients who want to try it.

Intermittent fasting diets, he says, fall generally into two categories: daily time-restricted feeding, which narrows eating times to 6-8 hours per day, and so-called 5:2 intermittent fasting, in which people limit themselves to one moderate-sized meal two days each week.

An array of animal and some human studies have shown that alternating between times of fasting and eating supports cellular health, probably by triggering an age-old adaptation to periods of food scarcity called metabolic switching. Such a switch occurs when cells use up their stores of rapidly accessible, sugar-based fuel, and begin converting fat into energy in a slower metabolic process.

Mattson says studies have shown that this switch improves blood sugar regulation, increases resistance to stress and suppresses inflammation. Because most Americans eat three meals plus snacks each day, they do not experience the switch, or the suggested benefits.

In the article, Mattson notes that four studies in both animals and people found intermittent fasting also decreased blood pressure, blood lipid levels and resting heart rates.

Evidence is also mounting that intermittent fasting can modify risk factors associated with obesity and diabetes, says Mattson. Two studies at the University Hospital of South Manchester NHS Foundation Trust of 100 overweight women showed that those on the 5:2 intermittent fasting diet lost the same amount of weight as women who restricted calories, but did better on measures of insulin sensitivity and reduced belly fat than those in the calorie-reduction group.

More recently, Mattson says, preliminary studies suggest that intermittent fasting could benefit brain health too. A multicenter clinical trial at the University of Toronto in April found that 220 healthy, nonobese adults who maintained a calorie restricted diet for two years showed signs of improved memory in a battery of cognitive tests. While far more research needs to be done to prove any effects of intermittent fasting on learning and memory, Mattson says if that proof is found, the fasting -- or a pharmaceutical equivalent that mimics it -- may offer interventions that can stave off neurodegeneration and dementia.

"We are at a transition point where we could soon consider adding information about intermittent fasting to medical school curricula alongside standard advice about healthy diets and exercise," he says.

Mattson acknowledges that researchers do "not fully understand the specific mechanisms of metabolic switching and that "some people are unable or unwilling to adhere" to the fasting regimens. But he argues that with guidance and some patience, most people can incorporate them into their lives. It takes some time for the body to adjust to intermittent fasting, and to get beyond initial hunger pangs and irritability that accompany it. "Patients should be advised that feeling hungry and irritable is common initially and usually passes after two weeks to a month as the body and brain become accustomed to the new habit," Mattson says.

Read more at Science Daily

Dec 23, 2019

How fish get their shape

The diverse colours, shapes and patterns of fish are captivating. Despite such diversity, a general feature that we can observe in fish such as salmon or tuna once they are served in a dish like sushi, is the distinct 'V' patterns in their meat. While this appears to be genetically observed in the muscle arrangement of most fish species, how such a generic 'V' pattern arises is puzzling.

A team of researchers from the Mechanobiology Institute (MBI) at the National University of Singapore (NUS) investigated the science behind the formation of the 'V' patterns -- also known as chevron patterns -- in the swimming muscles of fish. The study focused on the myotome (a group of muscles served by a spinal nerve root) that makes up most of the fish body. These fish muscles power the fish's side-to-side swimming motion and the chevron pattern is thought to increase swimming efficiency. The research team found that these patterns do not simply arise from genetic instruction or biochemical pathways but actually require physical forces to correctly develop. The findings of the study were published in the journal Proceedings of the National Academy of Sciences of the United States of America on 26 November 2019.

Friction and stress combine to shape patterns in fish muscle

The chevron pattern is not unique to salmon and tuna; it is also present in other fish species such as the zebrafish, as well as in some amphibian species like salamanders and frogs during development. The 'V' shape first appears in the somites -- the precursor building blocks of the myotome, which forms the skeletal muscles. The somites typically form during the first few days of fish development or morphogenesis.

A team of scientists led by MBI Postdoctoral Fellow Dr Sham Tlili and Principal Investigator Assistant Professor Timothy Saunders studied chevron formation in the myotome of zebrafish embryos. Initially, each future developing myotome segment is cuboidal in shape. However, over the course of five hours, it deforms into a pointed 'V' shape. To find out how this deformation actually takes place, the team adopted a combination of different techniques -- imaging of the developing zebrafish myotome at single cell resolution; quantitative analysis of the imaging data; and fitting the quantitative data into biophysical models.

Based on findings from their experimental as well as theoretical studies, the MBI scientists identified certain physical mechanisms that they thought might be guiding chevron formation during fish development.

Firstly, the developing myotomes are physically connected to other embryonic tissues such as the neural tube, notochord, skin and ventral tissues. The strength of their connection to these different tissues varies at different time points of myotome formation, and accordingly, different amounts of friction are generated across the tissue. Effectively, the side regions of the developing myotome are under greater friction than the central region. As new segments push the myotome forward, this leads to the formation of a shallow 'U' shape in the myotome tissue.

Secondly, cells within the future myotome begin to elongate as they form muscle fibres. The research team revealed that this transformation process generates an active, non-uniform force along certain directions within the somite tissue, which results in the 'U' shape sharpening into the characteristic 'V'-shaped chevron. Lastly, orientated cell rearrangements within the future myotome help to stabilise the newly acquired chevron shape.

Deciphering the patterns guiding organ formation

Asst Prof Saunders, a theoretical physicist who applies physical principles to characterise biological processes that take place during development, said, "This work reveals how a carefully balanced interplay between cell morphology and mechanical interactions can drive the emergence of complex shapes during development. We are excited to see if the principles we have revealed are also acting in the shaping of other organs."

Read more at Science Daily

Massive gas disk raises questions about planet formation theory

Astronomers using the Atacama Large Millimeter/submillimeter Array (ALMA) found a young star surrounded by an astonishing mass of gas. The star, called 49 Ceti, is 40 million years old and conventional theories of planet formation predict that the gas should have disappeared by that age. The enigmatically large amount of gas requests a reconsideration of our current understanding of planet formation.

Planets are formed in gaseous dusty disks called protoplanetary disks around young stars. Dust particles aggregate together to form Earth-like planets or to become the cores of more massive planets by collecting large amounts of gas from the disk to form Jupiter-like gaseous giant planets. According to current theories, as time goes by the gas in the disk is either incorporated into planets or blown away by radiation pressure from the central star. In the end, the star is surrounded by planets and a disk of dusty debris. This dusty disk, called a debris disk, implies that the planet formation process is almost finished.

Recent advances in radio telescopes have yielded a surprise in this field. Astronomers have found that several debris disk still possess some amount of gas. If the gas remains long in the debris disks, planetary seeds may have enough time and material to evolve to giant planets like Jupiter. Therefore, the gas in a debris disk affects the composition of the resultant planetary system.

"We found atomic carbon gas in the debris disk around 49 Ceti by using more than 100 hours of observations on the ASTE telescope," says Aya Higuchi, an astronomer at the National Astronomical Observatory of Japan (NAOJ). ASTE is a 10-m diameter radio telescope in Chile operated by NAOJ. "As a natural extension, we used ALMA to obtain a more detailed view, and that gave us the second surprise. The carbon gas around 49 Ceti turned out to be 10 times more abundant than our previous estimation."

Thanks to ALMA's high resolution, the team revealed the spatial distribution of carbon atoms in a debris disk for the first time. Carbon atoms are more widely distributed than carbon monoxide, the second most abundant molecules around young stars, hydrogen molecules being the most abundant. The amount of carbon atoms is so large that the team even detected faint radio waves from a rarer form of carbon, 13C. This is the first detection of the 13C emission at 492 GHz in any astronomical object, which is usually hidden behind the emission of normal 12C.

"The amount of 13C is only 1% of 12C, therefore the detection of 13C in the debris disk was totally unexpected," says Higuchi. "It is clear evidence that 49 Ceti has a surprisingly large amount of gas."

Read more at Science Daily

Chronobiology: 'We'll be in later'

Students attending a high school in Germany can decide whether to begin the schoolday at the normal early time or an hour later. According to Ludwig-Maximilians-Universitaet (LMU) in Munich chronobiologists, the measure has had a positive effect on both their sleep and learning experience.

They fall asleep too late at night, and are rudely expelled from dreamland by the shrill tones of the alarm clock in the morning. Classes begin early and they must be prepared to show their mettle.

Adolescents are constantly sleep deprived, a phenomenon that can be observed worldwide. In addition, the problem is no longer confined to certain personality types and therefore of individual concern, it has become a public health issue. Indeed, the Centers for Disease Control and Prevention in the US have officially designated the matter as a public health concern. The consequences of chronic sleep deficit include not only a reduced ability to concentrate but also an increased accident risk to and from school. Studies have also detected higher risks for depression, obesity, diabetes and other chronic metabolic diseases. In light of these findings, it is hardly surprising that calls for school classes to begin later in the morning are becoming louder.

But would such a move do any good? Would a later school start actually change the sleep of adolescents for the better, and enhance their cognitive performance in class? So far, there have been few research studies of this question in Europe. A group of chronobiologists in Munich, led by Eva Winnebeck and Till Roenneberg, studied the issue at a high school in Germany that made an exceptional change to their starting time arrangement. This school instituted a system that allows senior students to decide day by day whether or not to attend the first class of the day or to come to school an hour later. This form of flexible scheduling is possible because the school has adopted what is known as the Dalton Plan (for which the institution won the German School Prize in 2013). A major component of this idea (which originated in the US) is that students are required to tackle parts of the school curriculum independently in the context of project phases. The school timetable allots 10 hours per week for these activities, half of which are scheduled for the first class at 8 o'clock in the morning. Students who choose to skip this class must work through the material in their free periods during the day or after the end of the regular school day. Students from the three senior grades (i.e. 15- to 19-year-olds) served as the study population for LMU researchers from the Institute of Medical Psychology. For 3 weeks before and 6 weeks after the introduction of the flexible system in the school in Alsdorf, the team observed how the students reacted and adapted to the change. The participating students were asked to record their sleeping patterns daily, and around half of them were equipped with activity monitors for objective sleep monitoring. At the end of the study, the participants provided information on their sleep, their overall level of satisfaction and their ability to concentrate in class and while studying course content.

Read more at Science Daily

Moms' obesity in pregnancy is linked to lag in sons' development and IQ

A mother's obesity in pregnancy can affect her child's development years down the road, according to researchers who found lagging motor skills in preschoolers and lower IQ in middle childhood for boys whose mothers were severely overweight while pregnant. A team of epidemiologists, nutritionists and environmental health researchers at Columbia University Mailman School of Public Health and the University of Texas at Austin and found that the differences are comparable to the impact of lead exposure in early childhood. The findings are published in BMC Pediatrics.

The researchers studied 368 mothers and their children, all from similar economic circumstances and neighborhoods, during pregnancy and when the children were 3 and 7 years of age. At age 3, the researchers measured the children's motor skills and found that maternal obesity during pregnancy was strongly associated with lower motor skills in boys. At age 7, they again measured the children and found that the boys whose mothers were overweight or obese in pregnancy had scores 5 or more points lower on full-scale IQ tests, compared to boys whose mothers had been at a normal weight. No effect was found in the girls.

"What's striking is, even using different age-appropriate developmental assessments, we found these associations in both early and middle childhood, meaning these effects persist over time," said Elizabeth Widen, assistant professor of nutritional sciences at UT Austin and a co-author. "These findings aren't meant to shame or scare anyone. We are just beginning to understand some of these interactions between mothers' weight and the health of their babies."

It is not altogether clear why obesity in pregnancy would affect a child later, though previous research has found links between a mother's diet and cognitive development, such as higher IQ scores in kids whose mothers have more of certain fatty acids found in fish. Dietary and behavioral differences may be driving factors, or fetal development may be affected by some of the things that tend to happen in the bodies of people with a lot of extra weight, such as inflammation, metabolic stress, hormonal disruptions and high amounts of insulin and glucose.

The researchers controlled for several factors in their analysis, including race and ethnicity, marital status, the mother's education and IQ, as well as whether the children were born prematurely or exposed to environmental toxic chemicals like air pollution. What the pregnant mothers ate or whether they breastfed were not included in the analysis.

The team also examined and accounted for the nurturing environment in a child's home, looking at how parents interacted with their children and if the child was provided with books and toys. A nurturing home environment was found to lessen the negative effects of obesity.

According to Widen and senior author Andrew Rundle, DrPH, associate professor of Epidemiology at Columbia Mailman School, while the results showed that the effect on IQ was smaller in nurturing home environments, it was still there.

This is not the first study to find that boys appear to be more vulnerable in utero. Earlier research found lower performance IQ in boys but not girls whose mothers were exposed to lead, and a 2019 study suggested boys whose moms had fluoride in pregnancy scored lower on an IQ assessment.

Because childhood IQ is a predictor of education level, socio-economic status and professional success later in life, researchers say there is potential for impacts to last into adulthood.

Read more at Science Daily

Dec 22, 2019

Early-life exposure to dogs may lessen risk of developing schizophrenia

Child with dog.
Ever since humans domesticated the dog, the faithful, obedient and protective animal has provided its owner with companionship and emotional well-being. Now, a study from Johns Hopkins Medicine suggests that being around "man's best friend" from an early age may have a health benefit as well -- lessening the chance of developing schizophrenia as an adult.

And while Fido may help prevent that condition, the jury is still out on whether or not there's any link, positive or negative, between being raised with Fluffy the cat and later developing either schizophrenia or bipolar disorder.

"Serious psychiatric disorders have been associated with alterations in the immune system linked to environmental exposures in early life, and since household pets are often among the first things with which children have close contact, it was logical for us to explore the possibilities of a connection between the two," says Robert Yolken, M.D., chair of the Stanley Division of Pediatric Neurovirology and professor of neurovirology in pediatrics at the Johns Hopkins Children's Center, and lead author of a research paper recently posted online in the journal PLOS One.

In the study, Yolken and colleagues at Sheppard Pratt Health System in Baltimore investigated the relationship between exposure to a household pet cat or dog during the first 12 years of life and a later diagnosis of schizophrenia or bipolar disorder. For schizophrenia, the researchers were surprised to see a statistically significant decrease in the risk of a person developing the disorder if exposed to a dog early in life. Across the entire age range studied, there was no significant link between dogs and bipolar disorder, or between cats and either psychiatric disorder.

The researchers caution that more studies are needed to confirm these findings, to search for the factors behind any strongly supported links, and to more precisely define the actual risks of developing psychiatric disorders from exposing infants and children under age 13 to pet cats and dogs.

According to the American Pet Products Association's most recent National Pet Owners Survey, there are 94 million pet cats and 90 million pet dogs in the United States. Previous studies have identified early life exposures to pet cats and dogs as environmental factors that may alter the immune system through various means, including allergic responses, contact with zoonotic (animal) bacteria and viruses, changes in a home's microbiome, and pet-induced stress reduction effects on human brain chemistry.

Some investigators, Yolken notes, suspect that this "immune modulation" may alter the risk of developing psychiatric disorders to which a person is genetically or otherwise predisposed.

In their current study, Yolken and colleagues looked at a population of 1,371 men and women between the ages of 18 and 65 that consisted of 396 people with schizophrenia, 381 with bipolar disorder and 594 controls. Information documented about each person included age, gender, race/ethnicity, place of birth and highest level of parental education (as a measure of socioeconomic status). Patients with schizophrenia and bipolar disorder were recruited from inpatient, day hospital and rehabilitation programs of Sheppard Pratt Health System. Control group members were recruited from the Baltimore area and were screened to rule out any current or past psychiatric disorders.

All study participants were asked if they had a household pet cat or dog or both during their first 12 years of life. Those who reported that a pet cat or dog was in their house when they were born were considered to be exposed to that animal since birth.

The relationship between the age of first household pet exposure and psychiatric diagnosis was defined using a statistical model that produces a hazard ratio -- a measure over time of how often specific events (in this case, exposure to a household pet and development of a psychiatric disorder) happen in a study group compared to their frequency in a control group. A hazard ratio of 1 suggests no difference between groups, while a ratio greater than 1 indicates an increased likelihood of developing schizophrenia or bipolar disorder. Likewise, a ratio less than 1 shows a decreased chance.

Analyses were conducted for four age ranges: birth to 3, 4 to 5, 6 to 8 and 9 to 12.

Surprisingly, Yolken says, the findings suggests that people who are exposed to a pet dog before their 13th birthday are significantly less likely -- as much as 24% -- to be diagnosed later with schizophrenia.

"The largest apparent protective effect was found for children who had a household pet dog at birth or were first exposed after birth but before age 3," he says.

Yolken adds that if it is assumed that the hazard ratio is an accurate reflection of relative risk, then some 840,000 cases of schizophrenia (24% of the 3.5 million people diagnosed with the disorder in the United States) might be prevented by pet dog exposure or other factors associated with pet dog exposure.

"There are several plausible explanations for this possible 'protective' effect from contact with dogs -- perhaps something in the canine microbiome that gets passed to humans and bolsters the immune system against or subdues a genetic predisposition to schizophrenia," Yolken says.

For bipolar disorder, the study results suggest there is no risk association, either positive or negative, with being around dogs as an infant or young child.

Overall for all ages examined, early exposure to pet cats was neutral as the study could not link felines with either an increased or decreased risk of developing schizophrenia or bipolar disorder.

"However, we did find a slightly increased risk of developing both disorders for those who were first in contact with cats between the ages of 9 and 12," Yolken says. "This indicates that the time of exposure may be critical to whether or not it alters the risk."

One example of a suspected pet-borne trigger for schizophrenia is the disease toxoplasmosis, a condition in which cats are the primary hosts of a parasite transmitted to humans via the animals' feces. Pregnant women have been advised for years not to change cat litter boxes to eliminate the risk of the illness passing through the placenta to their fetuses and causing a miscarriage, stillbirth, or potentially, psychiatric disorders in a child born with the infection.

In a 2003 review paper, Yolken and colleague E. Fuller Torrey, M.D., associate director of research at the Stanley Medical Research Institute in Bethesda, Maryland, provided evidence from multiple epidemiological studies conducted since 1953 that showed there also is a statistical connection between a person exposed to the parasite that causes toxoplasmosis and an increased risk of developing schizophrenia. The researchers found that a large number of people in those studies who were diagnosed with serious psychiatric disorders, including schizophrenia, also had high levels of antibodies to the toxoplasmosis parasite.

Because of this finding and others like it, most research has focused on investigating a potential link between early exposure to cats and psychiatric disorder development. Yolken says the most recent study is among the first to consider contact with dogs as well.

"A better understanding of the mechanisms underlying the associations between pet exposure and psychiatric disorders would allow us to develop appropriate prevention and treatment strategies," Yolken says.

Read more at Science Daily

Dogs process numerical quantities in similar brain region as humans

Dog and chalkboard addition.
Dogs spontaneously process basic numerical quantities, using a distinct part of their brains that corresponds closely to number-responsive neural regions in humans, finds a study at Emory University.

Biology Letters published the results, which suggest that a common neural mechanism has been deeply conserved across mammalian evolution.

"Our work not only shows that dogs use a similar part of their brain to process numbers of objects as humans do -- it shows that they don't need to be trained to do it," says Gregory Berns, Emory professor of psychology and senior author of the study.

"Understanding neural mechanisms -- both in humans and across species -- gives us insights into both how our brains evolved over time and how they function now," says co-author Stella Lourenco, an associate professor of psychology at Emory.

Such insights, Lourenco adds, may one day lead to practical applications such as treating brain abnormalities and improving artificial intelligence systems.

Lauren Aulet, a PhD candidate in Lourenco's lab, is first author of the study.

The study used functional magnetic resonance imaging (fMRI) to scan dogs' brains as they viewed varying numbers of dots flashed on a screen. The results showed that the dogs' parietotemporal cortex responded to differences in the number of the dots. The researchers held the total area of the dots constant, demonstrating that it was the number of the dots, not the size, that generated the response.

The approximate number system supports the ability to rapidly estimate a quantity of objects in a scene, such as the number of predators approaching or the amount of food available for foraging. Evidence suggests that humans primarily draw on their parietal cortex for this ability, which is present even in infancy.

This basic sensitivity to numerical information, known as numerosity, does not rely on symbolic thought or training and appears to be widespread throughout the animal kingdom. Much of the research in non-humans, however, has involved intensive training of the subjects.

Previous research, for example, has found that particular neurons in the parietal cortex of monkeys are attuned to numerical values. Such studies had not clarified whether numerosity is a spontaneous system in non-human primates, because the subjects underwent many trials and received rewards for selecting scenes with greater numbers of dots in preparation for the experiments.

Behavioral studies in dogs that were trained in the task of discriminating between different quantities of objects have also indicated that dogs are sensitive to numerosity.

The Emory researchers wanted to delve further into the neural underpinnings of canine number perception using fMRI.

Berns is founder of the Dog Project, which is researching evolutionary questions surrounding man's best, and oldest friend. The project was the first to train dogs to voluntarily enter an fMRI scanner and remain motionless during scanning, without restraint or sedation.

Lourenco primarily researches human visual perception, cognition and development.

Eleven dogs of varying breeds were involved in the current fMRI experiments. The dogs did not receive advance training in numerosity. After entering the fMRI, they passively viewed dot arrays that varied in numerical value. Eight of the 11 dogs showed greater activation in the parietotemporal cortex when the ratio between alternating dot arrays was more dissimilar than when the numerical values were constant.

"We went right to the source, observing the dogs' brains, to get a direct understanding of what their neurons were doing when the dogs viewed varying quantities of dots," Aulet says. "That allowed us to bypass the weaknesses of previous behavioral studies of dogs and some other species."

Humans and dogs are separated by 80 million years of evolution, Berns notes. "Our results provide some of the strongest evidence yet that numerosity is a shared neural mechanism that goes back at least that far," he says.

Unlike dogs and other animals, humans are able to build on basic numerosity in order to do more complex math, drawing primarily on the prefrontal cortex. "Part of the reason that we are able to do calculus and algebra is because we have this fundamental ability for numerosity that we share with other animals," Aulet says. "I'm interested in learning how we evolved that higher math ability and how these skills develop over time in individuals, starting with basic numerosity in infancy."

Read more at Science Daily