Feb 20, 2017

These Cutting-Edge Solar Panels Roll Up Like Wrapping Paper

This solar panel certainly doesn't look like one at all. Instead of the classic rigid panel, the flexible solar material created by Albuquerque-based mPower Technology Inc. seems like a cross between metallic wrapping paper and one of those reflective sun shades for car windshields.

The clever mesh design means this solar panel is lightweight, foldable and highly efficient. Microsystems-enabled photovoltaics technology has actually been in development for several years, but it took a big step forward recently, when mPower Technology signed a licensing agreement with Sandia National Laboratories.

Called Dragon SCALES, mPower Technology's product is actually covered in miniature solar cells known as "solar glitter." The SCALES part stands for semi-conductor active layer embedded solar, FastCo.Exist reported. If one cell stops working or becomes shaded, the rest of the cells continue working.

"The key limitation to silicon is that if you bend and flex it, it will crack and shatter," mPower Technology's founder and CEO Murat Okandan said in a press statement. "Our technology makes it virtually unbreakable while keeping all the benefits of high efficiency, high reliability silicon PV."

Okandan envisions the company's tough and bendy solar panels going into satellites and drones as well as biomedical devices and consumer products. Imagine unfurling sheets on rooftops to install photovoltaics in record time or stuffing a roll in your backpack. Hey, is that a yoga mat? Nope, it's my portable solar.

I really thought that with all the technological advancements happening in the last decade-plus, we'd be able to buy rolls of photovoltaics at the hardware store by now. Blending flexibility with efficiency has been an enormous challenge, though. Okandan and his team had to get the microdesign and microfabrication techniques just right.

Prior to starting mPower Technology in 2015, Okandan worked on microsystems-enabled photovoltaics for Sandia as an employee. He launched the company through Sandia's Entrepreneurial Separation to Transfer Technology program, which supports spinoffs.

The new license between mPower Technology and Sandia is expected to help speed up the commercialization of solar glitter. I'm eager to see how it goes, not least because we could use more glitter in general.

From Discovery News

Elephant Numbers in 'Sanctuary' Collapse by 80 Percent Due to Poaching

The forest elephant population within a supposed "sanctuary" in Central Africa declined by more than 80 percent over a decade's time due to poaching, new research finds.

The loss, representing about 25,000 elephant deaths, highlights how sanctuaries — while necessary to separate defined wilderness regions from more populated areas — can also increase threats to certain animals, since their populations then become so concentrated in particular areas.

In this case, documented in the journal Current Biology, the killings occurred at Gabon's Minkébé National Park, one of Central Africa's largest and most important preserves.

"Across Central Africa, forest elephant populations are being more and more restricted to protected areas, and so these will be the areas targeted by poachers," lead author John Poulson told Seeker.

Poulson, an assistant professor of tropical ecology at Duke University's Nicholas School of the Environment, and his colleagues had to overcome challenges in attempting to estimate numbers of the forest elephants in the park. He explained that most other elephants live in savanna regions, where aerial surveys can be used to accurately count elephants.

In forests, however, researchers must depend upon field teams walking over long distances to count elephant dung piles. Densities of poop piles are then converted to densities of elephants, with great care needed to factor in weather effects, such as the fact that greater rainfall can lead to faster decay of the elephant waste.

Using this method, the scientists determined that the population of forest elephants in the central and northern parts of the park between 2004 and 2014 was basically erased. Losses also occurred over the same period in the southern part of the park, but were not as substantial.

A lone forest elephant in Gabon's Minkébé National Park.
Numerous eyewitness accounts and the still-flourishing ivory trade offer evidence that poachers killed the forest elephants. Poulson said that Cameroon's national road, which runs very close to the central and northern parts of the park, makes it fairly easy for Cameroonese poachers to access the park and transport their illegal haul back to their nation's largest city, Douala, which is a major hub of the international ivory trade.

Largely because of reports of poaching in Minkébé National Park, the Gabon government raised the status of forest elephants to "fully protected" in 2011, Pouson said. That same year, the federal authorities doubled the budget of the country's Parks Agency, growing its protection staff by 300 people.

They also created the National Park Police, posting 100 military personnel permanently in Minkébé to shore up protection of the park and to attempt to quell the extermination of elephants. In 2012, Gabon became the first Central African country to burn its entire ivory stock.

Poulson said, "While these efforts are admirable and necessary, by themselves they didn't stop poaching in the area, as evidenced by continued poaching of elephants in recent years."

Guards are posted along the Cameroon-Gabon border and other important entry points, but Poulson said, "The problem is that the area is very large, and poachers will always find a way to slip through if the stakes are high enough."

Poachers obviously have not forgotten the elephants and their ivory, but Poulson and others use the word "forgotten" to describe forest elephants due to both the ongoing threats to the pachyderm's populations and the elephant's status within the conservation community. It is estimated that 100,000 forest elephants live in Gabon now.

Poulson primarily believes forest elephants are forgotten "because of the failure of the international community to recognize the forest elephant as a distinct species. Currently, IUCN (International Union for Conservation of Nature) and CITES (Convention on the International Trade in Endangered Species) only recognize the African elephant, which includes the forest and savanna subspecies, and which is listed as 'vulnerable.'"

The argument for recognizing forest elephants as a distinct species lies in the fact that the genetic differences separating these elephants from African savanna elephants is comparable in magnitude to the differences between modern Asian elephants and the extinct mammoths.

Young forest elephant in Gabon.
Forest elephants do not become pregnant for the first time until they are 23 years of age, and they produce only one calf every five to six years. Savanna elephants, on the other hand, begin breeding at around 12 years of age and typically produce young at three to four year intervals. Forest elephants also tend to be much smaller, in terms of both stature and weight, than savanna elephants.

Politics could help to explain why forest elephants are "forgotten" and have not been recognized as a unique species.

"There is concern that two different species have not been recognized because that would reduce the estimated population of savanna elephants, and trade of its ivory would no longer be permitted," Poulson said. "This could result in 3 southern African nationals pulling out of CITES because they want to sell their ivory stocks."

Read more at Discovery News

Bee decline threatens US crop production

A new study of wild bees identifies 139 counties in key agricultural regions of California, the Pacific Northwest, the Midwest, west Texas and the Mississippi River valley that face a worrisome mismatch between falling wild bee supply and rising crop pollination demand.
The first-ever study to map U.S. wild bees suggests they are disappearing in the country's most important farmlands -- from California's Central Valley to the Midwest's corn belt and the Mississippi River valley.

If wild bee declines continue, it could hurt U.S. crop production and farmers' costs, said Taylor Ricketts, a conservation ecologist at the University of Vermont, at the American Association for the Advancement of Science (AAAS) annual meeting panel, Plan Bee: Pollinators, Food Production and U.S. Policy on Feb. 19.

"This study provides the first national picture of wild bees and their impacts on pollination," said Ricketts, Director of UVM's Gund Institute for Ecological Economics, noting that each year $3 billion of the U.S. economy depends on pollination from native pollinators like wild bees.

At AAAS, Ricketts briefed scholars, policy makers, and journalists on how the national bee map, first published in the Proceedings of the National Academy of Sciences in late 2015, can help to protect wild bees and pinpoint habitat restoration efforts.

At the event, Ricketts also introduced a new mobile app that he is co-developing to help farmers upgrade their farms to better support wild bees.

"Wild bees are a precious natural resource we should celebrate and protect," said Ricketts, Gund Professor in UVM's Rubenstein School of Environment and Natural Resources. "If managed with care, they can help us continue to produce billions of dollars in agricultural income and a wonderful diversity of nutritious food."

TROUBLE ZONES

The map identifies 139 counties in key agricultural regions of California, the Pacific Northwest, the upper Midwest and Great Plains, west Texas, and Mississippi River valley, which appear to have most worrisome mismatch between falling wild bee supply and rising crop pollination demand.

These counties tend to be places that grow specialty crops -- like almonds, blueberries and apples -- that are highly dependent on pollinators. Or they are counties that grow less dependent crops -- like soybeans, canola and cotton -- in very large quantities.

Of particular concern, some crops most dependent on pollinators -- including pumpkins, watermelons, pears, peaches, plums, apples and blueberries -- appeared to have the strongest pollination mismatch, growing in areas with dropping wild bee supply and increasing in pollination demand.

Globally, more than two-thirds of the most important crops either benefit from or require pollinators, including coffee, cacao, and many fruits and vegetables.

Pesticides, climate change and diseases threaten wild bees -- but their decline may be caused by the conversion of bee habitat into cropland, the study suggests. In 11 key states where the map shows bees in decline, the amount of land tilled to grow corn spiked by 200 percent in five years -- replacing grasslands and pastures that once supported bee populations.

RISING DEMAND, FALLING SUPPLY

Over the last decade, honeybee keepers facing colony losses have struggled with rising demand for commercial pollination services, pushing up the cost of managed pollinators -- and the importance of wild bees.

"Most people can think of one or two types of bee, but there are 4,000 species in the U.S. alone," said Insu Koh, a UVM postdoctoral researcher who co-hosted the AAAS panel and led the study.

"When sufficient habitat exists, wild bees are already contributing the majority of pollination for some crops," Koh adds. "And even around managed pollinators, wild bees complement pollination in ways that can increase crop yields."

Read more at Science Daily

'Alternative Facts' Have Plagued Science for Decades

Kellyanne Conway, counselor to President Donald Trump, used the phrase "alternative facts" earlier this year during a "Meet the Press" interview, in which she defended White House Press Secretary Sean Spicer's false statement about the size of the crowd at Trump's inauguration. While the phrase is newly infamous as a result, the phenomenon that it describes has a long history in both politics and science, according to an analysis presented today at the annual meeting of the American Association for the Advancement of Science in Boston.

Public radar tends to be up for demonstrably false or implausible claims made by politicians and their spokespeople, but researchers say many people were duped in the past by alternative scientific "facts," and that the problem persists today.

"My main message is that, just as we need to be on the lookout for false or misleading information about politics and pop culture, we also need to be on the lookout for false or misleading information about science," Kevin Elliott, who conducted the analysis, told Seeker.

"Living an hour away from Flint, Mich., I'm especially cognizant of the ways that the lead industry tried to downplay the hazards of lead almost a hundred years ago in order to promote the use of lead in drinking water pipes and in gasoline," said Elliott, an associate professor at Michigan State University and the author of the book "A Tapestry of Values: An Introduction to Values in Science."

He and others contacted by Seeker focused on four general areas where alternative facts concerning scientific matters have been especially prevalent and damaging: the tobacco industry, the drug industry, the manufacturing industry and climate change.

Tobacco

In the 1950s American tobacco producers created the Tobacco Industry Research Committee, whose purpose was to cast doubt on independent scientific research that was increasingly showing the harms of cigarette smoking, said Erik Conway, a historian at NASA's Jet Propulsion Laboratory who co-authored, with Harvard's Naomi Oreskes, the book "Merchants of Doubt."

"They did this by funding critiques of academic or government science," Conway said, "by funding research programs designed to blame other activities for smoking's health effects, and even by publishing their own 'scientific' journal to have an apparently independent venue to publish their misleading studies in."

"They were successful in fending off tobacco regulation this way for nearly 70 years."

The effort led to what was later called "The Tobacco Strategy" by Oreskes and Conway. The strategy, she explained, involved creating uncertainty about the science behind smoking and health, stating that health claims were exaggerated, arguing that medical innovations and technology could solve any related problems, and emphasizing that there was no need for government interference.

Conway said that the tobacco industry even created "Bad Science: a Resource Book," to help other industries understand "how to play the game."

Drugs

Sergio Sismondo, a professor at Queen's University who has long studied the drug industry, told Seeker that certain players within the pharmaceutical industry "have supported and promoted alternative facts to cloud issues about how effective and safe certain products are, and about the structure of the drug industry as a whole."

As examples, he mentioned a diabetes drug that was introduced in 1999, even though "internal documents already showed that the drug carried heart risks." Sismondo said that several years later, in 2007, the company making the drug "engaged in a quick campaign to cloud the issue."

More generally, he shared that every decade or so, a university's drug development center publishes a figure for the cost of developing a prescription drug. The most recent from 2014, he said, was $2.6 billion.

"The research is supported and then promoted by the drug industry, which uses figures like that to justify high prices, long patents and more, but the figure is grossly misleading, because it rolls in various costs that could be seen as marketing and other business costs," Sismondo said. He added that the figure also focuses on new molecules at the center of research efforts, when actually many companies incur "much lower costs when they repurpose molecules."

Manufacturing

Problems related to the promotion of misleading or false claims extend to some manufacturers of other products too, Elliott said. For example, he said the known health hazards related to asbestos and vinyl chloride, in addition to lead, were withheld from the public so that the manufacturers "could promote false information about the safety of their products."

Like the tobacco industry, these companies became experts about "manufacturing doubt" over scientific evidence challenging the safety and health risks associated with their products. David Michaels, former head of the Occupational Safety and Health Administration, addressed the subject in detail in his book "Doubt Is Their Product."

A recent instance involved German automaker Volkswagen, which Elliott said cheated on emissions tests by installing a device in diesel engines that could detect when a test was being administered, and could change the way the vehicle performed to improve results. The device, he explained, allowed the company to sell its cars in the U.S. while its engines emitted pollutants up to 40 times above what's accepted by the Environmental Protection Agency. The automaker last month admitted guilt and agreed to pay $4.3 billion in criminal and civil penalties, reported CBS News.

Climate Change


Elliott, Oreskes, Conway and many others believe that "The Tobacco Strategy" continues to be followed by big oil companies in response to science concerning climate change. Heather Douglas of the University of Waterloo prefers the expletive "bullshit" instead of "alternative facts," since the phrase popularized by Kellyanne Conway "suggests a possible equivalency between one set of facts and another, and this is just not the case," she told Seeker. "Fact-checking by the media and tracking down sources readily reveals what is reliable and what is not, and the suggested equivalence is illusory.

"There are lots of things to disagree about in the realm of climate change — what are the regional climate projections, what energy policies should we adopt to mitigate the problem, etc. — but whether humans are a substantial cause is no longer one of them."

Polls and regulation concerning smoking of tobacco products indicate that most people now believe smoking cigarettes poses health risks. A Pew poll last year, however, found that nearly three-quarters of American polled do not trust that there is a large "scientific consensus" among climate scientists on human activities being the cause of client change. In response, Erik Conway said, "No one wants to believe that he or she, personally, is helping to destroy the world's ecosystems, and unlike smoking — a personal choice that can be reversed — no one in the industrialized world can stop emitting, because we're trapped in an infrastructure that leaves us no choice."

"So we live in denial instead," he said. "That's one reason the denialist message resonates so powerfully."

The Problem May Be Getting Worse

Although problems associated with misleading or false science are nothing new, several factors appear to be drive the current use of "alternative facts." One is the political climate, particularly in the U.S., Elliott said. "It's possible that our hyper-partisan political climate and our polarized use of social media has made us all more susceptible to accepting questionable scientific claims when they appeal to our political preferences."

Douglas agrees. She said that "the echo-chambers of social media exacerbate the problem of bullshit."

Genuine disagreement among scientists is an important, inherent part of the discipline, and is critical to the evolution of scientific knowledge. It can therefore be challenging to distinguish legitimate scientific debate from statements made by those who continue to follow "The Tobacco Strategy" or similar deceptive tactics.

"We need to look at who supports different facts," Sismondo said. "When one side of a disagreement is supported primarily by (self)-interested parties, whether directly or indirectly, that should lead us to discount that side. We can never completely discount challenges to accepted facts, but we should also recognize that sometimes the challenges are motivated by greed. These challenges can nitpick away at accepted knowledge, but often this nitpicking is scientifically frivolous."

Read more at Discovery News

Feb 19, 2017

How humans bond: The brain chemistry revealed

Mother and baby bonding.
In new research published Monday in the journal Proceedings of the National Academy of Sciences, Northeastern University psychology professor Lisa Feldman Barrett found, for the first time, that the neurotransmitter dopamine is involved in human bonding, bringing the brain's reward system into our understanding of how we form human attachments. The results, based on a study with 19 mother-infant pairs, have important implications for therapies addressing postpartum depression as well as disorders of the dopamine system such as Parkinson's disease, addiction, and social dysfunction.

"The infant brain is very different from the mature adult brain -- it is not fully formed," says Barrett, University Distinguished Professor of Psychology and author of the forthcoming book How Emotions Are Made: The Secret Life of the Brain. "Infants are completely dependent on their caregivers. Whether they get enough to eat, the right kind of nutrients, whether they're kept warm or cool enough, whether they're hugged enough and get enough social attention, all these things are important to normal brain development. Our study shows clearly that a biological process in one person's brain, the mother's, is linked to behavior that gives the child the social input that will help wire his or her brain normally. That means parents' ability to keep their infants cared for leads to optimal brain development, which over the years results in better adult health and greater productivity."

To conduct the study, the researchers turned to a novel technology: a machine capable of performing two types of brain scans simultaneously -- functional magnetic resonance imaging, or fMRI, and positron emission tomography, or PET.

fMRI looks at the brain in slices, front to back, like a loaf of bread, and tracks blood flow to its various parts. It is especially useful in revealing which neurons are firing frequently as well as how different brain regions connect in networks. PET uses a small amount of radioactive chemical plus dye (called a tracer) injected into the bloodstream along with a camera and a computer to produce multidimensional images to show the distribution of a specific neurotransmitter, such as dopamine or opioids.

Barrett's team focused on the neurotransmitter dopamine, a chemical that acts in various brain systems to spark the motivation necessary to work for a reward. They tied the mothers' level of dopamine to her degree of synchrony with her infant as well as to the strength of the connection within a brain network called the medial amygdala network that, within the social realm, supports social affiliation.

"We found that social affiliation is a potent stimulator of dopamine," says Barrett. "This link implies that strong social relationships have the potential to improve your outcome if you have a disease, such as depression, where dopamine is compromised. We already know that people deal with illness better when they have a strong social network. What our study suggests is that caring for others, not just receiving caring, may have the ability to increase your dopamine levels."

Before performing the scans, the researchers videotaped the mothers at home interacting with their babies and applied measurements to the behaviors of both to ascertain their degree of synchrony. They also videotaped the infants playing on their own.

Once in the brain scanner, each mother viewed footage of her own baby at solitary play as well as an unfamiliar baby at play while the researchers measured dopamine levels, with PET, and tracked the strength of the medial amygdala network, with fMRI.

The mothers who were more synchronous with their own infants showed both an increased dopamine response when viewing their child at play and stronger connectivity within the medial amygdala network. "Animal studies have shown the role of dopamine in bonding but this was the first scientific evidence that it is involved in human bonding," says Barrett. "That suggests that other animal research in this area could be directly applied to humans as well."

Read more at Science Daily

2,000-Year-Old Seeds Found in Chinese Tomb May Reveal Clues About the Past

A recent discovery of more than 100 seeds in northern China's Inner Mongolia Autonomous Region has puzzled archaeologists. The half-moon shaped seeds resemble those from modern pomegranates, but according to the regional institute of archaeology, their identity is unknown.

The discovery was made while excavating an ancient tomb in Dengkou County, western Inner Mongolia, from the middle and late Western Han Dynasty (206 B.C.-25 A.D.) and early Eastern Han Dynasty (25 A.D.-220 A.D.)

"One of the advantages to a find like this is that you may come across a specific variety of plant that's no longer around today, or has kind of faded into obscurity," Craig Barrett, assistant professor of plant evolutionary biology at West Virginia University, told Seeker.

Barrett pointed out that scientists today are very interested in preserving the genetic variation we see in crops, called seed banking, in order to save seed varieties from eradication. The Svalbard seed vault in Norway is the largest example of this, with over 880,000 seed samples from almost every country in the world.

The other advantage of a seed find like this is the potential to gain insight about the diet of people two millennia ago.

"2,000 years ago is recent enough to where we know quite a bit about what people in that part of the world were eating," Barrett said, however if the seeds are in fact related to the pomegranate plant, "it might be really significant in terms of finding some ancient variety of pomegranate that people were eating," he added.

It's unknown at this point whether the seeds can be revived or not, but there have been several successful attempts at ancient seed revival in recent years.

Archaeologists unearthed a stockpile of 2,000-year-old seeds in the excavation of Herod the Great's palace in Israel in the early 1960s. The seeds remained stored away for over forty year, but in 2005, a botanical researcher decided to plant one and see if it would sprout.

To her surprise, the seeds produced a Judean date palm tree sapling, which is now the oldest known tree seed to germinate. The tree continues to thrive and even produced its first flower in 2011.

In 2015, students in Winnipeg, Canada, successfully grew an ancient variety of squash from seeds that had been discovered in an archaeological dig on First Nations land. The seeds are thought to be about 800 years old, and the school plans to continue saving the seeds from every squash they grow so the variety never goes extinct again.

Barrett thinks a similar approach should be taken in the case of the northern China discovery.

"In terms of actually figuring out what [plant] this is, the first suggestion I'd have would be to stick the seeds in soil and see if it germinates — that's the easy route," he said.

"But there are other routes you could take," Barrett continued. "For example, having a plant expert in this particular group identify the seeds based on their morphology, or in other words, their shape. The last option would be to grind some [seeds] up and try to sequence some DNA from them, if that's even possible."

Read more at Discovery News

Feb 18, 2017

Big improvement to brain-computer interface

This image shows a sheet of glassy carbon electrodes patterned inside chips.
When people suffer spinal cord injuries and lose mobility in their limbs, it's a neural signal processing problem. The brain can still send clear electrical impulses and the limbs can still receive them, but the signal gets lost in the damaged spinal cord.

The Center for Sensorimotor Neural Engineering (CSNE) -- a collaboration of San Diego State University with the University of Washington and the Massachusetts Institute of Technology -- is working on an implantable brain chip that can record neural electrical signals and transmit them to receivers in the limb, bypassing the damage and restoring movement. Recently, these researchers described in a study published in the journal Nature Scientific Reports a critical improvement to the technology that could make it more durable, last longer in the body and transmit clearer, stronger signals.

The technology, known as a brain-computer interface, records and transmits signals through electrodes, which are tiny pieces of material that read signals from brain chemicals known as neurotransmitters. By recording brain signals at the moment a person intends to make some movement, the interface learns the relevant electrical signal pattern and can transmit that pattern to the limb's nerves, or even to a prosthetic limb, restoring mobility and motor function.

The current state-of-the-art material for electrodes in these devices is thin-film platinum. The problem is that these electrodes can fracture and fall apart over time, said one of the study's lead investigators, Sam Kassegne, deputy director for the CSNE at SDSU and a professor in the mechanical engineering department.

Kassegne and colleagues developed electrodes made out of glassy carbon, a form of carbon. This material is about 10 times smoother than granular thin-film platinum, meaning it corrodes less easily under electrical stimulation and lasts much longer than platinum or other metal electrodes.

"Glassy carbon is much more promising for reading signals directly from neurotransmitters," Kassegne said. "You get about twice as much signal-to-noise. It's a much clearer signal and easier to interpret."

The glassy carbon electrodes are fabricated here on campus. The process involves patterning a liquid polymer into the correct shape, then heating it to 1000 degrees Celsius, causing it become glassy and electrically conductive. Once the electrodes are cooked and cooled, they are incorporated into chips that read and transmit signals from the brain and to the nerves.

Researchers in Kassegne's lab are using these new and improved brain-computer interfaces to record neural signals both along the brain's cortical surface and from inside the brain at the same time. "If you record from deeper in the brain, you can record from single neurons," explained Elisa Castagnola, one of the researchers. "On the surface, you can record from clusters. This combination gives you a better understanding of the complex nature of brain signaling."

A doctoral graduate student in Kassegne's lab, Mieko Hirabayashi, is exploring a slightly different application of this technology. She's working with rats to find out whether precisely calibrated electrical stimulation can cause new neural growth within the spinal cord. The hope is that this stimulation could encourage new neural cells to grow and replace damaged spinal cord tissue in humans. The new glassy carbon electrodes will allow her to stimulate, read the electrical signals of and detect the presence of neurotransmitters in the spinal cord better than ever before.

Read more at Science Daily

Liquid Metal Circuits and Atomic Microchips Could Be the Future of Electronics

In recent years, scientists have been very interested indeed about the concept of two-dimensional materials, sometimes called 2D materials or single-layer materials. As the name suggests, these are structures so thin — down to a single layer of atoms — that they've functionally abandoned the third dimension altogether.

The single-layer variant of carbon known as graphene is the rock star of this particular class of materials, which chemical engineers hope will power the next generation of super-small semiconductors. The tricky part is getting these atomically skinny two-dimensional materials to "plug in" to traditional three-dimensional manufacturing systems.

News out of Australia this week is pointing things in an interesting direction by incorporating liquid metals and a kind of nanoscale version of rust.

Research published today in the journal Nature Communications describes a new technique for creating integrated circuits that are just a few atoms in thickness. The process could potentially allow microchip companies to manufacture circuit wafers as thin as 1.5 nanometers. How skinny is that? Pretty skinny. Consider that a standard sheet of paper is about 100,000 nanometers thick.

"This is a truly revolutionary development," said lead researcher Kourosh Kalantar-zadeh, in an email exchange from his offices at the Royal Melbourne Institute of Technology in Australia. "Our idea will be one of the first steps toward translation of the 2D world into real electronic technologies."

The specifics get complicated indeed — quantum physics are involved — but the essential gist is this: The new technique leverages certain atomic properties of metals with a relatively low melting point — gallium and indium, if you're keeping score at home. These metals naturally form a thin layer of oxide on their surface when in an oxygenated environment. This oxide, a kind of nanoscale variation of rust, can then be transferred onto a pre-treated electronic wafer, creating individual transistors.

"We use nature itself to form atomically thin, self-limiting oxides with no extra manipulation," Kalantar-zadeh said. "It is the force of nature that produces them perfectly and with no ripples and steps. Because the technology comes from the simplest observation in nature, it will impact technologies very rapidly as it is simple, understandable, and easy to implement."

And just in time, too, according to Kalantar-zadeh, who believes the process represents the next big advance for electronics.

"The fundamental technology of car engines has not progressed since 1920 and now the same is happening to electronics," he said in a statement accompanying the research publication. "Mobile phones and computers are no more powerful than five years ago.

"That is why this new 2D printing technique is so important — creating many layers of incredibly thin electronic chips on the same surface dramatically increases processing power and reduces costs. It will allow for the next revolution in electronics."

Read more at Discovery News

Feb 17, 2017

New method uses heat flow to levitate variety of objects

UChicago researchers achieved levitation of macroscopic objects between warm and cold plates in a vacuum chamber.
Although scientists have been able to levitate specific types of material, a pair of UChicago undergraduate physics students helped take the science to a new level.

Third-year Frankie Fung and fourth-year Mykhaylo Usatyuk led a team of UChicago researchers who demonstrated how to levitate a variety of objects -- ceramic and polyethylene spheres, glass bubbles, ice particles, lint strands and thistle seeds -- between a warm plate and a cold plate in a vacuum chamber.

"They made lots of intriguing observations that blew my mind," said Cheng Chin, professor of physics, whose ultracold lab in the Gordon Center for Integrative Science was home to the experiments.

Usatyuk and Fung

In their work, researchers achieved a number of levitation breakthroughs, in terms of duration, orientation and method: The levitation lasted for more than an hour, as opposed to a few minutes; stability was achieved radially and vertically, as opposed to just vertically; and it used a temperature gradient rather than light or a magnetic field. Their findings appeared Jan. 20 in Applied Physics Letters.

"Magnetic levitation only works on magnetic particles, and optical levitation only works on objects that can be polarized by light, but with our first-of-its-kind method, we demonstrate a method to levitate generic objects," said Chin.

In the experiment, the bottom copper plate was kept at room temperature while a stainless steel cylinder filled with liquid nitrogen served as the top plate. The upward flow of heat from the warm to the cold plate kept the particles suspended indefinitely.

"The large temperature gradient leads to a force that balances gravity and results in stable levitation," said Fung, the study's lead author. "We managed to quantify the thermophoretic force and found reasonable agreement with what is predicted by theory. This will allow us to explore the possibilities of levitating different types of objects." (Thermophoresis refers to the movement of particles by means of a temperature gradient.)

"Our increased understanding of the thermophoretic force will help us investigate the interactions and binding affinities between the particles we observed," said Usatyuk, a study co-author. "We are excited about the future research directions we can follow with our system."

The key to obtaining high levitation stability is the geometrical design of the two plates. A proper ratio of their sizes and vertical spacing allows the warm air to flow around and efficiently capture the levitated objects when they drift away from the center. Another sensitivity factor is that the thermal gradient needs to be pointing upward -- even a misalignment of one degree will greatly reduce the levitation stability.

"Only within a narrow range of pressure, temperature gradient and plate geometric factors can we reach stable and long levitation," Chin said. "Different particles also require fine adjustment of the parameters."

The apparatus offers a new ground-based platform to investigate the dynamics of astrophysical, chemical and biological systems in a microgravity environment, according to the researchers.

Levitation of macroscopic particles in a vacuum is of particular interest due to its wide applications in space, atmospheric and astro-chemical research. And thermophoresis has been utilized in aerosol thermal precipitators, nuclear reactor safety and the manufacturing of optical fibers through vacuum deposition processes, which apply progressive layers of atoms or molecules during fabrication.

The new method is significant because it offers a new approach to manipulating small objects without contacting or contaminating them, said Thomas Witten, the Homer J. Livingston Professor Emeritus of Physics. "It offers new avenues for mass assembly of tiny parts for micro-electro-mechanical systems, for example, and to measure small forces within such systems.

"Also, it forces us to re-examine how 'driven gases,' such as gases driven by heat flow, can differ from ordinary gases," he added. "Driven gases hold promise to create new forms of interaction between suspended particles."

Read more at Science Daily

Scarcity of resources led to violence in prehistoric central California

Joshua Tree, California (stock image). This study going back more than 1,000 years, found that California had the highest population density in all of North America, with lots of small groups living in close proximity.
A longtime Cal Poly Pomona anthropology professor who studies violence among prehistoric people in California has been published in a prestigious journal.

Professor Mark Allen's study, titled "Resource scarcity drives lethal aggression among prehistoric hunter-gathers in central California," was published in the Proceedings of the National Academy of Sciences of the United States of America, one of the top journals highlighting the general sciences. Allen teamed up with professors at U.C. Davis, the University of Utah, Cal Poly San Luis Obispo and an archeologist for the Sacramento-based Millennia Archeological Consulting.

"You have to have something significant," Allen says of what it takes to be published in the journal. "You have to have good evidence. As archeologists, you don't get the data you want most of the time. We are typically dealing with fragmented evidence."

Allen says there are two views related to the origins of violence and warfare in humans -- one view that humans in earlier times were peaceful and lived in harmony and a second view that there has always been competition for resources, war and violence.

The second view was confirmed in Allen's study. Using an archeological database of human burials of remains from thousands of Central California inhabitants between going back more than 1,000 years, Allen and his fellow researchers looked at the wound marks from physical traumas they suffered. They also compared that evidence to the environment and looked at the way the communities were socially organized, he says.

They found that California had the highest population density in all of North America, with lots of small groups living in close proximity. There were approximately 100 different languages spoken in California at the time. The data showed how the scarcity of resources and violence correlates.

"When people are stressed out and worried about protecting the group, they are willing to be aggressive," he says. "Violence is about resources for the group."

The data related to the remains showed that about 7 percent of the population at that time had evidence of forced traumas, whether they were shot by an arrow, stabbed or bludgeoned. For females it was 5 percent and for males it was 11 percent, a percentage of violent trauma not even reached during World War II, Allen says.

Allen, who teaches North American and California archeology, says that his research on the origins of violence and warfare speaks to what is happening in modern times.

Read more at Science Daily