Mar 24, 2017

Milky Way-like galaxies in early universe embedded in 'super halos'

Artist impression of a progenitor of Milky Way-like galaxy in the early universe with a background quasar shinning through a 'super halo' of hydrogen gas surrounding the galaxy. New ALMA observations of two such galaxies reveal that these vast halos extend well beyond the galaxies' dusty, star-forming disks. The galaxies were initially found by the absorption of background quasar light passing through the galaxies. ALMA was able to image the ionized carbon in the galaxies' disks, revealing crucial details about their structures.
By harnessing the extreme sensitivity of the Atacama Large Millimeter/submillimeter Array (ALMA), astronomers have directly observed a pair of Milky Way-like galaxies seen when the universe was only eight percent of its current age. These progenitors of today's giant spiral galaxies are surrounded by "super halos" of hydrogen gas that extend many tens-of-thousands of light-years beyond their dusty, star-filled disks.

Astronomers initially detected these galaxies by studying the intense light from even-more-distant quasars. As this light travels through an intervening galaxy on its way to Earth, it can pick up the unique spectral signature from the galaxy's gas. This technique, however, normally prevents astronomers from seeing the actual light emitted by the galaxy, which is overwhelmed by the much brighter emission from the background quasar.

"Imagine a tiny firefly next to a high-power search light. That's what astronomers are up against when it comes to observing these youthful versions of our home galaxy," said Marcel Neeleman a postdoctoral fellow at the University of California, Santa Cruz, and lead author on a paper appearing in the journal Science. "We can now see the galaxies themselves, which gives us an amazing opportunity to learn about the earliest history of our own galaxy and others like it."

With ALMA, the astronomers were finally able to observe the natural millimeter-wavelength "glow" emitted by ionized carbon in the dense and dusty star-forming regions of the galaxies. This carbon signature, however, is considerably offset from the gas first detected by quasar absorption. This extreme separation indicates that the galaxies' gas content extends well beyond their star-filled disks, suggesting that each galaxy is embedded in a monstrous halo of hydrogen gas.

"We had expected we would see faint emissions right on top of the quasar, and instead we saw strong bright carbon emission from the galaxies at large separations from their background quasars," said J. Xavier Prochaska, professor of astronomy and astrophysics at UC Santa Cruz and coauthor of the paper. The separation from the quasar to the observed galaxy is about 137,000 light-years for one galaxy and about 59,000 light-years for the other.

According to the researchers, the neutral hydrogen gas revealed by its absorption of quasar light is most likely part of a large halo or perhaps an extended disk of gas around the galaxy. "It's not where the star formation is, and to see so much gas that far from the star-forming region means there is a large amount of neutral hydrogen around the galaxy," Neeleman said.

The new ALMA data show that these young galaxies are already rotating, which is one of the hallmarks of the massive spiral galaxies we see in the universe today. The ALMA observations further reveal that both galaxies are forming stars at moderately high rates: more than 100 solar masses per year in one galaxy and about 25 solar masses per year in the other.

"These galaxies appear to be massive, dusty, and rapidly star-forming systems, with large, extended layers of gas," Prochaska said.

"ALMA has solved a decades-old question on galaxy formation," said Chris Carilli, an astronomer with the National Radio Astronomy Observatory in Socorro, N.M., and co-author on the paper. "We now know that at least some very early galaxies have halos that are much more extended that previously considered, which may represent the future material for galaxy growth."

Read more at Science Daily

Gravitational wave kicks monster black hole out of galactic core

This image, taken by NASA's Hubble Space Telescope, reveals an unusual sight: a runaway quasar fleeing from its galaxy's central hub. A quasar is the visible, energetic signature of a black hole. Black holes cannot be observed directly, but they are the energy source at the heart of quasars -- intense, compact gushers of radiation that can outshine an entire galaxy. The green dotted line marks the visible periphery of the galaxy. The quasar, named 3C 186, appears as a bright star just off-center. The quasar and its host galaxy reside 8 billion light-years from Earth. Researchers estimate that it took the equivalent energy of 100 million supernovas exploding simultaneously to jettison the black hole. The most plausible explanation for this propulsive energy is that the monster object was given a kick by gravitational waves unleashed by the merger of two hefty black holes at the center of the host galaxy. The Hubble image combines visible and near-infrared light taken by the Wide Field Camera 3.
Astronomers have uncovered a supermassive black hole that has been propelled out of the center of a distant galaxy by what could be the awesome power of gravitational waves.

Though there have been several other suspected, similarly booted black holes elsewhere, none has been confirmed so far. Astronomers think this object, detected by NASA's Hubble Space Telescope, is a very strong case. Weighing more than 1 billion suns, the rogue black hole is the most massive black hole ever detected to have been kicked out of its central home.

Researchers estimate that it took the equivalent energy of 100 million supernovas exploding simultaneously to jettison the black hole. The most plausible explanation for this propulsive energy is that the monster object was given a kick by gravitational waves unleashed by the merger of two hefty black holes at the center of the host galaxy.

First predicted by Albert Einstein, gravitational waves are ripples in space that are created when two massive objects collide. The ripples are similar to the concentric circles produced when a hefty rock is thrown into a pond. Last year, the Laser Interferometer Gravitational-Wave Observatory (LIGO) helped astronomers prove that gravitational waves exist by detecting them emanating from the union of two stellar mass black holes, which are several times more massive than the sun.

Hubble's observations of the wayward black hole surprised the research team. "When I first saw this, I thought we were seeing something very peculiar," said team leader Marco Chiaberge of the Space Telescope Science Institute (STScI) and Johns Hopkins University, in Baltimore, Maryland. "When we combined observations from Hubble, the Chandra X-ray Observatory, and the Sloan Digital Sky Survey, it all pointed towards the same scenario. The amount of data we collected, from X-rays to ultraviolet to near-infrared light, is definitely larger than for any of the other candidate rogue black holes."

Chiaberge's paper will appear in the March 30 issue of Astronomy & Astrophysics.

Hubble images taken in visible and near-infrared light provided the first clue that the galaxy was unusual. The images revealed a bright quasar, the energetic signature of a black hole, residing far from the galactic core. Black holes cannot be observed directly, but they are the energy source at the heart of quasars -- intense, compact gushers of radiation that can outshine an entire galaxy. The quasar, named 3C 186, and its host galaxy reside 8 billion light-years away in a galaxy cluster. The team discovered the galaxy's peculiar features while conducting a Hubble survey of distant galaxies unleashing powerful blasts of radiation in the throes of galaxy mergers.

"I was anticipating seeing a lot of merging galaxies, and I was expecting to see messy host galaxies around the quasars, but I wasn't really expecting to see a quasar that was clearly offset from the core of a regularly shaped galaxy," Chiaberge recalled. "Black holes reside in the center of galaxies, so it's unusual to see a quasar not in the center."

The team calculated the black hole's distance from the core by comparing the distribution of starlight in the host galaxy with that of a normal elliptical galaxy from a computer model. The black hole had traveled more than 35,000 light-years from the center, which is more than the distance between the sun and the center of the Milky Way.

Based on spectroscopic observations taken by Hubble and the Sloan survey, the researchers estimated the black hole's mass and measured the speed of gas trapped near the behemoth object. Spectroscopy divides light into its component colors, which can be used to measure velocities in space. "To our surprise, we discovered that the gas around the black hole was flying away from the galaxy's center at 4.7 million miles an hour," said team member Justin Ely of STScI. This measurement is also a gauge of the black hole's velocity, because the gas is gravitationally locked to the monster object.

The astronomers calculated that the black hole is moving so fast it would travel from Earth to the moon in three minutes. That's fast enough for the black hole to escape the galaxy in 20 million years and roam through the universe forever.

The Hubble image revealed an interesting clue that helped explain the black hole's wayward location. The host galaxy has faint arc-shaped features called tidal tails, produced by a gravitational tug between two colliding galaxies. This evidence suggests a possible union between the 3C 186 system and another galaxy, each with central, massive black holes that may have eventually merged.

Based on this visible evidence, along with theoretical work, the researchers developed a scenario to describe how the behemoth black hole could be expelled from its central home. According to their theory, two galaxies merge, and their black holes settle into the center of the newly formed elliptical galaxy. As the black holes whirl around each other, gravity waves are flung out like water from a lawn sprinkler. The hefty objects move closer to each other over time as they radiate away gravitational energy. If the two black holes do not have the same mass and rotation rate, they emit gravitational waves more strongly along one direction. When the two black holes collide, they stop producing gravitational waves. The newly merged black hole then recoils in the opposite direction of the strongest gravitational waves and shoots off like a rocket.

The researchers are lucky to have caught this unique event because not every black-hole merger produces imbalanced gravitational waves that propel a black hole in the opposite direction. "This asymmetry depends on properties such as the mass and the relative orientation of the back holes' rotation axes before the merger," said team member Colin Norman of STScI and Johns Hopkins University. "That's why these objects are so rare."

An alternative explanation for the offset quasar, although unlikely, proposes that the bright object does not reside within the galaxy. Instead, the quasar is located behind the galaxy, but the Hubble image gives the illusion that it is at the same distance as the galaxy. If this were the case, the researchers should have detected a galaxy in the background hosting the quasar.

Read more at Science Daily

Computer program developed to diagnose and locate cancer from a blood sample

DNA from tumour cells is known to end up in the bloodstream in the earliest stages of cancer so offers a unique target for early detection of the disease.
Researchers in the United States have developed a computer program that can simultaneously detect cancer and identify where in the body the cancer is located, from a patient's blood sample. The program is described in research published this week in the open access journal Genome Biology.

Professor Jasmine Zhou, co-lead author from the University of California at Los Angeles, said: "Non-invasive diagnosis of cancer is important, as it allows the early diagnosis of cancer, and the earlier the cancer is caught, the higher chance a patient has of beating the disease. We have developed a computer-driven test that can detect cancer, and also identify the type of cancer, from a single blood sample. The technology is in its infancy and requires further validation, but the potential benefits to patients are huge."

The program works by looking for specific molecular patterns in cancer DNA that is free flowing in the patients' blood and comparing the patterns against a database of tumour epigenetics, from different cancer types, collated by the authors. DNA from tumour cells is known to end up in the bloodstream in the earliest stages of cancer so offers a unique target for early detection of the disease.

Professor Zhou explained: "We built a database of epigenetic markers, specifically methylation patterns, which are common across many types of cancer and also specific to cancers originating from specific tissue, such as the lung or liver. We also compiled the same 'molecular footprint' for non-cancerous samples so we had a baseline footprint to compare the cancer samples against. These markers can be used to deconvolute the DNA found freely in the blood into tumor DNA and non-tumor DNA."

In this study, the new computer program and two other methods (called Random Forest and Support Vector Machine) were tested with blood samples from 29 liver cancer patients, 12 lung cancer patients and 5 breast cancer patients. Tests were run 10 times on each sample to validate the results. The Random Forest and Support Vector Machine methods had an overall error rate (the chance that the test produces a false positive) of 0.646 and 0.604 respectively, while the new program obtained a lower error rate of 0.265.

Twenty-five out of the 29 liver cancer patients and 5 out of 12 lung cancer patients tested in this study had early stage cancers, which the program was able to detect in 80% of cases. Although the level of tumour DNA present in the blood is much lower during the early stages of these cancers, the program was still able to make a diagnosis demonstrating the potential of this method for the early detection of cancer, according to the researchers.

Read more at Science Daily

Astronomers identify purest, most massive brown dwarf

An artist's impression of the new pure and massive brown dwarf.
An international team of astronomers has identified a record breaking brown dwarf (a star too small for nuclear fusion) with the 'purest' composition and the highest mass yet known. The object, known as SDSS J0104+1535, is a member of the so-called halo -- the outermost reaches -- of our Galaxy, made up of the most ancient stars. The scientists report the discovery in Monthly Notices of the Royal Astronomical Society.

Brown dwarfs are intermediate between planets and fully-fledged stars. Their mass is too small for full nuclear fusion of hydrogen to helium (with a consequent release of energy) to take place, but they are usually significantly more massive than planets.

Located 750 light years away in the constellation of Pisces, SDSS J0104+1535 is made of gas that is around 250 times purer than the Sun, so consists of more than 99.99% hydrogen and helium. Estimated to have formed about 10 billion years ago, measurements also suggest it has a mass equivalent to 90 times that of Jupiter, making it the most massive brown dwarf found to date.

It was previously not known if brown dwarfs could form from such primordial gas, and the discovery points the way to a larger undiscovered population of extremely pure brown dwarfs from our Galaxy's ancient past.

The research team was led by Dr ZengHua Zhang of the Institute of Astrophysics in the Canary Islands. He said: "We really didn't expect to see brown dwarfs that are this pure. Having found one though often suggests a much larger hitherto undiscovered population -- I'd be very surprised if there aren't many more similar objects out there waiting to be found."

SDSS J0104+1535 has been classified as an L type ultra-subdwarf using its optical and near-infrared spectrum, measured using the European Southern Observatory's "Very Large Telescope" (VLT). This classification was based on a scheme very recently established by Dr Zhang.

From Science Daily

New portal to unveil the dark sector of the universe

Portals can allow the exploration of the dark sector with the Standard Model particles Portals mix or connect the dark sector particles with the Standard Model particles. Through the portals it is possible to explore the dark sector particles using the Standard Model particles. The portals play a basic and critical role in the study of the dark sector particles both theoretically and experimentally.
Once upon a time, the Universe was just a hot soup of particles. In those days, together with visible particles, other particles to us hidden or dark might have formed. Billions of years later scientists catalogued 17 types of visible particles, with the most recent one being the Higgs boson, creating the 'Standard Model'. However, they are still struggling to detect the hidden particles, the ones that constitute the dark sector of the Universe.

Scientists at the Center for Theoretical Physics of the Universe, within the Institute for Basic Science (IBS) have proposed a hypothetical portal that connects two possible dark sector particles; their research could open a new perspective into the murky understanding of the dark sector. Published in Physical Review Letters, this study has implications in cosmology and astroparticle physics.

Physicists have plenty of ideas about what these dark sector particles might look like. One candidate is the axion, which is a very light particle that can solve some theoretical problems of the Standard Model. Another candidate is the dark photon: A very light particle which shares some properties with one of the particles of the Standard Model, that is the photon, the constituent of visible light. However, while photons couple to the electromagnetic charge, dark photons couple to the so-called dark charge, that might be carried by other dark sector particles.

Physicists believe that the dark sector communicates with the Standard Model, via portals. For example, a vector portal would allow the mixing between photons and dark photons. And, an axion portal connects axions and photons. There are only several possible portals physicists have identified, and each portal is a major tool in theoretical and experimental studies in searching for dark sector particles. A team of IBS scientists, hypothesized the existence of a new portal they named the "dark axion portal" that connects dark photons and axions.

The central idea of the dark axion portal is based on the observation that new heavy quarks may also have a dark charge that couples to the dark photon. Through the heavy quarks, axion, photon, and dark photon can interact with each other.

Read more at Science Daily

Mar 23, 2017

Most cancer mutations are due to random DNA copying 'mistakes'

No matter how perfect the environment, random DNA copying errors occur.
Johns Hopkins Kimmel Cancer Center scientists report data from a new study providing evidence that random, unpredictable DNA copying "mistakes" account for nearly two-thirds of the mutations that cause cancer. Their research is grounded on a novel mathematical model based on DNA sequencing and epidemiologic data from around the world.

"It is well-known that we must avoid environmental factors such as smoking to decrease our risk of getting cancer. But it is not as well-known that each time a normal cell divides and copies its DNA to produce two new cells, it makes multiple mistakes," says Cristian Tomasetti, Ph.D., assistant professor of biostatistics at the Johns Hopkins Kimmel Cancer Center and the Johns Hopkins Bloomberg School of Public Health. "These copying mistakes are a potent source of cancer mutations that historically have been scientifically undervalued, and this new work provides the first estimate of the fraction of mutations caused by these mistakes."

"We need to continue to encourage people to avoid environmental agents and lifestyles that increase their risk of developing cancer mutations. However, many people will still develop cancers due to these random DNA copying errors, and better methods to detect all cancers earlier, while they are still curable, are urgently needed," says Bert Vogelstein, M.D., co-director of the Ludwig Center at the Johns Hopkins Kimmel Cancer Center.

Tomasetti and Vogelstein conducted the new study described in a report published March 24 in the journal Science.

The researchers say their conclusions are in accord with epidemiologic studies showing that approximately 40 percent of cancers can be prevented by avoiding unhealthy environments and lifestyles. But among the factors driving the new study, say the researchers, is that cancer often strikes people who follow all the rules of healthy living -- nonsmoker, healthy diet, healthy weight, little or no exposure to known carcinogens -- and have no family history of the disease, prompting the pained question "Why me?"

Tomasetti and Vogelstein believe the answer to this question rests in random DNA copying errors. Current and future efforts to reduce known environmental risk factors, they say, will have major impacts on cancer incidence in the U.S. and abroad. But they say the new study confirms that too little scientific attention is given to early detection strategies that would address the large number of cancers caused by random DNA copying errors.

"These cancers will occur no matter how perfect the environment," says Vogelstein.

In a previous study authored by Tomasetti and Vogelstein in the Jan. 2, 2015, issue of Science, the pair reported that DNA copying errors could explain why certain cancers in the U.S., such as those of the colon, occur more commonly than other cancers, such as brain cancer.

In the new study, the researchers addressed a different question: What fraction of mutations in cancer are due to these DNA copying errors?

To answer this question, the scientists took a close look at the mutations that drive abnormal cell growth among 32 cancer types (Supplemental Materials, Table S6). They developed a new mathematical model using DNA sequencing data from The Cancer Genome Atlas and epidemiologic data from the Cancer Research UK database.

According to the researchers, it generally takes two or more critical gene mutations for cancer to occur. In a person, these mutations can be due to random DNA copying errors, the environment or inherited genes. Knowing this, Tomasetti and Vogelstein used their mathematical model to show, for example, that when critical mutations in pancreatic cancers are added together, 77 percent of them are due to random DNA copying errors, 18 percent to environmental factors, such as smoking, and the remaining 5 percent to heredity.

In other cancer types, such as those of the prostate, brain or bone, more than 95 percent of the mutations are due to random copying errors.

Lung cancer, they note, presents a different picture: 65 percent of all the mutations are due to environmental factors, mostly smoking, and 35 percent are due to DNA copying errors. Inherited factors are not known to play a role in lung cancers.

Looking across all 32 cancer types studied, the researchers estimate that 66 percent of cancer mutations result from copying errors, 29 percent can be attributed to lifestyle or environmental factors, and the remaining 5 percent are inherited.

The scientists say their approach is akin to attempts to sort out why "typos" occur when typing a 20-volume book: being tired while typing, which represents environmental exposures; a stuck or missing key in the keyboard, which represent inherited factors; and other typographical errors that randomly occur, which represent DNA copying errors. "You can reduce your chance of typographical errors by making sure you're not drowsy while typing and that your keyboard isn't missing some keys," says Vogelstein. "But typos will still occur because no one can type perfectly. Similarly, mutations will occur, no matter what your environment is, but you can take steps to minimize those mutations by limiting your exposure to hazardous substances and unhealthy lifestyles."

Tomasetti and Vogelstein's 2015 study created vigorous debate from scientists who argued that their previously published analysis did not include breast or prostate cancers, and it reflected only cancer incidence in the United States.

However, Tomasetti and Vogelstein now report a similar pattern worldwide, supporting their conclusions. They reasoned that the more cells divide, the higher the potential for so-called copying mistakes in the DNA of cells in an organ. They compared total numbers of stem cell divisions with cancer incidence data collected by the International Agency for Research on Cancer on 423 registries of cancer patients from 68 countries other than the U.S., representing 4.8 billion people, or more than half of the world's population. This time, the researchers were also able to include data from breast and prostate cancers. They found a strong correlation between cancer incidence and normal cell divisions among 17 cancer types, regardless of the countries' environment or stage of economic development.

Tomasetti says these random DNA copying errors will only get more important as societies face aging populations, prolonging the opportunity for our cells to make more and more DNA copying errors. And because these errors contribute to a large fraction of cancer, Vogelstein says that people with cancer who have avoided known risk factors should be comforted by their findings. "It's not your fault," says Vogelstein. "Nothing you did or didn't do was responsible for your illness."

In addition to Tomasetti and Vogelstein, Lu Li, a doctoral student in Tomasetti's laboratory in the Department of Biostatistics at the Johns Hopkins Bloomberg School of Public Health, also contributed to the research. Funding for the research was provided by the John Templeton Foundation, the Lustgarten Foundation for Pancreatic Cancer Research, the Virginia and D.K. Ludwig Fund for Cancer Research, the Sol Goldman Center for Pancreatic Cancer Research, and the National Institutes of Health's National Cancer Institute (CA006973, CA43460, and CA62924).

Read more at Science Daily

Tracing aromatic molecules in the early Universe

In this study, astronomers used data from the Keck and Spitzer telescopes to trace the star forming and dusty regions of galaxies at about 10 billion years ago. The picture in the background shows the GOODS field, one of the five regions in the sky that was observed for this study.
A molecule found in car engine exhaust fumes that is thought to have contributed to the origin of life on Earth has made astronomers heavily underestimate the amount of stars that were forming in the early Universe, a University of California, Riverside-led study has found.

That molecule is called polycyclic aromatic hydrocarbon (PAH). On Earth it is also found in coal and tar. In space, it is a component of dust, which along with gas, fills the space between stars within galaxies.

The study, which was just published in the Astrophysical Journal, represents the first time that astronomers have been able to measure variations of PAH emissions in distant galaxies with different properties. It has important implications for the studies of distant galaxies because absorption and emission of energy by dust particles can change astronomers' views of distant galaxies.

"Despite the ubiquity of PAHs in space, observing them in distant galaxies has been a challenging task," said Irene Shivaei, a graduate student at UC Riverside, and leader of the study. "A significant part of our knowledge of the properties and amounts of PAHs in other galaxies is limited to the nearby universe."

The research was conducted as part of the University of California-based MOSDEF survey, a study that uses the Keck telescope in Hawaii to observe the content of about 1,500 galaxies when the universe was 1.5 to 4.5 billion years old. The researchers observed the emitted visible-light spectra of a large and representative sample of galaxies during the peak-era of star formation activity in the universe.

In addition, the researchers incorporated infrared imaging data from the NASA Spitzer Space Telescope and the European Space Agency-operated Herschel Space Observatory to trace the polycyclic aromatic hydrocarbon emission in mid-infrared bands and the thermal dust emission in far-infrared wavelengths.

The researchers concluded that the emission of polycyclic aromatic hydrocarbon molecules is suppressed in low-mass galaxies, which also have a lower fraction of metals, which are atoms heavier than hydrogen and helium. These results indicate that the polycyclic aromatic hydrocarbon molecules are likely to be destroyed in the hostile environment of low-mass and metal-poor galaxies with intense radiation.

The researchers also found that the polycyclic aromatic hydrocarbon emission is relatively weaker in young galaxies compared to older ones, which may be due to the fact that polycyclic aromatic hydrocarbon molecules are not produced in large quantities in young galaxies.

They found that the star-formation activity and infrared luminosity in the universe 10 billion years ago is approximately 30 percent higher than previously measured.

Read more at Science Daily

Scientists reveal hidden structures in bacterial DNA

Quick-freeze deep-etch replica TEM imaging of a M. pneumoniae cell.
DNA contains the instructions for life, encoded within genes. Within all cells, DNA is organised into very long lengths known as chromosomes. In animal and plant cells these are double-ended, like pieces of string or shoelaces, but in bacteria they are circular. Whether stringy or circular, these long chromosomes must be organised and packaged inside a cell so that the genes can be switched on or off when they are required.

Working together with colleagues in Spain, Japan and Australia, researchers led by Luis Serrano, ICREA research professor and leader of the Design of Biological Systems laboratory at the Centre for Genomic Regulation, focused their attention on the organisation of DNA within an organism with an extremely small genome -- the pneumonia pathogen Mycoplasma pneumoniae. Its circular chromosome is five times smaller than that of larger bacteria such as the gut bug E. coli.

Using a technique called Hi-C, which reveals the interactions between different pieces of DNA, the researchers created a three-dimensional 'map' of the Mycoplasma chromosome. They then used super-resolution microscopy to prove that this computer-generated map matched up with the real-life chromosome organisation inside bacterial cells.

Notably, the CRG team, which counted with the expertise in Mycoplasma from the Serrano's laboratory and the collaboration of the ICREA research professor Marc Marti-Renom at CNAG-CRG, discovered that Mycoplasma's circular chromosome is consistently organised the same way in all the cells, with a region called the Origin (where DNA copying begins) at one end of the structure and the midpoint of the chromosome located at the opposite end. This is a similar arrangement to that seen in some other larger bacterial species.

The scientists also used the Hi-C technique to study more detailed patterns of organisation within the Mycoplasma genome. In recent years, scientists all over the world have investigated the organisation of chromosomes inside cells from species ranging from larger bacteria to human. Next Generation Sequencing has allowed scientists to 'read' the DNA sequence of any genome, but this doesn't reveal how genetic information is managed and organised in the crowded and bustling biological environment inside a cell. Now, new tools have revealed complex organisational structures within the genomes of larger organisms, with certain regions of chromosomes clustered together to form domains containing genes that are switched on or off together.

However, it was thought that these domains would not be found in Mycoplasma, because its genome is so small and it only makes around 20 different DNA binding proteins responsible for organising the chromosome, compared to the hundreds made by other bacterial species.

Intriguingly, the CRG team found that even the tiny Mycoplasma chromosome is organised into distinct structural domains, each containing genes that are also turned on or off in a co-ordinated way.

Marie Trussart, the lead author on the paper, said: "Studying bacteria with such a small genome was a big technical challenge, especially because we were using super-resolution microscopy, and it took us five years to complete the project. We had suspected that the Mycoplasma genome might have a similar overall organisation to other bacteria, but we were completely surprised to find that it was also organised into domains, which can be considered as regulatory units of chromatin organisation and that we had identified a previously unknown layer of gene regulation. This research shows that the organisation and control of genes cannot be understood by just looking at the linear sequence of DNA in the genome. Indeed, to get the full picture of gene regulation we need to look at the three-dimensional organisation of the chromatin that also coordinates gene activity."

The discovery suggests that this level of organisation and genetic control is common to all living cells, from the largest to the smallest, and can be achieved with little more than a handful of DNA binding proteins and the structural properties of the DNA itself.

Read more at Science Daily

A new web of life: First full family tree of the world's spiders

Leucauge venusta suspended from its web.
For the first time biologists have made a full family tree of the world's spiders, giving us knowledge about venoms that can be useful in medicine. And we might be able to develop silk just as good as the spider's.

They may make you cringe in horror, or they may intrigue you. Some even have them as pets.

Regardless of how you judge them, spiders are a plentiful and widespread group of animals. They have been around for 400 million years, count 45,000 species, and crawl around on nearly every terrestrial habitat in in the world.

For long, researchers have tried to unlock the secrets to their evolutionary history, striking diversity and success.

First of its kind

One team, including Dimitar Dimitrov from the Natural History Museum in Oslo, has taken this task to an unrivalled level, sampling 932 spider species from across the globe, representing every but one of the world's 116 known families.

The spiders of the last family are extremely small, and involving them was too complicated. But they are not really significant in this context, Dimitrov says. They will be included in further analyses.

The spiders were sequenced for several gene markers and then compared to each other, analyses in which Dimitrov was heavily involved. Simply put, the more similar the genetic code is between two species, the more closely they are related.

The team was thereby able to order and place the different spider branches in relation to each other, reconstructing their history through a so-called phylogenetic tree (see fact box).

"It is the most comprehensive study of spiders' evolution until now," says Dimitrov.

Drivers of diversification

One of the main challenges for understanding spider evolution is the identification of the drivers that have led to spider diversity.

"Our findings are important for understanding how different characters such as webs, vision or venoms have evolved and have affected the diversification of different groups that have these characters. For example why do some families have thousands of species and others just a few? Now that we have a large-scale phylogeny we may actually address this question combining information on traits and natural history with the tree," the entomologist explains.

Far reaching applications

The newly spun web of life not only alters our understanding of spiders, but may also impact disciplines such as material science and medicine, the researchers claim. "Spiders' venoms are exceptionally diverse in terms of their components. Thus, having a large tree of spiders will help us understand how those have evolved. We can also use the tree to predict the venom type of spiders that have not been studied. This is also important for medical applications as some of the venom components are used in the pharmaceutical industry."

Another alluring prospect relates to the manufacture of artificial silk, which material scientists try to copy with the same extreme strength and elasticity as silk produced by spiders.

"As of now there is no artificial fiber that can match the spider silk properties. In the future," Dimitrov explains, "the research team may supply the current tree with even more species and genomic data, which may further resolve uncertain parts of the tree."

Big picture science


"What I like most about this type of studies is that they provide you with the "big picture," a perspective that is hard to gain otherwise. Yet it is necessary to put more specific studies into a general evolutionary context," Dimitrov explains.

"For example, it is really hard to gain a deep understanding on the evolution of traits if one is looking at a specific trait in just a few species."

Envision two separate species. Both of them only thrive in a harsh and arid environment and happen to look alike. Did they adapt to the arid habitat independently or did they inherit this ability through a common ancestor?

"The two options would imply rather radical differences in our understanding of adaptations to arid environments. If it happened independently, one would suggest that this might be common, while the other would suggest it is as a rate event. If we lack a phylogenetic perspective we cannot really tell which one would be the case."

Read more at Science Daily

Under the Dead Sea, warnings of dire drought

Deep below the seabed, drilling revealed thick layers of salt, precipitated out during past warm, dry periods. In this specimen, transparent crystals (left) formed on what was then the bottom during winter; finer white ones (right) formed on the water surface in summer and later sank.
Nearly 1,000 feet below the bed of the Dead Sea, scientists have found evidence that during past warm periods, the Mideast has suffered drought on scales never recorded by humans -- a possible warning for current times. Thick layers of crystalline salt show that rainfall plummeted to as little as a fifth of modern levels some 120,000 years ago, and again about 10,000 years ago. Today, the region is drying again as climate warms, and scientists say it will get worse. The new findings may cause them to rethink how much worse, in this already thirsty and volatile part of the world.

"All the observations show this region is one of those most affected by modern climate change, and it's predicted to get dryer. What we showed is that even under natural conditions, it can become much drier than predicted by any of our models," said lead author Yael Kiro, a geochemist at Columbia University's Lamont-Doherty Earth Observatory. The findings were just published in an early online edition of the journal Earth and Planetary Science Letters.

The landlocked Dead Sea, straddling Israel, Jordan and Palestinian lands, is earth's lowest spot on land. Its current shoreline lies about 1,300 feet below sea level, and its floor extends down another 900 feet. Fed mainly by the Jordan River drainage, which extends also into Syria and Lebanon, it is a dead end for water, and so is extremely salty; its Biblical name in Hebrew is Y?m ha-Melah, the sea of salt. In recent years, its level has dropped about four feet a year. But hot, dry weather is not the main cause yet; rather, booming populations in the region need more water than ever, and people are sucking so much from the watershed, very little reaches the Dead Sea, where evaporation is outweighing input.

The U.N. Food and Agriculture Organization estimates that much of the region already has per capita water availability only a tenth of the world average. Rainfall has declined about 10 percent since 1950, and existing climate models say it could sink another 20 percent this century, even as population continues to grow. Israel is meeting demand by desalinating Mediterranean seawater, but poorer, landlocked Jordan and the Palestinian territories are desperate for more. In adjoining Syria, a record 1998-2012 drought likely stoked by climate change is believed to have helped spark the ongoing civil war, which has now claimed more than 500,000 lives and infected neighboring nations.

In 2010, scientists from a half-dozen nations drilled 1,500 feet into the deepest part of the seabed, bringing up a cross section of deposits recording 200,000 years of regional climate history -- the longest such archive in the Mideast. (Around-the-clock drilling went for 40 days and 40 nights -- perhaps a respectful bow to the rainfall of the Biblical Flood.) The cores revealed alternating layers of mud washed in with runoff during wet times, and crystallized salt, precipitated out during dry times when the water receded. This instantly made it clear that the region has suffered epic dry periods, but the core was not analyzed in great detail until now.

The new study shows that the salt accumulated rapidly?an estimated half-inch per year in many cases. The researchers spotted two striking periods. About halfway down they found salty layers some 300 feet thick, indicating a long-term drop below the sea's current level. This came in a period between ice ages, 115,000 to 130,000 years ago, when variations in Earth's orbit brought temperatures about 4 degrees hotter those of the 20th century?equivalent to what is projected for the end of the 21st century. The lake refilled when glaciers readvanced in sub-polar regions and the Mideast climate cooled and became moister. The cores show a similar drop in lake level just 6,000 to 10,000 years ago, following the most recent ice age, when temperatures were probably a bit cooler than now.

The chemistry of tiny fluid bubbles within the salt allowed the researchers to extrapolate rainfall and runoff patterns of these periods. They calculated that runoff to the Dead Sea generally declined 50 to 70 percent compared to today, dwarfing current projections for this century. In the most extreme periods, it went down 80 percent, and this lasted for decades to centuries at a time. The declines are probably linked to broader shifts in atmospheric flow patterns. Storms coming in from the Mediterranean could have slackened, as they appear to be doing today; and then as now, higher temperatures increase evaporation of moisture from the land.

To alleviate growing water shortages, Jordan plans to break ground next year on a canal to bring in water from the Red Sea for desalination; leftover brine would be dumped into the Dead Sea, possibly stabilizing its level. But the project is controversial, because it could cause drastic environmental changes in both seas, and could still leave much of the rest of the region with inadequate water.

Read more at Science Daily