RSS
 

Posts Tagged ‘Ars Technica’

Nobel Awarded to Researcher Who Redefined Crystalline

05 Oct

By John Timmer, Ars Technica

Yesterday, the Physics Nobel Prize went to a group of researchers who found that what we expected about something as basic as the structure of the universe was wrong. Today, the Chemistry Prize has gone to a lone researcher who overturned something even more basic: His discovery of what’s now termed a quasicrystal actually triggered the redefinition of what a crystalline solid is.

It’s easy to find a representation of a typical crystal in any chemistry textbook, which will typically show an orderly arrangement of atoms, spreading out to infinity. These crystals, which are as easy to find as looking in the nearest salt shaker, look the same no matter which direction you look at them. There are a limited number of ways to build something with that sort of symmetry, and chemists had pretty much figured they identified all of them. In fact, the International Union of Crystallography had defined a crystal as, “a substance in which the constituent atoms, molecules, or ions are packed in a regularly ordered, repeating three-dimensional pattern.”

Enter Israel’s Daniel Shechtman, who was working with a rapidly cooled aluminum alloy with about 10 to 15 percent manganese mixed in. Shechtman put his sample under an electron microscope to generate a diffraction pattern, in which electrons are bounced off the atoms in an orderly crystal structure, creating a bunch of bright and dark regions that tell us about the positions of the atoms themselves. The diffraction pattern Shechtman saw, shown above, didn’t make any sense — it showed a tenfold symmetry, something that any chemist, including Shechtman, would know was impossible.

The exclamation points in Shechtman's notebook. Image: Iowa State University

In fact, his notebook, which is also still around, has three question marks next to the point where he noted the sample’s tenfold symmetry.

His boss apparently thought he had lost it and, according to the Nobel’s press information, bought Shechtman a crystallography handbook to tell him what he already knew. But Shechtman was persistent, and sent his data to others in the field, some of whom took it seriously.

Fortunately, there was some precedent for the sorts of patterns he was seeing. Mathematicians had studied Medieval Islamic tiling that contained repeated patterns that lacked symmetries, and had developed methods of describing them. This Penrose tiling (named after Roger Penrose, a British mathematician) could also be used to describe the sorts of patterns Shechtman was seeing in his crystals.

Despite the mathematical backing, Shechtman’s first publication on the topic met fierce resistance from some in the crystallography community, including Nobel Laureate Linus Pauling. What gradually won the day for him was the fact that other researchers were able to quickly publish related quasicrystalline structures — some of them may have actually seen this years earlier, but didn’t know what to make of the data, so they left it in the file drawer.

Enough labs published results that it became impossible to contend that they all needed a remedial trip to a crystallography textbook, and the consensus in the field went in Shechtman’s favor. Eventually, the International Union of Crystallography even changed its definition of a crystal to accommodate what was once thought to be impossible. And, more recently, researchers have even described a naturally occurring quasicrystal.

The Nobel Prize literature cites a number of interesting properties of these substances that might eventually be turned into useful materials. Quasicrystals, even purely metallic ones, tend to be very hard (although prone to fracturing). Their unusual structures make them poor conductors of heat and electricity, and can help create a nonstick surface. There is some hope that, because of their poor heat conductivity, they’ll make good materials for converting temperature differences directly to electricity, allowing the harvesting of waste heat.

Still, the Prize isn’t being awarded because quasicrystals could have commercial applications. Instead, it’s being awarded because Shechtman demonstrated that he could reliably reproduce what we once thought was impossible.

Top image: The symmetry pattern of electron diffraction in Shechtman’s quasicrystal. (Nobel Media)

Source: Ars Technica

See Also:

 
 

Sheer Numbers Gave Early Humans Edge Over Neanderthals

29 Jul

By Kate Shaw, Ars Technica

Between 35,000 and 45,000 years ago, Neanderthals in Europe and Asia were replaced by the first modern humans. Why and how this transition occurred remains somewhat controversial. New research from the journal Science suggests that sheer numbers may have played a large role in modern humans’ eventual takeover; archeological data shows that early populations of modern humans may have outnumbered Neanderthals by more than 9 to 1.

Two archaeologists from Cambridge University analyzed data from the Aquitaine region of southern France, which has Europe’s highest density of sites from this era, and one of the most complete archeological records. They used data from three time periods that encompassed the transition between Neanderthals and modern humans: the Mouterian and Chatelperronian eras, during which Neanderthals lived, and the Aurignacian period, which was dominated by modern humans. By examining differences between land use during these time periods, the researchers hoped to determine whether population dynamics played a role in the transition between these two hominins.

Because of the difficulties in estimating long-ago populations, the researchers used a few different proxies for population sizes and densities. They analyzed the number of occupied sites in each era, the size of these sites, and the accumulation rates of stone tools and animal food remains. Through these proxies, the researchers could get good estimates of population dynamics during the transition from Neanderthals to modern humans in Aquitaine.

From the Mouterian to the Chatelperronian era, there was very little increase in the number of rock-shelter sites. There were about 26 sites occupied in the Mouterian era, and 31 in the Chatelperronian period, suggesting that the Neanderthal population was not growing quickly. However, there were about 108 sites occupied by modern humans in the Aurignacian period. The increase is similar for occupied open-air sites. Adjusted for time scales, these figures suggest that, between the last Neanderthal-dominated era and the first era dominated by modern humans, the population numbers and densities increased by a factor of about 2.5.

A similar trend was seen in the sizes of occupied areas, with the Neanderthal sites averaging less than 200 square meters, while several of the modern human sites reached up to 600 square meters. From the size differences of the sites, the researchers estimate population increased up to 3 times as the Neanderthal-dominated era ended and modern humans occupied their sites.

Finally, the accumulation of stone tools and animal remains tells a similar story: modern humans were far more numerous than the Neanderthals they replaced. The densities of stone tools and animal food remains skyrocketed between the Chatelperronian and Aurignacian eras—according to these differences, the modern human population probably outnumbered the Neanderthals by a factor of about 1.8.

Each of these statistics, taken alone, tells only part of the story. Since these archaeological proxies was developed independently, the estimations can be looked at cumulatively to get a better idea of the different population sizes. When evaluated as a whole, these estimations show that the population size and densities of modern humans may have been more than 9 times those of the Neanderthals around the time of the population’s transition. It’s very likely that a numerical advantage that large played a significant role in modern humans’ dominance over their earlier counterparts.

While the study did not directly address the features that gave modern humans a population advantage, the authors suggest that it was probably due to a combination of factors such as improved food storage, an increase in social cohesion, and the potential for trade and the exchange of goods.

Source: Ars Technica

Image: A comparison of Neanderthal and human settlement density and size about 30,000 years ago. (Science)

See Also:

Citation: Science, 2011. DOI: 10.1126/science.1206930

 
 

Cold, Lonely Planets More Common Than Sun-Like Stars

19 May

By Christopher Dombrowski, Ars Technica

Seems like every week astronomers find a new exoplanet, one that’s the biggest or the smallest or the hottest or most habitable. However, this week astronomers are announcing a truly unique and new class of exoplanets: Jupiter sized planets that are in extremely large orbits or completely unbound from a host star altogether. And there appear to be a lot of them, as these planets seem to be more common than main sequence stars.

Finding a planet that is not associated with a star is no easy task. In the new search, a team of researchers used a technique called gravitational microlensing. As you look at a background field of stars, if an object passes between you and one of the stars, there will be a temporary brightening of that star. This occurs as the gravity of the object bends light around itself, which acts as a lens for light from the background star, hence “gravitational lensing.” Microlensing occurs when the foreground object is too small to create measurable distortion of the background star and only a brightening is observed. This makes it an ideal detector for small, dim objects.

The mass of the lensing object determines the duration of the brightening event — the longer the duration, the more massive. A Jupiter-sized object would produce lensing event with a duration of around one day.

The odds of a microlensing event occurring are exceedingly small, as the lensing object has to line up exactly between you and the background star. To compensate, astronomers looked at 50 millions of stars over several years, which yielded 474 microlensing events. Out of those 474, 10 had durations of less than two days, consistent with a Jupiter mass object.

No host stars were observed within 10 astronomical units of the lensing object. Previous work from The Gemini Planet Imager had set limits of the population of Jupiter-sized planets in extended orbits. From that data, the astronomers were able to estimate that 75 percent of their observed planets were most likely not bound to a host star at all, and are instead loose within the galaxy.

By creating a galactic-mass density model that takes into account this new class of object, astronomers were able to predict how many of these unbound planets there might be. They found that there are ~1.8 times as many unbound Jupiter-sized object as there are main sequence stars in our galaxy.

This raises a number of questions. Did these planets from near a star only to be ejected from the system? And if they truly have never been bound to any stars, do these planets represent a new planetary formation process? In any case, these observations have discovered a whole new population of Jupiter-sized planets in the Milky Way, and there are a lot of them.

I wonder if these new planets are like our Jupiter and, like our Jupiter, have moons which are geologically active and warm. If so, these new planets may have significantly increased the number of places that life may exist.

Image: NASA/JPL-Caltech [full-resolution image]

Citation: “Unbound or distant planetary mass population detected by gravitational microlensing.” The Microlensing Observations in Astrophysics (MOA) Collaboration and The Optical Gravitational Lensing Experiment (OGLE) Collaboration. Nature, Vol. 473, Pg. 349–352, 19 May 2011. DOI: 10.1038/nature10092

Source: Ars Technica

See Also:

 
 

Engineered Viruses Boost Memory Recall in Mice

03 Mar

By John Timmer, Ars Technica

Memories fade with time, often to the annoyance of those who can’t recall important details. But scientists have now found a way to boost the recall of memories even after they’ve started to fade. Unfortunately, the method involves injecting an engineered virus directly into the brain, so those of us who are bad with names may want to wait a bit for the technique to be refined.

The work was done in rats, and the memories in question are associations between a specific taste — saccharine, for example — and an unpleasant stimulus, caused by injection of a nausea-inducing drug (the approach is called “conditioned taste aversion”). Unless the unpleasant association is reinforced, the memories will slowly fade with time, although the aversion doesn’t disappear entirely during the two-week period that the authors were looking at.

Two years ago, the same authors found that it was possible to radically accelerate this fading. By injecting a chemical that blocked a specific brain enzyme (protein kinase M ζ), the authors caused the rats to act as if they had never experienced the nausea, even if the memory manipulation took place 25 days after the conditioning. Most chemicals that interfere with memories tend to prevent them from being consolidated for long-term storage, but this chemical seemed to work even after the memory was firmly in place.

That’s potentially helpful, since some people have formed negative associations with harmless or even helpful items. Still, for most of us, it would be nice to think that fading memories could be resuscitated. Apparently, they can. The researchers have now done what’s effectively the converse experiment, and increased the activity of protein kinase M ζ. They did this by engineering a virus to express the gene for the kinase, and then infected specific areas of the brain involved in memory. All the infected cells had additional copies of the gene, and thus made more of its product.

The virus had exactly the effect that the authors would presumably have predicted. The virus was injected a week after the rats were given the aversion conditioning, when the memory would already be starting to fade, and the memory tests were done a week after that, yet rats showed a significantly improved retention of their memories. As the authors point out, the engineered virus boosted a memory that was formed before it was even present.

The memory molecule, PKMzeta, overexpressed in rat neurons. Red (left) shows PKMzeta while green (middle) is a fluorescent protein that shows nerve cells have been infected by viruses engineered to boost the memory molecule. Yellow (right) shows both the memory molecule and green fluorescent protein only overexpress at certain locations in the neuron. Weizmann Institute of Science/Science

Actually, you can make that memories, plural. The authors trained rats to avoid both saccharine and salty liquids over the course of three days, and then injected the virus a week after the last training. The memories of both of these trainings were enhanced by the presence of the viral protein kinase M ζ gene.

The authors can’t tell exactly what protein kinase M ζ is doing to increase the recall of memories, and suggest it could be either enhancing the association between taste and the unpleasant experience, or simply enhancing recall in general. Although they don’t mention it, their findings may also be limited to specific classes of memories, like the associations examined here.

That latter point makes the last sentence of the paper a bit over the top, as the authors suggest that a chemical that enhances protein kinase M ζ activity might make for a good treatment for memory disorders like amnesia and age-related decline. Until we have a clearer sense of how many types of memories it works for, that’s a bit premature. Fortunately, there are lots of ways to test the recall abilities of animals, many of which don’t involve negative associations. Hopefully, testing of the virus’ more general impact on memory is already underway.

Image: HIV (green dots), a member of the lentivirus genus. (C. Goldsmith/P. Feorino/E. L. Palmer/W. R. McManus/CDC)

Citation: “Enhancement of Consolidated Long-Term Memory by Overexpression of Protein Kinase Mζ in the Neocortex.” Reut Shema, Sharon Haramati, Shiri Ron, Shoshi Hazvi, Alon Chen,
Todd Charlton Sacktor and Yadin Dudai.
Science, Vol. 331, March 3, 2011. DOI: 10.1126/science.1200215

Source: Ars Technica

See Also: