2008/06/24

Why Sleep?

Phys. Rev. E 77, 011922 (issue of January 2008) Title and Authors
8 January 2008
Why Sleep?

Getty Images
Concentrate on napping. Researchers aren't sure why animals need to sleep, but a new study suggests that any system is more efficient when it focuses on one task at a time, rather than trying to multitask. With a sleep-wake cycle, the brain collects information during the day and processes it at night.
Why we sleep remains a mystery. Competing theories claim various "house-cleaning" brain activities occur during sleep, but they can't say why we need to power down to accomplish them. A study in the January Physical Review E suggests that a sleep-wake cycle, allowing the brain to focus on one task at a time, may be the most efficient way to operate. The researcher shows mathematically that processing a continuously changing resource--sensory input, in the brain's case--is best done "offline," when there's no input. This sort of analysis may lead to a more precise biological explanation for why sleep and other biological cycles evolved.
Humans spend a third of their lives asleep, and sleep is essential to our health. But scientists do not yet agree on its purpose. One theory is that the brain requires sleep to consolidate information collected during the day, while another theory says that the brain needs to sweep out harmful free radicals that build up during waking hours. But turning off the senses seems impractical, if not outright dangerous. It would seem better for an organism to perform sleep-related tasks in parallel with being awake.
To address this question, Emmanuel Tannenbaum of Ben Gurion University in Beer Sheva, Israel, proposes the concept of temporal differentiation, in which a system focuses on one task at a time, rather than trying to multitask. The advantages of a time-varying strategy have been studied in traffic control, computer programming, and operations research. But Tannenbaum believes he is the first to consider the brain as a "factory" for information processing, for which certain routines are more efficient than others.
In his paper, Tannenbaum analyzes two models. The first involves a tank with two pipes--one for filling and one for emptying--which can be opened one at a time. Assuming the incoming resource flow continuously switches between "on" and "off," Tannenbaum proves mathematically that one way to maximize the flow through the tank is to fill whenever the resource is available and empty when it isn't. The resource is analogous to sensory information that fills the brain and needs to be processed (emptied). Tannenbaum reasons that many animals can only receive visual information when there is light, so an efficient strategy, according to the tank model, is to be alert during daylight hours and devote all one's time in darkness to processing. As a comparison, Tannenbaum calculates the productivity of alternating rapidly between filling and emptying (equivalent to being half-asleep and half-awake simultaneously) and finds this approach less efficient.
Certain sleep behaviors, like episodic REM sleep and nocturnal habits, do not fit this picture, so Tannenbaum formulated a more generic model, in which a resource supplied at a fixed rate is processed in three separate steps, such that the initial, intermediate, and final products are all present in varying concentrations. The model bears some resemblance to the cyclic reactions of circadian rhythm proteins, which keep many organisms on 24-hour clocks even in complete darkness. Tannenbaum finds that a temporally differentiated case, where the steps are performed separately, is 33 percent more efficient at producing the final product than an undifferentiated case, where all three steps run simultaneously. This result depends on the details of the model, but he believes that optimization through temporal differentiation might explain why certain cyclic behaviors evolved.
James Krueger, a sleep expert at Washington State University, says that this is definitely a new approach, but he thinks Tannenbaum ignores a host of sleep phenomena, such as the localization of sleep to specific areas of the brain and the fact that some sensory input continues during sleep. Still, he welcomes the effort and admits that "any new idea cannot address everything at once." --Michael Schirber Michael Schirber is a freelance science writer in Montpellier, France.
Temporal Differentiation and the Optimization of System Output Emmanual Tannenbaum Phys. Rev. E 77, 011922 (issue of January 2008)

From: http://focus.aps.org/story/v21/st1

Squeezed into Darkness

Phys. Rev. Lett. 100, 203601 (issue of 23 May 2008) Title and Authors
8 May 2008
Squeezed into Darkness

Phys. Rev. Lett. 100, 203601 (2008)
Symmetry breaking. An optical cavity containing a special crystal can emit a beam with this intensity pattern (beam coming toward you). Theorists calculate that a related mode can lead to a beam with unwavering intensity, without some of the usual technical requirements.
According to quantum mechanics, empty space teems with random electromagnetic oscillations that limit the precision of optical measurements. Schemes to "squeeze" light and dodge this quantum limit require a carefully-tuned light intensity and other conditions. In the 23 May Physical Review Letters, Spanish researchers propose an alternative squeezing strategy that should be less finicky. If it proves experimentally feasible, the technique could permit improved measurements of gravitational waves or more practical ways to transmit quantum information.
Physicists often describe quantum-mechanical precision limits using the Heisenberg uncertainty principle, which places strict limits on how well quantities can be measured, even with perfect equipment. But this rule limits the combined uncertainty of pairs of related quantities, like the position and momentum of a particle. Researchers are free to measure the particle's position exactly, as long as they abandon any knowledge of its momentum, or vice versa.
A similar tradeoff applies to light waves, which have an intrinsic variability reflecting their quantum nature. Beginning in the 1980s, researchers learned how to experimentally "squeeze" light, for example, to precisely determine the light wave's amplitude at the expense of its phase, the number that measures the wave's progress through its oscillating cycle. But Germán de Valcárcel, of the University of Valencia in Spain, says that squeezing is significant only when the light intensity is chosen carefully. In his team's new technique the intensity of the input light "need not be tuned in order to obtain, ideally, perfectly squeezed light," he says.
To generate squeezed light, physicists typically shine laser light into an optical cavity, where it bounces back and forth between two partially transparent mirrors. The cavity contains a "nonlinear" crystal that converts the light into squeezed light of a new color with twice the wavelength. To optimize the effect, the input light must be carefully tuned to have intensity at or near a specific value called the threshold.
De Valcárcel and his colleagues propose using an input intensity well above the threshold and adjusting the mirror spacing so that the new light emerges with an intensity pattern in the shape of a dumbbell: with the beam coming toward you, you might see bright regions above and below, for example, with a dark lane horizontally across the middle. The critical ingredient, says de Valcárcel, is "symmetry breaking": the pattern is free to emerge with any orientation angle. "This angle is arbitrary," he says, so over time the pattern will rotate randomly.
Although the light "chooses" a particular orientation for the dumbbell pattern, or mode, the cavity also allows a second mode, which is identical but rotated by 90 degrees around the beam axis with respect to the first one. The researchers calculate that the completely unknown orientation of the first mode results in perfect precision for one aspect of the second, "dark" mode. Specifically, a component of this mode, the light signal that is exactly a quarter-cycle delayed from the bright mode, should be precisely zero--lacking even the usual quantum-mechanical fluctuations of empty space. Researchers could mix this "super-dark" mode with another laser to make a low-noise beam for precise measurements, such as detecting the tiny motions caused by gravitational waves.
Julio Gea-Banacloche, of the University of Arkansas in Fayetteville, is intrigued by the theoretical intuition that leads to the surprising new result. But he cautions that experimentalists usually avoid working above threshold because any noise reduction in one mode is hard to measure in the presence of very bright and noisy light in the other mode. --Don Monroe Don Monroe is a freelance science writer in Murray Hill, New Jersey.
Related Information:
Demonstration of a tenfold reduction in noise power using traditional squeezing techniques:H. Vahlbruch et al., "Observation of Squeezed Light with 10-dB Quantum-Noise Reduction," Phys. Rev. Lett. 100, 033602 (2008).
Noncritically Squeezed Light Via Spontaneous Rotational Symmetry Breaking Carlos Navarrete-Benlloch, Eugenio Roldán, and Germán J. de Valcárcel Phys. Rev. Lett. 100, 203601 (issue of 23 May 2008)

From: http://focus.aps.org/story/v21/st16

Laser Cooling of Atoms

Phys. Rev. Lett. 61, 169 (issue of 11 July 1988) Phys. Rev. Lett. 48, 596 (issue of 1 March 1982) Phys. Rev. Lett. 40, 1639 (issue of 19 June 1978) Titles and Authors
2 April 2008
Landmarks: Laser Cooling of Atoms

H. M. Helfer/NIST
Frozen. A cloud of cold sodium atoms (bright spot at center) floats in a trap. Researchers began cooling atoms with lasers in 1978, reaching below 40 Kelvin. They achieved temperatures a million times colder just ten years later, eventually leading to better atomic clocks and the observation of a new ultracold state of matter.
APS has put the entire Physical Review archive online, back to 1893. Focus Landmarks feature important papers from the archive.
In the 1970s and 80s, physicists learned how to use lasers to cool atoms to temperatures just barely above absolute zero. Three papers from that era, all published in Physical Review Letters, highlight some of the essential steps in the development of the technology. In 1978, researchers cooled ions somewhat below 40 Kelvin; ten years later, neutral atoms had gotten a million times colder, to 43 microkelvin. But the basic principle remained the same: use the force of laser light applied to atoms to slow them down. The work led to the creation of a new quantum form of matter called a Bose-Einstein condensate and to modern atomic clocks, as well as at least two Nobel prizes.
The original reason to cool atoms--that is, reduce the speed of their motion--was to allow more precise measurements of atomic spectra, and later, to improve atomic clocks. In 1978 Dave Wineland and his colleagues at what is now the National Institute of Standards and Technology (NIST) in Boulder, Colorado, followed theoretical proposals [1] and managed to laser cool magnesium ions.
As the team described in PRL, they confined the ions in an electromagnetic trap and hit them with a laser tuned to a frequency a bit below a "resonance" frequency for the ions. At rest, the ions absorb photons at the resonance frequency, but if they're moving toward the beam, its lower frequency appears Doppler shifted to the correct frequency, allowing them to absorb photons coming toward them. These photons slow down the ions until the cooling effect is balanced by the small heating that is always present when the laser is on. In later years, this heating--which comes from atoms recoiling every time they randomly emit or absorb a photon in any direction--would ultimately limit the cooling possible with this so-called Doppler cooling technique.
In Boston, William Phillips read Wineland's experimental article and a theoretical paper [2] with great interest. He was just finishing a postdoctoral fellowship at the Massachusetts Institute of Technology and heading to the NIST lab in Gaithersburg, Maryland. "The idea of cooling ions made me think that it might be possible to do the same thing with neutral atoms," says Phillips.
In 1982, Phillips and Harold Metcalf of Stony Brook University in New York published the first paper on laser cooling of neutral atoms. They sent a beam of sodium atoms through a magnetic field that was large at the entrance to the apparatus but became gradually smaller over a distance of 60 centimeters. While moving through the field, the atoms headed directly into an off-resonance laser that used Doppler cooling to reduce the range of atomic velocities among atoms in the beam. The laser also slowed the beam as a whole. During deceleration, the changing magnetic field changed the atoms' resonant frequency, so that the slowing and cooling continued over a long distance, allowing them to reach 40 percent of their initial velocity. Now called a Zeeman slower, this device has become a standard way of decelerating an atomic beam.
Laser cooling techniques improved, and by the late 1980s, researchers had achieved what they thought were the lowest possible temperatures, according to Doppler cooling theory--240 microkelvin for sodium atoms. Then in 1988, a group led by Phillips accidentally discovered that a technique developed three years earlier at another lab [3] could shatter the Doppler limit. They used three mutually perpendicular pairs of lasers to cool sodium atoms, with laser frequencies somewhat different from other labs. They discovered, using several new temperature measurement techniques, that their atoms were at about 43 microkelvin. Theorists quickly explained the unexpected cooling mechanisms by including more atomic states and the effects of laser polarization; previous cooling models were overly simplistic.
Guided by the new theory, experimentalists reached much colder temperatures and developed additional cooling techniques. Phillips' "sub-Doppler" cooling was an early step in the 1995 creation of a Bose-Einstein condensate, a new state of matter where gaseous atoms all drop to the lowest possible energy state.
Atomic clocks benefited as well. The latest generation uses techniques derived directly from what Phillips and others did in the 1980s. Phillips and others won the Nobel Prize in 1997 for developing laser cooling; another prize in 2001 was awarded for the creation of Bose-Einstein condensates.--Jason Socrates Bardi Jason Socrates Bardi is a senior science writer at the American Institute of Physics.
References:[1] D. J. Wineland and H. Dehmelt, Bull. Am. Phys. Soc. 20, 637 (1975); T. W. Hänsch and A. L. Schawlow, "Cooling of Gases by Laser Radiation," Opt. Commun. 13, 68 (1975).[2] A. Ashkin, "Trapping of Atoms by Resonance Radiation Pressure," Phys. Rev. Lett. 40, 729 (1978).[3] S. Chu et al., "Three-Dimensional Viscous Confinement and Cooling of Atoms by Resonance Radiation Pressure," Phys. Rev. Lett. 55, 48 (1985).
Related Information:
1997 Nobel Prize in physics
Observation of Atoms Laser Cooled below the Doppler Limit Paul D. Lett, Richard N. Watts, Christoph I. Westbrook, William D. Phillips, Phillip L. Gould, and Harold J. Metcalf Phys. Rev. Lett. 61, 169 (issue of 11 July 1988)
Laser Deceleration of an Atomic Beam William D. Phillips and Harold Metcalf Phys. Rev. Lett. 48, 596 (issue of 1 March 1982)
Radiation-Pressure Cooling of Bound Resonant Absorbers D. J. Wineland, R. E. Drullinger, and F. L. Walls Phys. Rev. Lett. 40, 1639 (issue of 19 June 1978)

From : http://focus.aps.org/story/v21/st11

Dark Physics Beats Light Limit

Dark Physics Beats Light Limit

Intel Corporation
Stamped out. Each of these chips contains over 400 million transistors. Calculations suggest that a new system using multiple lasers might be able to shrink the circuit elements even further.
Current laser-based techniques to make computer chips cannot fashion components much smaller than the light's wavelength, but researchers are devising tricks to beat this so-called diffraction limit. A new idea, detailed in the 22 February Physical Review Letters, is to use a dark state--which requires multiple laser beams--to write patterns in the absorbing material. Calculations show that the technique could create structures far smaller than the beams' wavelengths without using the dangerously high intensities needed with other proposed techniques.


In optical lithography, a "picture" of a microchip circuit is shone onto a semiconductor coated with a light-sensitive material called photoresist. Light-exposed areas of commonly-used photoresists become susceptible to the chemicals that etch out the integrated circuit pattern. According to classical physics, these exposed areas cannot be smaller than half a wavelength of the laser light. Engineers have ways to fudge this limit, such as immersing the semiconductors in liquids that help bend the light further. But to fundamentally break the limit, theorists have proposed systems where the photoresist molecule is activated by two or more photons of light, rather than one. Increasing the excitation energy reduces the effective wavelength, compared with a single photon. But multiphoton absorption requires all of the photons to be in the same place at the same time, which means high laser intensities that could damage materials or equipment.
Now Suhail Zubairy of Texas A&M University's campus in Qatar and his colleagues from the Max Planck Institute for Nuclear Physics in Heidelberg, Germany, have proposed a new way to get below the diffraction limit without high-intensity lasers. The photoresist molecules would be activated by coherent population trapping (CPT), a process used in slow light and other atomic experiments. In the simplest case, two lasers drive two transitions from different lower energy states to a common excited state, but due to a quantum interference effect, the molecules are never excited. Instead they evolve into a so-called dark state--a stable combination of the lower states that is unaffected by light. With additional upper and lower states, there may exist more complex dark states that combine several low-energy states and that could be populated using additional lasers tuned to the different transitions. CPT does not require multiphoton absorption, so it can work at relatively low intensities.
Zubairy's team showed that one can use the lasers to create sub-wavelength-sized regions on the surface where the molecules are all in a special state--not simply the dark state, but the dark state "favoring" one of its component low-energy states over the others. In the simplest case, each of the two beams would be split in half and reflected back onto itself to form a pattern of light and dark stripes on the photoresist surface. By arranging how these patterns overlapped, some places would be exposed to more of one laser than the other. The researchers calculated how this varying illumination would affect the molecules in the dark state and found that they formed a pattern of their own--stripes alternating between molecules favoring one of the two component states and those favoring the other.
Assuming one state was more susceptible to etching, the process could lead to chip features as narrow as a half wavelength, according to their calculations. To beat the diffraction limit, the team included a third component in the dark state and an additional pair of lasers in their theory, which reduced the etch-sensitive stripes to quarter-wavelength thickness. More complex dark states led to even narrower stripes. By combining stripes of different widths, engineers could make the complicated patterns for a microchip, say the authors.
Jonathan Dowling of Louisiana State University in Baton Rouge thinks this new idea in "quantum trickery" could work, but the required energy level structure may be hard to reproduce in a commercial photoresist. "A lot of chemistry will be needed to translate these ideas into a practical technology," he says.--Michael Schirber Michael Schirber is a freelance science writer in Montpellier, France.

From: http://focus.aps.org/story/v21/st6

2008/06/04

The shape of time

We all have the same question about what is time, could it be reversed? When I was a child, I love to see the movie "Back to the future", it says they go back to the past and try to come back to the future. It is always in my mind about "time travel". Could we find a way to travel forward or backward along time axis? Before think about that kind of question, we need to understand what is time firstly.
It is a qestionable issue and still an open discussion. If our physics describe the universe is somehow symmetric, then the physical symmetry should lead to time symmetry. Why there is only time axis propagates to one direction only without considering any possibility of "backward"? It is because there is no backward causation in humman intuition. But it is too anthropocentric to explain our nature phenomena. Maybe it's time to release our mind to widen the vision of the way we think about our universe. It is a great thing and a very important thing in my life, to pursue such an anwser for the most mystery in nature. This vedio is nice but not clear enough to explain the physics behind the time issue. I think it was made by some university students, if so, he will become a good science film producer one day.
Share it to all my friends, who love physics, animals and Earth.

http://tw.youtube.com/watch?v=y53hh-LAbLk&feature=related

2008/06/03

World Science Fest: What's behind quantum mechanics?

World Science Fest: What's behind quantum mechanics?
By John Timmer Published: June 02, 2008 - 09:45AM CT
Friday night's session of the World Science Festival included a program on quantum mechanics entitled "The Invisible Universe," which included a panel discussion moderated by Alan Alda. Festival founder Brian Greene (who's actually a string theorist) provided the introduction to the quantum world, noting that, "100 years ago, one generation of physicists changed our understanding of reality." He said that society has adopted a lot of the lingo of quantum mechanics without really coming to terms of what it actually involved.
Greene tried to get the audience up to speed by talking about the now-famous double-slit experiment, using an example in which waves of water passing through a pool create an interference pattern. He then brought that into the world of electrons, which also form interference patterns when sent through a double slit, even when only one is sent through at a time. What is going through the slits, instead of an actual particle, appears to be a set of probability curves, which can interfere with each other on their way to determining where the electron lands.
The second example of quantum strangeness, which came during the ensuing discussion, was entanglement. Using up and down spin as an example, Greene described how two entangled particles could be separated, potentially by the entire length of the universe, and yet have their behavior remain linked. The consequence of the entanglement is that a measurement of the properties of one particle would instantly define the state of the other, no matter how large the separation between the two.
Quantum probabilities and a concrete world
These things profoundly violate what most people tend to think of as our orderly, causal universe. How do we make sense out of what we think of as a physical particle vanishing into a set of probabilities and then popping out, a particle again, at the far side? The panel included people who argued number of perspectives on this question.
Max Tegmark from MIT suggested that we register multiple things happening because all of them actually do happen in a series of related universes. According to Tegmark, a mathematical construct called Hilbert space can let us describe quantum behavior in a linear, causal, and 'real' manner but, to work, it requires that all quantum possibilities actually happen. Tegmark thinks they can, in nearby universes that split off and recombine to create the strange effects we measure.
Nobel Laureate Bill Phillips, who's a quantum experimentalist, took what he termed a "shut up and calculate" approach to the question. We're arguing over these different perspectives yet, "when we go into the lab, we get the same results." Phillips pointed out that there's no inherent reason to think that there's anything behind the quantum behavior we observe—it's just the nature of the universe. He noted that the graduate students he gets no longer have any issues with quantum behavior, and used that to suggest that it's really just a habit of mind (a mind that evolved to deal with a very concrete reality) that keeps many from being satisfied with quantum behavior.
The panel included a philosopher of physics, David Albert, that didn't really have a specific response he favored, although he clearly favored some response. Referring to the "shut up and calculate" approach, Albert pointed out that "nobody is born wanting to know the result of specific experiments," so to put the underlying principles off limits violates the nature of science. In Albert's view, "the problem isn't that Max's ideas are wacky—it's pretty clear that the world is wacky." The fear is that the wackiness may exceed our mental capacities; "is it stranger than we know, or stranger than we can know?" Albert asked.
New ideas vs. newfound comfort
Not surprisingly, the conversation frequently returned to Einstein, who was profoundly uncomfortable with quantum mechanics. Everyone agreed that this discomfort was often mischaracterized, though. It's frequently presented as an unease with the random nature of events, but nobody thinks Einstein expected that nature should make him (or anyone else) feel comfortable. Instead, Einstein seemed to have been unable to map quantum behavior onto anything he understood; as David Albert put it, it was like being told, "this bottle of water is Elvis Presley—it's not 'I don't believe that,' it's 'I don't know what that means.'"
Tegmark argued that any correct theory should seem weird, though, because the universe at the small scale is weird, and humans evolved to comprehend the large-scale world. He suggested that progress in the theoretical realm can be seen when people stop saying, "it's strange, and I hate it," and start saying, "I hate it." Tegmark was happy that there are a number of ways being tried to find some logic underlying the quantum world, saying, "it's better to bark up many trees than all going up the same one."
On that, everyone seemed to agree. Brian Greene argued that, as we get more information and ideas, quantum mechanics was likely to make more sense, and Alberts suggested three different proposals all had the potential to provide those ideas. Bill Phillips, who argued the shut up and calculate perspective, even agreed that these competing proposals were great. Calling experimentalists such as himself the real quantum mechanics, he said, "experimentalists love proving theorists wrong; I love having competing theories because somebody's gotta be wrong."

From:http://arstechnica.com/journals/science.ars/2008/06/02/world-science-fest-whats-behind-quantum-mechanics