Adam Levine seen wearing Oliver Peoples Hollis Sunglasses
Adam Levine seen wearing Oliver Peoples Hollis Sunglasses
We look at the world from two slightly different vantage points, which correspond to the positions of our two eyes. These dual vantage points create tiny differences between the two eyes’ images that are proportional to the relative depths of objects in the field of view. The brain can measure those differences, and when it does so the result is stereovision, or stereopsis.
To get an idea of this effect, extend one arm to point at a distant object. While keeping your arm extended, alternately open and close each eye. Notice how your finger shifts in relation to the object, illustrating the horizontal disparity between the eyes.
Christina Ricci was sporting Oliver Peoples Bacall Sunglasses.
In pursuit of an iPhone-size tool to monitor cancer and diagnose malaria. Researchers hope that a new kind of small portable microscope may give health workers the ability to quickly and cheaply scan blood for tumor cells and life-threatening parasites. What makes the microscope unique is the way it scans objects without lenses, and that it was inspired by a phenomenon that usually clouds vision instead of improving it. A team from the California Institute of Technology (Caltech) has demonstrated that light-sensitive microchips like those found in digital cameras can produce high-resolution images of microscopic beads and worms about a millimeter long.
Writing in Proceedings of the National Academy of Sciences USA, they report that prototype lensless microscopes resolved details down to approximately 0.8 to 0.9 microns, or thousandths of a millimeter. (One millimeter equals 0.04 inch.) This is good enough to spot cancer cells, which measure 15 to 30 microns. Here’s how it works: Scientists shine light onto a liquid sample flowing through a narrow channel. Below the channel are a series of three-micron-wide apertures, or holes, punched through a layer of metal such as gold or aluminum. The light shines through the holes onto a semiconductor chip studded with a series of sensor pixels. Such chips cost about $10 a pop, says Caltech bioengineer and study leader Changhuei Yang. Objects that float over the apertures block some of the incoming light received by the pixels, which reconstruct an image of the object based on the variations in light intensity across multiple apertures.
Yang says he was inspired by “floaters,” the clumps of dead cells and other debris in the eye that are sometimes visible when staring at blue sky or other uniform sources of light. He says floaters are more common with age and in some eye conditions such as myopia. Normally we see objects because the lens of the eye focuses an image onto the retina, often with help from corrective lenses. But floaters float right above the retina, where we can see them directly. Conventional microscopes use lenses to magnify microscopic features for the eye to see, but lenses are hard to make small for use in portable handheld devices or for scanning lots of biological samples simultaneously, which would aid research. With a chip-based microscope, “there’s no lens to break,” Yang says. “A clinician can stick this into his back pocket.” He says he sees the technology as the basis for rugged, iPod-size scanners capable of quickly diagnosing the presence of the blood parasites that cause malaria and sleeping sickness (human African trypanosomiasis), which are both endemic in sub-Saharan Africa. Another possible application is scanning the blood of cancer patients for cancerous cells circulating in the body that may take root in other organs, a phenomenon known as metastasis that often signals a poor prognosis. Right now, such cells can be collected on membranes, but transferring the cells to microscope slides is relatively complicated and expensive, says pathologist Richard Cote of the Keck School of Medicine of the University of Southern California, who was not part of the Caltech team but is working with Yang to develop a chip-based system to probe those samples directly. Micron resolution is sufficient for basic tasks, he says, such as counting the number of potentially dangerous cancer cells. He says the lensless microscope “would be a way of actually disseminating the technology much more broadly.”
The way a primitive auditory structure in the brain processes visual information may explain how we are fooled by thrown voices. If you watched football or the final game of the World Series yesterday, you may have noticed the following: When the announcers were speaking on camera, it seemed as though the sound of their voices were coming from their mouths. But when the commentary occurred off-screen as the game action was shown, it was quite apparent the TV speakers were the actual sound source of the endless color-commentary babble. This processing phenomenon in which a visual cue affects how one perceives an auditory stimulus—ventriloquism is another example—may be explained by new research that pinpointed neurons in a primitive brain area that responds to both visual and auditory information. This area, the inferior colliculus region in the midbrain, less than half an inch in diameter, is a way station for nearly all auditory signals as they travel from the ear to the cortex (the brain’s central processing area). “It’s important if you’re going to be integrating visual and auditory information that they be on a level playing field, so both are encoded the same way,” says Jennifer Groh, an associate professor at Duke University’s Center for Cognitive Neuroscience and a co-author of the new work published in Proceedings of the National Academy of Sciences USA. “It’s important for the auditory pathway to know where the eye is pointed.” Groh and her colleagues planted electrodes in the brains of three monkeys, targeting 180 individual neurons (or nerve cells) in the inferior colliculus. The animals were placed in a dark chamber where a light-emitting diode (LED) would switch on in one of several predetermined locations. After the monkeys attended to and fixated on the light for a few fractions of a second, a short clip of white noise would play from speakers in the chamber. When the researchers examined the time-stamped activity of the individual neurons, they observed that each monkey had a neural response in its inferior colliculus when the LED turned on. In addition, two of the three animals showed activity in the auditory structure as they moved their eyes toward the light. In all, the scientists report that more than 67 percent of the neurons monitored (121 of the 180) showed statistically significant responses to the visual stimulus. “The implication is that it’s possible that perception involves more interaction between the sensory pathways than we expected and, because they are happening in low-level areas, they may be more automatic,” Groh says. She adds that some cells responded more quickly to the light, although others had a buildup of activity. She speculates that the quicker acting cells process the information whereas the slower ones may encode a reward response (a secondary function of the inferior colliculus). Christoph Kayser, a research scientist at the Max Planck Institute for Biological Cybernetics in Tübingen, Germany, calls the new work “stunning.” “Results like these suggest that the brain does not try to keep the information provided by the different sensory organs as isolated as possible, but rather that an early mixing of sensory information seems to be the rule,” he says. “All this can best be interpreted when seeing the brain as being faced with a flood of sensory information that must be co-registered and merged into a coherent percept.”