The original image is in the middle. At left, white-balanced as if the dress is white-gold. At right, white-balanced to blue-black. |
(And yes, it’s blue.)
The fact that a single image could polarize the entire Internet into two aggressive camps is, let’s face it, just another Thursday. But for the past half-day, people across social media have been arguing about whether a picture depicts a perfectly nice bodycon dress as blue with black lace fringe or white with gold lace fringe. And neither side will budge. This fight is about more than just social media—it’s about primal biology and the way human eyes and brains have evolved to see color in a sunlit world.
Light enters the eye through the lens—different wavelengths corresponding to different colors. The light hits the retina in the back of the eye where pigments fire up neural connections to the visual cortex, the part of the brain that processes those signals into an image. Critically, though, that first burst of light is made of whatever wavelengths are illuminating the world, reflecting off whatever you’re looking at. Without you having to worry about it, your brain figures out what color light is bouncing off the thing your eyes are looking at, and essentially subtracts that color from the “real” color of the object. “Our visual system is supposed to throw away information about the illuminant and extract information about the actual reflectance,” says Jay Neitz, a neuroscientist at the University of Washington. “But I’ve studied individual differences in color vision for 30 years, and this is one of the biggest individual differences I’ve ever seen.” (Neitz sees white-and-gold.)
Usually that system works just fine. This image, though, hits some kind of perceptual boundary. That might be because of how people are wired. Human beings evolved to see in daylight, but daylight changes color. That chromatic axis varies from the pinkish red of dawn, up through the blue-white of noontime, and then back down to reddish twilight. “What’s happening here is your visual system is looking at this thing, and you’re trying to discount the chromatic bias of the daylight axis,” says Bevil Conway, a neuroscientist who studies color and vision at Wellesley College. “So people either discount the blue side, in which case they end up seeing white and gold, or discount the gold side, in which case they end up with blue and black.” (Conway sees blue and orange, somehow.)
We asked our ace photo and design team to do a little work with the image in Photoshop, to uncover the actual red-green-blue composition of a few pixels. That, we figured, would answer the question definitively. And it came close.
In the image as presented on, say, BuzzFeed, Photoshop tells us that the places some people see as blue do indeed track as blue. But…that probably has more to do with the background than the actual color. “Look at your RGB values. R 93, G 76, B 50. If you just looked at those numbers and tried to predict what color that was, what would you say?” Conway asks.
So…kind of orange-y?
“Right,” says Conway. “But you’re doing this very bad trick, which is projecting those patches on a white background. Show that same patch on a neutral black background and I bet it would appear orange.” He ran it through Photoshop, too, and now figures that the dress is actually blue and orange.
The point is, your brain tries to interpolate a kind of color context for the image, and then spits out an answer for the color of the dress. Even Neitz, with his weird white-and-gold thing, admits that the dress is probably blue. “I actually printed the picture out,” he says. “Then I cut a little piece out and looked at it, and completely out of context it’s about halfway in between, not this dark blue color. My brain attributes the blue to the illuminant. Other people attribute it to the dress.”
Even WIRED’s own photo team—driven briefly into existential spasms of despair by how many of them saw a white-and-gold dress—eventually came around to the contextual, color-constancy explanation. “I initially thought it was white and gold,” says Neil Harris, our senior photo editor. “When I attempted to white-balance the image based on that idea, though, it didn’t make any sense.” He saw blue in the highlights, telling him that the white he was seeing was blue, and the gold was black. And when Harris reversed the process, balancing to the darkest pixel in the image, the dress popped blue and black. “It became clear that the appropriate point in the image to balance from is the black point,” Harris says.
Read more at Wired Science
No comments:
Post a Comment