As if hearing all the hackneyed stereotypical jokes about Asians having small, squinty, slanted eyes weren’t enough, now there are “intelligent” cameras to remind us about it, too. As TIME magazine reports, minority customers have become disgruntled at certain face-detection cameras, which have unintentionally shown a strong bias toward Caucasian facial features.
One of the deemed “racist” cameras is the Nikon Coolpix, which is designed to warn the picture-taker when someone blinks in a photo. These cameras are programmed with complex algorithms that track certain features in a person’s face. Due to technological limitations, these cameras are sometimes unable to distinguish between half-blinks and naturally narrow eyes, which results in a message on the screen—“Did someone blink?”—that minorities, in particularly East Asians, may take as racially insensitive. We remember hearing Joz Wang’s story back in May, where she made fun of the camera glitches by posting her photo of the blink warning on her blog along with the title “Racist Camera! No, I did not blink… I’m just Asian!”
This issue isn’t unique to Nikon since many big companies get software from manufacturers who use the same programming codes. Other electronic devices which have demonstrated unintentional racial biases include the facial tracking software in Hewlett-Packard webcams which sparked the well-known “HP computers are racist” YouTube video, with nearly 2 million views.
There is still hope for improvement as technology advances, however. According to TIME magazine, Nikon has stated that they are working on improving the accuracy of its cameras’ blink-detection function. So until then, try not to take it too offensively!
Photo of Joz Wang via TIME
Last modified: January 29, 2010