Facial recognition has a blind spot
BlackTechLogy: Melanin-biased software leaves a lot of room for error
Writer’s note: This post was originally published on Medium’s “ZORA” on Juneteenth 2019.
Online shopping provides something retail stores usually don’t: personalization. When you log in to a store’s smartphone app or website, the algorithms remember who you are; suggest what you’re most likely to purchase; highlight past purchases; and keep track of your favorite brands, sports teams, or foods.
So imagine being able to use that same personalization inside of a retail store with facial recognition. Your face is scanned as soon as you walk by a kiosk or camera. You can skip past the retail store clerks suspiciously refolding clothes wherever you’re standing or asking repeatedly if you “need something.” Ideally you could enter a store, and the algorithms of the software already confirm you’re a repeat customer.
Last year, Cali Group (the owner of CaliBurger) piloted facial recognition software for its in-store loyalty program. Used at the National Retail Federation’s annual expo, the software could identify registered customers, activate their loyalty accounts, and display their favorite meals from the West Coast restaurant. Instead of credit cards, the restaurant was in the beginning stages of planning face-based payments.
For people of color, facial recognition could be just as stressful as the retail clerk who’s following them around a store.
Samsung and AT&T used facial recognition software at the expo too, Forbes reports. The companies used the software to calculate demographics and store traffic and to send store associates the names of incoming shoppers. Even Walmart has been working on patenting facial recognition software that can detect a shopper’s mood. Similar to Cloverleaf’s shelfPoint, the goal is to identify how customers are feeling when something catches their eye enough to buy it.
This software may help retailers to become more familiar with their customers, lower the threat of shoplifting, and bridge the gap between online and brick-and-mortar retail. But for people of color, it could potentially be just as stressful as the retail clerk who might be following them around a store.
According to the Electronic Frontier Foundation, a nonprofit organization defending civil liberties in the digital world, facial recognition software may be prone to error. It’s specifically worse when identifying people of color, women, and younger groups. If the software reports a “false negative,” it will not be able to match a person’s face at all. A “false positive” may identify this person as someone else. And for law enforcement, this can pose quite a problem. Since 2013, San Diego police officers could stop to take photographs of a person of interest to run their face through Tactical Identification System (TACIDS). That photograph could then be analyzed against more than a million other booking shots.
For White men, the software works correctly 99% of the time. For darker-skinned women, there were nearly 35% more errors.