I Do See Color

I Do See Color

Share this post

I Do See Color
I Do See Color
Facial recognition has a blind spot

Facial recognition has a blind spot

BlackTechLogy: Melanin-biased software leaves a lot of room for error

Shamontiel L. Vaughn's avatar
Shamontiel L. Vaughn
Jun 19, 2019
∙ Paid
1

Share this post

I Do See Color
I Do See Color
Facial recognition has a blind spot
Share
Photo credit: cottonbro studio/Pexels

This post is part of a series entitled “BlackTechLogy.” Click here for the archived posts.


Online shopping provides something retail stores usually don’t: personalization. When you log in to a store’s smartphone app or website, the algorithms remember who you are; suggest what you’re most likely to purchase; highlight past purchases; and keep track of your favorite brands, sports teams, or foods.

So imagine being able to use that same personalization inside of a retail store with facial recognition. Your face is scanned as soon as you walk by a kiosk or camera. You can skip past the retail store clerks suspiciously refolding clothes wherever you’re standing or asking repeatedly if you “need something.” Ideally you could enter a store, and the algorithms of the software already confirm you’re a repeat customer.

Last year, Cali Group (the owner of CaliBurger) piloted facial recognition software for its in-store loyalty program. Used at the National Retail Federation’s annual expo, the software could identify registered customers, activate their loyalty accounts, and display their favorite meals from the West Coast restaurant. Instead of credit cards, the restaurant was in the beginning stages of planning face-based payments.

Recommended Read: “Why the Whitley Gilberts and Meghan Markles are threatening to racists ~ Yes, you are about 99% likely to be racist if you hate 'With Love, Meghan'“

Samsung and AT&T used facial recognition software at the expo too, Forbes reports. The companies used the software to calculate demographics and store traffic and to send store associates the names of incoming shoppers. Even Walmart has been working on patenting facial recognition software that can detect a shopper’s mood. Similar to Cloverleaf’s shelfPoint, the goal is to identify how customers are feeling when something catches their eye enough to buy it.

This software may help retailers to become more familiar with their customers, lower the threat of shoplifting, and bridge the gap between online and brick-and-mortar retail. But for people of color, it could potentially be just as stressful as the retail clerk who might be following them around a store.

Photo credit: Franklin/Pexels

According to the Electronic Frontier Foundation, a nonprofit organization defending civil liberties in the digital world, facial recognition software may be prone to error. It’s specifically worse when identifying people of color, women, and younger groups. If the software reports a “false negative,” it will not be able to match a person’s face at all.


ADVERTISEMENT ~ Amazon

As an Amazon affiliate, I earn a percentage from purchases with my referral links. I know some consumers are choosing to boycott Amazon for its DEI removal. However, after thinking about this thoroughly, I choose to continue promoting intriguing products from small businesses, women-owned businesses and (specifically) Black-owned businesses who still feature their items on Amazon. All five of my Substack publications now include a MINIMUM of one product sold by a Black-owned business. (I have visited the seller’s official site, not just the Amazon Black-owned logo, to verify this.) If you still choose to boycott, I 100% respect that decision.
Topicals Faded Brightening Under Eye Masks | Revitalizing Patches to Depuff, Hydrate, and Illuminate | Diminishes Dark Circles and Fine Lines

A “false positive” may identify this person as someone else. And for law enforcement, this can pose quite a problem. Since 2013, San Diego police officers could stop to take photographs of a person of interest to run their face through Tactical Identification System (TACIDS). That photograph could then be analyzed against more than a million other booking shots.

For White men, the software works correctly 99% of the time. For darker-skinned women, there were nearly 35% more errors.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Shamontiel L. Vaughn
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share