top of page

The folly of facial recognition






Within the ranks of technology, GIGO is the term we when referring to bad outcomes from programs. It stands for Garbage In, Garbage Out. Simply put any system is only as good as the information that has been input. Inaccurate data will yield inaccurate results. This seems pretty straightforward but many times programmers and system designers aren't aware of how bad the input data was, until the results come out and they are blatantly skewed and obviously inaccurate. Recently Microsoft's AI-based facial recognition system has come under scrutiny for misidentifying women and people of colour. IBM has also said that it would no longer sell general-purpose facial recognition services. While facial recognition software has improved greatly over the last decade thanks to advances in "artificial intelligence", At the same time, the technology has been shown to suffer from bias along lines of age, race, and ethnicity, which can make the tools unreliable for law enforcement and security and ripe for potential civil rights abuses. Bad data due to biases while categorizing input. 


Clearview AI, a company many of us have come into contact with inadvertently has come under heavy scrutiny starting earlier this year when it was discovered that its facial recognition tool, built with more than 3 billion images compiled in part from scraping social media sites, was being widely used by private sector companies and law enforcement agencies. 

Clearview has since been issued numerous cease and desist orders and is at the centre of several privacy lawsuits. Facebook was also ordered in January to pay $550 million to settle a class-action lawsuit over its unlawful use of facial recognition technology. Remember that proliferation of apps that showed what you would look like as a member of the opposite sex or what you would look like in 20 years? The images you uploaded were gathered, sorted, and sold to the highest bidder. Many of these images became part of the database used to create the "AI" that law enforcement agencies use to identify people in crowds. I wonder how many of the uploaded pictures had snap chat filters applied? I can picture an FBI agent poring over a screen looking for a woman in her 20's with rosy cheeks whiskers and bunny ears.


It's probably not as bad as my last example but the fact that Microsoft and IBM have pulled out speaks volumes. Changes to privacy laws and use specific waivers may make the prospect of developing these systems more attractive but in our business money talks and in this case, it seems to be saying that facial recognition algorithms are more likely to cost the company money than make it money.


0 views0 comments

Recent Posts

See All

Comments


bottom of page