AI at retail stores – Is right of privacy compromised?

AI at retail stores - Is right of privacy compromised?

Click. The camera under the sunshade of a basic convenience store at the corner of South 38th Street and Pacific Avenue, take a picture of the patron entering the store. Using artificial intelligence, the image is then matched against the images in a database of known robbers and shoplifters at that area. Denise Diharce, a regular customer at Jacksons Food Store, was taken aback when she learned that the Tacoma location is testing a high tech system which will compare her to images of previous crime suspects before entering the store. According to Diharce, it is a privacy violation as one should be notified before capturing photos and matching them against the phots of criminals

According to Jackson’s spokesperson, Russ Stoddard, the system is currently deactivated after a recent test, but when fully functional, it will operate from 8 p.m.-6 a.m. Once it is started, a sign at the front of the store will inform clients that facial recognition technology is in usage, and a speaker will ask customers to look at the camera.  However, the door won’t unlock if someone is wearing a mask or if the person has been previously labeled for criminal activity by in-store camera footage.

Another customer named May, who wished that her surname not be used, said she had no issue with the surveillance tool, but wondered its purpose in a corner store. May showed deep concern as to why a small store needs something like this

“KIRO 7” first reported the testing at Jacksons. It is part of a more significant movement in which both the retailers and governments, are employing AI cameras to tackle criminal activity and to observe people’s habits for other purposes. Regardless of how beneficial it is, the advancement has left privacy advocates wondering if the anti-crime efforts offset the civil liberty risks.

Other general stores such as Target, Walmart and Lowes have also used AI cameras to prevent criminal activity. According to a current report by research firm CB Insights, stores are not always transparent about their use of the technology. AI cameras serve more than just watching thieves: The Associated Press reported that at a Long Island Walmart, thousands of high-resolution cameras were recently installed to monitor inventory on shelves and even the ripeness of fruit.

Human rights groups warned that if this goes unregulated, the growing use of facial recognition software in stores and elsewhere could be responsible for biases and lead to unnecessary supervision. Jacksons’ facial recognition software is created by Blue Line Technology, a Fenton, Mo.-based company that associates with Dell and Axis to provide the video security system.

The system is not connected to a criminal database, so a store manager must label a suspected burglar’s image to receive a notification when the person comes to the store again. The system does not collect any personal information, and it only saves the images of nonflagged customers for 24-48 hours, whereas suspected shoplifters are kept in the database. It can not be denied that retail is a risky work: According to The D&D Daily, the industry publication, 424 people died violently in stores in 2017.

More than 20 retail companies that engaged the Blue Line Technology software platform throughout the nation have reported a decrease of 95% in police service calls and shoplifting theft has reduced to half. A video of unsuccessful robbery last summer displayed two masked people ran from the spot when they were refused entry into a Yakima AMPM store that uses Blue Line’s system. Tom Sawyer, a senior partner in Blue Line Technology, said that he understands the stigma attached to facial recognition. However, all the company desires is to stop robbers.

According to Vice President of Operations Jill Linville, The Jacksons in Tacoma is the Idaho-based company’s second location to fit the video surveillance system. Another one in Portland, Ore., has used the technology for about a year, and company spokesperson Stoddard said criminal activity has greatly plunged, thanks to the system. After it is entirely installed at the Tacoma store, the AI security system will function on an experimental basis for use in its other locations in five Western states, though Linville told that it would not be installed in all the nearly 245 stores.

This security platform takes the limited action of locking a door on suspected shoplifters. On the other hand, another AI security camera beginner, Austin, Texas-based Athena Security, employs video analytics to detect criminal activity or guns at businesses, schools and places of worship, then notifies law enforcement to the likely danger. Video analytics is a system that tries to analyze what people are doing in real-time. According to the company, it aims to avoid ethnic profiling and the collection of personal data by concealing out subject faces before the AI system analyzes the video. Last month, the startup connected its AI-powered cameras at the Al-Noor mosque in Christchurch, New Zealand, after the killing of 50 believers at two mosques in March.

“The core A.I. brain that powers Athena Security is just scratching the surface of what it’s capable of to mitigate crime and save lives, but the road map is also full of evil potholes that we’ll need to size up and take the right course,” Chris Ciabarra, CTO and co-founder of Athena Security, said in an email.

A recent ACLU report framing thoughts about intelligent video monitoring technology finds that AI security systems are getting more commonplace because it gets easier and quicker to make systems that recognize people and analyze images. It hardly takes a few minutes to train an artificially intelligent machine to recognize objects in images, according to a Stanford University report.

The propagation of AI surveillance technology could lead to a society where everybody’s public actions and behavior are exposed to constant and complete evaluation and judgment by agents of authority. It is a society where everyone is scrutinized, forewarned Jay Stanley, the senior policy analyst and author with the ACLU Speech, Privacy, and Technology Project. Blue Line Technology software footage is available to law enforcement only if store management decides to share it. Neighbors, an app introduced by  Amazon’s smart doorbell company Ring, enables to share neighborhood surveillance tapes and tips with law implementation agencies.

Jake Laperruque, the senior counsel at the D.C.-based nonprofit Project on Government Oversight, says many retail stores are using surveillance technology to keep a check on shopper habits and curb shoplifting, but he labeled Jackson’s use of AI-technology to give entry “a little extreme.”

AI is also being used in retail is to keep an eye on commodities. The ACLU reported Target’s use of AI video analytics to inform store security when customers spend a significant quantity of time in front of particular objects. Target told The Seattle Times that it stopped using the video analytics system and facial identification software to prevent fraud and stealing in a small number of stores last year.

“As we always do, we’ll continue to experiment and learn from upcoming technologies that can keep our guests and team members safe,” penned Target spokesperson Danielle Schumann in an emailed statement. Another American retail company, Lowe’s also chose not to use facial recognition technology for any purpose after conducting a three-month experiment, four years ago at three stores, said company spokeswoman Maureen Wallace.

Besides security systems, video analytics are employed in a range of machines for commercial or governmental reasons, for instance, to power robots or autonomous cars. The unparalleled potential for mass surveillance calls on policymakers to build a proper balance between the legitimate right to privacy and freedom of expression when formulating regulations, says Stanley.

The governmental agencies should openly debate the use of the technology before implementing it to monitor the misuse of video analytics systems. Stanley suggests that video analytics should only be used in urgent situations and interfere in decisions that could change people’s lives without their approval. Neither should private companies use video analytics to collect customer data for marketing reasons. Laperruque from the Project on Government Oversight aired his concern about the absence of regulation over the use of facial recognition in a retail setup. Facial identification technology has been shown to misidentify women and darker-skinned people at a higher rate than fair-skinned males.

Despite Blue Line Technology representative Sawyer said the software has never misrecognized anyone, but the use at Jacksons’ probably serves a discrimination lawsuit.

Leave a reply