Sainsbury’s shopper wrongly misidentified by staff says he felt like ‘criminal’

Warren Rajah, 42, from Elephant and Castle, south London, says he was in a store on January 27 when staff approached him and asked him to leave.

By contributor Ted Hennessey, Press Association
Published
Last updated
Supporting image for story: Sainsbury’s shopper wrongly misidentified by staff says he felt like ‘criminal’
Warren Rajah was told to leave his local Sainsbury’s after staff mistook him for an offender flagged by facial recognition software (Handout/PA)

A data strategist who was told to leave Sainsbury’s after staff mistook him for an offender flagged by facial recognition software has said he was made to feel like a “criminal”.

Warren Rajah, 42, from Elephant and Castle, south London, says he was in his local store on January 27 when staff approached him, asked him to leave and took his shopping.

When a “distraught” Mr Rajah asked why, staff pointed to a sign showing that the store used facial recognition technology, he said.

But he was mistaken for another person who was in the system as an offender and was also in the store at the time.

Sainsbury’s said there was no fault with the Facewatch technology, which has been rolled out in seven of its stores, and has apologised to Mr Rajah.

On being misidentified, Mr Rajah told the Press Association: “You feel horrible, you feel like a criminal and you don’t even understand why.”

He said: “To tell you to leave the store without any explanation gives you the impression that you’ve done something wrong.

“If you speak to anyone in the public, that is what they will tell you, when you’ve been forced and excluded from an environment, you automatically think you’ve done something wrong, especially with security.

“That’s just a normal human response.”

Mr Rajah said that after being removed from the store he contacted Facewatch, which told him he was not on its database after he sent a copy of his passport and photo of himself.

Sainsbury’s later apologised and offered him a £75 shopping voucher.

A spokesperson for the firm said: “We have been in contact with Mr Rajah to sincerely apologise for his experience in our Elephant and Castle store.

“This was not an issue with the facial recognition technology in use but a case of the wrong person being approached in store.”

The UK’s second largest supermarket chain has said the technology is part of its efforts to identify shoplifters and curb a sharp increase in retail crime in recent years.

Its website says that the system has a “99.98% accuracy rate and every alert is reviewed by trained colleagues before any action is taken”.

Sainsbury’s said that the system issues an alert based on criminal behaviour submitted by the store or other retailers using Facewatch nearby.

But Mr Rajah said that he now has “no interest” in shopping in Sainsbury’s and said he wants people to be aware of facial recognition technology being used in stores.

He said: “It’s borderline fascistic as well, how can you just have something done to you and not have an understanding? How can you be excluded from a space and not have an understanding or an explanation?”

A Facewatch spokesperson said: “We’re sorry to hear about Mr Rajah’s experience and understand why it would have been upsetting. This incident arose from a case of human error in store, where a member of staff approached the wrong customer.

“Our data protection team followed the usual process to confirm his identity and verified that he was not on our database and had not been subject to any alerts generated by Facewatch.”

They added that if someone makes a subject access request, the data is not stored or used for any other purpose and is deleted after the individual proves who they say they are.

Jasleen Chaggar of Big Brother Watch said: “The idea that we are all just one facial recognition mistake away from being falsely accused of a crime or ejected from a store without any explanation is deeply chilling.

“To add insult to injury, innocent people seeking remedy must jump through hoops and hand over even more personal data just to discover what they’re accused of.

“In the vast majority of cases, they are offered little more than an apology when companies are finally forced to admit the tech got it wrong.”

Ms Chaggar said the organisation “regularly hears from members of the public who are left traumatised after being wrongly caught in this net of privatised biometric surveillance”.

The Information Commissioner’s Office (ICO) said: “Facial recognition technology can help retailers detect and prevent crime and has clear benefits in the public interest. However, its use must comply with data protection law.

“Retailers should carefully consider the risks of misidentification and have robust procedures in place to ensure the accuracy and integrity of the personal information they collect and process.

“This is especially important where personal information is used in situations which can have a serious impact on a person.”