I witnessed the Metropolitan Police use automated facial recognition at Notting Hill Carnival previous year, and while watching for only five minutes I saw the system wrongly identify two innocent women walking down the street as men on the police's "watch-list".
London's Metropolitan Police has tested AFR at a total of three events, including the city's Notting Hill carnival in 2016 and 2017, and a "Remembrance Sunday" event in November, the watchdog discovered.
Two police forces acknowledged they were now testing facial recognition cameras.
Police have been rolling out the software to be used at major events such as sporting fixtures and music concerts, including a Liam Gallagher concert and worldwide rugby games, aiming to identify wanted criminals and people on watch lists.
Automated facial recognition (AFR) technology used by London's Metropolitan Police is created to find persons of interests within large groups of people by comparing the biometrics of attendees caught on camera with information already stored on law enforcement databases.
Its system incorrectly flagged 102 people as potential suspects and led to no arrests.
"When we first deployed and we were learning how to use it. some of the digital images we used weren't of sufficient quality", said Deputy Chief Constable Richard Lewis. If you think this isn't worth worrying about, bear in mind that on the basis of an incorrect match the police have the power to stop you in the street and require you to identify yourself, in order to prove you aren't the person their computer tells them you are.
"If we move forward on this path, these systems will mistakenly identify innocent people as criminals or terrorists and will be used by unscrupulous governments to silence unwelcome voices".
"This system does not include facial recognition but does capture images and licence plate numbers, enabling our loss prevention staff to identify offenders more easily and get on top of theft".
The SWP said that false positives were to be expected while the technology develops, but that the accuracy was improving, and added that no one had been arrested after a false match - again because of human intervention. "Faces in the video stream that do not generate an alert are deleted immediately".
"Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK".
But Big Brother Watch said it was concerned that facial recognition cameras would affect "individuals' right to a private life and freedom of expression".
Underlying the concerns about the poor accuracy of the kit are complaints about a lack of clear oversight - an issue that has been raised by a number of activists, politicians and independent commissioners in related areas.
What does Big Brother Watch want?The majority of the people whose faces were scanned automatically were also not notified that the police system has "matched" them as targets.
The privacy group also said that: "automated facial recognition technology is now used by United Kingdom police forces without a clear legal basis, oversight or governmental strategy".
"Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place for the public", said Ms Denham.
"When trialling facial recognition technologies, forces must show regard to relevant policies, including the Surveillance Camera Code of Practices and the Information Commissioner's guide", it said in a statement.