The privacy group also said that: "automated facial recognition technology is now used by United Kingdom police forces without a clear legal basis, oversight or governmental strategy".
Information Commissioner Elizabeth Denham said the issue had become a "priority" for her office.
The product used by both police forces is called "NeoFace Watch", made by Japanese firm NEC.
Figures revealed in response to Freedom of Information requests by the group, show that for London's Metropolitan Police, 98 per cent of "matches" identified by the technology were wrong, the figure standing at 91 per cent for South Wales Police.
"For the use of FRT to be legal, the police forces must have clear evidence to demonstrate that the use of FRT in public spaces is effective in resolving the problem that it aims to address, and that no less intrusive technology or methods are available to address that problem".
On 31 occasions police followed up with people of concern only to find innocent people had been stopped due to false identifications.
The report says U.S. research shows the technology is particularly inaccurate identifying minority ethnic women.
Automated facial recognition is an artificially intelligent (AI) computer system which runs alongside a surveillance camera on the street, recognising people's faces in real time and matching them against watch-lists created by the police.
Two police forces acknowledged they were now testing facial recognition cameras. Potential matches are then flagged, allowing police to investigate further.
That the police think these embarrassing inaccuracy rates are acceptable - claiming that "no facial recognition system is 100% accurate" - is even more worrying, as South Wales Police has already planned future deployments and Metropolitan Police is in fact growing its use of facial recognition throughout 2018. Because of the poor quality, it was identifying people wrongly.
The first was at Notting Hill, but the person identified was no longer wanted for arrest because the information used to generate the watch list was out of date.
"If an incorrect match has been made, officers will explain to the individual what has happened and invite them to see the equipment along with providing them with a Fair Processing Notice".
Silkie Carlo, the director of Big Brother Watch, said: "Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK".
"Regarding "false" positive matches - we do not consider these as false positive matches because additional checks and balances are in place to confirm identification following system alerts", it said in a statement.
Despite a 2012 High Court ruling that said keeping images of presumed innocent people on file was unlawful, the government has said it isn't possible to automate removal.
But Big Brother Watch said it was concerned that facial recognition cameras would affect "individuals' right to a private life and freedom of expression".
Adding real-time facial recognition to our surveillance state's already worryingly militaristic arsenal would fundamentally change policing in the United Kingdom, and indeed the health of our democracy.
New data protection rules are about to come into force in the United Kingdom, requiring organizations to assess the risks of new technologies, particularly when biometric data is involved, and also to provide a data protection impact assessment to Denham's office in some circumstances. The majority of the people whose faces were scanned automatically were also not notified that the police system has "matched" them as targets. "Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place for the public".