Guinea pigs and Isis: how AI got it wrong over LCF
INVESTORS trying to get compensation for the £236 million they put into London Capital & Finance claim to have been let down by shocking errors in the AI software used to transcribe phone calls.
The Financial Services Compensation Scheme is charged with deciding who deserves to be paid, based on evidence including transcripts of calls from LCF’s marketing staff.
It uses a voice recognition system to hunt for keywords in phone calls to spot potential mis-selling, but LCF victims have found it makes ridiculous errors which they claim could have resulted in their applications being rejected.
They obtained the AI’s transcripts of their calls by using subject access requests, and were appalled at what they say they found. In one case seen by the Evening Standard, the word “ISA” was repeatedly misheard by the computer as “Isis”. A customer who told the firm she has already taken out an ISA for this year, saw the statement transcribed: “I think I’ve already taken my eyes out for this year.” The word “bond” — also a vital keyword — regularly came up as “bomb”.
On one occasion, the LCF representative is transcribed as saying: “Give me callback when you think you’ve done it and then I’ll have another girl, you Guinea pig again.”
Several LCF victims say results of their information requests came back showing the AI had missed or lost large numbers of calls. One said all the data on their first three investments was missing.
Former PE teacher Peter Thornley said he was “badgered” into investing by a high-pressure LCF salesman who would call him on his mobile while he was teaching. Yet none of those calls came up in his transcripts. “Clearly there are big chunks of information not being looked at, which makes it hard to have any faith in the system at all.”
The FSCS sid it was using AI software “to reduce how long LCF customers are having to wait”. It said machine learning was being used to iron out early glitches and the system now meets its “high standards of accuracy.”