When no one listens, will bug be fixed?
The security gaffe Nadeem Douba discovered this spring could have given him access to information about millions of dollars worth of transactions at a handful of banks around the world.
With a combination of keystrokes and code, he said, he could have uploaded a surveillance-gathering file and endangered real people’s savings and personal account information.
It was a vulnerability that had apparently gone unnoticed for years.
“User names and passwords; transaction information; depositing money from where to where; credit card information,” he said. “These kinds of things you can collect quite steathfully.”
Douba is an Ottawa researcher who owns a cybersecurity assessment firm and has developed tools to gauge the soundness of software and systems. He’s accustomed to having people listen when he raises serious concerns.
But that’s not what happened in this case, at least not at first.
Temenos Group AG is one of many vendors that sell core banking systems to financial services companies: the equivalent of a bank in a box.
Such products tie together branches, online banking and ATMs like a general ledger, ensuring that balances remain the same, no matter the venue.
Temenos — a Greek word meaning “sacred space” — is already a major global player. The Geneva company processes the daily transactions of more than 500 million banking customers, and counts more than 2,000 financial services firms, including 38 of the top 50 banks, as clients, according to its marketing materials. Last year, it brought in roughly $468 million in revenue.
Temenos is trying to expand into the U.S by convincing American banks to replace their older cores with its more modern one — an upgrade it insists can save time and money by offering immediate transaction analytics, among other features.
Security is a priority, the company insists.
“We have official channels for reporting any security issues,” wrote Mark Frampton, a Temenos employee in charge of analyst relations, in an e-mail.
“Clients have access to a web-portal to log and track issues, as well as a 24/7 phone number and specific contact email addresses. We have processes in place which include security reviews at various stages of development requirements, design, code and then security testing.”
Outside researchers
But Temenos isn’t used to dealing with outside researchers.
“There is no formal channels through which researchers can contact us if they are not affiliated with any of our clients or partners,” Frampton said.
Douba said he became interested in Temenos’ products this year after a 10-day gig with a financial services firm — but he wasn’t working with that client when he found the bug.
Curious about Temenos’ systems, he said, he gained access from a contact at a different bank overseas to do some further testing.
Some major tech firms, like AT&T, Microsoft, Google and Facebook, welcome such disclosures from outsiders.
Others tend to be more sensitive to unsolicited revelations, especially in heavily regulated industries such as financial services.
On paper, that philosophy makes sense. Trust only those you know.
This year, Oracle’s chief security officer, Mary Ann Davidson, showed that such attitudes toward hackers are relatively commonplace.
“Please comply with your license agreement and stop reverse engineering our code, already,” she wrote in a rant that appeared on the software company’s website. It was retracted and denounced by Oracle within a day.
The malware detection company FireEye recently filed a preliminary injunction against a German security researcher effectively censoring his presentation about its vulnerabilities.
Connecting hackers
“What you’re talking about is the tension in between this idea of quite a free flow of information, clashing up against an industry that’s traditionally been quite closed about security issues,” said Casey Ellis, CEO and co-founder of BugCrowd, a San Francisco company that connects hackers who want to help companies — known as White Hats — with firms that welcome their findings.
Douba said he discovered the Temenos vulnerability in late May.
After scouring search engines and the company’s website, Douba said, the best he could find was a general e-mail address — perhaps for the marketing department, he doesn’t remember. He took to Twitter instead.
“@Temenos, who do I have to call to talk about your software’s security? I need your (chief information security officer).”
The company answered within minutes: “@ndouba Hi, please could you (direct message) with your details, along with which (financial institution) you work for? Thanks.”
The problem was that Douba wasn’t working for anyone but himself. He was reaching out purely out of goodwill, he said.
“If you’re an independent contractor trying to get your security research to them, they say: ‘Hey, you’re not part of the bank.’ They kept asking me: ‘Which bank are you part of ? You’re not a bank, we really can’t help you,’ ” said Douba.
“That kind of behavior worries me,” he said, “because who knows what else is in this banking software, I barely touched it.”
Many companies believe their systems are harder to hack if less is known about them, according to one veteran cybersecurity consultant.
“That’s fundamentally flawed,” said the expert, who said he has worked directly with banks using Temenos products in the past, but asked not to be named because he’s afraid his career would be hurt if he spoke out.
There’s nothing stopping a criminal hacker from signing up for an online account, then poking around its system for potential flaws, he added.
Outside researchers are another line of defense.
“There are only three ways organizations find out about vulnerabilities: through their own testing (hired or in-house), from friendly hackers or from a breach,” said Katie Moussouris, a former security strategist at Microsoft and the chief policy officer at HackerOne, in an e-mail.
“Once these industries realize that both the first two methods are preferable by orders of magnitude than the third, I think they will come around.”
Some financial services companies already have.
BBVA’s branchless banking subsidiary, Simple, is one of several financial services companies that operate disclosure programs. They often pay what’s called a bug bounty to hackers who raise flags, as Douba did, when they find legitimate security flaws.
A week after Douba tweeted Temenos, he said, he received the first of three blank e-mails from a member of Temenos’ security team.
About three weeks later, Douba said, he got an e-mail from that Temenos employee reiterating that he should take the issue up with a bank using its products.
With the help of a previous client, Douba arranged a phone call. By then it was early September — more than three months after he alerted Temenos.
Strict regulations
Tech vendors to the financial world may have good reason to move slowly when approached by outsiders. Banks are tightly regulated by a number of agencies around the globe. Vendors face a complex web of guidelines that differ from nation to nation.
They’re also the target of a seemingly unending parade of digital thieves, leaving them wary of trusting strangers.
“I would be too, because you’re skipping an important step in there,” said a former executive at both banks and their technology vendors who said he has worked specifically on Temenos products. He asked that his name not be used because he was not authorized to speak on the record.
There’s also an issue of perception, said Florian Lukavsky, the director of SEC Consult Singapore.
“If they had an open security program where researchers could submit vulnerabilities, then they would have to admit that their system had vulnerabilities in the first place,” said Lukavsky over Skype.
When vendors do issue fixes, it often takes 12 to 18 months for their banking customers to upgrade vulnerable systems, he said.
By reporting a bug that could be exploited, researchers sometimes find themselves treated like criminals.
“People hear a researcher saying: ‘I found a vulnerability. I want to tell you and let you know,’ ” Ellis said. “Oftentimes the reaction from somewhere within the organization — especially if they get the lawyers involved — is ‘We’re being attacked.’ ”
The episode can send an entire company into DefCon 1.
“It’s going to get bounced around in the company, because no one wants to own it,” said Rob Graham, CEO of Errata Security.
“They haven’t prepared for it, so there is no way for them to respond responsibly.”
Those pressures often leave researchers wondering whether they’re doing something wrong, said Andrew Crocker, a staff attorney on the Electronic Frontier Foundation’s civil liberties team. They’re not, he said. They have a First Amendment right to disclose bugs, according to Crocker. What’s less clear is whether they have legal protection to seek them out.
Doing so without violating laws is a fine line, said Jason Malo, an executive adviser at CEB Towergroup who specializes in cybersecurity.
“For example, as soon as you collect personal information, you’ve broken the currently established laws,” he said.
The specter of legal pressure sometimes shuts down such disclosures even when they’re seemingly in the best interest of a company and its customers.
“Very often, researchers are not sure what behavior does or does not raise the question of liability,” said Crocker. “Often these laws are interpreted in ways that make things more murky and less clear.
“All these things can chill security research when most people would think that the endeavor of security research, finding vulnerabilities when they exist, is to the benefit of society.”
The threat of lawsuits can make it too much trouble for researchers to properly disclose a security gaffe to a company not ready to hear such a warning, Graham said.
“We’ve reached a point in vulnerabilities that if they don’t have a bug bounty, or at least a disclosure policy published, it’s almost certainly going to go bad for you,” said Graham.
Mixed reaction
In Douba’s case, Temenos executives didn’t seem sure how to react.
Now, they insist they are working closely with Douba. They made a point of saying that they’re grateful. Oddly, that didn’t stop those same people from bashing him.
Reacting specifically to Douba’s plan to present his findings this month at the BlackHat Europe conference in Amsterdam, Temenos executives said Douba was sensationalizing the effects of the problems he found.
Douba quietly canceled his talk for personal reasons.
Martin Bailey, Temenos product director for enterprise technology, went one step further.
When told that Douba said he gained legitimate access from another Temenos customer, Bailey implied that Douba was acting inappropriately by not reporting the issue through the bank with which he had previously had a contract.
“How do we know that?” he snapped.
Bailey did acknowledge that the issues raised were valid. “This was quite a serious security vulnerability, despite all the processes that we have in place,” he said.
When asked about the steps that Douba took to report the issue, he said he couldn’t speak to the time frame in which the company was contacted, nor could he answer further questions about security procedures or the size and scope of its security team.
“We’re not pleased that it was found,” Bailey said. “And we’re not pleased that it existed in our products.”
Temenos declined to provide more details about its security process, because such information is too sensitive, a spokesman said.
Frampton, in the email, said the company constantly evaluates ways to refine its security practices. A part of that means it is willing to at least entertain “accredited and ethical” external security researchers on “perceived” security threats, he said.
Douba praised Temenos, but said reporting has been an ordeal.
Once the company acknowledged that his research was legitimate and began discussing the issue, Douba said, it was quick to act.
“It wasn’t that they were slow to patch,” he said Douba. “it was the fact that they were slow to respond.”
By the end of the September, Temenos issued a fix for the bug.
“It’s going to get bounced around in the company, because no one wants to own it. They haven’t prepared for it, so there is no way for them to respond responsibly.”
Rob Graham, Errata Security, on vulnerabilities found by outside researchers