Google's contactless mobile payment application, Google Wallet, has long been thought by experts to be secure due to its use of a hard-to-break secure hardware element for handling cardholder credentials and account information. But the fledgling app has failed a security test conducted by viaForensics, primarily for storing too much of consumers' personal data on the phone. While the app doesn't store the customer's entire credit card number, it does store the user's name, credit card balance, limits, expiration date, and transaction dates and locations on the phone itself (in the application's databases directory). The last four digits of the user's card number and email address are also recoverable from the phone.
Google's response to this test points out that this sensitive information can only be retrieved from a rooted phone, in other words, one whose operating system has been broken into so that system files can be accessed. "The viaForensics study does not refute the effectiveness of the multiple layers of security built into the Android operating system and Google Wallet," says spokesperson Nathan Tyler. "This report focuses on data accessed on a rooted phone, but even in this case, the secure element still protects the payment instruments, including the credit card and card verification value numbers. Android actively protects against malicious programs that attempt to gain root access without users' knowledge."
However, there have been instances of malware, such as Droid Dream, that have let attackers break through Android security and gain root access to the phone.
Once such a break-in occurs, the customer information stored on the phone would be sufficient to launch a social engineering attack, according to Andrew Hoog, chief investigative officer at viaForensics. "You could send someone a message containing information about their transactions and balance and say you need to confirm their card number," Hoog explains. "The fact that the sender knew you had conducted a transaction that afternoon would convince most people that it was legitimate."
Having this information available on the consumer's device does provide convenience, Hoog acknowledges. For instance, once the consumer chooses a credit card to use in the Google Wallet, the app displays the card balance and next payment due. "As a consumer, when that popped up, I thought, that's great, because I can never remember what my balance is and when the payment is due and here it is," Hoog says. "I really liked that feature. The problem is they shouldn't store it unencrypted." Google should either encrypt the information or not store it in the device.
A further security issue is that Google Analytics tracks activity that is stored in the phone log, which again could give a cybercriminal insight into the customer's purchasing and account behavior.
Google's is not the only mobile payment software to fail viaForensics' tests — Square and others also have. But although the Square app stores less personal information than Google's does, the Google Wallet is more secure than Square, Hoog says. "Square has some pretty big issues that we don't look at in the appWatchdog [the company's security testing service]," he says. appWatchdog only looks at what information is securely stored and transmitted. "Square has unencrypted readers and that's a really big deal. Contrast that with what Google Wallet did, which was they invested in near-field communication and a secure element, they put a lot of engineering into controlling access to that data. Square has been going out and capturing market share, so they built cheap, unencrypted credit card readers that they could send out to the masses."