By Ken Munro, partner at Pen Test Partners
Ever had a weird situation where you’ve been talking about something, then shortly after an advert has popped up on your phone or web browser relating to something you just said? There’s now a growing bank of anecdotal evidence to suggest that mobile phones can be used to listen in on the user and their conversations for the purpose of targeting advertising.
But how realistic is this and does it constitute an abuse of the app-user relationship? We built a proof-of-concept app to demonstrate the ability to access the audio on mobile phones and potentially use this for targeted advertising.
To target adverts at users the app developer needs access to the audio microphone on the device. That may sound intrusive but it’s already common practice in many mobile apps.
Developers now try to cast the permissions net as widely as possible to capture all eventualities in a bid to futureproof apps. For instance, when Messenger launched over the Facebook platform, it requested no fewer than 34 permissions from the user, including the right to adjust audio settings, record audio, and email guests without the owner’s knowledge, many of which are not essential for it to function.
It’s this ‘permission creep’ that has paved the way for ‘snooping apps’ and it’s a slippery slope for the industry; users rarely review the permissions required by an app at installation, and they’ll accept pretty much anything just so long as they can have the app.
So, on the one hand, developers need to provide apps that can exploit future revenue opportunities, but on the other they need to protect themselves from being associated with an abuse of mobile functionality, one that ultimately invades the user’s privacy.
The repercussions for responsible providers if it was proven that an app was found to be eavesdropping on the user would be profound. It’s why the likes of Google and Facebook have moved swiftly to deny these allegations. Nonetheless, there’s nothing to say less scrupulous providers won’t be tempted to monetise this access.
We were interested to see if it was technically possible to design an app capable of capturing audio in real time and identifying key words; these would be gold dust for an advertiser, giving them readily identified, receptive target market.
At first, we suspected that battery use would be high when constantly listening to the microphone and uploading the audio to a voice-to-text service. This turned out not to be the case. Admittedly, the media stream of the phone had to be muted, to avoid it making sounds whilst recording and the keyword search was difficult, with some words misinterpreted by the conversion process, but there was no doubt the app was able to capture and convert speech in real time from users in close proximity to the device; they didn’t even need to be using it.
Of course, there’s no real evidence as of yet that apps are performing this level of capture – the next step would be figuring out a way to review large numbers of apps in the stores to see if any are actually taking your voice data – but many have undoubtedly secured the rather dubious privilege of doing so by virtue of blanket permissions.
The question is whether the mobile industry needs to take a stand now and reassure some rather rattled users, from the woman who discussed tax with her sister, then received targeted adverts from tax experts over social media, to the man who received targeted ads for sanitary towels after discussing menstruation with his wife in the car.
The app development community has a real opportunity here. Responsible apps should be publicising the fact that they do not issue blanket permissions, perhaps look at checkbox opt-outs, and be transparent about how data is collected and used. Security testing and proof of data protection are all feathers in the cap that the app provider should be proud of.
Yes, apps need to protect future revenue streams, but they also need to protect their relationship with the customer; if that comes into disrepute, the app and the brand could suffer potentially catastrophic reputational damage. After all, we all know nothing good ever came from eavesdropping.
Pen Test Partners is an ethical hacking firm, which built a proof-of-concept app to demonstrate the ability to access the audio on mobile phones and potentially use this for targeted advertising.