Apple's tries to peek at user habits without violating privacy

text

AppleInc.is tapping new technology to garner insight into user behavior, in an effort to keep pace with rivals’ insights without violating its privacy pledges.

Called “differential privacy,” the technology will be included in a fall update to iOS, Apple’s operating system for iPhone and iPad. It will help the company’s engineers “spot patterns on how multiple users are using their devices,” saidCraig Federighi,Apple’s senior vice president of software engineering, at the company’s developer conference earlier this week.

The technology works by adding incorrect information to the data Apple collects. This is done in such a way that Apple’s algorithms can extract useful insights while making it very difficult for anyone to link accurate data back to an individual user.

Apple’s short-term ambitions for the technology are limited. The company will use it to keep user data anonymous while analyzing how customers are using emojis or new slang expressions on the phone, or which search queries should pop up “deep links” to apps rather than webpages. It will also improve the company’s Notes software.

In the long term, however, differential privacy could help Apple keep up with competitors such asAlphabetInc.’s Google that collect user data more aggressively and use it to improve offers such as image- and voice-recognition programs.

Its privacy stance has left Apple in something of a bind. The company promises not to touch its users’ data and criticizes other tech companies for collecting personal information that is used to target advertising. But that policy has hindered Apple’s ability to develop and improve services for users.

In essence, differential privacy is an effort to tap insights about what a group of users is doing without examining any individuals. Apple hasn’t released any technical details on how its differential-privacy software works.

“Using differential privacy will allow them to do significant machine learning while adhering to privacy promises that they made,” said Cynthia Dwork, aMicrosoftCorp.researcher who co-wrote the first major paper on the concept a decade ago. A Microsoft spokeswoman said the company isn’t employing the technology.

Daniel Barth-Jones, a Columbia University epidemiology professor who studies technologies related to the disclosure of sensitive information, said the technology will allow Apple to analyze more data. “They’ve been kind of surpassed by other players who are bigger data collectors—Google would be a good example—in the machine-learning arena and personalization of user interface,” he said.

The technology dovetails with Apple’s business model, which is focused on selling hardware rather than advertising, saidEd Lazowska,an engineering professor at the University of Washington. He said Apple’s approach to privacy is “highly principled.” But, he added, “They can afford to be, because strong privacy protection will have far less impact on their bottom line than for many other companies.”

Apple isn’t the first company to experiment with differential privacy. Google has been using the technology for two years to analyze some data from its Chrome browser.

One reason that differential privacy hasn’t been adopted more widely is that it is difficult to get right. “It’s not quite something you can deploy in a plug-and-play fashion,” said Arvind Narayanan, an assistant professor of computer science at Princeton University. “It requires a certain level of expertise. It needs a lot of careful thinking about your data.”

He also urged Apple to talk more openly about how it is using the technology. “It is a good move that they’re doing this,” he said. “However, the public would have a lot more confidence in this if it were open and were able to be audited by outside privacy experts.”

(THE WALL STREET JOURNAL)