Google should not be in business of war, say employees

APD NEWS

text

Thousands of Google employees have signed an open letter asking the internet giant to stop working on a project for the US military.

Project Maven involves using artificial intelligence to improve the precision of military drone strikes.

Employees fear Google's involvement will "irreparably damage" its brand.

"We believe that Google should not be in the business of war," says the letter, which is addressed to Google chief executive Sundar Pichai.

"Therefore we ask that Project Maven be cancelled, and that Google draft, publicise and enforce a clear policy stating that neither Google nor its contractors will ever build warfare technology."

No military projects

The letter, which was signed by 3,100 employees - including "dozens of senior engineers", according to the New York Times - says that staff have already raised concerns with senior management internally. Google has more than 88,000 employees worldwide.

In response to concerns raised, the head of Google's cloud business, Diane Greene, assured employees that the technology would not be used to launch weapons, nor would it be used to operate or fly drones.

However, the employees who signed the letter feel that the internet giant is putting users' trust at risk, as well ignoring its "moral and ethical responsibility".

"We cannot outsource the moral responsibility of our technologies to third parties," the letter says.

"Google's stated values make this clear: every one of our users is trusting us. Never jeopardise that. Ever.

"Building this technology to assist the US government in military surveillance - and potentially lethal outcomes - is not acceptable."

'Non-offensive purposes'

Google confirmed that it was allowing the Pentagon to use some of its image recognition technologies as part of a military project, following an investigative report by tech news site Gizmodo in March.

A Google spokesperson told the BBC: "Maven is a well-publicised Department of Defense project and Google is working on one part of it - specifically scoped to be for non-offensive purposes and using open-source object recognition software available to any Google Cloud customer.

"The models are based on unclassified data only. The technology is used to flag images for human review and is intended to save lives and save people from having to do highly tedious work.

"Any military use of machine learning naturally raises valid concerns. We're actively engaged across the company in a comprehensive discussion of this important topic and also with outside experts, as we continue to develop our policies around the development and use of our machine learning technologies."

The internet giant is working on developing policies for the use of its artificial intelligence technologies.

(BBC)