In the closing line of his Observer Networker column on Sunday 23 March 2014 – Eisenhower’s military-industrial warning rings truer than ever John Naughton/@jjn1 concluded: “we’re witnessing the evolution of a military-information complex”. John also tweeted the story:
The military-information complex, updated. Now it's the military-industrial-information complex http://t.co/VbYB56STeU
— John Naughton (@jjn1) March 26, 2014
Google’s corporate Code of Conduct may begin with Don’t be evil, but I think a precautionary principle of considering the potential for evil should also be applied when trying to think through the possible implications of what Google, and other information companies of the same ilk, could do…
This isn’t about paranoid tin foil hat wearing fantasy – it’s about thinking through how cool it would be to try stuff out… and then oopsing. Or someone realising that that whatever can make shed loads of money, and surely it can’t hurt. Or a government forcing the company to do whatever. Or another company with an evilness agnostic motto (such as “maximise short term profits”) buying the company.
“Just” and “all you need to do” are often phrases that unpack badly in the tech world (“just” can be really hard, with multiple dependencies; “all you have to do” might mean you have to do pretty much everything). On the other hand “sure, we can do that” can cover things that are flick of a switch possible, but tend not to be done for policy reasons.
Geeks are optimists – “just” can take hours, days, weeks.. “Sure” can be dangerous. “Erm, well we can, but…” can mean game over when don’t be evil becomes a realised potential for evil.
What if Google, who sell advertising to influence you, or Amazon, who try to directly influence you to buy stuff, had been running #pysops experiments testing their ability to manipulate your emotions. [They do, of course…] Like advertising company company Facebook did – Experimental evidence of massive-scale emotional contagion through social networks:
The experiment manipulated the extent to which people (N = 689,003) were exposed to emotional expressions in their News Feed. This tested whether exposure to emotions led people to change their own posting behaviors, in particular whether exposure to emotional content led people to post content that was consistent with the exposure—thereby testing whether exposure to verbal affective expressions leads to similar verbal expressions, a form of emotional contagion. People who viewed Facebook in English were qualified for selection into the experiment.
[The experiment] was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research. [my emphasis]
(For a defense of the experiment, see: In Defense of Facebook by Tal Yarkoni; I think one reason the chatterati are upset is because they remember undergrad psychology experiments and mention of ethics committees…)
Experian collect shed loads of personal data – including personal data they have privileged access to by virtue of being a credit reference checking agency – about you, me, all of us. Then we’re classed, demographically. And this information is sold, presumably to whoever’s buying, (local) government as well as commerce. Do we twitch if Google buys them? Do we care if GCHQ buys data from them?
What about Dunnhumby, processors of supermarket loyalty data among other things. Do we twitch if Amazon buys them?
Tesco Bank just started to offer a current account. Lots more lovely transaction data there:-)
I don’t know how to think through any of this. If raised in conversation, it always comes across as paranoid fantasy. But if you had access to the data, and all those toys? Well, you’d be tempted, wouldn’t you? To see if the “just” is a “just”, or it’s actually a “can’t”, or whether indeed it’s a straightforward “sure”…