With great power comes great responsibility, and the power accrued by some data companies sometimes leads them to the dark side when openness, accountability and humanity go missing from their worldview, says Alistair Gentry
I'm reasonably confident this is the first time there's been a reference to a Cyndi Lauper song on the Open Data Institute's blog [ed: you’re bang on], but in my work as its artist in residence I've been looking into the ways in which certain approaches can come across as either constructive or creepy.
Unfortunately data handlers – occasionally entire governments, hello China – really do seem to err on the dark side. The data analytics company Palantir Technologies, jointly founded and currently chaired by arch-libertarian Peter Thiel, is geekily named after the magical seeing stones from The Lord of the Rings trilogy.
In the throes of nerdgasm they failed to notice or perhaps simply don't mind that communication via palantír is not a neutral medium; the palantír that features most prominently in the books is a tool of the evil Sauron, who corrupts and deceives the wizard Saruman every time he looks into it. Like most well-written antagonists, he's the hero of his own story, but clearly not of the real story. I mean Saruman – who did you think I was talking about?
Two of Palantir's projects are titled Gotham (relational data analysis that can be used for counter-terrorism and fraud) and Metropolis (analysing the behaviour of models, such as financial services) – these being also the abodes of Batman and Superman respectively. Or from another perspective, supervillains like The Joker and Lex Luthor, who could just as easily be the inspiration given Palantir's form. For yet more final-reel-villain-reveal-vibes, Palantir's sole initial investor was the CIA's In-Q-Tel venture capital vehicle.
Elsewhere, Google's founding motto 'Don't be evil' seems increasingly distant from its extremely gymnastic efforts to wriggle out of tax responsibilities. Somewhere between Sergey Brin and Larry Page sitting in a room with a laptop, and Google becoming one of the world's pre-eminent entities, 'you can be amoral though' was apparently added as a silent amendment.
As for Elon Musk with his space missions, disruptive-as-in-disruptive-child approach to technology, flamethrowers, impractical boy-rescuing submarines and egomaniacal monologues, he seems to be recapitulating the plots of Moonraker, Live and Let Die, and A View to a Kill, among others, all on top of each other. Again, he may be 007 in the story in his head but he probably seems more like Blofeld or Goldfinger to his employees.
Of the three examples mentioned above, Google is probably the most informative in terms of thinking about why data handlers can go so wrong, or be perceived as going wrong by so many of their users. This is not to single them out particularly, because these thoughts could easily apply to many companies and civic entities who hold data. Let's leave Thiel and Musk aside since they actually seem to revel in being bad boys (emphasis on 'boy') much to the well-publicised chagrin of their staff and board members.
In the early days Google was small enough to balance on the tightrope between mining data for private profit and retaining the trust of its users. To not 'be evil', in short. The relationship between search engines or social media platforms and their users was always asymmetric because they were never entirely straight about what they were giving in return for what they got. There are probably few people online who haven’t clicked the 'OK' button on lengthy and complicated terms they haven’t read.
Achieving some clarity – and, frankly, honesty from some of those responsible for managing data – is an ongoing and unfinished longterm process, as the recent introduction of the EU and EEA's General Data Protection Regulation (GDPR) proves. From such an unbalanced start it's not hugely surprising that on the way to becoming a global entity, companies like Google or Facebook would see their moral compass swing uncontrollably between opposing poles of data hoarding and blatant untrustworthiness, precisely because so many people use them every day whether they’re evil or not.
We're almost talking about lawful evil and chaotic evil, if you're old enough and geeky enough to remember Dungeons & Dragons. There's a huge body of work to be done, by the way, on the influence of role playing games on the tech industry via nerd culture.
Not being evil isn't merely a semantic nicety or a cute tagline. In a world where data is another layer of vital infrastructure, maintaining trust and being worthy of that trust is at the very least a sound business proposition. More so than, say, raving about child molesters when your crackpot mini submarine concept is rejected.
At best, mutuality and an open, democratic approach to the data that is vital to us all empowers rather than disempowers, informs rather than obfuscates. We need more data superheroes, fewer – preferably no – data villains. Perhaps it would help some people in policy and tech to think about whether Lex Luthor would do a gloating monologue to a kryptonite-disabled Superman about their new concept for gathering all the data of everyone everywhere, or if their genius strategy has already been bluntly refuted by a large explosion in a James Bond film.
Perhaps they could even dig deep into the nerdy fandom that many of them apparently partake of, and contemplate Jeff Goldblum's famous line from Jurassic Park: “Your scientists were so preoccupied with whether or not they could that they didn't stop to think if they should.”
Alistair Gentry is embedded Research Artist in Residence at the Open Data Institute, as part of its Data as Culture programme
Image credit: CC BY-SA 2.0 by Joost J. Bakker IJmuiden Joost J. Bakker