Author: Alistair Gentry
Is there such a thing as a fair, neutral interface? ODI Research Artist in Residence Alistair Gentry considers the subjectivity of fairness in technology and services, and what that means for their design
Another month as The Open Data Institute's embedded Research Artist in Residence, another trip down a rabbit hole of things that seem simple but aren't...
I still can't give too much away, but for now it suffices to say that I've been looking into whether there's such a thing as a fair, neutral interface that doesn't lead or betray the maker's ideology.
I started with the science of our natural interface with other people: our faces.
Research at Pennsylvania University (among others) has shown that “We attribute personality, inner-thoughts, and beliefs to others based simply on facial appearance in what appears to be an effortless and nonreflective manner. Yet, these impressions are often nondiagnostic of people’s actual traits and states such as their tendency for sociability, responsibility, caring, happiness, and aggression... most faces cannot aptly be described as emotionally “neutral” even when completely devoid of overt muscle movement related to expression.”
In other words, despite how confidently we all do it, we're frequently wrong when we judge by first appearances because we project so much even onto neutrality. Furthermore (patriarchal surprise!), “neutral female versus neutral male faces were rated as more submissive, affiliative, naïve, honest, cooperative, babyish, fearful, happy, and less angry than neutral male faces.” (Ibid.)
Things that are not under anyone's conscious control – like the bone structure of their face, or how far apart their eyes are, or the distance between their brow and mouth – “influence perceptions of anger and sadness even for basic schematic, or “dot-and-line,” faces”, according to the Association for Psychological Science. The dot-and-line emoji – officially named “neutral face” or “straight faced” – is “intended to depict a neutral sentiment but often used to convey mild irritation and concern or a deadpan sense of humor.”
I definitely use that emoji in the latter ways, because I rarely have cause to convey or claim that I'm feeling absolutely no emotion without irony or sarcasm. There have also been many occasions when people have asked me what terrible thing has happened, based on my “neutral” expression. Usually there's no terrible thing, that's just what my face looks like when there's no particular emotion being enacted on it. Call it resting catastrophe face.
As is often the case, this is something that artists have been onto for decades, if not centuries.
For example, actors trained in the traditions of French theatre maker Jacques Copeau and later, more famously, Jacques Lecoq were instructed to start their work with carefully designed neutral masks, but they had to do so in the knowledge that according to Lecoq, neutrality is “a fulcrum point that does not exist.” The point is to try, not to get there; because actually getting there is impossible. In fact, the closer you get, the more your quirks and biases are exposed. Not only in Western theatre and performance but throughout all times and cultures, masks – especially masks of neutrality – reveal rather than hide. And for ‘masks’ you can also read: user interfaces, content policies, etc.
A panel at the November 2018 ODI Summit on Designing for fairness touched on the idea of neutral data and the uneven impact of data, and data management regimes, on different individuals and different sectors of society. This is also one of the key areas of research and production for me as Artist in Residence at the ODI. While data can be used in countless and productive ways for good, it can also be hoarded for private benefit, , or used inadvertently or deliberately to do harm.
Another fundamental message of what I'm working on came through from the panel: that humans make algorithms, choosing what data is valuable or worthless, thereby influencing whether the way that data is used has a good or a bad outcome on people's lives. “The algorithm” has emerged as one of those metonymys that we forget is a figure of speech and not an actual entity with desires or agenda, like “the market” or even the one I used above, “society.” The market, society and the algorithm result from the individual and collective decisions of certain people, and one can usually trace exactly who, with varying degrees of effort. If the data's truly open then doing this kind of audit shouldn't be difficult at all.
The panel's Martin Tisné, Managing Director at Luminate, also made the very good point that “fairness is a question of perspective.” Who does and doesn't get to decide what's fair, just, neutral or equitable matters a lot, and definitions differ greatly depending upon their politics, ethics and circumstances.
As the worldwide resurgence of traditionalism and the far-right shows, women or minorities adjusting towards some kind of equality – if not neutrality – of discourse by claiming their right to be seen and heard in their own terms is enough to trigger fierce resistance and an escalation of abuse, amid claims by some that they are being favoured rather than levelling the playing field. Lurking crypto-metonymy makes it easier to disregard the “I”s contribution to the things that “we” do. At the time of writing, both the grassroots and the very organised far-right – including women – have been losing their minds over a Gillette advertisement politely suggesting (along with the implied strong hint of buying a disposable razor) that a man who sexually harasses and belittles women, or engages in or tolerates bullying and homophobia, is neither secure in his masculinity nor a suitable role model.
To the beneficiaries of structural inequality, it seems deeply unfair that their privileges might be curtailed when they also conceive of themselves as good people and don't feel that they have personally, actively ‘done’ anything wrong. Their sense of what's neutral is calibrated radically differently to somebody who is disadvantaged by lacking that privilege.
For the foreseeable future, there will probably not be a way to offer access to or benefit from any resource with perfect fairness, when one bears in mind how much people's feelings about what's fair vary, not to mention that the closer you get to it, the more the details slow you down. Data ethics is an increasingly important, rigorous and methodical field of study, but thinking about whether our notions of fairness and neutrality match up to what's considered neutral and fair to other people seems to be the only place to start if you want things to work out better for more people, more of the time.