AI gaydar a nihilistic invasion of privacy

Photograph-analysing algorithm likely to be used as tool for intolerance of LGBTQ people

The Stanford robot was able to correctly classify gay and straight men 81 per cent of the time, and gay and straight women 71 per cent of the time when provided with just one image of the subject.
The Stanford robot was able to correctly classify gay and straight men 81 per cent of the time, and gay and straight women 71 per cent of the time when provided with just one image of the subject.

A new study from Stanford University used artificial intelligence (AI) technology to guess people's sexual orientations by analysing their headshots. The machine turned out to be worryingly accurate.

Researchers have shown how a novel machine-learning algorithm needed nothing more than a few photos of a person’s face in order to identify them as gay or straight.

Using a sample of more than 35,000 facial images taken from an unnamed online dating website, the robot developed at Stanford was able to correctly classify gay and straight men 81 per cent of the time, and gay and straight women 71 per cent of the time, when provided with just one image of the subject. The success rate increased when more than one image was provided. The researchers used people’s stated preferences on the website as evidence of whether they were in fact gay or straight.

According to the Stanford researchers, gay men and women shared common traits such as “gender-atypical facial morphology, expression, and grooming styles.” Gay men had tighter jaws, bigger noses and larger foreheads than heterosexual males. Gay women were believed to have larger jaws and smaller foreheads when compared with straight women.

READ MORE

Funding

While the methodology used, and conclusions drawn, from this study are worthy of deeper analysis, the question I’d like to ask is why anyone thought it was a good idea to fund this research in the first place (which, by the way, didn’t include any people of colour, bisexual or transgender people in its sample.)

Online privacy invasion is a modern-day malaise affecting everyone but can be particularly thorny for LGBTQ people, particularly those who feel compelled to conceal their true selves, (perhaps because of disapproving family members or the fear that it might negatively affect career choices).

Save staying indoors alone forever, it's virtually impossible to guarantee images of you or references to you wont be found somewhere online that you didn't give permission for, regardless of the recent introduction of GDPR data privacy rules. You might be careful not to take selfies when out at the gay bars. But that doesn't mean everyone else isn't taking and sharing snaps that could be seen on Instagram or Facebook by the "wrong" person.

But to be “outed” in this way still requires an individual actively putting herself in a potentially exposed situation. It doesn’t make it any better but at the very least you knew the risk involved. The news that a machine has been designed with the specific purpose of polarising people based on their sexual orientation, without their consent, serves no positive purpose, in my mind at least.

I’m too long in the tooth to give a stuff what anyone thinks of me now but I remember vividly what it felt like when I did care about others’ opinions.

From hardline anti-LGBTQ authoritarian regimes to parents wishing to determine their childrens’ sexual orientation, such a technology would more than likely be used as a tool for intolerance before anything else.

So why develop it in the first place? In the authors' notes included with the research paper – published in the most recent Journal of Personality and Social Psychology – there is a section mentioning how the group conducting the study "were really disturbed by these results and spent much time considering whether they should be made public at all. We did not want to enable the very risks that we are warning against," they wrote.

“We did not create a privacy-invading tool, but rather showed that basic and widely used methods pose serious privacy threats.”

Yet inventing something uniquely harmful only to turn around and warn people about the risks posed by this innovation that didn't exist until you created them is akin to the story of Italian car manufacturer Fiat in its heyday. Back in the 1950s Fiat made more than just cars and was at one point simultaneously the world's largest manufacturer of landmines and prosthetic limbs.

The two main authors, Yilun Wang and Michal Kosinski, are hardline pessimists when it comes to internet privacy, which they believe is already "with O'Leary in the grave".

To quote the nihilists: “Essentially, we believe that further erosion of privacy is inevitable, and the safety of gay and other minorities hinges not on the right to privacy but on the enforcement of human rights, and tolerance of societies and governments.”

Being void of strong feelings themselves, nihilists have a tendency to overlook the impact certain words and deeds can have on marginalised groups, presumably because they don’t ignite any strong feelings either way in themselves.

I learned this the hard way a few years ago. As a gay man who grew up in a time when “gay” was an alternative adjective for weak or crap, I too began to associate the word more with something lame rather than sexual orientation.

Etymology

Several years later I started writing an article essentially arguing many words change meanings over the years and that maybe it was time we all just accepted “gay” to mean “bad” rather than homosexual. I spoke to linguists, etymologists and various other academic scholars who, for the most part, agreed with my thesis, or at least could get behind the idea that the meanings of words evolve all the time.

With one last interview to conduct before writing up the piece I remember feeling delighted with myself and my clever little left-field argument. Confident I already had the article written in my mind I got on the phone to founding director of LGBTQ youth support service, BeLonG To – more recently known for his central role in the marriage equality referendum – a soft spoken hero for many LGBTQ Irish people, Michael Nanci Barron.

I made my intellectual case for letting “gay” evolve to become an acceptable adjective used to describe something unfavourable. Michael asked me how old I was, (I was 30 at the time), if I was in a stable relationship with a man (I was) and whether I considered myself a self-confident person (I did). Then he suggested I think back to when I was 15 years old and how everything considered lame or crap was described as “gay”.

He got me. Just because I wasn’t offended by the word now didn’t mean others, particularly younger gay people, weren’t either. The article never saw the light of day.

But that was merely one man’s opinion shelved. This AI technique in question is far more significant. So if we are to allow for the development of potentially unhelpful technologies just to prove they can be done, funding should come with a stipulation that researchers also develop ways to override their innovation’s central function should the technology turn out to be gay, I mean bad.