A computer program can tell if someone is gay or not with a high level of accuracy by looking at a photograph, a study claims.
The artificial intelligence system can infer someone’s sexuality by scanning a photograph of a man or woman with up to 91 per cent accuracy.
But critics say that the software could be used to ‘out’ men and women currently in the closet.
A computer program can tell if someone is gay or not with a high level of accuracy by looking at a photograph, a study claims (stock image)
The software in the wrong hands could in theory be used to pick out people who would rather keep their choice of sexual partners.
Google search results show that the term ‘is my husband gay?’ is more common than ‘is my husband having an affair’ or ‘is my husband depressed’.
Subtle differences in facial structure that the human eye struggles to detect can be detected by computers, the authors claim in the Economist.
To ‘train’ their computer, the researchers downloaded 130,741 different images of 36,630 individual men’s faces, and 170,360 images of 38,593 women from a US dating website.
The users had all declared their sexuality on their profiles.
Removing images which were not clear enough, they were left with even numbers of 35,326 pictures of 14,776 people, gay and straight, male and female.
Digitally scanning contours of the face, cheekbones, nose and chin the computer made hosts of measurements of the ratios between the different facial features.
It then logged which ones were more likely to appear in gay people than straight people.
Once the patterns associated with homosexuality were learnt, the system was shown faces it had not been shown before.
The system was tested by showing it a picture of two men, one gay and one straight.
When shown five photos of each man, it correctly selected the man’s sexuality 91 per cent of the time.
The model performed worse with women, telling gay and straight apart with 71 per cent accuracy after looking at one photo, and 83 per cent accuracy after five.
In both cases the level of performance far outstripped human ability to make this distinction.
Using the same images, human viewers could tell gay from straight 61 per cent of the time for men, and 54 per cent of the time for women.
The system was tested by showing it a picture of two men, one gay and one straight. When shown five photos of each man, it correctly selected the man’s sexuality 91 per cent of the time (stock image)
This aligns with research which suggests humans can determine sexuality from faces at only just better than chance.
After looking at one photo, it was 71 per cent accurate, rising to 83 per cent after viewing five.
Researchers Michal Kosinski and Yilun Wang at Stanford University are soon to publish their findings in the Journal of Personality and Social Psychology.
Offering an explanation of how the software works, they say that in the womb, hormones such as testosterone affect the developing bone structure of the foetus.
They suggest that these same hormones have a role in determining sexuality – and the machine is able to pick these signs out.
In findings that may provoke concern – particularly in men and women whose sexuality has been kept secret from friends and family – the machine could also guess accurately sexuality of people who had not declared on a dating website their sexuality.
To test this, the system was shown pictures of 1,000 men at random – with at least five photographs of each man.
The ratio of gay to straight used was approximately seven in every 100, which the researchers say mirrors the level of homosexuality in the general population.
When asked to pick out the 100 faces thought most likely to be gay, the machine was much less accurate.
Only 47 of those chosen by the system actually being gay – meaning it viewed some men’s faces as ‘gayer’ than some men who are actually gay.
But when asked to pick out the ten it considered were most likely to be gay, nine out of ten were actually homosexual.
Dr Kosinski has been involved in controversial research. He invented an app that could use information in a person’s Facebook profile to model their personality.
This information was used by the Donald Trump election campaign team to select voters to target it thought would be receptive.
In a leading article, the Economist warns: ‘In countries where homosexuality is a crime, software which promises to infer sexuality from a face is an alarming prospect.’