“The computer looks at very stupid differences”
Artificial intelligence that reads your sexuality, intelligence and political affiliation from your face. It is said that the future is here. Ronald Feldhuizen also undermines this technology.
Gay or not? A computer with artificial intelligence only needs to see an image of a face and it already knows. A digital “gaydar” would be better than a human one. Your political preference? The computer knows. Your IQ? also. Shit, the future is already here.
However, this is not so. Because taking a closer look at the science behind computers that read deeply rooted human features from a person’s face and it soon becomes clear that we’re dealing with optical illusions here.
Stanford researcher Michal Kosinski often appears in the news with pulsing algorithms for reading the face. In 2017, he published a study in which he allegedly showed that a machine learning algorithm can determine who is gay or straight by looking at a person’s facial structure.
The idea behind Kosinski’s software makes sense in principle: You train a computer on facial images of a group of people – in this case from an American dating site – and reveal their sexual preference. Kosinski then lets the computer pass the same judgment on a new batch of facial images, and yes, scores pretty well.
The problem is that the machine learning algorithm has no clue what it is looking for. And this is where everything goes wrong, critics convincingly appear again and again. For example, Google AI researchers wrote that Kosinski’s photos clearly show that heterosexual women wear more eye shadow than lesbian women. The average gay man wears glasses more often. So you already know the time: the computer looks at very superficial and stupid differences that occur on American dating sites between certain people. in the middle. So there are no facial features or smart algorithm.
In short, pseudoscience. It is very reminiscent of cranial reading, or phrenology, an imposter practice we left behind a hundred years ago. Phrenologists often made superficial and meaningless observations like Kosinski’s algorithms now do. If a prominent physiognomy feels a lump above the ear in a handful of aggressive or criminal individuals, this trait is mentioned in books as aggressive.
Either way, the technology is really dangerous: In China, the government is already using facial algorithms to spy and control people. A pointy nose – something that minorities in China often have – is enough to make you one targeting to make. It’s old nonsense in a modern racist jacket.
This column is also found in Summer issue of LOOK.
“Travel enthusiast. Alcohol lover. Friendly entrepreneur. Coffeeaholic. Award-winning writer.”