09/20/2017 / By Isabelle Z.
At the recent unveiling of their newest iPhone models, Apple bragged about their use of face-reading artificial intelligence (AI). The phones will use the technology in place of fingerprint readers and numeric passcodes to unlock them, and Apple claims the Face ID system is so smart that it can even identify people when they are wearing masks. However, the approach is drawing a lot of scrutiny from those who feel it infringes too much on their privacy. Some experts are pointing out just how far face-reading AI can go – and it’s a very scary prospect.
Stanford University Data Scientist and Psychologist Dr. Michal Kosinski says that such programs will be capable not only of reading your face for authentication purposes but also of determining your sexual orientation. He also believes that algorithms will soon be able to determine whether people have certain personality traits or are predisposed to criminal behavior based on their faces.
He put his theory to the test by programming an AI using a batch of online dating photos. He used photos from profiles where people had listed their sexual orientation. The software used digital scans of contours of the people’s faces, noses, chins and cheekbones to measure the ratios between features and log the ones that were more likely to appear in straight and gay people. For example, gay men tend to have longer noses and narrower jaws while lesbians have bigger jaws, researchers say. After learning these patterns, it was presented with faces it had not seen before. His program was able to determine a person’s sexual orientation with accuracy rates of 91 percent in men and 83 percent in women.
Now, he is working on a type of AI software that identifies people’s political beliefs, something he says is possible because studies show that political views are often heritable. The genetic or developmental factors underlying political leanings could cause detectable facial differences.
In fact, he said that past studies have found conservative politicians to be more attractive overall than liberals. This could be because attractive people may have an easier time getting ahead in life.
Kosinski also believes that facial recognition could be used to detect IQ, with schools using facial scan results in the future as they consider prospective students.
There are a lot of concerns about this type of software. For example, in countries where homosexuality is considered a crime, the software could be used to target people for prosecution based on their sexual orientation. It could also be used by bullies to “out” people, possibly exposing them to hate crimes. Making matters even worse is the fact that it’s not 100 percent accurate.
LGBT groups were quick to attack the study, saying its findings could be used as a weapon to harm people who are in situations where coming out of the closet could be dangerous, and they also pointed out that it could bring harm to heterosexuals who are outed inaccurately. They also criticized the study for only using white faces, making inaccurate assumptions, and not being peer-reviewed.
In the case of scanning for IQ in school admissions, it could give people with “better genetics” an unfair advantage. When predicting political beliefs, it could also be very dangerous if it fell into the wrong hands. Governments could use it to identify individuals whose beliefs do not align with theirs and take preemptive actions against them, which is a very disturbing thought indeed.
You might be tempted to buy into the hype about the facial AI found on the newest iPhones, but keep in mind that it’s a very slippery slope.
Sources for this article include:
Tagged Under: artificial intelligence, face mapping, Facial recognition, facial scanning, future tech, political beliefs, politics, profiling, Sexual orientation, surveillance