Artificial Intelligence (AI) can successfully predict a person’s political orientation based on images of a blank facial expression, a development that researchers say shows that facial recognition technology is “more threatening than previously thought” and poses “serious challenges to privacy.” It also supports the idea that physiognomy, often discredited as ‘pseudoscience,’ may, in fact, be a valid practice.
A recent study published in the journal American Psychologist revealed that an algorithm’s ability to accurately guess a person’s political views is “on par with how well job interviews predict job success, or alcohol drives aggressiveness.” According to lead author Michal Kosinski, the 591 participants filled out a political orientation questionnaire. AI then captured what Kosinski described as a numerical “fingerprint” of participants’ faces and compared them to a database to predict political views.
“Participants wore a black T-shirt adjusted using binder clips to cover their clothes. They removed all jewelry and – if necessary – shaved facial hair. Face wipes were used to remove cosmetics until no residues were detected on a fresh wipe. Their hair was pulled back using hair ties, hair pins, and a headband while taking care to avoid flyaway hairs,” the study’s authors wrote.
A facial recognition algorithm — VGGFace2 — then examined the images to determine “face descriptors, or a numerical vector that is both unique to that individual and consistent across their different images.”
‘MORE THREATENING THAT PREVIOUSLY THOUGHT.’
“Descriptors extracted from a given image are compared to those stored in a database. If they are similar enough, the faces are considered a match. Here, we use a linear regression to map face descriptors on a political orientation scale and then use this mapping to predict political orientation for a previously unseen face,” the study said.
The study’s authors observed that an “analysis of facial features associated with political orientation revealed that conservatives tended to have larger lower faces.”
“Perhaps most crucially, our findings suggest that widespread biometric surveillance technologies are more threatening than previously thought,” the study warns. “Our results, suggesting that stable facial features convey a substantial amount of the signal, imply that individuals have less control over their privacy.”