Personality tests, from makeshift Facebook quizzes to serious psychological sorters, are all the rage these days. Many of these are based on the so-called "Big Five" personality traits that most contemporary personality psychologists believe make up our many varied temperaments: extraversion, agreeableness, openness, conscientiousness and neuroticism.
Now, though, all of these tests are about to become obsolete. Pretty soon, artificial intelligence should be able to know your personality just by looking at the ways your eyes move, according to a University of South Australia press release.
Researchers at the University of South Australia, in collaboration with the University of Stuttgart, Flinders University, and the Max Planck Institute for Informatics in Germany, have developed algorithms capable of reliably identifying four of the Big Five traits in a human subject by merely tracking eye movement.
It seems impossible, if not a tad creepy, that so much of who you are can be revealed by looking into your eyes, but it turns out that your eyes might quite literally be the windows into your soul.
"Thanks to our machine-learning approach, we not only validate the role of personality in explaining eye movement in everyday life, but also reveal new eye movement characteristics as predictors of personality traits,” said Dr. Tobias Loetscher, one of the lead researchers on the project.
To test the accuracy of their algorithms, researchers attached eye-tracking headsets to 42 study participants. The participants then went into a store and were asked to find something to purchase. After the algorithms analyzed eye movements and made predictions about the participants' personalities, researchers compared the results to standard personality questionnaires that the subjects were also asked to fill out. Researchers found that the machines were able to type the participants significantly better than chance.
"There’s certainly the potential for these findings to improve human-machine interactions," said Loetscher. "People are always looking for improved, personalized services. However, today’s robots and computers are not socially aware, so they cannot adapt to non-verbal cues. This research provides opportunities to develop robots and computers so that they can become more natural, and better at interpreting human social signals."
The fact that the study was performed in a store while participants picked something to purchase raises a few alarm bells about how this research might be used for commercial purposes, but it also has a lot of potential to make our interactions with technology far more useful.
As long as it's used with the consent of users, this technology could even help foster more positive social interactions in a world where such interactions increasingly happen online. Maybe we'll even come to understand each other, and ourselves, a little bit better.