Increased diameters of pupils because of increased cognitive effort demonstrate that our bodies mirror what goes on in our minds. Thus, body language has its own field of experts who tell us that how we sit in a meeting sends signals about intention or attitude.
That we can read physical signals for intention and attitude is probably a safeguard against attacks both physical and mental. But we are a subtle lot capable of practicing deceptive body language that might fool another. It isn’t as though we are dogs wagging tails in an obviously nonthreatening greeting; without tell-tale tails, people can hide their true intentions and attitudes, except… Except for the eyes; apparently, Mother Nature built in some fundamental responses to stimuli, such as that increased pupil diameter during times of some hard thinking.
Remember that eyes are “windows of the soul” and that vision is a brain function aided by eyes. We ignore what is insignificant in our field of view In favor of what we perceive to be relevant to our lives. Occasionally, we seem to throw relevance to wind. I suppose that staring is one of those eye responses that comes from the brain, as daydreaming, for example, takes over, and eyes ignore movements in a scene. “Hey, are you paying attention?” we hear during those moments. And we also know that the brain fills in a scene and ignores or extrapolates for the “blind spots” in our eyes. Staring and dilating eyes (in response to heavy thinking), indicate that the eye-brain relationship is a two-way street, just as my eye to your eye is a two-way street.
I assume all but the blind become rather good at reading eyes through experience. And maybe some of those eye responses are imitations learned from adults when each of us is a child. We do, as we know, have mirror neurons that express themselves outwardly in imitative movements. Maybe brains hard-pressed to comprehend have long dilated their attendent pupils, but groups of brainstorming individuals see the eyes of one another doing whatever eyes do, possibly leading to learning by imitation.
So, we shouldn’t be surprised to see some researchers devoted to studying the human gaze, especially if they think that developing an algorithm that ties gaze to response or intention can open the door for more humanlike artificial intelligence. * Gaze encodes a scene, sends that message to the interpretative brain, and ends in some response. If we could just capture the intricacies of human eye movements, we might be able to make that compassionate—or angry—robot, the Star Trek Data or 2001: A Space Odyssey Hal that, although not truly feeling, acts as though it is feeling. Imagine that perceptive AI that can read in your gaze your intention or attitude just as you read the dog’s wagging tail.
Look around. Look at looks. What do you see? What do others see in your looks? And what kind of AI will see as you see, process what is seen as you process, and act as you might act on the basis of visual evolution and experience? I guess we are destined to remain pupils of humanity for as long as there are people or their AI proxies and avatars.
*There are many research articles on this and related topics; here are four.
Brenna D Argall, Sonia Chernova, Manuela Veloso, and Brett Browning. A survey of robot learning from demonstration. Robotics and autonomous systems, 57(5):469– 483, 2009.
Congcong Liu, Yuying Chen, Lei Tai, Haoyang Ye, Ming Liu, and Bertram E Shi. A gaze model improves autonomous driving. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, page 33. ACM, 2019.
Faraz Torabi, Garrett Warnell, and Peter Stone. Behavioral cloning from observation. In IJCAI, pages 4950– 4957. AAAI Press, 2018.
Michael F Land. Vision, eye movements, and natural behavior. Visual neuroscience, 26(1):51–62, 2009.