Human society is plagued by a problem: the amount of information is constantly increasing, but our ability to process information does not develop at the same pace. The future smart eyewear offers solutions to the challenge – we can soon monitor the visual information flood and perception in everyday situations. This opens up possibilities to enhance brain health, neuroergonomics and well-being, and enables early diagnosis of various diseases.
People blink or move their eyes hundreds of thousands of times a day. From eye blinks and movements, we can identify parameters that contain enormous amount of information about human visual perception, health, cognitive state, and brain functions. Soon, smart eyewear will help us to utilize this information.
Thanks to the development of sensor technology, we can pack various sensors in small space: for instance, a wide range of biosignal sensors can already be integrated into eyeglass frames. With these so-called smart eyewears, we will be able to monitor e.g., eye movements and visual perception unobtrusively in a person's natural environment. This will very likely revolutionize our understanding of human visual perception in everyday situations.
Algorithms and machine learning are key tools
So far, eye movements and visual perception have been studied mainly in laboratory environment where a person stays still and performs a specified task. As technological development is gradually enabling reliable measuring of a moving person, we will be getting huge amount of new and objective information about human visual perception and eye movements. However, in order to utilize this information, we need new tools.
Visual perception and thus eye movements are different in carefully controlled laboratory conditions than in real-life situations. Before the eye movements of a freely moving person can be interpreted, we must develop new algorithms. They can be used to identify parameters that describe visual perception. In addition, we need methods based on machine learning to identify different cognitive states (such as mental load) from these parameters.
The transition from the laboratory to studying a moving person must be done in a controlled manner so that we can find the right features from the biosignals and understand what they can tell us. In addition, the visual environment around us as well as real-life situations affect people in different ways. Therefore, we need fast, light, and personalized algorithms, as well as efficient machine learning to develop them.
Virtual reality forms a bridge between fixed laboratory environment and real life – it provides a controlled environment for developing algorithms and methods. In virtual reality, we can fully control the visual environment and observe both the stimuli a person encounters and their effects, e.g., to the cognitive state. This is exactly what we have researched and developed at VTT in the EYES project funded by the Academy of Finland. Our work continues in the AWARE project, also funded by the Academy of Finland, which will continue until 2027, in cooperation with the University of Tampere and the University of Lapland.
Many opportunities in the health sector
Advances in eye movement research open up significant opportunities for applications in the health sector. Smart eyewear may be available to consumers in a few years. Their increasing use enables broader monitoring of eye movements, and the accumulated data can be used to identify, for example, neurodegenerative diseases or attention deficit disorders. In the future, the parameters identified from the eyes will provide information about our cognitive state, help manage the load on our brain and promote people's brain health.
Interested?
Learn more about our vision of measuring eye movements, studying visual perception, and evaluating a person's cognitive state in virtual reality.