Researchers Have Found Out That Auditing Affects Distant Visual Perception

736

Researchers Have Found Out That Auditing Affects Distant Visual Perception.

A research team from the National University of Quilmes (UNQ, as per its Spanish acronym) has found out that our brain gets and combines information from eyesight and auditing in order to determine the distance to a visual object.

The study has been recently published by a well-known magazine, Scientific Reports by Nature Publishing Group, and the members of said research team are currently working at the Laboratorio de Acústica y Percepción Sonora (LAPSo) (Lab of Acoustics and Sound Perception) which operates at UNQ Art Universitary School, together with members of Lab of Acoustics and Lighting (LAL) of Buenos Aires Province Scientific Research Committee (CIC-BA).

Los responsables del mismo fueron Ramiro Vergara, Pablo Etchemendy, Ezequiel Abregú, Esteban Calcagno y Manuel Eguía por parte del LAPSo, y Nilda Vechiatti y Federico Iasi por parte del LAL. El resultado es relevante por su potencial aplicación al desarrollo de entornos virtuales audiovisuales que, en los últimos tiempos, ha cobrado gran impulso. También es importante para comprender los procesos cerebrales que participan en el procesamiento y la combinación de la información proveniente de los diferentes sentidos.

The parties responsible for this research were Ramiro Vergara, Pablo Etchemendy, Ezequiel Abregú, Esteban Calcagno and Manuel Eguía, representing del LAPSo, and Nilda Vechiatti and Federico Iasi on the part of LAL. The result is significant due to its potential use for the development of audio-visual virtual backgrounds that, during last years, have become very important. It is also useful to understand brain processes which take part in the combination and processing of information coming from senses.

Research Development

When we see an object, our body has several sources of information to let the brain determine the distance to it. For example, the eyeball muscles get tight or relaxed depending on the distance to the observed object in order to let light rays be clearly focused in the retina. Furthermore, each eye gets slightly inclined to be in front of the object. Thus the image perceived by each of them is improved. The brain has a quite acute control of these changes in the visual organs. It may use said information to highly accurately estimate the distance to the object. And everything is natural; we cannot even notice the processes involved.

Due to vision high accuracy, it has been long believed that the other senses played a non significant or no role at all when processing information from our eyes. However, several research teams have recently found out that this is false. Different multi-sensory (or, “multimodal” according to scientific language) relationships have been found out and recorded during last years. The study carried out at UNQ is to confirm that this type of relationships exist in one aspect of our perception not previously studied under this perspective.

Researchers carried out experiments in two LAL chambers. One of them was performed in an “anechoic” chamber: this is a room designed to completely absorb the sound wave on walls, floor and roof. The other experiment was carried out in a “reverberation” chamber: this is a place covered with sound highly-reflective materials; thus, a hit or a loud applause may continue sounding during several seconds before getting extinguished.

During this research –which leader was Doctor Ramiro Vergara, a CONICET researcher working at UNQ- 80 volunteers were invited to participate in a visual perception experiment to estimate the distance to brilliant objects (some acrylic rectangles lightened with LEDs) located in fixed places in both rooms and in complete darkness. Most participants estimated that distances were longer in the reverberation chamber. During the experiment, they listened to a voice through a loudspeaker giving instructions      and to their own voice; a longer time passed until these sounds were extinguished in the reverberation chamber.  Furthermore, once the experiment was finished, participants were required to report the size of the chamber where they had had the experience. As this was carried out in a complete darkness, they could only get their hearing information in order to “imagine” the place where they were and which had not been previously known by them. All these persons reported that the reverberation chamber was larger than the anechoic one.

According to these results, researchers could get to the conclusion that, first, an observer can have a mental image of the surrounding space only by means of hearing information; in this case, it was coming from reverberation. Secondly, said mental image of the surrounding space may affect the visual information processing- related to the position of the observed object- by increasing or decreasing the distance perceived according to perception of the surrounding as a large or small space. That is to say, the information provided by eyesight (given by the position of eyes, muscles tension, etc.) may be interpreted taking into account the “image” of surrounding created through hearing.


BACK