ASSOCIATE PROFESSOR - NMU
BlueSee is a sweet character reluctant to technology at first but as he is very curious, he is very eager to learn. Contexts around BlueSee has been created in relation to each project.
The effect of Visual, Haptic, and Auditory Signals Perceived from Rumble Strips during Inclement Weather
Rumble strips (RS) offer ideal conditions for multimodality research. Designed to reduce crashes and alert drowsy or inattentive drivers, their effectiveness in crash reduction is not questioned but little is known regarding how information from tactile vibrations and auditory rumbling is integrated during low-vision driving conditions. In this paper, we report descriptive data related to participants’ perceptual experience while driving on a RS road during a snow storm, as well as data collected from participants driving in a simulated snow storm environment, and suggest future research perspectives.
See Related publications: J32
How color affects temperature perception
The purpose of this study is to understand multisensory integration between the color and the temperature of an object when a conflict arises. It was hypothesized that when the information of color and temperature stimuli are incongruent (blue-warm or red-cold), reaction times (RTs) will be slower than when they are congruent (blue-cold or red-warm). We utilized the Oculus Rift, a head-mounted display to create a virtual environment that allowed us to control color cues of cup’s temperature and a Peltier thermo-device to provide tactile temperature stimuli. The results confirmed our initial expectation, as suggested by longer RTs for incongruent stimuli. The results also showed that participants rated cold temperature sensations warmer when presented simultaneously with a visual red color cue and warm temperature sensations cooler when presented simultaneously with a visual blue color cue.
See Related publications: A10, J29, J36
Ebbinghaus Illusion in the Tactile Modality
This project reports the first evidence for the existence of the Ebbinghaus illusion in the tactile modality. Participants were asked to explore bimanually two sets of Ebbinghaus circles while blindfolded. The results shows that the participants are more likely to be deceived when the illusory stimulus is present comparing to a stimulus that does not present the illusion. These results contribute to the perception-action debate and the two streams hypothesis that states that the pathways for action and perception are separated in the visual system.
See Related publications: J27
Multisensory Integration in Potters
We explored the effect of sensory deprivation (visual, auditory, or tactile) during the shaping of a cylindrical vessel on a potter’s wheel. Although, there was no significant results between the three types of deprivation, i.e. when one of the three senses was attenuate or deprived (test condition), the results showed significant difference in the final height of the cylinder between the control condition (all the senses were used) and the test condition.
See Related publications: A9, J26
We are currently investigating the phenomenon of haptic hallucinations, more specifically formication: the sensation of crawling insects on or beneath the skin that are not actually present. The phenomenon is often experienced by people suffering from Psychotic Disorders such as schizophrenia. We are planning to compare physiological responses in schizophrenic patients versus non-schizophrenic patients while wearing the sleeve and when being exposed to a visual stimulus of an insect crawling on or beneath the skin (by wearing a head-mounted display).
See Related publications: D3, J29
InGrid: Interactive Grid Table
InGrid, an Interactive Grid table, offers several affordances to the user that could not only interact with tangible and intangible objects but also with other users. It is based on the concepts of embodied cognition and personal space.
See Related publications: J25, J28
Haptics and BCI
Brain-Computer Interfaces or “BCI” were initially conceived as a means for individuals suffering from motor deficiencies to activate and control computers and electronic or mechanical devices solely by making use of brain activity. Several techniques have been developed to connect a BCI to a haptic interface: some consider BCI as direct control of a robotic or haptic interface, others adapt the BCI haptic interface in real-time to the user’s mental states. Such examples can be found in either virtual reality, video games or robotics.
See Related publications: W1, W2, W3, D5, J31, W4
The effect of virtual plucked string stiffness on loudness perception
In the present study, we tested whether the mechanics of a plucked string
affected how the sound it produced was perceived. To test this hypothesis, we simulated
the feel of a plucked string using a high fidelity haptic force-feedback device
and simultaneously simulated its acoustic emission. This way, we could
independently manipulate the two sensory inputs - how it felt and how it
sounded - together with physical correct haptic interaction and with accurate
synchronization. When the stiffness of the
string was low, the sound was perceived to be softer. Interestingly, this
effect was found only when the first string was less stiff than the second string
plucked during a comparison. The results are consistent with the inverse
effectiveness principle of multisensory integration.
See Related publications: J23
Time perception and the progress bar illusion
In the present study, we used a progress bar illusion that allows us to induce the illusion of fast and slow “apparent” motions while the speed of motion and distance covered is actually equivalent across all progress bars. Our results indicate that the quick steps used to produce the illusion of a faster bar progression led to an overestimation of time, whereas a progression in large steps produced slower apparent bar progression and thus created the illusion of a dilated time. We suggest that the perception of time depends on the nature of the stimulus rather than the speed of motion or the distance covered by the stimulus.
See Related publications: J24
Visual History and the Peripheral Perception of Color
is it that we perceive a richly colored world everywhere in our visual
field, even though the color signals provided by the peripheral retina
are quite poor? We propose: peripheral color perception is mediated by
past foveal visual experience. If we have previously viewed an object
in foveal vision, then the color information acquired at that time will
determine the perception of the object subsequently viewed in
periphery. If an object has never been viewed foveally, then its color
will necessarily be based on the low quality peripheral signals. But
these signals will be acquired ONLY if attention is directed to the
question of determining the color of the peripheral object.
See Related publications: coming soon
Reflections and the visuo-proprioceptive conflict
visuomotor system conflicts when confronted with visual deformations.
Although, it adapts very quickly, the first trials show higher number
of errors and reaction time. In this project, we investigated the human
spatial abilities of dealing with two kinds of visual deformations: a
reflection of 90° (simple mirror) and a reflection of 180° (double
mirror). We studied the participants’ performances during a
drawing/tracing task and 2D and 3D construction tasks. We believe that
the visuomotor mapping of reflection tasks can be compare to a “dynamic
mental rotation” when those involved spatial layout manipulation.
See Related publications: A8
Auditory-tactile temporal order judgments during active exploration
study addressed the effects of voluntary movement by the observer on
multisensory perception. Specifically, we looked at whether active arm
movements affect the sensitivity at which people can tell apart a pair
of sequential auditory and tactile stimuli, but also whether it might
cause a systematic shift in the perceptual temporal alignment of the
two sensory systems.
See Related publications: A4, J22
Tactile Suppression of Displacement
project provided evidence for the phenomenon of tactile suppression
of displacement that is equivalent to visual suppression of
displacement. We developed a tactile device that activates the stimuli
according to the fingers displacement on the display. Visual
of image displacement occurs during a blink or a saccade. By analogy,
tactile suppression of displacement occurs during time intervals when a
stimulus is in the gap between the two fingers. We are currently
investigating the neural activity associated with the concept of
“tactile suppression of displacement” by using event related potentials
See Related publications: J20, D2, A7, A6, A5, A3
Visuospatial Map Learning through Action on Cell phones
research examined the observer’s attention by comparing active learning
to passive learning in the context of spatial orientation on a mobile
phone. Specifically, we looked at whether motor action and auditory
cues affect and increase the human attention on the phone display.
People engaged in a localization and relational space learning task on
a cellphone in three different scrolling modes, active, marginally
active and passive following various conditions, without auditory cues,
continuous auditory cues and non-continuous auditory cues. There were
two main findings. First, active exploration modes (active and
marginally active) helped the participants to get their attention more
focused on the display during map navigation in the context of spatial
orientation learning task. The second finding was that there was no
benefit to spatial abilities from active exploration over passive
observation for extremely fast tempo for a continuous auditory sound.
See Related publications: J21
Information visualization, zoomable interfaces and haptic mobile devices
PhD work was related to zoomable user interfaces (ZUI), haptic perception and mobile devices. I
combined these three areas in order to identify and solve practical
design problems related to a haptic zoomable mobile interface. Indeed,
ZUIs allow the visualization of great quantity of information
on a limited space. This could be problematic with a WIMP display since in a
ZUI, the space is infinite in length and width. This allows the user
to operate infinite pans and zooms in order to navigate in this
multi-scales space. Implemented in a PDA, a ZUI can solve problems
related to the small size of the screens, but this navigation is always
incidental to a user’s disorientation (Desert Fog). In addition to a
ZUI, a tactile feedback can avoid this disorientation and reduce the
visual/auditory cognitive loads. Users’
performances were evaluated by suggesting solutions through experimentations and theory
(Ecology and Enaction).
See Related publications: J30, J19, J18, J17, J16, J15, J14, J13, J12, J11, J9, J8, J7, J6, J5, J4, J3, J1, D1, T2
Acuity and thresholds of human movements
my master, I intended to specify the perceptual limits by using a
sensory substitution device named Tactos. Blindfolded participants had
to recognize objects of different sizes by using a vibrotactile feedback and a stylus on a graphics tablet. Two acuity were defined: the standard acuity that corresponds to the
Snellen ratio and the spatial acuity that corresponds to the relative
deviation between the size of a receptor field (sensor) and the size of
the letter to identify.
See Related publications: J10, J2, A2, A1, T1