Skip to content
2000
Volume 4, Issue 2
  • ISSN: 2213-3852
  • E-ISSN: 2213-3860

Abstract

Background: When we use our fingers to explore the fine surface (spatial features smaller than 200 μm) of an object, the relevant temporal features are encoded by cutaneous mechanoreceptor afferents that are widely believed to lead to the perception of roughness. However, whether visual input influences the haptic perception of fine surfaces and how the haptic and visual modalities interact with each other are questions that remain unanswered. Objective: In the present study, fifteen healthy volunteers participated in a series of unimodal (haptic-haptic (HH) task and visual-visual (VV) task) and bimodal (haptic- haptic & visual (HHv) task and visual-visual & haptic (VVh) task) fine surface roughness estimation tasks. Methods: The subjects were asked to estimate the roughness of a test surface that they compared to a standard surface in the HH and VV tasks. In the HHv and VVh tasks, the task procedures were the same as those in the unimodal tasks, but both haptic and visual surfaces were presented simultaneously. Results: Our results suggest that both the visual and haptic roughness estimations were influence by information from the other modality. Conclusion: In conclusion, we propose that humans store a modality-independent and dimensionless quantity in the brain when estimating the roughness of a fine surface.

Loading

Article metrics loading...

/content/journals/nbe/10.2174/2213385204666160503161135
2016-06-01
2025-10-15
Loading full text...

Full text loading...

/content/journals/nbe/10.2174/2213385204666160503161135
Loading

  • Article Type:
    Research Article
Keyword(s): fine surface texture; haptic-visual crossmodal; texture perception; Touch; vision
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error
Please enter a valid_number test