Quantcast

Oral hapsis guides accurate hand preshaping for grasping food targets in the mouth.

Research paper by Jenni M JM Karl, Lori-Ann R LA Sacrey, Jon B JB Doan, Ian Q IQ Whishaw

Indexed on: 12 Jul '12Published on: 12 Jul '12Published in: Experimental Brain Research



Abstract

Preshaping the digits and orienting the hand when reaching to grasp a distal target is proposed to be optimal when guided by vision. A reach-to-grasp movement to an object in one's own mouth is a natural and commonly used movement, but there has been no previous description of how it is performed. The movement requires accuracy but likely depends upon haptic rather than visual guidance, leading to the question of whether the kinematics of this movement are similar to those with vision or whether the movement depends upon an alternate strategy. The present study used frame-by-frame video analysis and linear kinematics to analyze hand movements as participants reached for ethologically relevant food targets placed either at a distal location or in the mouth. When reaching for small and medium-sized food items (blueberries and donut balls) that had maximal lip-to-target contact, hand preshaping was equivalent to that used for visually guided reaching. When reaching for a large food item (orange slice) that extended beyond the edges of the mouth, hand preshaping was suboptimal compared to vision. Nevertheless, hapsis from the reaching hand was used to reshape and reorient the hand after first contact with the large target. The equally precise guidance of hand preshaping under oral hapsis is discussed in relation to the idea that hand preshaping, and its requisite neural circuitry, may have originated under somatosensory control, with secondary access by vision.