Academic
Publications
Real-time social touch gesture recognition for sensate robots

Real-time social touch gesture recognition for sensate robots,10.1109/IROS.2009.5354169,Heather Knight,Robert Toscano,Walter D. Stiehl,Angela Chang,Yi

Real-time social touch gesture recognition for sensate robots   (Citations: 2)
BibTex | RIS | RefWorks Download
This paper describes the hardware and algorithms for a realtime social touch gesture recognition system. Early experiments involve a Sensate Bear test-rig with full body touch sensing, sensor visualization and gesture recognition capabilities. Algorithms are based on real humans interacting with a plush bear. In developing a preliminary gesture library with thirteen Symbolic Gestures and eight Touch Subtypes, we have taken the first steps toward a Robotic Touch API, showing that the Huggable robot behavior system will be able to stream currently active sensors to detect regional social gestures and local sub-gestures in realtime. The system demonstrates the infrastructure to detect three types of touching: social touch, local touch, and sensor-level touch. I. INTRODUCTION he physical nature of robots necessarily dictates that detecting different levels of touch is an important area of research. We define sensor-level touch as the robot's knowledge of the activation and location of each individual sensor. This helps the robot be aware of its physical boundaries. Sensor-level touch enables functional tasks such as robot grippers to operate safely by allowing the robot to sense when and where it had made contact with something. As robots become social actors with the ability to physically engage human bodies, we must develop a social touch taxonomy to describe the new realms of interaction. Social touch is defined as touch that contains social value. Prior work has demonstrated the detection of local touch sub-gestures with increased tactile resolution and gesture profiles, for detection of affective content. Local touch allows discrimination of a tickle from a poke. In this work, we attach a social value to touch at different body locations to determine symbolic touch, which posits that there is a locational significance to touch, in particular that of an anthropomorphic robot's body. Our hypothesis is that a body-awareness of touch, combined with the gesture profile of the touch, can allow a robot to detect the difference between a socially laden gesture (like a hug) and a local gesture (like a poke). This work unites the sensor level touch with the profiling of affective touch, to create a system that can infer social meaning from the contact between a human and a teddy-bear
Cumulative Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
    • ...There have been extensive studies to utilize touch recognition systems to various robot applications such as teaching robot motions intuitively [17], controlling a robot using tactile commands [18], understanding social behaviors of humans [19], conveying human’s emotional states [20] and establishing affective relationship between a robot and humans [21]...

    Young-Min Kimet al. A robust online touch pattern recognition for dynamic human-robot inte...

    • ...Using a teddy bear with 56 capacitive touch sensors, Knight et al distinguish “touch subgestures” (low level touches such as pet, stroke, pat and hold) and “symbolic gestures” (location-dependent, with contextual social meaning such as, for this teddy bear model, feeding, rocking, hug, head-pat etc) [6]...
    • ...Our GRE uses a probabilistic Markovian model with architectural similarities to [6]; the Creature has been assessed in an operational setup for recognition success of gestures analogous to [6]’s “subgesture” class...
    • ...Our GRE uses a probabilistic Markovian model with architectural similarities to [6]; the Creature has been assessed in an operational setup for recognition success of gestures analogous to [6]’s “subgesture” class...
    • ...The GRE is most similar to the system described in Knight et al [6], but the contrast is revealed in the systematic differences of the recognized gestures; Knight’s gestures are defined primarily by localization – “head-pat”, “foot-rub”, “side-tickle”...

    Jonathan Changet al. Gesture Recognition in the Haptic Creature

Sort by: