MIT’s Subramanian Sundaramhas developed a sensor glove that identifies objects through touch. This could improve assistive robot performance and enhance prosthetic design.The cheap “scalable tactile glove” includes 550 tiny, pressure-capturing sensors. A neural network uses the data to classify objects and predict their weights. No visual input is required.
In a Nature paper, the system accurately detected objects, including a soda can, scissors, tennis ball, spoon, pen, and mug 76 percent of the time.
The tactile sensing sensors could be used in combination with traditional computer vision and image-based datasets to give robots a more human-like understanding of interacting with objects. The dataset also measured cooperation between regions of the hand during interactions, which could be used to customize prosthetics.
Similar sensor-based gloves used cost thousands of dollars and typically 50 sensors. The STAG glove costs approximately $10 to produce.
Click to view MIT video