Berkeley’s Bob Knightdiscussed (and demonstrated) decoding language from direct brain recordings at ApplySci’s recent Wearable Tech + Digital Health + Neurotech Silicon Valley at Stanford: Join ApplySci at the 9th Wearable Tech + Digital Health +...
Berkeley’s Bob Knightdiscussed (and demonstrated) decoding language from direct brain recordings at ApplySci’s recent Wearable Tech + Digital Health + Neurotech Silicon Valley at Stanford: Join ApplySci at the 9th Wearable Tech + Digital Health +...
Stanford’s Zhenan Bao has developed an artificial sensory nerve system that can activate the twitch reflex in a cockroach and identify letters in the Braille alphabet. Bao describes it as “a step toward making skin-like sensory...
Rice University’s Jacob Robinson, with Yale and Columbia colleagues, are developing FlatScope — a flat, brain implanted microscope capable of monitoring and triggering neurons modified to be fluorescent when active. While capturing greater detail than...
Ipsihand, developed by Eric Leuthardt and Washington University colleagues, is a brain controlled glove that helps reroute hand control to an undamaged part of the brain. The system uses a glove or brace on the...
Bolu Ajiboye and Case Western colleagues used an implanted BrainGate2 brain-computer interface to allow a tetraplegia patient to control arm movements via an implanted FES muscle stimulation system. A robotic arm, which was needed...
University of Oldenburg student Carlos Filipe da Silva Souto is in the early stages of developing a brain computer interface that can advise a user who he/she is listening to in a noisy room. Wearers...
University of Minnesota professor Bin He has created a brain computer interface to control a robotic arm with out an implant. In a recent study, EEG alone was used to allow 8 people to move objects in...