👋 Un equipo científico ha desarrollado un dispositivo que se lleva en la muñeca y que permite a los usuarios interactuar con computadoras mediante gestos con las manos.https://t.co/UOFtpyU6WA
"In the past decade or so, scientists have found that Wi-Fi signals can be used for various sensing applications, such as seeing through walls, detecting falls, sensing the presence of humans, and recognizing gestures including sign language." Reminds me of Asimov Robots series
👇‼️Meta very smartly acquired this from Lux family co CTRL-Labs https://t.co/tRwO2hBxaO
Meta’s Reality Labs researchers have unveiled a prototype wristband that translates electrical signals from forearm muscles into computer commands, allowing users to move a cursor, open applications or write in mid-air without touching a screen. The non-invasive device, detailed in a peer-reviewed paper in the journal Nature, relies on surface electromyography sensors and artificial-intelligence models that interpret muscle activity in real time and transmit inputs wirelessly over Bluetooth. The decoding system was trained on data collected from about 10,000 volunteers, enabling the wristband to work immediately for new users without extensive calibration. In laboratory tests, the prototype recognized handwriting gestures at an average 20.9 words per minute and supported a range of point-and-click interactions. Meta has released more than 100 hours of anonymized sEMG recordings from 300 participants to spur further research. Meta is collaborating with Carnegie Mellon University to evaluate the technology with people who have spinal-cord injuries, positioning the wristband as a potential assistive tool for those with limited mobility. By offering a camera-free, surgery-free interface, the project provides an alternative to more invasive efforts such as Elon Musk’s Neuralink and underscores Meta’s broader push to develop next-generation input methods for augmented and virtual-reality devices.