选题: 20200219 A $399 device that translates brain signals into digital commands

sources/talk/20200219 A -399 device that translates brain signals into digital commands.md
This commit is contained in:
DarkSun 2020-02-20 01:05:57 +08:00
parent 6e526870f9
commit 6a1108c512

View File

@ -0,0 +1,62 @@
[#]: collector: (lujun9972)
[#]: translator: ( )
[#]: reviewer: ( )
[#]: publisher: ( )
[#]: url: ( )
[#]: subject: (A $399 device that translates brain signals into digital commands)
[#]: via: (https://www.networkworld.com/article/3526446/nextmind-wearable-device-translates-brain-signals-into-digital-commands.html)
[#]: author: (Patrick Nelson https://www.networkworld.com/author/Patrick-Nelson/)
A $399 device that translates brain signals into digital commands
======
Startup NextMind is readying a $399 development kit for its brain-computer interface technology that enables users to interact, hands-free, with computers and VR/AR headsets.
MetamorWorks / Getty Images
Scientists have long envisioned brain-sensing technology that can translate thoughts into digital commands, eliminating the need for computer-input devices like a keyboard and mouse. One company is preparing to ship its latest contribution to the effort: a $399 development package for a noninvasive, AI-based, brain-computer interface.
The kit will let "users control anything in their digital world by using just their thoughts," [NextMind][1], a commercial spinoff of a cognitive neuroscience lab claims in a [press release][2].
[[Get regularly scheduled insights by signing up for Network World newsletters.]][3]
The company says that its puck-like device inserts into a cap or headband and rests on the back of the head. The dry electrode-based receiver then grabs data from the electrical signals generated through neuron activity. It uses machine learning algorithms to convert that signal output into computer controls. The interaction could be with a computer, artificial-reality or virtual-reality headset, or [IoT][4] module.
"Imagine taking your phone to send a text message without ever touching the screen, without using Siri, just by using the speed and power of your thoughts," said NextMind founder Sid Kouider in a [video presentation][5] at Helsinki startup conference Slush in late 2019.
Advances in neuroscience are enabling real-time consciousness-decoding, without surgery or a doctor visit, according to Kouider.
One obstacle that has thwarted previous efforts is the human skull, which can act as a barrier to sensors. Its been difficult for scientists to differentiate indicators from noise, and some past efforts have only been able to discern basic things, such as whether or not a person is in a state of sleep or relaxation. New materials, better sensors, and more sophisticated algorithms and modeling have overcome some of those limitations. NextMinds noninvasive technology "translates the data in real time," Kouider says.
Essentially, what happens is that a persons eyes project an image of what they see onto the visual cortex in the back of the head, a bit like a projector. The NextMind device decodes the neural activity created as the object is viewed and sends that information, via an SDK, back as an input to a computer. So, by fixing ones gaze on an object, one selects that object. For example, a user could select a screen icon by glancing at it.
[][6]
"The demos were by no means perfect, but there was no doubt in my mind that the technology worked," [wrote VentureBeat writer Emil Protalinski][7], who tested a pre-release device in January.
Kouider has stated its the "intent" aspect of the technology thats most interesting; if a person focuses on one thing more than something else, the technology can decode the neural signals to capture that users intent.
"It really gives you a kind of sixth sense, where you can feel your brain in action, thanks to the feedback loop between your brain and a display," Kouider says in the Slush presentation.
Join the Network World communities on [Facebook][8] and [LinkedIn][9] to comment on topics that are top of mind.
--------------------------------------------------------------------------------
via: https://www.networkworld.com/article/3526446/nextmind-wearable-device-translates-brain-signals-into-digital-commands.html
作者:[Patrick Nelson][a]
选题:[lujun9972][b]
译者:[译者ID](https://github.com/译者ID)
校对:[校对者ID](https://github.com/校对者ID)
本文由 [LCTT](https://github.com/LCTT/TranslateProject) 原创编译,[Linux中国](https://linux.cn/) 荣誉推出
[a]: https://www.networkworld.com/author/Patrick-Nelson/
[b]: https://github.com/lujun9972
[1]: https://www.next-mind.com/
[2]: https://www.businesswire.com/news/home/20200105005107/en/CES-2020-It%E2%80%99s-Mind-Matter
[3]: https://www.networkworld.com/newsletters/signup.html
[4]: http://www.networkworld.com/cms/article/3207535
[5]: https://youtu.be/RHuaNDSxH0o
[6]: https://www.networkworld.com/article/3440100/take-the-intelligent-route-with-consumption-based-storage.html?utm_source=IDG&utm_medium=promotions&utm_campaign=HPE21620&utm_content=sidebar ( Take the Intelligent Route with Consumption-Based Storage)
[7]: https://venturebeat.com/2020/01/05/nextmind-is-building-a-real-time-brain-computer-interface-unveils-dev-kit-for-399/
[8]: https://www.facebook.com/NetworkWorld/
[9]: https://www.linkedin.com/company/network-world