For Memorial Day weekend, three pieces of neuroscience research relevant to the military (and with applications beyond it):
1) Navy seeks to map the mind
On brain-computer interface technology –
The true goal is to make a vehicle or a robot arm just another extension of the human body and brain.
2) PTSD Combat Veterans’ ‘Fear Circuitry’ In Brains Always On High Alert
Even when an individual with PTSD isn’t confronted by a threat or a relatively taxing mental activity, there’s still PTSD-related activity in certain areas of the brain. What does this mean?
3) Professor finds neuroscience provides insights into brains of complex and adaptive leaders
What do the brains of great leaders look like? Is there really a way to increase leadership strength via neuro-feedback?
1) Real-life Avatar: The first mind-controlled robot surrogate
Tirosh Shapira, an Israeli student, controlled the movements of a small robot over a thousand miles away using only his thoughts.
The fMRI (functional magnetic resonance imaging) reads his thoughts, a computer translates those thoughts into commands, and then those commands are sent across the internet to the robot in France. The system requires training: On its own, an fMRI can simply see the real-time blood flow in your brain (pictured below right). Training teaches the system that a particular “thought” (blood flow pattern) equates to a certain command.
Some of the future uses for such technology are medical (for people who have suffered paralysis for example) and military.
Shapira mentions in the article that he “became one with the robot.” How would we come to feel about these robotic extensions of ourselves? If they get damaged or destroyed, would we feel as if a part of us had been killed, or after some disappointment would we settle for any replacement?
2) Mind-controlled robot arms show promise
Using implants to record neuronal activity in parts of the brain associated with the intention to move, researchers were able to help two people with tetraplegia manipulate a robotic arm by thinking about certain actions (e.g. lifting up a cup).
The challenge lies in decoding the neural signals picked up by the participant’s neural interface implant — and then converting those signals to digital commands that the robotic device can follow to execute the exact intended movement. The more complex the movement, the more difficult the decoding task.
This is amazing work.