Brain-Machine Interface (BMI) System Identification Siddharth Dangi and Suraj Gowda BMIs decode...
-
Upload
sara-daniela-cole -
Category
Documents
-
view
226 -
download
0
description
Transcript of Brain-Machine Interface (BMI) System Identification Siddharth Dangi and Suraj Gowda BMIs decode...
Brain-Machine Interface (BMI) System Identification
Siddharth Dangi and Suraj Gowda• BMIs decode neural
activity into control signals for prosthetic limbs
• Aim to improve quality of life for severely disabled patients suffering from neurological injuries and disease
• Restore a human’s ability to move and communicate with the world
“Center-Out” Training Task• Monkey uses joystick to
move a cursor to targets
• Record neural firing rates and cursor kinematic data
• Train decoding algorithm using collected data to predict cursor kinematics
• Switch cursor control from joystick to decoder
Echo-State Network (ESN)• Problem – relationship
between neural signals and limb kinematics is highly nonlinear
• Idea – create a large, recurrent neural network with random weights
• Can be used to learn the input-output behavior of a nonlinear system
• Training connections inside the reservoir is difficult and computationally expensive
• Use supervised learning to train only the output layer weights
Kalman Filter-based methods• Adaptive Kalman filter
– Allow parameters to auto-adjust
– Stochastic gradient descent
• Standard model
• State prediction
Kinematic state at time t
Firing rates at time t
Gaussian noise variables
• Combined Kalman-ESN method– Weight estimates based on error variances
LMS and Wiener Filter
Wiener Filter– Rewrite model equation
by tiling collected data:
– Closed-form solution for weight matrix:
Least-Mean Squares (LMS)– Gradient descent solution
for weight matrix:
Kinematic state at time nFiring rates at time n Error term at time n
Filter weights
Model:
Performance Results
Prediction ResultsSimulation Parameters• Trained decoders for 100 seconds on training data• Measured Mean-Squared Error (MSE) and Correlation Coefficient (CC) on 40
seconds of new dataPrediction Method
Position MSE Velocity MSE Position CC Velocity CC
LMS Filter 0.2018 0.0182 0.844 0.874
Wiener Filter 0.1714 0.0439 0.789 0.718
Echo-State Network
0.1043 0.0291 0.841 0.745
Adaptive Kalman
0.0895 0.0254 0.907 0.770
Standard Kalman Filter
0.0526 0.0180 0.930 0.836
Combined Kalman-ESN
0.0464 0.0173 0.932 0.842
Classification of Neural State• Neuron firing rates signals can
be treated as (behavior-driven) state-space trajectories
• Experiment – use logistic regression to classify trajectories into higher-level states (e.g., planning vs. not planning)
• Classes:
• Logistic Regression Model:
• Online Estimation Algorithm
• 93.1% classification accuracy• All errors were “false alarm”
before/after planning periods
Conclusions• Ranking methods based on MSE of position predictions
shows that:– “Pure” linear regression models (LMS and Wiener) need
more training time to perform well– Kalman-based models that maintain a state-space/dynamics
model perform better than those that don’t– Combination of linear (Kalman) and nonlinear (ESN) methods
performs the best, and better than any single method alone