Journal List > J Korean Med Sci > v.32(8) > 1108450

Lee, Choi, Lee, Cho, Ahn, Kim, Lee, and Jang: Decoding Saccadic Directions Using Epidural ECoG in Non-Human Primates

Abstract

A brain-computer interface (BCI) can be used to restore some communication as an alternative interface for patients suffering from locked-in syndrome. However, most BCI systems are based on SSVEP, P300, or motor imagery, and a diversity of BCI protocols would be needed for various types of patients. In this paper, we trained the choice saccade (CS) task in 2 non-human primate monkeys and recorded the brain signal using an epidural electrocorticogram (eECoG) to predict eye movement direction. We successfully predicted the direction of the upcoming eye movement using a support vector machine (SVM) with the brain signals after the directional cue onset and before the saccade execution. The mean accuracies were 80% for 2 directions and 43% for 4 directions. We also quantified the spatial-spectro-temporal contribution ratio using SVM recursive feature elimination (RFE). The channels over the frontal eye field (FEF), supplementary eye field (SEF), and superior parietal lobule (SPL) area were dominantly used for classification. The α-band in the spectral domain and the time bins just after the directional cue onset and just before the saccadic execution were mainly useful for prediction. A saccade based BCI paradigm can be projected in the 2D space, and will hopefully provide an intuitive and convenient communication platform for users.

Graphical Abstract

jkms-32-1243-ab001.jpg

INTRODUCTION

Brainstem stroke, traumatic brain injury, and neurodegenerative disorders can result in locked-in syndrome, characterized by near-total paralysis despite intact cognitive function. A brain-computer interface (BCI) can be used as an alternative interface for such patients to restore some communication with the outside world because this system does not depend on peripheral nerves and muscles but directly uses brain signals to assist or repair sensory-motor functions (123). Unlike the traditional input machines (keyboard, mouse, stylus pen, etc.), the BCI reads the brain signals, translates these signals into actions, and commands that can control the computer. BCI system appears as a promising communication platform for persons suffering from severe paralysis (i.e., persons suffering from amyotrophic lateral sclerosis).
BCI has been intensively studied during the last decades by numerous researchers, and scientific and technological advances have accelerated the development of BCI. There are different techniques to record brain activity by BCIs, and different approaches have been utilized to translate brain signals into the output actions of an external effector (123). Non-invasive BCI, especially that based on electroencephalogram (EEG) signals, has been developed to implement control over external devices. For instance, the analysis of modulations from the P300 has led to highly accurate decoding of the letters on the screen to which a subject has attended (4). However, the main shortcoming of this technique is the classification accuracy of BCI because of the low spatial resolution and low signal-to-noise ratio (SNR). On the other hand, invasive methods can perform the decoding much more accurately since there is both an increase in spatial resolution and the SNR. The BCIs using spiking signals from multiple neurons recorded with intracortical electrode implants showed the best performance (567); however, recording from intracortical implants can be unstable due to the response of brain tissue to the implant (8) and changes of the neuronal activity-behavior relationship across time (9). Thus, intracortical recording based BCI currently seems to have significant limitations for long-term application. A useful alternative to intracortical neural signals, is the electrocorticogram (ECoG). A number of studies have shown that various movement parameters can be decoded using ECoG (1011). However, most of these BCI studies have focused on arm movement or motor imagery, and so, to increase the usefulness and applicability of the ECoG BCI, it is necessary to diversify the types of control signals that can be used to discriminate between multiple classes.
In this study, we adopted a simple eye movement task to use in attempting to predict a monkey's saccadic eye movement direction using epidural ECoG (eECoG). Since eye movements can be directly mapped to 2-dimensional space, the prediction and decoding of eye movements may be useful as control signals for BCI, allowing for interaction with the environment. Several studies reported the BCI system based on the eye movements, however, most of them decoded the ocular muscle's activity not a brain signal or used the eye tracker to tract the gaze movement (1213). Here, we used a support vector machine (SVM) as a classifier and successfully decoded the four possible saccadic directions. We also quantified the feature's contribution ratio to quantify which feature is more dominantly used for prediction.

MATERIALS AND METHODS

Subject and surgical procedures

Two adult male rhesus monkeys (Macaca mulatta, M14 and M5) were recruited for the experiments. The monkeys were housed by paired way and the cage size was followed the Guide for the Care and Use of Laboratory Animals. The temperature was maintained at 24°C ± 4°C and humidity was maintained at 50% ± 10%. The light was controlled as 12 hours for day and 12 hours for night. For the surgery, the monkeys were prepared with sterile, anesthetic surgical procedures. A licensed veterinarian was present throughout surgery to induce anesthesia and to monitor and record all measured physiological variables. One hour before the surgery, the animal was intramuscularly (IM) injected with atropine sulfate (0.08 mg/kg) to prevent excessive salivation during surgery. One-half hour later, it was sedated with tiletamine-zolazepam (Zoletil®, 10 mg/kg, IM; Virbac Corporation, Carros, France), intubated, and placed under isoflurane anesthesia. A saline drip was maintained through an intravenous catheter placed into a leg vein. Throughout the surgery, body temperature, heart rate, blood pressure, oxygen saturation, and respiratory rate were continuously monitored. The primates were then placed in a stereotaxic frame, the scalp was incised, and a craniotomy of 2.5 cm radius was performed, but the dura was left intact.
Two monkeys were implanted with 2 customized multichannel ECoG electrode arrays containing 32 gold electrodes (300 μm in diameter) with an inter-electrode distance of 3 mm in the epidural space chronically (Fig. 1A). The rectangular type (4 by 8) of the electrode patch was implanted in the left hemisphere for monkey 14, covering the superior parietal cortex including the intraparietal sulcus (IPS) and a portion of the frontal cortex, including the frontal eye field (FEF) and supplementary eye field (SEF). For monkey 5, the circular type of electrode patch (14) was inserted along the central line. The electrodes covered the bilateral FEF, SEF, and superior part of the parietal cortex. A head restraint device was also implanted for each monkey. Finally, the bone flap was replaced on top of the implant and sealed with dental cement, maintaining the implant in position.
Fig. 1
Electrode position and CS task. Two types of electrode patches were implanted into the monkey's cerebral cortex. (A) The rectangular type (4 by 8) of the electrode patches were implanted in the left hemisphere for monkey 14, and the circular type (32 channels) of electrode patches were inserted along the central line for monkey 5. Normal channels are shown as yellow circles, and bad channels are shown as gray circles. (B) The associations between color and spatial location shown were pre-trained before the inactivation experiments. (C) The visual events in the trial are schematically depicted along the time line: the appearance of the fixation target at the center, display of four alternative gray targets in the periphery, the onset of a color cue at the center, the color-cue turning-off signaling when to make the saccade, and the saccade to the color-matched target.
CS = choice saccade.
jkms-32-1243-g001

Behavioral task

The monkey was trained to perform a choice saccade (CS) task (15). Fig. 1B shows the pre-trained location and color association; for example, a red colored dot in the upper part of the figure is associated with the direction in which the monkey should move its eyes when the white dot in the center of the panel is changed to red. The trial began when the animal fixed its eyes at the central white dot shown in Fig. 1C. Subsequently, 4 white target dots appeared in 4 peripheral visual fields. The target dots were present 7 degrees away from the central fixation points. After 400 or 600 ms, the central disc changed to 1 of the 4 colors associated with a particular target location. After an additional 700 or 500 ms (i.e., 1,100 ms after the alternative spot onset), the central disc disappeared, providing the cue for the animal to make a saccade response. The mapping between color and location was held constant throughout the training and experiments. We collected ECoG data during 3 experimental sessions from each monkey. Each session included 200 trials per direction (800 trials total), on average. The minimum number of trials was 189.

Data analysis

In the experiment, the monkey was seated in a custom-made primate chair facing the visual screen with head movement restricted and a water reward system. The saccadic behavioral response was monitored by an eye tracker with a sampling rate of 500 Hz (Eyelink2; SR Research Ltd., Kanata, Canada). Saccade behavior was measured off-line using programs written in MATLAB (The Mathworks, Natick, MA, USA). The onset and offset of the saccades were determined by the velocity criteria (30°/s radial velocity for onset and 10°/s for offset).
Electrical recordings were started one week after surgery. ECoG signals were digitized at a sampling rate of 512 Hz (Brainbox EEG-1164 amplifier; Braintronics B. V., Almere, the Netherlands). As visual inspection, we rejected the trials that contain the severe motion artifact caused by monkey's fiery movement. The monkey's movement induced the trembling of the data cable, thus the noise was contaminated. The rejected trials were about 3–4 in 200 total trials. Additionally, channels that did not clearly contain ECoG signals (e.g., such as channels that contained flat signals or noise due to broken connections) were removed prior to analysis. Overall, these procedures reduced the total number of channels to 57 in monkey 14 and 56 in monkey 5 (Fig. 1A). Signals were band-pass filtered from 1 Hz to 200 Hz and re-referenced using the common average reference (CAR). The filter was 2-way least-squares finite impulse response (FIR) filter provided by EEGLAB (16). The length of the filter was 3,000 sample point (i.e., 3*[sampling rate/low-cut frequency]). Then, the independent component (IC) analysis decomposition was conducted to remove artifacts. We rejected the component which has a short, high-amplitude and single-electrode offset and we also tested a kurtosis value of the ICs which measures the peakedness of data (17). The analysis epoch was extracted as 2 conditions, target on (TG) and saccade start (SS). In the TG condition, we aligned the signal to the color cue changing time and extracted the epoch −600 ms to 750 ms. In the SS condition, the epoch was extracted as −1,100 ms to 450 ms relative to the saccade execution time.
Time-frequency representation of the ECoG signals for each electrode was generated by the Morlet wavelet transformation (8–100 Hz). The length of the wavelets was 5 cycles of the lowest frequency (i.e., 625 ms). For each channel and each 100 ms time period, the normalized average power spectral densities were computed in 4 different frequency bands; the α-band (8–13 Hz), β-band (18–26 Hz), low γ-band (30–50 Hz), and high γ-band (70–100 Hz) (18-20). Then, we selected the data in four time bins after the TG in the TG condition and 5 time bins before the SS time in the SS condition. The power values of each frequency bin (4), time window (4 for TG and 5 for SS) and channel (57 for M14 and 56 for M5) were used as input features (a feature vector) for classification analysis. The feature vector obtained from each trial was labeled by the target direction.

Classification

We decoded the brain signal with 3 different comparison types: 4 for each direction (4d), left vs. right (LR, left: 135° + 225°, right: 45° + 315°) and top vs. bottom (TB, top: 45° + 135°, bottom: 225° + 315°). For classification, we used a SVM (21) implemented in the LibSVM toolbox (National Science Council of Taiwan, Taichung, Taiwan) (22) with a linear kernel. The dimensionality of the feature vectors was reduced by adopting SVM-based recursive feature elimination (SVM-RFE) (2324). We ranked the features by the weights value and selected the top features above a +2-standard deviation. To estimate accuracies, we used a 10-fold cross-validation in which the data was permuted and partitioned into 10 blocks of equal size. In each of the 10 folds, 9 blocks were used for training the classifier and tested on the 1 remaining block. Each block was used for testing once.
Based on Kübler et al. (25), we set the threshold for correct responses at 70% for the 2 classes because verbal communication with a language support program is possible at that level.

Spatio-spectro-temporal contributions

To calculate the directional information that could be extracted from each cortical area, each sub-band and each time bin in the eECoG signal, we quantified the spatio-spectro-temporal contribution of brain activity for predicting each target direction. We calculated the weight value's ratio from the features' weight magnitude which was derived from adopting the SVM-RFE algorithm (26). Three different contents were calculated from wch, wfreq, and wtime, which is the weight derived from the SVM-RFE at electrode ‘ch,’ frequency ‘freq,’ and time bin ‘time’ in each decoding model.
The spatial contribution Ws(ch) of each recording electrode ‘ch’ was quantified based on the ratio of the value of the weight for the frequency bin and the time lag in the recording electrode to the total weight of all the frequency bins and time lags (Equation 1). The spectral contribution Wf(freq) of each frequency band ‘freq’ and the temporal contribution Wt(time) of each time bin ‘time’ (Equations 2 and 3) were also calculated in the same way as the spatial contribution. Equation 1 quantifies the contribution ratio of each channel for predicting the directions across all the frequency and time bins and Equations 2 and 3 quantify the contribution ratio of each frequency band and time bin, respectively. These values can be interpreted as how each feature contributes to the decoding performance.

Ethics statement

The study was performed after receiving approval of the Institutional Animal Care and Use Committee (IACUC) in Seoul National University Hospital (IACUC approval No. 13-0314).

RESULTS

We trained the monkeys to make a saccade movement according to the color of the cue. The monkeys responded correctly to the target directions in 89% and 91% of trials on average across three sessions, for M14 and M5, respectively.

Decoding accuracy

The decoding performances were significantly higher than chance level for both monkeys (Fig. 2). For M14, the average prediction accuracies over the total sessions were 77.0%, 82.7%, and 41.6% for the LR, TB, and 4d classification conditions, respectively (Fig. 2A-C). For M5 (Fig. 2D-F), the accuracies were 78.9%, 86.5%, and 45.7% for the LR, TB, and 4d, respectively (Table 1). These values were significantly higher than chance level, 50% for 2 classes and 25% for 4 classes. There was no difference between the LR and TB condition and between sessions.
Fig. 2
Decoding accuracy. The decoding accuracy was calculated in three conditions (left vs. right, top vs. bottom, and 4 directions), 3 sessions, and 2 epoch types (TG and SS). The decoding performance was significantly higher than the chance level (50% for 2 classes and 25% for 4 classes) for M14 (A, B, C) and M5 (D, E, F), respectively. The horizontal line means the chance level.
TG = target on, SS = saccade start.
jkms-32-1243-g002
Table 1

Decoding accuracy

jkms-32-1243-i001
Subject Condition Target on Saccade start
Session 1 Session 2 Session 3 Average(SD) Session 1 Session 2 Session 3 Average(SD)
M14 left vs right 81 74 72 76(4.73) 76 78 77 77(1.00)
top vs bottom 78 78 79 78(0.58) 86 82 80 83(3.06)
4 directions 45 39 38 41(3.79) 44 42 39 42(2.52)
M5 left vs right 78 84 73 78(5.51) 75 84 77 79(4.73)
top vs bottom 82 87 81 83(3.21) 86 86 87 86(0.58)
4 directions 42 43 43 43(0.58) 43 49 45 46(3.06)
We also calculated the confusion matrix to see the decoding tendency. As you can see in Fig. 3, prediction of 45° was more accurate than other degrees for both monkeys. In contrast, the opposite direction, at 225°, had the lowest accuracy.
Fig. 3
Confusion matrix on the condition of the “TG” and “SS.” The confusion matrix of the actual 4 directions (45°, 135°, 225°, 315°) and the predicted directions was calculated for 2 epoch types for Monkey 14 (A, B) and Monkey 5 (C, D), respectively.
TG = target on, SS = saccade start.
jkms-32-1243-g003

Weight contribution ratio

In the spatial contribution ratio, the electrodes over the FEF, SEF, and superior parietal lobule (SPL) areas, which are known as the oculomotor area, showed a higher contribution ratio than the other electrodes for both monkeys. There is the left dominancy of the contribution for M5, which means that the electrodes in the left hemisphere play a more important role in discrimination. In the spectral contribution, the α-band was used as the dominant feature for decoding the saccadic direction for both monkeys in both time epochs. In the TG period, the signal from the time bin (100 to 200 ms) was mainly used to determine the saccadic direction and gradually decreased as time passed. In contrast, in the SS condition, the temporal contribution ratio gradually increased with proximity to the saccadic onset time, that is, the data just before the saccade onset is the most informative for decoding.

DISCUSSION

In the present study, we developed a BCI paradigm using eECoG signals related to eye movement and successfully decoded saccadic directions before saccade execution time. Across three sessions, the decoding accuracies were maintained above the chance level for both monkeys.
As shown in Fig. 4, the electrodes over the FEF, SEF, and SPL showed a higher significant contribution for decoding saccadic directions. These results were consistent with previous neuroscientific findings that neurons in the brain area known as the oculomotor area are mainly related to saccadic movement planning, preparation and execution (1527). Especially for M5, left hemispheric bias was shown in the spatial contribution ratio, in other words, the contribution magnitude of the electrodes in the left hemisphere was higher than in the right hemisphere. This seems to indicate that left hemispheric activity is more crucial for decoding eye movement directions, while activities in the right hemisphere are less important for classification. However, these inter-hemispheric imbalances resulted in no difference of decoding accuracy between M14 and M5. Therefore, more evidence is required to provide for a more concrete explanation.
Fig. 4
Spatial-spectro-temporal contribution ratio. (A-D) The channel contribution weight is presented as a spatial map. (E-H) The frequency contribution was presented as the weight ratio of each frequency band: α, β, low γ, and high γ. (I-L) The time contribution weight ratio graph. Error bars indicates SEM.
SEM = standard error of the mean.
jkms-32-1243-g004
In the spectral domain, the α-band was dominantly used for classification. This may be related to the oscillatory multiplexing according to functions. Several studies reported that the α frequency band is hypothesized source of variability in saccadic response latency and control oscillations (2829). Cortical and thalamocortical α oscillations is known to reflect cyclic fluctuation between low and high excitability neuronal states (3031). Schizophrenia patients are generally characterized to depart from the healthy persons on the proportion of fast reaction time saccades (3233) and anti-saccade error rate (3334). Interestingly, EEG α oscillation dynamics during resting state and stimulus evoked are abnormal in schizophrenia patients (35). Kelly et al. (36) also reported that when using the α-band power of the parieto-occipital area as the feature, they classified the spatial attention direction. Similarly, van Gerven and Jensen (37) used the α-band power over the parietal area to discriminate the cover spatial attentional direction. Consistent with previous studies, our results also showed that the oscillation of α frequency band is crucial for the discrimination of saccadic direction.
In the temporal contributions, the activity after the target onset and before the saccade execution were the highest among the time bins and the contribution values were gradually decreased as the time bin is farther from the cueing time and the saccadic execution. This result suggests that the brain signal just after the target onset and just before the movement execution are most informative compared with other time bins, in other words, the brain signal after target on and before the execution contains a suitable information for differentiating eye direction and such information decreases as time passes. Although it is a bit baffling to simply interpret the BCI result into neuroscientific view, the brain signal relates to a saccade planning or an allocation of covert spatial attention can be decoded during this interval (38).
Because of less invasiveness from non-penetrating characteristics of epidural grid electrodes, we selected epidural type of ECoG as the recording method. However, in macroelectrodes, neural firings are summated as a field potential and some information can be lost from lower spatial resolution than microelectrode recordings. In addition, eECoG instead of subdural positions, some signals may be lost by dura. Thus, the performance of directional predictions was limited especially in the 4d classification.
The saccade based BCI paradigm has several potential advantages over conventional BCI systems which use the brain signal related to motor imagery or arm movement signal. First, there is direct mapping between the eye movement space and the 2D cursor system. Single eye movements can be projected in the 2D space, allowing it to be specialized for 2D cursor control or alternative target selection. To be practical, BCI systems need to provide users with a sufficient level of accuracy and control, but also need to be easy to use and stable over the long term. From this point of view, eye movement based BCI may provide an intuitive and convenient communication platform for users. Second, eye movement is tightly linked with top-down attentional systems. Several previous studies have reported an association between saccadic eye movement and spatial attentional allocation (3940). Therefore, since decoding eye movements can be interpreted as decoding where the subject places attention, a saccade based BCI supposedly provide several advantages for various applications. Additionally, saccade BCI might be more advantageous than direct eye tracking method, because saccade BCI can predict the eye movement's direction before actual movement execution. Therefore, suggested saccade BCI can exhibit faster performance than eye tracking system.
As conclusion, in this paper, we developed a saccade based BCI paradigm in non-human primates using eECoG signals. We demonstrated that brain activity related to saccadic eye movement can serve as a control signal for BCI that does not require much training time and is easy to use. Our future aim is to advance the paradigm in actual on-line BCI applications for real-time uses.

Notes

Funding: This research was supported by the Brain Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (2016M3C7A1904987) and supported by a grant of the Korea Health Technology R & D Project through the Korea Health Industry Development Institute (KHIDI), funded by the Ministry of Health and Welfare, Republic of Korea (grant number: HI14C3229).

DISCLOSURE: The authors have no potential conflicts of interest to disclose.

AUTHOR CONTRIBUTION: Formal analysis: Lee J, Choi H, Lee S. Investigation: Cho BH, Ahn KH. Supervision: Jang DP. Validation: Kim IY, Lee KM. Writing - original draft: Lee J, Choi H. Writing - review & editing: Kim IY, Lee KM.

References

1. Wolpaw JR, Birbaumer N, McFarland DJ, Pfurtscheller G, Vaughan TM. Brain-computer interfaces for communication and control. Clin Neurophysiol. 2002; 113:767–791. PMID: 12048038.
2. Lebedev MA, Nicolelis MA. Brain-machine interfaces: past, present and future. Trends Neurosci. 2006; 29:536–546. PMID: 16859758.
3. Waldert S, Pistohl T, Braun C, Ball T, Aertsen A, Mehring C. A review on directional information in neural signals for brain-machine interfaces. J Physiol Paris. 2009; 103:244–254. PMID: 19665554.
4. Mak JN, Arbel Y, Minett JW, McCane LM, Yuksel B, Ryan D, Thompson D, Bianchi L, Erdogmus D. Optimizing the P300-based brain-computer interface: current status, limitations and future directions. J Neural Eng. 2011; 8:025003. PMID: 21436525.
5. Santhanam G, Ryu SI, Yu BM, Afshar A, Shenoy KV. A high-performance brain-computer interface. Nature. 2006; 442:195–198. PMID: 16838020.
6. Velliste M, Perel S, Spalding MC, Whitford AS, Schwartz AB. Cortical control of a prosthetic arm for self-feeding. Nature. 2008; 453:1098–1101. PMID: 18509337.
7. Ifft PJ, Shokur S, Li Z, Lebedev MA, Nicolelis MA. A brain-machine interface enables bimanual arm movements in monkeys. Sci Transl Med. 2013; 5:210ra154.
8. Bjornsson CS, Oh SJ, Al-Kofahi YA, Lim YJ, Smith KL, Turner JN, De S, Roysam B, Shain W, Kim SJ. Effects of insertion conditions on tissue strain and vascular damage during neuroprosthetic device insertion. J Neural Eng. 2006; 3:196–207. PMID: 16921203.
9. Dickey AS, Suminski A, Amit Y, Hatsopoulos NG. Single-unit stability using chronically implanted multielectrode arrays. J Neurophysiol. 2009; 102:1331–1339. PMID: 19535480.
10. Schalk G, Miller KJ, Anderson NR, Wilson JA, Smyth MD, Ojemann JG, Moran DW, Wolpaw JR, Leuthardt EC. Two-dimensional movement control using electrocorticographic signals in humans. J Neural Eng. 2008; 5:75–84. PMID: 18310813.
11. Pistohl T, Schulze-Bonhage A, Aertsen A, Mehring C, Ball T. Decoding natural grasp types from human ECoG. Neuroimage. 2012; 59:248–260. PMID: 21763434.
12. Frisoli A, Loconsole C, Leonardis D, Banno F, Barsotti M, Chisari C, Bergamasco M. A new gaze-BCI-driven control of an upper limb exoskeleton for rehabilitation in real-world tasks. IEEE Trans Syst Man Cybern C Appl Rev. 2012; 42:1169–1179.
13. Zander TO, Kothe C, Jatzev S, Gaertner M. Enhancing human-computer interaction with input from active and passive brain-computer interfaces. In : Tan DS, Nijholt A, editors. Brain-computer Interfaces: Applying Our Minds to Human-computer Interaction. London: Springer;2010. p. 181–199.
14. Baek DH, Lee J, Byeon HJ, Choi H, Young Kim I, Lee KM, Jungho Pak J, Pyo Jang D, Lee SH. A thin film polyimide mesh microelectrode for chronic epidural electrocorticography recording with enhanced contactability. J Neural Eng. 2014; 11:046023. PMID: 25024292.
15. Lee KM, Ahn KH, Keller EL. Saccade generation by the frontal eye fields in rhesus monkeys is separable from visual detection and bottom-up attention shift. PLoS One. 2012; 7:e39886. PMID: 22761923.
16. Delorme A, Makeig S. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods. 2004; 134:9–21. PMID: 15102499.
17. Nolan H, Whelan R, Reilly RB. FASTER: fully automated statistical thresholding for EEG artifact rejection. J Neurosci Methods. 2010; 192:152–162. PMID: 20654646.
18. Mitra S, Nizamie SH, Goyal N, Tikka SK. Evaluation of resting state gamma power as a response marker in schizophrenia. Psychiatry Clin Neurosci. 2015; 69:630–639. PMID: 25854748.
19. Seeber M, Scherer R, Wagner J, Solis-Escalante T, Müller-Putz GR. High and low gamma EEG oscillations in central sensorimotor areas are conversely modulated during the human gait cycle. Neuroimage. 2015; 112:318–326. PMID: 25818687.
20. Ray S, Crone NE, Niebur E, Franaszczuk PJ, Hsiao SS. Neural correlates of high-gamma oscillations (60–200 Hz) in macaque local field potentials and their potential implications in electrocorticography. J Neurosci. 2008; 28:11526–11536. PMID: 18987189.
21. Cortes C, Vapnik V. Support-vector networks. Mach Learn. 1995; 20:273–297.
22. Chang CC, Lin CJ. LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol. 2011; 2:27.
23. Cho BH, Yu H, Kim KW, Kim TH, Kim IY, Kim SI. Application of irregular and unbalanced data to predict diabetic nephropathy using visualization and feature selection methods. Artif Intell Med. 2008; 42:37–53. PMID: 17997291.
24. Cho BH, Yu H, Lee J, Chee YJ, Kim IY, Kim SI. Nonlinear support vector machine visualization for risk factor analysis using nomograms and localized radial basis function kernels. IEEE Trans Inf Technol Biomed. 2008; 12:247–256. PMID: 18348954.
25. Kübler A, Nijboer F, Mellinger J, Vaughan TM, Pawelzik H, Schalk G, McFarland DJ, Birbaumer N, Wolpaw JR. Patients with ALS can use sensorimotor rhythms to operate a brain-computer interface. Neurology. 2005; 64:1775–1777. PMID: 15911809.
26. Chao ZC, Nagasaka Y, Fujii N. Long-term asynchronous decoding of arm motion using electrocorticographic signals in monkeys. Front Neuroeng. 2010; 3:3. PMID: 20407639.
27. Goldberg ME, Segraves MA. The visual and frontal cortices. Rev Oculomot Res. 1989; 3:283–313. PMID: 2486326.
28. Hamm JP, Dyckman KA, Ethridge LE, McDowell JE, Clementz BA. Preparatory activations across a distributed cortical network determine production of express saccades in humans. J Neurosci. 2010; 30:7350–7357. PMID: 20505102.
29. Hamm JP, Sabatinelli D, Clementz BA. Alpha oscillations and the control of voluntary saccadic behavior. Exp Brain Res. 2012; 221:123–128. PMID: 22782481.
30. Romei V, Brodbeck V, Michel C, Amedi A, Pascual-Leone A, Thut G. Spontaneous fluctuations in posterior α-band EEG activity reflect variability in excitability of human visual areas. Cereb Cortex. 2008; 18:2010–2018. PMID: 18093905.
31. Mathewson KE. Pulsed Out of Awareness: EEG Alpha Oscillations Represent a Pulsed Inhibition of Ongoing Cortical Processing. Urbana, IL: University of Illinois at Urbana-Champaign;2011.
32. McDowell JE, Clementz BA. Ocular-motor delayed-response task performance among schizophrenia patients. Neuropsychobiology. 1996; 34:67–71. PMID: 8904734.
33. Reilly JL, Lencer R, Bishop JR, Keedy S, Sweeney JA. Pharmacological treatment effects on eye movement control. Brain Cogn. 2008; 68:415–435. PMID: 19028266.
34. McDowell JE, Clementz BA. Behavioral and brain imaging studies of saccadic performance in schizophrenia. Biol Psychol. 2001; 57:5–22. PMID: 11454432.
35. Hong LE, Summerfelt A, Mitchell BD, O’Donnell P, Thaker GK. A shared low-frequency oscillatory rhythm abnormality in resting and sensory gating in schizophrenia. Clin Neurophysiol. 2012; 123:285–292. PMID: 21862398.
36. Kelly SP, Lalor EC, Reilly RB, Foxe JJ. Visual spatial attention tracking using high-density SSVEP data for independent brain-computer communication. IEEE Trans Neural Syst Rehabil Eng. 2005; 13:172–178. PMID: 16003896.
37. van Gerven M, Jensen O. Attention modulations of posterior alpha as a control signal for two-dimensional brain-computer interfaces. J Neurosci Methods. 2009; 179:78–84. PMID: 19428515.
38. Brooks JL, List A. Searching for the role of the frontal eye fields in the visual attention network. J Neurosci. 2006; 26:2145–2146. PMID: 16495440.
39. Sheliga BM, Riggio L, Craighero L, Rizzolatti G. Spatial attention-determined modifications in saccade trajectories. Neuroreport. 1995; 6:585–588. PMID: 7766869.
40. Inhoff AW, Radach R, Starr M, Greenberg S. Allocation of visuo-spatial attention and saccade programming during reading. In : Kennedy A, Radach R, Heller D, Pynte J, editors. Reading as a Perceptual Process. Oxford: Elsevier;2000. p. 221–246.
TOOLS
Similar articles