Meetings
Link to page

Results Board

EEGNet:

EEGNetWrapper( (eegnet): EEGNet( (block1): Sequential( (0): Conv2d(1, 8, kernel_size=(1, 64), stride=(1, 1), padding=(0, 32), bias=False) (1): BatchNorm2d(8, eps=0.001, momentum=0.01, affine=True, track_running_stats=True) (2): Conv2dWithConstraint(8, 16, kernel_size=(16, 1), stride=(1, 1), groups=8, bias=False) (3): BatchNorm2d(16, eps=0.001, momentum=0.01, affine=True, track_running_stats=True) (4): ELU(alpha=1.0) (5): AvgPool2d(kernel_size=(1, 4), stride=4, padding=0) (6): Dropout(p=0.5, inplace=False) ) (block2): Sequential( (0): Conv2d(16, 16, kernel_size=(1, 16), stride=(1, 1), padding=(0, 8), groups=16, bias=False) (1): Conv2d(16, 16, kernel_size=(1, 1), stride=(1, 1), bias=False) (2): BatchNorm2d(16, eps=0.001, momentum=0.01, affine=True, track_running_stats=True) (3): ELU(alpha=1.0) (4): AvgPool2d(kernel_size=(1, 8), stride=8, padding=0) (5): Dropout(p=0.5, inplace=False) ) (lin): Linear(in_features=64, out_features=5, bias=False) ) )
====================================================================== Layer (type:depth-idx) Param # ====================================================================== ├─EEGNet: 1-1 -- | └─Sequential: 2-1 -- | | └─Conv2d: 3-1 512 | | └─BatchNorm2d: 3-2 16 | | └─Conv2dWithConstraint: 3-3 256 | | └─BatchNorm2d: 3-4 32 | | └─ELU: 3-5 -- | | └─AvgPool2d: 3-6 -- | | └─Dropout: 3-7 -- | └─Sequential: 2-2 -- | | └─Conv2d: 3-8 256 | | └─Conv2d: 3-9 256 | | └─BatchNorm2d: 3-10 32 | | └─ELU: 3-11 -- | | └─AvgPool2d: 3-12 -- | | └─Dropout: 3-13 -- | └─Linear: 2-3 320 ====================================================================== Total params: 1,680 Trainable params: 1,680 Non-trainable params: 0

1 - EEGNet - Normal split


S06S01
EEG Data Shape
-
- 2
(153,586, 2000)
Labels Shape
-
-
Size
122,868
30,718
forward
51,687
19,825
reverse
26,941
-
stop
38,828
8,928
turn_left
2,838
1,664
turn_right
2,574
301
Average Loss
1.1150
-
Highest Train Accuracy
48.71%
-
Test Accuracy
-
36.50%
There are no rows in this table
EEGNet_S06S01_confusion_matrix_plot_ns.png
EEGNet_S06S01_training_metrics_plot_ns.png

S06S06
EEG Data Shape
-
- 2
(91,163, 2000)
Labels Shape
-
-
Size
72,930
18,233
forward
29,772
10,195
reverse
14,180
-
stop
27,465
7,354
turn_left
168
466
turn_right
1,345
218
Average Loss
0.8834
-
Highest Train Accuracy
60.92%
-
Test Accuracy
-
47.25%
There are no rows in this table
EEGNet_S06S06_confusion_matrix_plot_ns.png
EEGNet_S06S06_training_metrics_plot_ns.png

S06S07
EEG Data Shape
-
- 2
(91,111, 2000)
Labels Shape
-
-
Size
72,888
18,223
forward
27,183
9,464
reverse
13,248
-
stop
31,487
8,212
turn_left
143
404
turn_right
827
143
Average Loss
0.4994
-
Highest Train Accuracy
79.98%
-
Test Accuracy
-
47.55%
There are no rows in this table
EEGNet_S06S07_confusion_matrix_plot_ns.png
EEGNet_S06S07_training_metrics_plot_ns.png
S06S08
Metric/Label
-
- 2
(153,586, 2000)
EEG Data Shape
-
-
Labels Shape
-
-
Size
122,868
30,718
forward
51,687
19,825
reverse
26,941
-
stop
38,828
8,928
turn_left
2,838
1,664
turn_right
2,574
301
Average Loss
1.1150
-
Highest Train Accuracy
48.71%
-
Test Accuracy
-
36.50%
There are no rows in this table

EEGNet_S06S08_confusion_matrix_plot_ns.png
EEGNet_S06S08_training_metrics_plot_ns.png

S06S09
EEG Data Shape
-
- 2
(87,944, 2000)
Labels Shape
-
-
Size
70,355
17,589
forward
26,354
9,744
reverse
13,365
-
stop
29,725
7,341
turn_left
173
346
turn_right
738
158
Average Loss
0.8630
-
Highest Train Accuracy
61.64%
-
Test Accuracy
-
36.49%
There are no rows in this table
EEGNet_S06S09_confusion_matrix_plot_ns.png
EEGNet_S06S09_training_metrics_plot_ns.png
S06S10
Metric/Label
Training Set
Testing Set
Overall
EEG Data Shape
-
Labels Shape
-
Size
58,377
forward
25,227
reverse
4,554
stop
27,724
turn_left
247
turn_right
625
Average Loss
0.7739
Highest Train Accuracy
61.70%
Test Accuracy
-
There are no rows in this table

EEGNet_S06S10_confusion_matrix_plot_ns.png
EEGNet_S06S10_training_metrics_plot_ns.png


Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.