Starting from:
$35

$29

Lab3 : EEG classification Solution

Lab Objective:

In this lab, you will need to implement simple EEG classification models which are EEGNet, DeepConvNet [1] with BCI competition dataset. Additionally, you need to try different kinds of activation function including ReLU, Leaky ReLU, ELU.

 
Turn in:

    1. Experiment Report (.pdf)

    2. Source code

Notice: zip all files in one file and name it like「DLP_LAB3_your

studentID_name.zip」, ex: 「DLP_LAB3_0851909_陳昭宇.zip」

Requirements:

    1. Implement the EEGNet, DeepConvNet with three kinds of activation function including ReLU, Leaky ReLU, ELU

    2. In the experiment results, you have to show the highest accuracy (not loss) of two architectures with three kinds of activation functions.

    3. To visualize the accuracy trend, you need to plot each epoch accuracy (not loss) during training phase and testing phase.

Dataset:

BCI Competition III - IIIb Cued motor imagery with online feedback (non-stationary classifier) with 2 classes (left hand, right hand) from 3 subjects [2 classes, 2 bipolar EEG channels]

Reference: http://www.bbci.de/competition/iii/desc_IIIb.pdf

Implementation Details:
    • Prepare Data

The training data and testing data have been preprocessed and named [S4b_train.npz, X11b_train.npz] and [S4b_test.npz, X11b_test.npz] respectively. Please download the preprocessed data and put it in the same folder. To read the preprocessed data, refer to the “dataloader.py”.














    • Model Architecture

You need to implement simple EEG classification models which are EEGNet and DeepConvNet.

EEGNet:

Overall visualization of the EEGNet architecture






















Reference: Depthwise Separable Convolution

https://towardsdatascience.com/a-basic-introduction-to-separable-convolutionsb99ec3102728

EEGNet implementation details:






















DeepConvNet:

You need to implement the DeepConvNet architecture by using the following table, where C = 2, T = 750 and N = 2. The max norm term is ignorable.

    • Activation Functions










By default, the negative slope = 0.01






The α value for the ELU formulation.Default: 1.0

Reference:

https://medium.com/tinymind/a-practical-guide-to-relu-b83ca804f1f7

https://pytorch.org/docs/stable/nn.html

In the PyTorch framework, it is easy to implement the activation function. Just typing the following code!





    • Hyper Parameters

Batch size= 64 Learning rate = 1e-2 Epochs = 300 Optimizer: Adam Loss function: torch.nn.CrossEntropyLoss()

You can adjust the hyper-parameters according to your own ideas.
    • Result comparison

In this part, you can use the matplotlib library to draw the graph.

Reference : https://matplotlib.org/


The comparison figure should like the example as below. (EEGNet)


Report Spec (60%)

    1. Introduction (20%)

    2. Experiment set up (30%)

        A. The detail of your model
 EEGNet

 DeepConvNet
        B. Explain the activation function (ReLU, Leaky ReLU, ELU)

    3. Experimental results (30%)

        A. The highest testing accuracy
        ◦ Screenshot with two models

        ◦ anything you want to present B. Comparison figures

        ◦ EEGNet

        ◦ DeepConvNet

    4. Discussion (20%)

A. Anything you want to share


---- Criterion of result (40%) ----

Accuracy > = 87% = 100 pts

Accuracy 85~87% = 90 pts

Accuracy 80~85% = 80 pts

Accuracy 75~80% = 70 pts

Accuracy < 75% = 60 pts


Score: 40% experimental results + 60% (report+ demo score) P.S If the zip file name or the report spec have format error, it will be penalty (-5).

In the demo phase, you only need to show the highest testing accuracy of the model.

Reference:

    [1] EEGNet: A Compact Convolutional Neural Network for EEG-based Brain-Computer Interfaces

More products