Category Archives: Lab Meeting

banner-neurology-doctor-brain

MRI

Introduction by Martin:

The next reading group will be on MRI. Since this is quite a general subject, instead of reading papers, I will make you watch a few introduction videos on selected key topics. I have included questions for each video to make sure you understood what you just watched. You are expected to watch them again or do some of your own research if you have absolutely no clue (they are short videos after all). We’ll go through them on Friday and some more advanced topics if we have time.

Continue reading MRI

3D_IPS_Motion_Detection

Motion Detection

Introduction by Yi:

Since some of you mentioned to me that they would like to know more about motion detection, the subject of the next reading group would be about this topic. At the same time, not everyone in the lab is working on motion detection, I’d like to make the reading group general instead of too specific. That means most of the stuffs would be about the basic ideas about motion detection, the widely-used methods, features, background model updating strategies and so on.

Required reading

Take-home message:

  • Still an open question, not one method is better than the other and there is always room for improvement. Combining different methods is still a hot topic.

CNN Part 6: Transfer Learning

For the sixth reading group on the Stanford University Convolutional Neural Networks class, we went through the following slides:

Take-home message:

  • The middle layers and depth of a CNN is important.
  • Transfer learning is helpful with small to medium datasets.
  • Rule of thumb:
Similar dataset Different dataset
Little data Use Linear Classifier on top layer You’re in trouble…
Try linear classifier from different stages
Lots of data Finetune a few layers Finetune a larger number of layers

CNN Part 5: Understanding and visualizing CNNs

For the fifth reading group on the Stanford University Convolutional Neural Networks class, we went through the following slides:

Take-home message:

You can backproject the content of a CNN in order to visualize the filters. You can fool a CNN by training an image (not specific to CNNs). The vast majority of the parameters of the CNN are at the fully connected layer, no matter how many convolution layers you have before. The CNN code can be highly discriminant.

CNN Part 4: Parameter Update & Optimization

For the fourth reading group on the Stanford University Convolutional Neural Networks class, we went through the following slides:

Take-home message:

There are different ways to decay your learning rate, which is recommended to avoid missing your global minima. Momentum is robust to saddle points but moves very quickly. There is no clear way to find the optimal hyperparameters, but they are very important to optimize.

CNN Part 3: Preprocessing, Initialization and Regularization

For the third reading group on the Stanford University Convolutional Neural Networks class, we went through the following slides:

Take-home message:

The most common regularization is L2 cross-validated. A good idea is to combine this with dropout with a reasonable value of p of 0.5, but this can be tuned. When implementing activation functions from scratch, it is important to do some gradient and sanity checks to validate your implementation.

CNN Part 2: Optimization and Backpropagation

For the second reading group on the Stanford University Convolutional Neural Networks class, we went through the following slides:

Take-home message:

In practice, use rectified linear or maxout activation functions, which are both piecewise linear. The bigger the network, the better, but regularization might be required.

CNN Part 1: Linear Classification

For this first reading group on the Stanford University Convolutional Neural Networks class, we went through the following slides:

Take-home message:

There are two loss functions commonly used: the softmax and the SVM. In order to optimize the weights and train effectively, we will want to minimize the chosen loss function during training.