Posts

Showing posts from February 15, 2024

[Day 44] Batch vs Layer vs Group Normalization and GANs (+ found a free KAIST AI course)

Image
 Hello! :) Today is day 44 A quick summary of today: Found KAIST Professor Choi's Programming for AI lectures  Discovered that there is Layer and Group norm (not only Batch norm) Learned about GANs 1) Batch vs Layer vs Group normalization methods In one of Professor Choi's lectures, he explained about the above three norm layers (I have not heard of the 2nd 3rd 4th), plus some searches online, I found the difference. Firstly, batch norm: given a batch of activations for a specific layer, the mean and std for the batch is calculated. Then, it subtracts the mean and divides by the std to normalize the values. (+ an epsilon is added to the standard deviation for numerical stability) following that, a scale factor "gamma" and shift factor "beta" which are learnable parameters are applied. Secondly, layer norm: proposed in 2016, layer norm operates over the feature dimension (i.e., it calculates the mean and variance for each instance separately, over all the fe