Deep learning noter – Gradient based learning – 11 & 13/10 2021
With a liminiear approximation can be made from the gradiant. This is done by going the oposite direction. With multiple inputs it is used the same way, but returns an output with weights matching the amount of dimensions of the input. SGD SGD is using mini batches. Instead of using all connections to get the…
Deep learning noter – Convolutional Neural Networks 8 & 11/10 2021
Last time: Key-ideas to exploited Convolution Visual explanations of conolution In convolution the entire input is used and manipulated by a kernel in chunks From a input string, an matrix is created 2d image manipulation example Covolution Instead of learning full matrix, only a couple of weights of the kernel is needed to be learned….