This type of layer normalizes each channel across a mini-batch.
This can be useful in reducing sensitivity to variations within the data.
(4) layer = reluLayer(Name="relu_1")
Rectified linear unit (ReLU) layer
(5) layer = additionLayer(numInputs)
creates an addition layer with the
number of inputs specified by numInputs. This layer takes multiple
inputs and adds them element-wise
(6) layer = fullyConnectedLayer(outputSize)
creates a fully connected layer.
outputSize specifies the size of the output for the layer.
A fully connected layer will multiply the input by a matrix and then add
a bias vector.
(7) layer = softmaxLayer()
creates a softmax layer.
This layer is useful for classification problems.
留言列表