Traditional Culture Encyclopedia - Traditional customs - Activation function and loss function
Activation function and loss function
(1)sigmoid function:
The output Sigmoid function (Logistic function) of hidden neurons has a value range of (0, 1), which can map a real number to the interval of (0, 1) and can be divided into two categories.
Disadvantages:
(3)softmax function:
Multi-classification neural network output:
(4) ReLU function of leakage (PReLU):
(4)ELU function:
(4)MaxOut function:
Maxout is a layer of deep learning network, which is the same pool layer and convolution layer. Maxout can be regarded as the active functional layer of the network. Suppose the input eigenvector of a certain layer of the network is: x = (x 1, x2, ... xd), and the input is d neurons. The calculation formula of each neuron in Maxout hidden layer is as follows:
Where c stands for cost, x stands for samples, y stands for actual value, a stands for output value, and n stands for total number of samples.
Take a sample as an example:
Gradient descent algorithm:
Without changing the activation function, the quadratic cost function is changed into the cross entropy cost function.
X represents samples, and n represents the total number of samples. Calculate the gradient of parameter w:
Gradient of b:
- Related articles
- What are the process types?
- ? Living room painting feng shui home feng shui attracting wealth and good luck landscape painting
- Seeking answers from scientific and technological innovation
- What are the kitchen utensils?
- Is changing face a national quintessence?
- How about Hangzhou Polymerization Logistics Co., Ltd.?
- Hainan Medical College has several campuses and the introduction of each campus.
- What are the conditions for the development of B&B?
- The main contents of frederick taylor's scientific management principles
- Military bar roll up on the body performance standards