g = mlpbkp(net, x, z, deltas)
g = mlpbkp(net, x, z, deltas)
takes a network data structure
net
together with a matrix x
of input vectors, a matrix
z
of hidden unit activations, and a matrix deltas
of the
gradient of the error function with respect to the values of the
output units (i.e. the summed inputs to the output units, before the
activation function is applied). The return value is the gradient
g
of the error function with respect to the network
weights. Each row of x
corresponds to one input vector.
This function is provided so that the common backpropagation algorithm can be used by multi-layer perceptron network models to compute gradients for mixture density networks as well as standard error functions.
mlp
, mlpgrad
, mlpderiv
, mdngrad
Copyright (c) Ian T Nabney (1996-9)