[Index for export/lyngby]
E = lyngby_nn_eerror(T, Y)
lyngby_nn_eerror - Calculate (cross-)entropic error
function E = lyngby_nn_eerror(T, Y)
Input: T Target Output
Y Neural network output
Output: E Error, normalized over patterns and outputs
If T=1 and Y=-1 the result is not Inf but log(realmin).
The entropy in this case is also called 'cross-entropy',
'relative entropy' or the 'Kullback' measure.
See also LYNGBY, LYNGBY_NN_EMAIN, LYNGBY_NN_COST.
$Id: lyngby_nn_eerror.m,v 1.2 2004/06/03 17:34:49 fnielsen Exp $
Cross-Reference InformationThis function is called by
Produced by mat2html on Wed Jul 29 15:43:40 2009
Cross-Directory links are: OFF