Saturday, January 07, 2006
"ANN"
has
artificial neural networks are hot.
Yi =(Wij)=> Yj
Xj ::= Weight input at j = sum( Yi*Wij )
Yj ::= Activity Level at j =1/(1 + e^(-Xj))
Yj = 1 + e^sum( Yi*Wij )
E ::= Error = sum( (Yj - Dj)^2 )/2
Dj ::= Desired Activity Level for j
E = sum( (1/(1+e^(-sum(Yi*Wij))) - Dj)^2 )/2
Note that Yj is recursively evaluated from the input layer to the output layer.
EAj ::= Rate of Error change according to Activity change = dE/dYj = Yj - Dj ::= Activity - Desired Activity.
EIj ::= Rate of Error change according to total Input change = dE/dXj = dE/dYj * dYj/dXj = EAj * dYj/dXj = EAj*Yj(1-Yj).
dYj/dXj = Yj(1-Yj) since Yj is sigmoid function 1/(1+e^-Xj)
EWij ::= Rate of Error change according to Weight change = dE/dW = dE/dXj * dXj/dWij = EIj * dXj/dWij = EIj * Yi
I don't know why dXj/dWij = Yi
EAi ::= Rate of Error change according to Activity level change in previous layer = dE/dYi =
ok i thought it's a simple derivitave..but it's partial derivative..no d..ok i'm stuck here
Yi =(Wij)=> Yj
Xj ::= Weight input at j = sum( Yi*Wij )
Yj ::= Activity Level at j =1/(1 + e^(-Xj))
Yj = 1 + e^sum( Yi*Wij )
E ::= Error = sum( (Yj - Dj)^2 )/2
Dj ::= Desired Activity Level for j
E = sum( (1/(1+e^(-sum(Yi*Wij))) - Dj)^2 )/2
Note that Yj is recursively evaluated from the input layer to the output layer.
EAj ::= Rate of Error change according to Activity change = dE/dYj = Yj - Dj ::= Activity - Desired Activity.
EIj ::= Rate of Error change according to total Input change = dE/dXj = dE/dYj * dYj/dXj = EAj * dYj/dXj = EAj*Yj(1-Yj).
dYj/dXj = Yj(1-Yj) since Yj is sigmoid function 1/(1+e^-Xj)
EWij ::= Rate of Error change according to Weight change = dE/dW = dE/dXj * dXj/dWij = EIj * dXj/dWij = EIj * Yi
I don't know why dXj/dWij = Yi
EAi ::= Rate of Error change according to Activity level change in previous layer = dE/dYi =
ok i thought it's a simple derivitave..but it's partial derivative..no d..ok i'm stuck here
0 Comments:
Post a Comment
« Home