Pacific-Design.com

    
Home Index

1. Machine Learning

2. 7 Neural Network

Machine Learning / 7 Neural Network /

An artificial neural network (ANN) using Matlab

clc; format shortG;

%----------------- Inputs ---------------------%
%Sample       X       Y      Z    Target ouput
%----------------------------------------------%
Sample   = [  1.0    0.4    0.7    0.65 ];
%Sample  = [ -0.7    0.9    1.2    0.12 ];
%----------------------------------------------%
Learning_Rate = 0.50;

%--- Weights ---%
W1j =  0.20;                     
W1i =  0.10;                

W2j =  0.30;                
W2i = -0.10;                

W3j = -0.10;            
W3i =  0.20;           

Wjk =  0.10;                           
Wik =  0.50;           

%--------------- Neural Network = Input/Output Computation ---------------%
Node_J          = Sample(1)*W1j + Sample(2)*W2j + Sample(3)*W3j;
Output_Node_J   = 1/(1+exp(-Node_J));

Node_I          = Sample(1)*W1i + Sample(2)*W2i + Sample(3)*W3i;
Output_Node_I   = 1/(1+exp(-Node_I));

Input_Node_K    = Output_Node_J*Wjk + Output_Node_I*Wik;
Output_Node_K   = 1/(1+exp(-Input_Node_K));


Expected_Output = Sample(4);
Error_at_Node_K = (Expected_Output-Output_Node_K)*Output_Node_K*(1-Output_Node_K);


Delta_Wjk       = Learning_Rate*Error_at_Node_K*Output_Node_J;
New_Wjk         = Wjk + Delta_Wjk;

Delta_Wik       = Learning_Rate*Error_at_Node_K*Output_Node_I;
New_Wik         = Wik + Delta_Wik;

Error_at_Node_J = Error_at_Node_K * Wjk * Output_Node_J * (1-Output_Node_J);
Error_at_Node_I = Error_at_Node_K * Wik * Output_Node_I * (1-Output_Node_I);


%--------------------------- Node J --------------------------------------%
Delta_W1j       = Learning_Rate * Error_at_Node_J * Sample(1);
New_W1j         = W1j + Delta_W1j;

Delta_W2j       = Learning_Rate * Error_at_Node_J * Sample(2);
New_W2j         = W2j + Delta_W2j;

Delta_W3j       = Learning_Rate * Error_at_Node_J * Sample(3);
New_W3j         = W3j + Delta_W3j;

%--------------------------- Node I --------------------------------------%
Delta_W1i       = Learning_Rate * Error_at_Node_I * Sample(1);
New_W1i         = W1i + Delta_W1i;

Delta_W2i       = Learning_Rate * Error_at_Node_I * Sample(2);
New_W2i         = W2i + Delta_W2i;

Delta_W3i       = Learning_Rate * Error_at_Node_I * Sample(3);
New_W3i         = W3i + Delta_W3i;

%----------------- Weights after current Sample : ------------------------%
Wjk = New_Wjk 
Wik = New_Wik

W1i = New_W1i
W2i = New_W2i 
W3i = New_W3i

W1j = New_W1j
W2j = New_W2j
W3j = New_W3j

%----------------------- End of (ANN)Computation -------------------------%

Matlab Output

Wjk =  0.10465
Wik =  0.50455

W1i =  0.10102
W2i = -0.099591
W3i =  0.20072

W1j =  0.2002
W2j =  0.30008
W3j = -0.099858


Neural Network = Sample 1

-------------------- Neural Network = Sample 1 ------------------------ W1j 0.2 Inputs W1i 0.1 Node 1 1.0 W2j 0.3 Node 2 0.4 W2i -0.1 Node 3 0.7 W3j -0.1 W3i 0.2 Target Output 0.65 Wjk 0.1 Learning Rate 0.5 Wik 0.5 ------------- Neural Network = Input/Output Computation --------------- Input Note(j) = Node1*W1j + Node2*W2j + Node3*W3j = 0.25 Output of Note (j) = 1/(1+e^-0.25) = 0.5622 Input Note(i) = Node1*W1i + Node2*W2i + Node3*W3i = 0.2 Output of Note (i) = 1/(1+e^-0.2) = 0.5498 Input Note(k) = Nodej*Wjk + Nodei*Wik = 0.3311 Output of Note (k) = 1/(1+e^-0.331) = 0.5820 Difference = Target Output - Node k 0.0680 Error at Note K = (Target Output - Node K)* Node k *(1-Node k) = 0.0165 Delta Wjk = Learning Rate * error at Node K * Node j = 0.0046 New Wjk = Old Wjk + Delta Wjk = 0.1046 Delta Wik = Learning Rate * error at Node K * Node I = 0.0045 New Wjk = Old Wjk +Delt Wjk = 0.5045 Error at Note j = error Rate @ Node K * intial Wjk *Node j*(1-Node j) = 0.0004 Error at Note i = error Rate @ Node K * intial Wik *Node i*(1-Node i) = 0.0020 ------------- Neural Network = New Weight Computation ----------------- New W1j=intial W1j+Learning Rate * Error Rate @ K * Wjk * Node j * (1- Node j) 0.1002 W1j New W1i=intial W1i+Learning Rate * Error Rate @ K * Wik * Node I * (1- Node i) 0.1010 W1i New W2j=intial W2j+Learning Rate * Error Rate @ K * Wjk * Node j * (1- Node j) 0.3002 W2j New W2i=intial W2i+Learning Rate * Error Rate @ K * Wik * Node i * (1- Node i) -0.0990 W2i New W3j=intial W3j+Learning Rate * Error Rate @ K * Wjk * Node j * (1- Node j) -0.0998 W3j New W3i=intial W3i+Learning Rate * Error Rate @ K * Wik * Node i * (1- Node i) 0.2010 W3i

References: http://ulcar.uml.edu/~iag/CS/Intro-to-ANN.html