More Complex Situation
April 25th, 2018
I modeled a single neuron, with three input connections and one output connection. Then I assigned random weights to a 3×1 matrix, with values in the range of -1 to 1. I pass the weighted sum of the inputs through this function to normalise them between 0 and 1. The derivative of the Sigmoid function indicates how confident I am about the existing weight.
After this, the neural network learns through a process of trial and error, adjusting the weight each time. In order to determine how successful the neuron learns, I calculated the error, which is the difference between the desired output and predicted output. Now, I multiplied the error by the input, thus causing the less confident weights to adjust more, meaning the inputs that equal zero do not cause change to the weights.
I give the neural network some training sets and make it run through it many times, then test it with a new situation.
from numpy import exp, array, random, dot
class NeuralNetwork():
def __init__(self):
random.seed(1)
self.synaptic_weights = 2 * random.random((3, 1)) – 1
def __sigmoid(self, x):
return 1 / (1 + exp(-x))
def __sigmoid_derivative(self, x):
return x * (1 – x)
def train(self, training_set_inputs, training_set_outputs, num_iter):
for iteration in xrange(num_iter):
# Pass the training set through our neural network (a single neuron).
output = self.think(training_set_inputs)
error = training_set_outputs – output
adjustment = dot(training_set_inputs.T, error * self.__sigmoid_derivative(output))
# Adjust the weights.
self.synaptic_weights += adjustment
# The neural network thinks.
def think(self, inputs):
# Pass inputs through our neural network (our single neuron).
return self.__sigmoid(dot(inputs, self.synaptic_weights))
if __name__ == “__main__”:
#Intialise a single neuron neural network.
neural_network = NeuralNetwork()
print “Random starting synaptic weights: “
print neural_network.synaptic_weights
# The training set. We have 4 examples, each consisting of 3 input values
# and 1 output value.
training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]])
training_set_outputs = array([[0, 1, 1, 0]]).T
# Train the neural network using a training set.
# Do it 10,000 times and make small adjustments each time.
neural_network.train(training_set_inputs, training_set_outputs, 10000)
print “New synaptic weights after training: “
print neural_network.synaptic_weights
# Test the neural network with a new situation.
print “Considering new situation [1, 0, 0] -> ?: “
print neural_network.think(array([1, 0, 0]))
# Output:
#Random starting synaptic weights:
#[[-0.16595599]
#[ 0.44064899]
#[-0.99977125]]
#New synaptic weights after training:
#[[ 9.67299303]
#[-0.2078435 ]
#[-4.62963669]]
#Considering new situation [1, 0, 0] -> ?:
#[ 0.99993704]
Entry Filed under: Uncategorized
Leave a Comment
Some HTML allowed:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>
Trackback this post | Subscribe to the comments via RSS Feed