You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

Introduction to the Code

April 25th, 2018

This is the simple code I started with where the neural network goes through multiple runs to try and output the number one. Here the output is very close, at 0.99993704

from numpy import exp, array, random, dotfrom numpy import exp, array, random, dot

training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]])training_set_outputs = array([[0, 1, 1, 0]]).T
random.seed(1)
synaptic_weights = 2 * random.random((3, 1)) – 1

for iteration in xrange(10000):    output = 1 / (1 + exp(-(dot(training_set_inputs, synaptic_weights))))
synaptic_weights += dot(training_set_inputs.T, (training_set_outputs – output) * output * (1 – output))
print 1 / (1 + exp(-(dot(array([1, 0, 0]), synaptic_weights))))

 

 

Entry Filed under: Uncategorized

Leave a Comment

Required

Required, hidden

Some HTML allowed:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>