Blog: Neural Network is simple as dumb
Neural networks are all about finding pattern or in other words weights
weights are the patterns.
suppose we get two inputs and to get the right resultant value we need to ignore the second input, so our weight for the second input will be 0 and suppose our first input directly affect the chances of getting A. so our weight for the first input will be 1
adding these weight will give us the chances of getting A
w1 * i1 + w2 * i2 + b = (chances that it’s A)
b here is bias, it is what it sounds like. (bias says, I think the prediction is off by a little bit. no worries, why take the trouble of loosing weight and making your neighbor gain weight. From now I will do that for you)
simple! there’s is a slight problem, we want our model to have uniform value, we normalize the value using.. (normalization method vary depending on the case, logistic and sinusoidal function won’t be terrible. sinusoidal is a special case of logistic function )
Logistic function are used to articulate growth (things grow fast and slows down after a case specific point) quite similar to what we want
a neural network is a jungle of weights (and biases) made to adapt to reach maximum accuracy by repeated training with more variety of data
you can just store these weights and the network structure and you have a ML model
but how does a neural network find these weights (patterns)?
Report cards!! (cost function)
we find what value of weights and biases most efficiently minimize the cost function for that output and average that with with all other possible outputs
this was backpropagation or back pass