Logical Computation With Neuron
* It has one or more binary input and one output.
* Activate output when certain number of input is active
Perceptron
* Threshold login unit (TLU)
* applies weight to each input layer and then compute it on the output layer, before going through step function.
* Step function can be Heaviside or sign.
* To solve XOR problem, it needs multi-layer perceptron.
* feed forward neural network because the signal flows one directional only
* Input Layer - Hidden layer - Output Layer
* When hidden layer is lots, it's called deep neural network. Deep learning studies this.
Backpropagation
* works for training multi-layer perceptron
* work through the instance -> compute the lost -> goes through backwards to measure the error from each connection -> tweak to minimise the error
* the initialisation weight is very important to be random
* the activation function must be changed to non-heaviside or non-sign because those are flat. the original used the logistic function = 1 / (1 + exp(-z))
Keras
* wrapper to access tensorflow
* see own code to see how it works.
* The flow -> create model -> compile -> train -> evaluate
* Start with Sequential API
Functional API
* the output layer has direct connection to input path, beside the hidden layer (the stack).
* this enable the network to learn both deep pattern and simple pattern
Subclassing API
* subclass the model
* create your own
* can put all the functional and sequential inside; but we lost the keras analysis advantage though.
Hyperparameter
* Layers: start from one, then maybe max three. It is actually more parameter friendly in more layers.
* Number of neurons, pyramid or same number of neurons. Layers is more important.
Subscribe to:
Post Comments (Atom)
Artificial Neural Network
Logical Computation With Neuron * It has one or more binary input and one output. * Activate output when certain number of input is active...
-
Instead of value like linear regression, it calculate probability out of the value. p = logistic(X.theta) * logistic is inverse of logit f...
-
Linear SVM Classification * create decision boundary that separates the instances * the edge of decision boundary is called support vector...
-
* Gini impurity => 1 - sigma ratio of k among the training instances. * Pure node will have zero impurity. * Classification and Regre...
No comments:
Post a Comment