Instead of value like linear regression, it calculate probability out of the value.
p = logistic(X.theta)
* logistic is inverse of logit function.
logistic(t) = 1 / ( 1 + exp(-t) )
y = 0 if p < 0.5, this also means t < 0
y = 1 if p >= 1, this also means t >= 1
The cost function of it is:
c(theta) = -log(p) if y = 1, -log(1-p) if y = 0
* increase probability when it is positive instance
* reduce probability of negative instance
* there is no normal equation to solve. need to use gradient descent approach
* The partial derivative is similar to linear regression. replace the value with the probability instead
Decision Boundary
* the boundary on which it is equally possible for y = 0 and y = 1, i.e. p = 0.5
* LogisticRegression.predict_proba will return two column. Column 0 is the probability of 0, Column is the probability of 1
Support of Multiple Class
* Using Softmax regression
* Compute sk(x) for each class k = x.theta
* Each class has its own parameter vector theta. Every parameter vector can be put in row in matrix called parameter matrix
* Probability of sk(x) is the softmax function: exponential of sk divide by sum of all exponential
Cost function of Softmax Regression
* When k = 2, it is the same as logistic regression.
Subscribe to:
Post Comments (Atom)
Artificial Neural Network
Logical Computation With Neuron * It has one or more binary input and one output. * Activate output when certain number of input is active...
-
Instead of value like linear regression, it calculate probability out of the value. p = logistic(X.theta) * logistic is inverse of logit f...
-
Linear SVM Classification * create decision boundary that separates the instances * the edge of decision boundary is called support vector...
-
* Gini impurity => 1 - sigma ratio of k among the training instances. * Pure node will have zero impurity. * Classification and Regre...
No comments:
Post a Comment