Skip to content

Commit

Permalink
Update SupportVectorMachinesQuiz.md
Browse files Browse the repository at this point in the history
  • Loading branch information
mGalarnyk authored Dec 6, 2020
1 parent 22c6843 commit 6a4f576
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions Stanford_Machine_Learning/Week7/SupportVectorMachinesQuiz.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,6 @@ Question 5
True/False | Answer | Explanation
--- | --- | ---
none yet | If the data are linear separable, an SVM using a linear kernel will return the same parameters Θ regardless of the chosen value of C (i.e., the resulting value of Θ does not depend on C) | none yet
none yet | The maximum value of the Gaussian kernel (i.e., sim(x,l<sup>(1)</sup>)) is 1. | none yet
none yet | Suppose you had 2D input examples (i.e. x<sup>(i)</sup> ∈ R<sup>2</sup>). The decision boundary of the SVM (with the linear kernel) is a straight line. | none yet
True | The maximum value of the Gaussian kernel (i.e., sim(x,l<sup>(1)</sup>)) is 1. | When x = l<sup>(1)</sup>, the Gaussian kernel has value exp(0) = 1, and it is less than 1 otherwise.
True | Suppose you had 2D input examples (i.e. x<sup>(i)</sup> ∈ R<sup>2</sup>). The decision boundary of the SVM (with the linear kernel) is a straight line. | The SVM without any kernel (i.e., the linear kernel) predicts output based only Θ<sup>T</sup>x, so it gives a linear/straight-line decision boundary, just as logisitic regression does.
none yet | If you are training multi-class SVMs with the one-vs-all method, it is not possible to use a kernel | none yet

0 comments on commit 6a4f576

Please sign in to comment.