-
Notifications
You must be signed in to change notification settings - Fork 1
/
Copy pathSKL-Classification.tex
55 lines (46 loc) · 1.38 KB
/
SKL-Classification.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
Classification Example
K nearest neighbors (kNN) is one of the simplest learning strategies: given a new, unknown observation, look up in your reference database which ones have the closest features and assign the predominant class.
Let's try it out on our iris classification problem:
In [38]:
from sklearn import neighbors, datasets
iris = datasets.load_iris()
X, y = iris.data, iris.target
# create the model
knn = neighbors.KNeighborsClassifier(n_neighbors=1)
<pre>
\begin{verbatim}
# fit the model
knn.fit(X, y)
\end{verbatim}
\end{framed}
<pre>
\begin{verbatim}
# What kind of iris has 3cm x 5cm sepal and 4cm x 2cm petal?
# call the "predict" method:
result = knn.predict([[3, 5, 4, 2],])
print iris.target_names[result]
['virginica']
\end{verbatim}
\end{framed}
<pre>
\begin{verbatim}
from fig_code import plot_iris_knn
plot_iris_knn()
\end{verbatim}
\end{framed}
Exercise: Now use as an estimator on the same problem: sklearn.svm.SVC.
Note that you don't have to know what it is do use it.
If you finish early, do the same plot as above.
<pre>
\begin{verbatim}
from sklearn.svm import SVC
In [37]:
model = SVC()
model.fit(X, y)
result = model.predict([[3, 5, 4, 2],])
print iris.target_names[result]
['virginica']
\end{verbatim}
\end{framed}
%===================================================================== %
%Regression Example