sklearn.datasets 資料集

``````>>> from sklearn.datasets import load_digits
>>> digits.data
array([[ 0.,  0.,  5., ...,  0.,  0.,  0.],
[ 0.,  0.,  0., ..., 10.,  0.,  0.],
[ 0.,  0.,  0., ..., 16.,  9.,  0.],
...,
[ 0.,  0.,  1., ...,  6.,  0.,  0.],
[ 0.,  0.,  2., ..., 12.,  0.,  0.],
[ 0.,  0., 10., ..., 12.,  1.,  0.]])
>>> digits.target
array([0, 1, 2, ..., 8, 9, 8])
>>> digits.DESCR
".. _digits_dataset:\n\nOptical recognition of handwritten digits dataset\n--------------------------------------------------\n\n**Data Set Characteristics:**\n\n    :Number of Instances: 1797\n    :Number of Attributes: 64\n    :Attribute Information: 8x8 image of integer pixels in the range 0..16....略
>>>
``````

``````from sklearn.datasets import load_digits
import matplotlib.pyplot as plt

plt.gray()

for i in range(10):
plt.subplot(2, 5, i + 1)
plt.imshow(digits.data[i].reshape((8, 8)))

plt.show()
``````

``````import numpy as np
import matplotlib.pyplot as plt
from matplotlib import cm

from sklearn.model_selection import train_test_split
from sklearn.neural_network import MLPClassifier

imgs_training_data, img_test_data, lb_training_data, lb_test_data = train_test_split(
digits.data, digits.target, stratify = digits.target, random_state = 1
)

mlp = MLPClassifier() # 用預設值就可以了，可自行查詢文件瞭解預設值
mlp.fit(imgs_training_data, lb_training_data)

# 評估
plt.text(0, 8.5,
"Score: " + str(mlp.score(img_test_data, lb_test_data)))

# 用測試圖片看看
plt.imshow(img_test_data[0].reshape((8, 8)), cmap = cm.gray)
# 預測值
plt.text(5, 8.5,
"Predict: " + str(mlp.predict([img_test_data[0]])))

plt.show()
``````

`sklearn.datasets``load_iris` 可以載入鳶尾花卉數據集，例如：

``````from sklearn.datasets import load_iris

print('花萼長度/花萼寬度/花瓣長度/花瓣寬度：\n', iris.data)
print('屬種名稱：\n', iris.target_names)
print('屬種標記：\n', iris.target)
``````

``````花萼長度/花萼寬度/花瓣長度/花瓣寬度：
[[5.1 3.5 1.4 0.2]
[4.9 3.  1.4 0.2]
[4.7 3.2 1.3 0.2]
[4.6 3.1 1.5 0.2]
[5.  3.6 1.4 0.2]
...略
[6.2 3.4 5.4 2.3]
[5.9 3.  5.1 1.8]]

['setosa' 'versicolor' 'virginica']

[0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2
2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
2 2]
``````