机器学习_KNN算法(超级简单)
2021/8/1 14:06:00
本文主要是介绍机器学习_KNN算法(超级简单),对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
一、KNN算法是什么?
knn是最简单的机器学习算法,也是最不需要数学知识的算法。
算法步骤:
- 给定训练数据,啥也不用干,存起来。
- 给定新的数据,把新数据和每一条训练数据进行求欧氏距离,然后把求得的距离进行从小到大排序,
- 选取前K个最小的距离,然后看一下这k个中哪一个标签的最多。
- 新数据就会被分类到最多的标签中。
import numpy as np from math import sqrt from collections import Counter import matplotlib.pyplot as plt class KNNClassifier: def __init__(self, k): """初始化kNN分类器""" assert k >= 1, "k must be valid" self.k = k self._X_train = None self._y_train = None def fit(self, X_train, y_train): """根据训练数据集X_train和y_train训练kNN分类器""" assert X_train.shape[0] == y_train.shape[0], \ "the size of X_train must be equal to the size of y_train" assert self.k <= X_train.shape[0], \ "the size of X_train must be at least k." self._X_train = X_train self._y_train = y_train return self def predict(self, X_predict): """给定待预测数据集X_predict,返回表示X_predict的结果向量""" assert self._X_train is not None and self._y_train is not None, \ "must fit before predict!" assert X_predict.shape[1] == self._X_train.shape[1], \ "the feature number of X_predict must be equal to X_train" y_predict = [self._predict(x) for x in X_predict] return np.array(y_predict) def _predict(self, x): """给定单个待预测数据x,返回x的预测结果值""" assert x.shape[0] == self._X_train.shape[1], \ "the feature number of x must be equal to X_train" distances = [sqrt(np.sum((x_train - x) ** 2)) for x_train in self._X_train] nearest = np.argsort(distances) topK_y = [self._y_train[i] for i in nearest[:self.k]] votes = Counter(topK_y) return votes.most_common(1)[0][0] def __repr__(self): return "KNN(k=%d)" % self.k raw_data_X = [[3.393533211, 2.331273381], [3.110073483, 1.781539638], [1.343808831, 3.368360954], [3.582294042, 4.679179110], [2.280362439, 2.866990263], [7.423436942, 4.696522875], [5.745051997, 3.533989803], [9.172168622, 2.511101045], [7.792783481, 3.424088941], [7.939820817, 0.791637231] ] raw_data_y = [0, 0, 0, 0, 0, 1, 1, 1, 1, 1] X_train = np.array(raw_data_X) y_train = np.array(raw_data_y) x = np.array([8.093607318, 3.365731514]) X_predict = x.reshape(1, -1) knn_clf = KNNClassifier(3) knn_clf.fit(X_train, y_train) y_predict = knn_clf.predict(X_predict) print(y_predict[0]) plt.scatter(X_train[y_train==0,0],X_train[y_train==0,1]) plt.scatter(X_train[y_train==1,0],X_train[y_train==1,1]) plt.scatter(X_predict[0][0],X_predict[0][1]) plt.show()
这篇关于机器学习_KNN算法(超级简单)的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2024-12-17机器学习资料入门指南
- 2024-12-06如何用OpenShift流水线打造高效的机器学习运营体系(MLOps)
- 2024-12-06基于无监督机器学习算法的预测性维护讲解
- 2024-12-03【机器学习(六)】分类和回归任务-LightGBM算法-Sentosa_DSML社区版
- 2024-12-0210个必须使用的机器学习API,为高级分析助力
- 2024-12-01【机器学习(五)】分类和回归任务-AdaBoost算法-Sentosa_DSML社区版
- 2024-11-28【机器学习(四)】分类和回归任务-梯度提升决策树(GBDT)算法-Sentosa_DSML社区版
- 2024-11-26【机器学习(三)】分类和回归任务-随机森林(Random Forest,RF)算法-Sentosa_DSML社区版
- 2024-11-18机器学习与数据分析的区别
- 2024-10-28机器学习资料入门指南