1 Reply Latest reply on Jan 3, 2018 11:06 PM by Sandhiya

    using knn with cross validation


      I don't understand why cross validation will enhance the performance of KNN. I understand this in other models but not in KNN. In all cases KNN will compare new point against old saved points. so whatever number of folds cross_val_score will not enhance something related to generalization of the model. The only points that will enhance KNN results is to choose K for example. and choosing K can be done by knnN.fit normally

      In other words, there is no cost function here to enhance it or generalize it by using the concept of cross validation

        • 1. Re: using knn with cross validation

          Hi Rahman,


          Thanks for your question.


          Cross validation can be used for various purposes. As far as your question is concerned, you could probably use it for choosing the optimal value of k for KNN. For instance, you can run K fold for various values of k.For each value of k, compute the average cross validation error across K folds. You can then select the value of k with the best cross validation error.


          Basically, you use cross-validation to avoid over fitting, by training the model on the training set and computing the model's performance on the validation set and repeating this for a few more times(K). The results are then averaged.


          Hope this helps.


          Thanks & Regards,