So I have a table with 2 classes and binary data. D1 is the expected output and I have to perform Adaboost , to combine the classifiers in order to get the best result.(for example in row 1, BN(N3) and NN(N2) give 1, when the expected output should be 0 (so they give wrong answer.)
link to the example data table
Till now the solution for me is as follows: I calculate all probabilities of all classifiers for giving wrong answer: P(KNN | 0) =2/4 ; P(BN(N1) | 0) = 1/4 and so on. Since SVM(N2) SVM(N1) are already perfect classifiers I have excluded them and work with the rest. Then I initialize equal weights for each data point= 1/6, since I have excluded SVM(NN1) and SVM(NN2) and start with the classifier that gives smallest probability error upon the data set(in this case I start with BN(N1) giving 1/4,taking order of appearance. Then for the number of rounds I update the weights and calculate alfa.
Am I taking the correct approach?
Aucun commentaire:
Enregistrer un commentaire