Κυριακή 8 Ιανουαρίου 2017

Improving Localized Multiple Kernel Learning via Radius-Margin Bound

Localized multiple kernel learning (LMKL) is an effective method of multiple kernel learning (MKL). It tries to learn the optimal kernel from a set of predefined basic kernels by directly using the maximum margin principle, which is embodied in support vector machine (SVM). However, LMKL does not consider the radius of minimum enclosing ball (MEB) which actually impacts the error bound of SVM as well as the separating margin. In the paper, we propose an improved version of LMKL, which is named ILMKL. The proposed method explicitly takes into consideration both the margin and the radius and so achieves better performance over its counterpart. Moreover, the proposed method can automatically tune the regularization parameter when learning the optimal kernel. Consequently, it avoids using the time-consuming cross-validation process to choose the parameter. Comprehensive experiments are conducted and the results well demonstrate the effectiveness and efficiency of the proposed method.

from #AlexandrosSfakianakis via Alexandros G.Sfakianakis on Inoreader http://ift.tt/2iUe6UD
via IFTTT

Δεν υπάρχουν σχόλια:

Δημοσίευση σχολίου

Δημοφιλείς αναρτήσεις