A Nested Heuristic for Parameter Tuning in Support Vector Machines

Emilio Carrizosa, Belén Martín-Barragán, Dolores Romero Morales

Research output: Contribution to journalJournal articleResearchpeer-review

Abstract

The default approach for tuning the parameters of a Support Vector Machine (SVM) is a grid search in the parameter space. Different metaheuristics have been recently proposed as a more efficient alternative, but they have only shown to be useful in models with a low number of parameters. Complex models, involving many parameters, can be seen as extensions of simpler and easy-to-tune models, yielding a nested sequence of models of increasing complexity. In this paper we propose an algorithm which successfully exploits this nested property, with two main advantages versus the state of the art. First, our framework is general enough to allow one to address, with the very same method, several popular SVM parameter models encountered in the literature. Second, as algorithmic requirements we only need either an SVM library or any routine for the minimization of convex quadratic functions under linear constraints. In the computational study, we address Multiple Kernel Learning tuning problems for which grid search clearly would be infeasible, while our classification accuracy is comparable to that of ad hoc model-dependent benchmark tuning methods.
The default approach for tuning the parameters of a Support Vector Machine (SVM) is a grid search in the parameter space. Different metaheuristics have been recently proposed as a more efficient alternative, but they have only shown to be useful in models with a low number of parameters. Complex models, involving many parameters, can be seen as extensions of simpler and easy-to-tune models, yielding a nested sequence of models of increasing complexity. In this paper we propose an algorithm which successfully exploits this nested property, with two main advantages versus the state of the art. First, our framework is general enough to allow one to address, with the very same method, several popular SVM parameter models encountered in the literature. Second, as algorithmic requirements we only need either an SVM library or any routine for the minimization of convex quadratic functions under linear constraints. In the computational study, we address Multiple Kernel Learning tuning problems for which grid search clearly would be infeasible, while our classification accuracy is comparable to that of ad hoc model-dependent benchmark tuning methods.
LanguageEnglish
JournalComputers & Operations Research
Volume43
Pages328–334
ISSN0305-0548
DOIs
StatePublished - 2014
Externally publishedYes

Keywords

    Cite this

    Carrizosa, Emilio ; Martín-Barragán, Belén ; Morales, Dolores Romero. / A Nested Heuristic for Parameter Tuning in Support Vector Machines. In: Computers & Operations Research. 2014 ; Vol. 43. pp. 328–334
    @article{653eec43397248dd81dc9e379607bb4a,
    title = "A Nested Heuristic for Parameter Tuning in Support Vector Machines",
    abstract = "The default approach for tuning the parameters of a Support Vector Machine (SVM) is a grid search in the parameter space. Different metaheuristics have been recently proposed as a more efficient alternative, but they have only shown to be useful in models with a low number of parameters. Complex models, involving many parameters, can be seen as extensions of simpler and easy-to-tune models, yielding a nested sequence of models of increasing complexity. In this paper we propose an algorithm which successfully exploits this nested property, with two main advantages versus the state of the art. First, our framework is general enough to allow one to address, with the very same method, several popular SVM parameter models encountered in the literature. Second, as algorithmic requirements we only need either an SVM library or any routine for the minimization of convex quadratic functions under linear constraints. In the computational study, we address Multiple Kernel Learning tuning problems for which grid search clearly would be infeasible, while our classification accuracy is comparable to that of ad hoc model-dependent benchmark tuning methods.",
    keywords = "Supervised classification, Support Vector Machines, Parameter tuning, Nested heuristic, Variable neighborhood search, Multiple kernel learning",
    author = "Emilio Carrizosa and Bel{\'e}n Mart{\'i}n-Barrag{\'a}n and Morales, {Dolores Romero}",
    year = "2014",
    doi = "10.1016/j.cor.2013.10.002",
    language = "English",
    volume = "43",
    pages = "328–334",
    journal = "Computers & Operations Research",
    issn = "0305-0548",
    publisher = "Pergamon Press",

    }

    A Nested Heuristic for Parameter Tuning in Support Vector Machines. / Carrizosa, Emilio; Martín-Barragán, Belén; Morales, Dolores Romero.

    In: Computers & Operations Research, Vol. 43, 2014, p. 328–334.

    Research output: Contribution to journalJournal articleResearchpeer-review

    TY - JOUR

    T1 - A Nested Heuristic for Parameter Tuning in Support Vector Machines

    AU - Carrizosa,Emilio

    AU - Martín-Barragán,Belén

    AU - Morales,Dolores Romero

    PY - 2014

    Y1 - 2014

    N2 - The default approach for tuning the parameters of a Support Vector Machine (SVM) is a grid search in the parameter space. Different metaheuristics have been recently proposed as a more efficient alternative, but they have only shown to be useful in models with a low number of parameters. Complex models, involving many parameters, can be seen as extensions of simpler and easy-to-tune models, yielding a nested sequence of models of increasing complexity. In this paper we propose an algorithm which successfully exploits this nested property, with two main advantages versus the state of the art. First, our framework is general enough to allow one to address, with the very same method, several popular SVM parameter models encountered in the literature. Second, as algorithmic requirements we only need either an SVM library or any routine for the minimization of convex quadratic functions under linear constraints. In the computational study, we address Multiple Kernel Learning tuning problems for which grid search clearly would be infeasible, while our classification accuracy is comparable to that of ad hoc model-dependent benchmark tuning methods.

    AB - The default approach for tuning the parameters of a Support Vector Machine (SVM) is a grid search in the parameter space. Different metaheuristics have been recently proposed as a more efficient alternative, but they have only shown to be useful in models with a low number of parameters. Complex models, involving many parameters, can be seen as extensions of simpler and easy-to-tune models, yielding a nested sequence of models of increasing complexity. In this paper we propose an algorithm which successfully exploits this nested property, with two main advantages versus the state of the art. First, our framework is general enough to allow one to address, with the very same method, several popular SVM parameter models encountered in the literature. Second, as algorithmic requirements we only need either an SVM library or any routine for the minimization of convex quadratic functions under linear constraints. In the computational study, we address Multiple Kernel Learning tuning problems for which grid search clearly would be infeasible, while our classification accuracy is comparable to that of ad hoc model-dependent benchmark tuning methods.

    KW - Supervised classification

    KW - Support Vector Machines

    KW - Parameter tuning

    KW - Nested heuristic

    KW - Variable neighborhood search

    KW - Multiple kernel learning

    U2 - 10.1016/j.cor.2013.10.002

    DO - 10.1016/j.cor.2013.10.002

    M3 - Journal article

    VL - 43

    SP - 328

    EP - 334

    JO - Computers & Operations Research

    T2 - Computers & Operations Research

    JF - Computers & Operations Research

    SN - 0305-0548

    ER -