Abstract
Searching for an optimal feature subset from a high dimensional feature space is known to be an NP-complete problem. We present a hybrid algorithm, SAGA, for this task. SAGA combines the ability to avoid being trapped in a local minimum of simulated annealing with the very high rate of convergence of the crossover operator of genetic algorithms, the strong local search ability of greedy algorithms and the high computational efficiency of generalized regression neural networks. We compare the performance over time of SAGA and well-known algorithms on synthetic and real datasets. The results show that SAGA outperforms existing algorithms.
Original language | English |
---|---|
Pages (from-to) | 5-13 |
Number of pages | 9 |
Journal | Pattern Recognition |
Volume | 43 |
Issue number | 1 |
Early online date | 24 Jun 2009 |
DOIs | |
Publication status | Published - Jan 2010 |
Externally published | Yes |