Search for a command to run...
The support vector machine (SVM), recently introduced by Boser, Guyon, and Vapnik is useful in solving supervised classification in high dimensions. The authors discuss the SVM and its application to high dimensional hyperspectral data taken from NASA's AVIRIS sensor (224 bands) and from a commercially available sensor called AISA (20-40 bands), built by SPECIM of Finland. Traditionally, classifiers model the density of the various classes and then find a separating surface. However density estimation in high dimensions suffers from the Hughes effect, necessitating a feature selection step to reduce the dimensionality of the data. The SVM approach does not suffer this limitation because it directly seeks a separating surface through an optimization procedure that finds the exemplars that form the boundaries of the classes. These exemplars are called the support vectors. This is significant because it is usually true that there are a small subset of all the training data that are involved in defining the separating surface, i.e., those examples thatare closest to the separating surface. In addition, the SVM approach uses the kernel method, discussed below, to map the data with a non-linear transformation to a higher dimensional space and in that space attempts to find a linear separating surface between the two classes. Why the curse of dimensionality is not a problem for the kernel method is discussed below.