General
Feature extraction involves simplifying the amount of resources required to describe a large set of data accurately. When performing analysis of complex data one of the major problems stems from the number of variables involved. Analysis with a large number of variables generally requires a large amount of memory and computation power or a classification algorithm which overfits the training sample and generalizes poorly to new samples. Feature extraction is a general term for methods of constructing combinations of the variables to get around these problems while still describing the data with sufficient accuracy.
Best results are achieved when an expert constructs a set of application-dependent features. Nevertheless, if no such expert knowledge is available general dimensionality reduction techniques may help. These include:
- Principal component analysis
- Semidefinite embedding
- Multifactor dimensionality reduction
- Multilinear subspace learning
- Nonlinear dimensionality reduction
- Isomap
- Kernel PCA
- Multilinear PCA
- Latent semantic analysis
- Partial least squares
- Independent component analysis
- Autoencoder
Read more about this topic: Feature Extraction
Famous quotes containing the word general:
“In truth, a mature man who uses hair-oil, unless medicinally, that man has probably got a quoggy spot in him somewhere. As a general rule, he cant amount to much in his totality.”
—Herman Melville (18191891)
“Treating water as a name of a single scattered object is not intended to enable us to dispense with general terms and plurality of reference. Scatter is in fact an inconsequential detail.”
—Willard Van Orman Quine (b. 1908)
“At that,
his small size,
keen eyes,
serviceable beak
and general truculence
assure his survival”
—William Carlos Williams (18831963)