ABSTRACT
A geometric and non parametric procedure for testing if two finite set of points are linearly separable is proposed. The Linear Separability Test is equivalent to a test that determines if a strictly positive point h > 0 exists in the range of a matrix A (related to the points in the two finite sets). The algorithm proposed in the paper iteratively checks if a strictly positive point exists in a subspace by projecting a strictly positive vector with equal co-ordinates (p), on the subspace. At the end of each iteration, the subspace is reduced to a lower dimensional subspace. The test is completed within r ≤ min(n, d + 1) steps, for both linearly separable and non separable problems (r is the rank of A, n is the number of points and d is the dimension of the space containing the points). The worst case time complexity of the algorithm is O(nr3) and space complexity of the algorithm is O(nd). A small review of some of the prominent algorithms and their time complexities is included. The worst case computational complexity of our algorithm is lower than the worst case computational complexity of Simplex, Perceptron, Support Vector Machine and Convex Hull Algorithms, if d<n2/3.
- Barber, C. B., Dobkin, D. P., & Huhdanpaa, H. (1996). The quickhull algorithm for convex hulls. ACM Transactions on Mathematical Software, 22, 469--483. Google ScholarDigital Library
- Cristianini, N., & Taylor, J. S. (2000). An introduction to support vector machines. Cambridge: Cambridge University Press. Google ScholarDigital Library
- Duda, R. O., Hart, P. E., & Stork, D. G. (2000). Pattern classification. New York: Wiley-Interscience. Google ScholarDigital Library
- Elizondo, D. (2006). The linear separability problem: Some testing methods. IEEE Transactions on Neural Networks, 17, 330--344. Google ScholarDigital Library
- Gonzaga, C. C. (1988). An algorithm for solving linear programming problems in O(n 3 L) operations. In N. Megiddo (Ed.), Progress in mathematical programming, 1--28. Springer-Verlag. Google ScholarDigital Library
- Joachims, T. (1999). Making large-Scale SVM learning practical. In B. Schlkopf, C. Burges and A. Smola (Eds.), Advances in Kernel Methods - Support Vector Learning. MIT-Press. Google ScholarDigital Library
- Newman, D., Hettich, S., Blake, C., & Merz, C. (1998). UCI repository of machine learning databases.Google Scholar
- Tsang, I. W., Kwok, J. T., & Cheung, P.-M. (2005). Core vector machines: Fast SVM Training on Very Large Data Sets. Journal of Machine Learning Research, 6, 363--392. Google ScholarDigital Library
- Wasserman, P. (1989). Neural computing theory and practice. Van Nostrand Reinhold. Google ScholarDigital Library
Recommendations
Linear discriminant projection embedding based on patches alignment
Dimensionality reduction is often required as a preliminary stage in many data analysis applications. In this paper, we propose a novel supervised dimensionality reduction method, called linear discriminant projection embedding (LDPE), for pattern ...
Multi-linear neighborhood preserving projection for face recognition
This paper proposes a novel method of supervised and unsupervised multi-linear neighborhood preserving projection (MNPP) for face recognition. Unlike conventional neighborhood preserving projections, the MNPP method operates directly on tensorial data ...
Fast Linear Discriminant Analysis Using Binary Bases
ICPR '06: Proceedings of the 18th International Conference on Pattern Recognition - Volume 02Linear Discriminant Analysis (LDA) is a widely used technique for pattern classification. It seeks the linear projection of the data to a low dimensional subspace where the data features can be modelled with maximal discriminative power. The main ...
Comments