ABSTRACT
Support Vector Machines (SVMs) deliver state-of-the-art performance in real-world applications and are established as one of the standard tools for machine learning and data mining. A key problem of these methods is how to choose an optimal kernel function. The real-world applications have also emphasized the need to adapt the kernel to the characteristics of heterogeneous data in order to boost the classification accuracy. Therefore, our goal is to automatically search a task specific kernel function. We use reinforcement learning based search mechanisms to discover custom kernel functions and verify the effectiveness of our approach by conducting an empirical evaluation with the discovered kernel function on MNIST classification. Our experiments show that the discovered kernel function shows significantly better classification performance than well-known classic kernels. Our solution will be very effective for resource constrained systems with low memory footprint which rely on traditional machine learning algorithms like SVMs for classification tasks.
Supplemental Material
Available for Download
Supplemental material.
- Irwan Bello, Barret Zoph, Vijay Vasudevan, and Quoc V Le. 2017. Neural optimizer search with reinforcement learning. arXiv preprint arXiv:1709.07417 (2017). Google ScholarDigital Library
- Bao Rong Chang and Hsiu-Fen Tsai. 2007. Composite of adaptive support vector regression and nonlinear conditional heteroscedasticity tuned by quantum minimization for forecasts. Applied Intelligence 27, 3 (2007), 277--289. Google ScholarDigital Library
- Olivier Chapelle, Vladimir Vapnik, Olivier Bousquet, and Sayan Mukherjee. 2002. Choosing multiple parameters for support vector machines. Machine learning 46, 1--3 (2002), 131--159. Google ScholarDigital Library
- Ronan Collobert and Jason Weston. 2008. A unified architecture for natural language processing: Deep neural networks with multitask learning. In Proceedings of the 25th international conference on Machine learning. ACM, 160--167. Google ScholarDigital Library
- Alex Krizhevsky, Ilya Sutskever, and Geoffrey E Hinton. 2012. Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems. 1097--1105. Google ScholarDigital Library
- Abdel-rahman Mohamed, George E Dahl, and Geoffrey Hinton. 2012. Acoustic modeling using deep belief networks. IEEE Transactions on Audio, Speech, and Language Processing 20, 1 (2012), 14--22. Google ScholarDigital Library
- Prajit Ramachandran, Barret Zoph, and Quoc V Le. 2018. Searching for activation functions. (2018).Google Scholar
- Barret Zoph and Quoc V Le. 2016. Neural architecture search with reinforcement learning. arXiv preprint arXiv:1611.01578 (2016).Google Scholar
Recommendations
Kernel selection in multi-class support vector machines and its consequence to the number of ties in majority voting method
Support vector machines are a relatively new classification method which has nowadays established a firm foothold in the area of machine learning. It has been applied to numerous targets of applications. Automated taxa identification of benthic ...
Optimizing the kernel selection for support vector machines using performance measures
A2CWiC '10: Proceedings of the 1st Amrita ACM-W Celebration on Women in Computing in IndiaSupport Vector Machine (SVM) is a powerful classification technique based on the idea of Structural Risk Minimization. The main idea behind the Support Vector Machine is to separate the classes with a surface that maximizes the margin between them. Key ...
Autocorrelation Kernel Functions for Support Vector Machines
ICNC '07: Proceedings of the Third International Conference on Natural Computation - Volume 01Kernel functions (kernel) are key part and the hard issue of Support Vector Machines. We research the relation of kernel functions and nonlinear mappings and mapped spaces. A new kind of admissible Support Vector Machines kernel is presented. It is ...
Comments