Paper: An Approximate Approach for Training Polynomial Kernel SVMs in Linear Time

ACL ID P07-2017
Title An Approximate Approach for Training Polynomial Kernel SVMs in Linear Time
Venue Annual Meeting of the Association of Computational Linguistics
Session System Demonstration
Year 2007
Authors

Kernel methods such as support vector ma- chines (SVMs) have attracted a great deal of popularity in the machine learning and natural language processing (NLP) com- munities. Polynomial kernel SVMs showed very competitive accuracy in many NLP problems, like part-of-speech tagging and chunking. However, these methods are usually too inefficient to be applied to large dataset and real time purpose. In this paper, we propose an approximate method to analogy polynomial kernel with efficient data mining approaches. To prevent expo- nential-scaled testing time complexity, we also present a new method for speeding up SVM classifying which does independent to the polynomial degree d. The experi- mental results showed that our method is 16.94 and 450 times faster than traditional polynomial kernel ...