This paper introduces a classification algorithm called HyperCuts. Like the previously best known algorithm, HiCuts, HyperCuts is based on a decision tree structure. Unlike HiCuts, however, in which each node in the decision tree represents a hyperplane, each node in the HyperCuts decision tree represents a k-dimensional hypercube, where k > 1. Using this extra degree of freedom and a new set of heuristics to find optimal hypercubes for a given amount of storage, HyperCuts can provide an order of magnitude improvement over existing classification algorithms. It uses 2 to 10 times less memory than HiCuts optimized for memory, while the worst case search time of HyperCuts is 50-500% better than that of HiCuts optimized for speed. Compared with another scheme recently introduced in Infocom 2003 called EGT-PC, HyperCuts uses 1.8-7 times less memory space while the worst case search time is up to $5$ times smaller. More importantly, unlike EGT-PC, HyperCuts can be fully pipelined to provide one classification result every memory access time, and has fast updates.
The authors of these documents have submitted their reports to this technical report series for the purpose of non-commercial dissemination of scientific work. The reports are copyrighted by the authors, and their existence in electronic format does not imply that the authors have relinquished any rights. You may copy a report for scholarly, non-commercial purposes, such as research or instruction, provided that you agree to respect the author's copyright. For information concerning the use of this document for other than research or instructional purposes, contact the authors. Other information concerning this technical report series can be obtained from the Computer Science and Engineering Department at the University of California at San Diego, firstname.lastname@example.org.
[ Search ]