TY - JOUR AU - Lei, Zijian AU - Lan, Liang PY - 2020/04/03 Y2 - 2024/03/28 TI - Improved Subsampled Randomized Hadamard Transform for Linear SVM JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 34 IS - 04 SE - AAAI Technical Track: Machine Learning DO - 10.1609/aaai.v34i04.5880 UR - https://ojs.aaai.org/index.php/AAAI/article/view/5880 SP - 4519-4526 AB - <p>Subsampled Randomized Hadamard Transform (SRHT), a popular random projection method that can efficiently project a <em>d</em>-dimensional data into <em>r</em>-dimensional space (<em>r</em> ≪ <em>d</em>) in <em>O</em>(<em>dlog</em>(<em>d</em>)) time, has been widely used to address the challenge of high-dimensionality in machine learning. SRHT works by rotating the input data matrix X ∈ ℝ<sup><em>n</em> × <em>d</em></sup> by Randomized Walsh-Hadamard Transform followed with a subsequent uniform column sampling on the rotated matrix. Despite the advantages of SRHT, one limitation of SRHT is that it generates the new low-dimensional embedding without considering any specific properties of a given dataset. Therefore, this data-independent random projection method may result in inferior and unstable performance when used for a particular machine learning task, e.g., classification. To overcome this limitation, we analyze the effect of using SRHT for random projection in the context of linear SVM classification. Based on our analysis, we propose importance sampling and deterministic top-<em>r</em> sampling to produce effective low-dimensional embedding instead of uniform sampling SRHT. In addition, we also proposed a new supervised non-uniform sampling method. Our experimental results have demonstrated that our proposed methods can achieve higher classification accuracies than SRHT and other random projection methods on six real-life datasets.</p> ER -