Indexed on: 01 Jun '18Published on: 28 Mar '18Published in: International journal of computational intelligence and applications
International Journal of Computational Intelligence and Applications, Volume 17, Issue 01, March 2018. Data in a high-dimensional data space may reside in a low-dimensional manifold embedded within the high-dimensional space. Manifold learning discovers intrinsic manifold data structures to facilitate dimensionality reductions. We propose a novel manifold learning technique called fast [math] selection for locally linear embedding or FSLLE, which judiciously chooses an appropriate number (i.e., parameter [math]) of neighboring points where the local geometric properties are maintained by the locally linear embedding (LLE) criterion. To measure the spatial distribution of a group of neighboring points, FSLLE relies on relative variance and mean difference to form a spatial correlation index characterizing the neighbors’ data distribution. The goal of FSLLE is to quickly identify the optimal value of parameter [math], which aims at minimizing the spatial correlation index. FSLLE optimizes parameter [math] by making use of the spatial correlation index to discover intrinsic structures of a data point’s neighbors. After implementing FSLLE, we conduct extensive experiments to validate the correctness and evaluate the performance of FSLLE. Our experimental results show that FSLLE outperforms the existing solutions (i.e., LLE and ISOMAP) in manifold learning and dimension reduction. We apply FSLLE to face recognition in which FSLLE achieves higher accuracy than the state-of-the-art face recognition algorithms. FSLLE is superior to the face recognition algorithms, because FSLLE makes a good tradeoff between classification precision and performance.