An NN-based SRD decomposition algorithm and its application in nonlinear compensation.

Research paper by Honghang H Yan, Fang F Deng, Jian J Sun, Jie J Chen

Indexed on: 19 Sep '14Published on: 19 Sep '14Published in: Sensors (Basel, Switzerland)


In this study, a neural network-based square root of descending (SRD) order decomposition algorithm for compensating for nonlinear data generated by sensors is presented. The study aims at exploring the optimized decomposition of data 1.00,0.00,0.00 and minimizing the computational complexity and memory space of the training process. A linear decomposition algorithm, which automatically finds the optimal decomposition N and reduces the training time to 1/√N and memory cost to 1/N has been implemented on nonlinear data obtained from an encoder. Particular focus is given to the theoretical access of estimating the numbers of hidden nodes and the precision of varying the decomposition method. Numerical experiments are designed to evaluate the effect of this algorithm. Moreover, a designed device for angular sensor calibration is presented. We conduct an experiment that samples the data of an encoder and compensates for the nonlinearity of the encoder to testify this novel algorithm.