Quantcast

Research on Point-wise Gated Deep Networks

Research paper by Nan Zhang, Shifei Ding, Jian Zhang, Yu Xue

Indexed on: 11 Oct '16Published on: 17 Sep '16Published in: Applied Soft Computing



Abstract

Stacking Restricted Boltzmann Machines (RBM) to create deep networks, such as Deep Belief Networks (DBN) and Deep Boltzmann Machines (DBM), has become one of the most important research fields in deep learning. DBM and DBN provide state-of-the-art results in many fields such as image recognition, but they don't show better learning abilities than RBM when dealing with data containing irrelevant patterns. Point-wise Gated Restricted Boltzmann Machines (pgRBM) can effectively find the task-relevant patterns from data containing irrelevant patterns and thus achieve satisfied classification results. For the limitations of the DBN and the DBM in the processing of data containing irrelevant patterns, we introduce the pgRBM into the DBN and the DBM and present Point-wise Gated Deep Belief Networks (pgDBN) and Point-wise Gated Deep Boltzmann Machines (pgDBM). The pgDBN and the pgDBM both utilize the pgRBM instead of the RBM to pre-train the weights connecting the networks' the visible layer and the hidden layer, and apply the pgRBM learning task-relevant data subset for traditional networks. Then, this paper discusses the validity that dropout and weight uncertainty methods are developed to prevent overfitting in pgRBMs, pgDBNs, and pgDBMs networks. Experimental results on MNIST variation datasets show that the pgDBN and the pgDBM are effective deep neural networks learning

Graphical abstract 10.1016/j.asoc.2016.08.056.jpg
Figure 10.1016/j.asoc.2016.08.056.0.jpg
Figure 10.1016/j.asoc.2016.08.056.1.jpg
Figure 10.1016/j.asoc.2016.08.056.2.jpg
Figure 10.1016/j.asoc.2016.08.056.3.jpg
Figure 10.1016/j.asoc.2016.08.056.4.jpg
Figure 10.1016/j.asoc.2016.08.056.5.jpg
Figure 10.1016/j.asoc.2016.08.056.6.jpg