aPaperADay
12 Dropout - Conclusions

Dropout is a technique. It fundamentally augments the learning environment and positively at that. The addition of the random binary-probability vector builds a learning environment robust to the brittle co-adaptations native to standard back propagation. It was shown that random dropout was useful for a many different neural net application domains, showing that the dropout technique changes properties at a fundamental level.

Dropout is fundamental enough to also benefit Restricted Boltzmann Machines.

The main drawback to dropout is the increased training time. Since dropout breaks the coadaptations that form with standard propagation, a product of this is noisier parameter updates. However, the improvement gained by dropout is quite compelling. Future work in this area will probably look at decreasing the training time penalty for using dropout.