Abstract:Dropout has been proven to be an eﬀective method for reducing overﬁtting in deep artiﬁcial neural networks. We present 3 new alternative methods for performing dropout on a deep neural network which improves the eﬀectiveness of the dropout method over the same training period. These methods select neurons to be dropped through statistical values calculated using a neurons change in weight, the average size of a neuron’s weights, and the output variance of a neuron. We found that increasing the probability of dropping neurons with smaller values of these statistics and decreasing the probability of those with larger statistics gave an improved result in training over 10,000 epochs. The most eﬀective of these was found to be the Output Variance method, giving an average improvement of 1.17% accuracy over traditional dropout methods.
© Erik Barrow