-
Notifications
You must be signed in to change notification settings - Fork 73
Open
Description
I've been working with your code lately and I've notice that the last layer of the keras_mlp.py in both models does never apply dropout:
model = Sequential()
model.add( Dense( params['layer_1_size'], init = params['init'],
activation = params['layer_1_activation'], input_dim = input_dim ))
for i in range( int( params['n_layers'] ) - 1 ):
extras = 'layer_{}_extras'.format( i + 1 )
if params[extras]['name'] == 'dropout':
model.add( Dropout( params[extras]['rate'] ))
elif params[extras]['name'] == 'batchnorm':
model.add( BatchNorm())
model.add( Dense( params['layer_{}_size'.format( i + 2 )], init = params['init'],
activation = params['layer_{}_activation'.format( i + 2 )]))
model.add( Dense( 1, init = params['init'], activation = 'linear' ))
As can be seen in the code, the last hidden layer can't have dropout since the dropout is coded before the layer itself. Is this intentional or it's undesired behaviour?
Metadata
Metadata
Assignees
Labels
No labels