link: https://www.zhihu.com/question/67209417/answer/302434279
Just stepped on the pit, almost cried out TT. — I clearly added a hundred dropout, why the results have not changed
When using F.dropout (nn. Functional. Dropout), it is necessary to set the state parameter of training consistent with the model as a whole.
Such as:
Class DropoutFC(nn.Module):
def: (self):
super(DropoutFC, self). input):
out = self.fc(input)
out = F.dropout(out, P =0.5)
return out
Net = DropoutFC()
Net. Train ()
# train the Net
The f.d.ropout in this code is actually useless because its training state is always the default False. Since F.ropout is only equivalent to an external function referenced, changes in the training status of the whole model will not cause changes in the training status of the function f.ropout. So, here out = F.d ropout (out) is out = out. Ref: https://github.com/pytorch/pytorch/blob/master/torch/nn/functional.py#L535
The correct way to use it is to pass the training status parameters of the model into the Dropout function
Class DropoutFC(nn.Module):
def: (self):
super(DropoutFC, self). Input):
out = self.fc(input)
out = f.darpout (out, p=0.5, Training =self. Training)
return out
Net = DropoutFC()
Net. Train ()
# train the Net
Or directly using nn. Dropout () (nn) Dropout () is actually the F.d ropout a packing, will also self. Training incoming) Ref: https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/dropout.py#L46
Class DropoutFC(nn. Module):
def __init__:
super(DropoutFC, self).__init__()
self.fc = nn. Linear(100.20)
self.dropout = nn. Dropout(p=0.5)
def forward(self, input):
out = self.fc(input)
out = self.dropout(out)
return out
Net = DropoutFC()
Net.train()