It’s probably such an error reporting method. I’ve been using torch for so many years. I first encountered this error NotImplementedError
I’m not using a nightly version
Traceback (most recent call last):
File "xxxxx\x.py", line 268, in <module>
print(x(y).shape)
File "xxxxx\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl
result = self.forward(*input, **kwargs)
File "xxxxx\x.py", line 259, in forward
x = self.features(x)
File "xxxxx\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl
result = self.forward(*input, **kwargs)
File "xxxxx\lib\site-packages\torch\nn\modules\container.py", line 119, in forward
input = module(input)
File "xxxxx\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl
result = self.forward(*input, **kwargs)
File "xxxxx\lib\site-packages\torch\nn\modules\module.py", line 201, in _forward_unimplemented
raise NotImplementedError
NotImplementedError
Call self.forward
in _call_impl
result = self.forward(*input, **kwargs)
If you inherit nn.Module
, and if you don’t implement self.forward
, it will
raise NotImplementedError
It turns out that when I use this function, I really don’t have the forward
method:
class Hswish(nn.Module):
def __init__(self, inplace=True):
super(Hswish, self).__init__()
self.inplace = inplace
def __swish(self, x, beta, inplace=True):
# But this swish is not used by H-swish
# The reason it's called H-swish is to make the sigmoid hard
# approximated by Relu6(x+3)/6
# Reduced computational effort for embedded deployment
return x * F.sigmoid(beta * x, inplace)
@staticmethod
def Hsigmoid(x, inplace=True):
return F.relu6(x + 3, inplace=inplace)/6
def foward(self, x):
return x * self.Hsigmoid(x, self.inplace)
forward
Write as foward
…