[Solved] bushi RuntimeError: version_ <= kMaxSupportedFileFormatVersion INTERNAL ASSERT FAILED at /pytorch/caffe2/s

Error Messages:

RuntimeError: version_ <= kMaxSupportedFileFormatVersion INTERNAL ASSERT FAILED at /pytorch/caffe2/serialize/inline_container.cc:132, please report a bug to PyTorch. Attempted to read a PyTorch file with version 3, but the maximum supported version for reading is 2. Your PyTorch installation may be too old. (init at /pytorch/caffe2/serialize/inline_container.cc:132)
frame #0: c10::Error::Error(c10::SourceLocation, std::string const&) + 0x33 (0x7fdcd0189193 in /home/a430/intel/intelpython3/envs/zayn/lib/python3.6/site-packages/torch/lib/libc10.so)
frame #1: caffe2::serialize::PyTorchStreamReader::init() + 0x1f5b (0x7fdcd33119eb in /home/a430/intel/intelpython3/envs/zayn/lib/python3.6/site-packages/torch/lib/libtorch.so)
frame #2: caffe2::serialize::PyTorchStreamReader::PyTorchStreamReader(std::string const&) + 0x64 (0x7fdcd3312c04 in /home/a430/intel/intelpython3/envs/zayn/lib/python3.6/site-packages/torch/lib/libtorch.so)
frame #3: <unknown function> + 0x6c53a6 (0x7fdd1b2423a6 in /home/a430/intel/intelpython3/envs/zayn/lib/python3.6/site-packages/torch/lib/libtorch_python.so)
frame #4: <unknown function> + 0x2961c4 (0x7fdd1ae131c4 in /home/a430/intel/intelpython3/envs/zayn/lib/python3.6/site-packages/torch/lib/libtorch_python.so)
<omitting python frames>

Some people say there is a problem with the torch version. My torch version is 1.4.0, try upgrading to 1.6.0, torchvision was upgraded to 0.7.0, this problem will be reported:

RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.

Just follow the hint and add map_location=torch.device(‘cpu’) when loading the model, as follows:
best_model = torch.load("../weights/September01-Unet-se_resnext50_32x4d/checkpoint.pth.tar", map_location=torch.device('cpu'))

Read More: