Skip to content

Commit 26ec25d

Browse files
committed
fix simplenet for timm finetuning recipe
if during model creation, the number of classes differed from imagenets 1000 classes simplenet would crash because the pretrained weights wouldnt match. now when generating architectures, we check for this and if encountered with one, only swap the classifier after the model has been loaded with original weights.
1 parent 665f27b commit 26ec25d

File tree

1 file changed

+1
-1
lines changed
  • ImageNet/training_scripts/imagenet_training/timm/models

1 file changed

+1
-1
lines changed

ImageNet/training_scripts/imagenet_training/timm/models/simplenet.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -344,7 +344,7 @@ def forward_features(self, x: torch.Tensor) -> torch.Tensor:
344344
return self.features(x)
345345

346346
def forward_head(self, x: torch.Tensor, pre_logits: bool = False):
347-
x_ = self.forward_features(x)
347+
x = self.forward_features(x)
348348
if pre_logits:
349349
return x
350350
else:

0 commit comments

Comments
 (0)