Skip to content
This repository was archived by the owner on Jul 1, 2024. It is now read-only.

Commit c4d9725

Browse files
mannatsinghfacebook-github-bot
authored andcommitted
Fix SiLU breakage in RegNets for PT < 1.7 (#725)
Summary: Pull Request resolved: #725 SiLU is only available from PT >= 1.7 Now that our tests work fine, we can finally detect and fix issues like these Reviewed By: kazhang Differential Revision: D27236649 fbshipit-source-id: ed5bc81a40d1a21def13d81b83b181c55e86f6f5
1 parent 4110064 commit c4d9725

File tree

1 file changed

+5
-7
lines changed

1 file changed

+5
-7
lines changed

classy_vision/models/regnet.py

Lines changed: 5 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -447,17 +447,15 @@ class RegNet(ClassyModel):
447447
def __init__(self, params: RegNetParams):
448448
super().__init__()
449449

450-
if params.activation_type == ActivationType.SILU and get_torch_version() < [
451-
1,
452-
7,
453-
]:
454-
raise RuntimeError("SiLU activation is only supported since PyTorch 1.7")
455-
450+
silu = None if get_torch_version() < [1, 7] else nn.SiLU()
456451
activation = {
457452
ActivationType.RELU: nn.ReLU(params.relu_in_place),
458-
ActivationType.SILU: nn.SiLU(),
453+
ActivationType.SILU: silu,
459454
}[params.activation_type]
460455

456+
if activation is None:
457+
raise RuntimeError("SiLU activation is only supported since PyTorch 1.7")
458+
461459
# Ad hoc stem
462460
self.stem = {
463461
StemType.RES_STEM_CIFAR: ResStemCifar,

0 commit comments

Comments
 (0)