You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If a module is passed to a sub-module, for example:
import torch.nn as nn
import ptflops
class Block(nn.Module):
def __init__(self, linear_layer) -> None:
super().__init__()
self.linear_layer = linear_layer
def forward(self, x):
return self.linear_layer(x)
class Test_model(nn.Module):
def __init__(self) -> None:
super().__init__()
self.linear_layer = nn.Linear(1000, 1000)
self.block = Block(self.linear_layer)
def forward(self, x):
out = self.linear_layer(x)
out = self.block(out)
return out
net = Test_model()
print(ptflops.get_model_complexity_info(net, (20, 1000)))
then, the flops of module nn.Linear(1000, 1000) will be counted twice in Test_model and Block:
Warning: variables __flops__ or __params__ are already defined for the moduleLinear ptflops can affect your code!
Test_model(
2.0 M, 200.000% Params, 80.0 MMac, 100.000% MACs,
(linear_layer): Linear(1.0 M, 100.000% Params, 40.0 MMac, 50.000% MACs, in_features=1000, out_features=1000, bias=True)
(block): Block(
1.0 M, 100.000% Params, 40.0 MMac, 50.000% MACs,
(linear_layer): Linear(1.0 M, 100.000% Params, 40.0 MMac, 50.000% MACs, in_features=1000, out_features=1000, bias=True)
)
)
('80.0 MMac', '1.0 M')
The text was updated successfully, but these errors were encountered:
CocytusDuo
changed the title
flops are counted multiple times if a module is passed to a sub-module
flops are counted multiple times if a module is shared by other modules
Mar 30, 2023
Hi! Yeah, that's a drawback of the tracing process. Since it doesn't affect the output value, I'd consider it as a minor one. I'll have a look if a simple fix is possible.
Since it doesn't affect the output value, I'd consider it as a minor one
Hi, @sovrasov
Why you said it doesn't affect the output value, if the linear_layer just use one time, we calculate the op's MACs twice. I I think it's wrong. Please correct me if there is misunderstanding.
By the output value I assume the return value of get_model_complexity_info, and it is not affected. For calculating extended statistics during model printing, I use a different mechanism, which can not handle shared submodules, and therefore counts parameters of shared submodules several times.
At the same time, inference of those shared submodules is conducted several times, so they contribute to the final MACs several times as well.
If a module is passed to a sub-module, for example:
then, the flops of module
nn.Linear(1000, 1000)
will be counted twice inTest_model
andBlock
:The text was updated successfully, but these errors were encountered: