-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About SE layer in Fused-MBConv #5
Comments
OK...I guess the missing SE parameter in the architecture table implies that. Nevermind. |
I also notice this implementation lack of SE layer in Fused MBConv. Could further explain about this ? |
In the architecture table, "SE" shows up but only with normal MBConv, I guess that means it is not used with Fused MBConv |
Oh thanks! |

I understand that the SE layer might get removed in the original fusion attempt: https://ai.googleblog.com/2019/08/efficientnet-edgetpu-creating.htmlBut according to Fig2 in the paper, it seems like the SE layer is preserved in the Fused MBConv. So I am quite confused when I found the SE layer lacking in your implementation. Have you ever tried the implementation with SE layer, or am I missing something in the paper?
The text was updated successfully, but these errors were encountered: