You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
How to modify or add layers to a pretrained Brevitas model?
I am working with the CNV pretrained model from Brevitas to use it with FINN. My goal is to modify a layer or add a custom one. For example, I want to replace the QuantReLU with my custom MyQuantReLU or substitute a QuantMaxPool with a layer that calculates a minpool instead.
Could you advise on the best approach to achieve this?
Looking forward to your support.
The text was updated successfully, but these errors were encountered:
(easiest): Rewrite the CNV model to use your custom layers - if the state_dict's are compatible, you can simply load the state_dict from the pretrained version into your custom version. If they're not compatible, you might need to write a custom _load_state_dict method for your custom module.
(hard): Use our module-to-module modification infrastructure to convert the CNV modules of interest into the modules you want. This can be tricky because our tools try to reconstruct the kwargs used to instantiate the original model from its attributes. This can quickly turn into quite the rabbit hole - I wouldn't recommend it for a small model like CNV.
If you choose to do option 2, we will only provide limited support as this is for advanced users.
Note, we do not have a QuantMaxPool or QuantReLU in CNV - are you sure you're looking at the right network?
Out of curiosity, is there something that you need in your custom QuantReLU that isn't provided in the regular one? If so, we may consider adding it - please let us know!
How to modify or add layers to a pretrained Brevitas model?
I am working with the CNV pretrained model from Brevitas to use it with FINN. My goal is to modify a layer or add a custom one. For example, I want to replace the QuantReLU with my custom MyQuantReLU or substitute a QuantMaxPool with a layer that calculates a minpool instead.
Could you advise on the best approach to achieve this?
Looking forward to your support.
The text was updated successfully, but these errors were encountered: