Skip to content

Conversation

@nhuet
Copy link
Contributor

@nhuet nhuet commented Sep 9, 2025

No description provided.

Unit, Layer and Group normalization are not linear.
For now we remove linear attribute from the decomon counterparts
and do not implement yet their ibp/affine bounds, nor test them.
- layer_pos and layer_neg must not take into account the bias (already
  added in forward_ibp_propagate())
  => center and add_moving_mean
     default to False in BatchNormalizationKernelConstraint
  => we update the docstring for layer_neg and layer_pos in DecomonLayer to
     make it explicit
- if scale = False, we set layer_neg to 0 (as gamma = None but should be
  considered to 1. so only layer_pos has to be considered)
- in tests, when randomizing layer weights, we need to be careful not to
  put a negative number for BatchNormalization.moving_variance
@ducoffeM ducoffeM merged commit 45657ab into airbus:refactor Sep 9, 2025
16 of 22 checks passed
@nhuet nhuet deleted the normalization branch September 9, 2025 12:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants