-
-
Notifications
You must be signed in to change notification settings - Fork 198
Description
From this stackoverflow post it looks like we can phrase max(a,b) into
and similarly for min
$$
min(a,b) = \frac{a + b - |a-b|}{2}
$$
So except for the case when
so if a>b the gradient is 1 and if a<b the gradient is 1/2. For min we have
where if
From this it seems like pytorch does something similar to the above. I think this would be nice to implement. We can also write min(vector) for these as well.