You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At the moment we do not check if an e.g. optimizer has a HessianFunction if it has a GradientFunction. So we could have manual derivatives in one case (GradientFunction) and automatic ones in another case (HessianAutodiff). Should we leave it that way?