You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I use @btime (and hypertime) to time, looking for the minimum time.
They both take some time, longer than @time which isn't reliable.
I was thinking would it be valuable to run task A, task B, A, B, etc. to see which is faster? Or because of cache, maybe A, A, .. A, B, B, ...B, and maybe again A..., B...?
Often I don't really care about the exact minimum value, I just care which is faster since I'm optimizing, and figure it might need way fewer runs, still statistically significant.
The text was updated successfully, but these errors were encountered:
One more idea for this. If both timings seem to be statistically the same, then look at the number and amount of allocations. They would usually be the same then too, but if for some reason different, it might be useful to point it out.
Conversely, if the two timings are different, and it's because the GC kicked in, then may not be meaningful. This would though likely not happen for the min. time.
no_dot_timing = @benchmark no_dot(A, B)
with_dot_timing = @benchmark with_dot(A, B)
then something. I would rather want something like:
@bfaster no_dot(A, B) with_dot(A, B)
The benefit could also be early exit, not possible for the other way. Often very obvious which is faster after just few trials. First implementation could do as above, reusing code.
I use
@btime
(and hypertime) to time, looking for the minimum time.They both take some time, longer than
@time
which isn't reliable.I was thinking would it be valuable to run task A, task B, A, B, etc. to see which is faster? Or because of cache, maybe A, A, .. A, B, B, ...B, and maybe again A..., B...?
Often I don't really care about the exact minimum value, I just care which is faster since I'm optimizing, and figure it might need way fewer runs, still statistically significant.
The text was updated successfully, but these errors were encountered: