-
Notifications
You must be signed in to change notification settings - Fork 370
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MNT: update contour for mpl v3.8 #2213
Conversation
lib/cartopy/crs.py
Outdated
@@ -1249,7 +1249,10 @@ def quick_vertices_transform(self, vertices, src_crs): | |||
""" | |||
return_value = None | |||
|
|||
if self == src_crs: | |||
if vertices.size == 0: | |||
return_value = vertices |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have not investigated why but, once we call get_datalim
directly on GeoContourSet
, we end up passing some empty arrays to this method which then breaks at the x.min()
step below.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we actually return None in this case? (I don't have a good sense for what we would expect here)
Also, just looking at this quick, I think you can clean it up and get rid of the return_value = None
and just return the proper quantity in the two cases here directly instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If this method returns None
, the calling function goes into some more involved logic. So I think returning an empty array would be preferable.
cartopy/lib/cartopy/mpl/geoaxes.py
Lines 166 to 172 in 41880a8
new_vertices = self.target_projection.quick_vertices_transform( | |
src_path.vertices, self.source_projection) | |
if new_vertices is not None: | |
if new_vertices is src_path.vertices: | |
return src_path | |
else: | |
return mpath.Path(new_vertices, src_path.codes) |
Happy to do the suggested cleanup 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cleanup done.
new_paths = [col_to_data.transform_path(path) for path in paths] | ||
new_paths = [path for path in new_paths if path.vertices.size >= 1] | ||
paths[:] = new_paths |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
set_paths
is not implemented for the ContourSet
although it is for other collections. I wonder if we should raise a feature request, although this use-case is pretty niche. I guess we would only have a problem if get_paths
started returning a copy.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I think this could make sense. It does seem a bit suspect to rely on changing the returned object here like this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Feature request now at matplotlib/matplotlib#26340.
There is one remaining problem here (that I know of): If I set
From this line in Matplotlib. Which I think indicates that it's trying to put the label near to the first vertex (so |
extent = mtransforms.Bbox.union(bboxes) | ||
self.update_datalim(extent.get_points()) | ||
else: | ||
self.update_datalim(result.get_datalim(self.transData)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need to update the datalim here or is that already taken care of because it is a full collection that is added to the axes now? The other collection methods like scatter and pcolormesh don't update the datalims from their return objects. Maybe this has to do with the contour wrapping stuff and not getting the bounds until actually calling update_datalim()? This is mostly just my curiosity here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I saw several test failures when I tried not having it. But now I look more closely I am questioning whether they matter. For example
E AssertionError:
E Arrays are not almost equal to 6 decimals
E
E Mismatched elements: 1 / 4 (25%)
E Max absolute difference: 1.
E Max relative difference: 0.00555556
E x: array([-179., 180., -25., 25.])
E y: array([-180, 180, -25, 25])
which comes from the second assertion here. So it fails because the non-transformed-first version is now the same as the transformed-first version. Without much context, more consistency seems like an improvement?
cartopy/lib/cartopy/tests/mpl/test_contour.py
Lines 137 to 146 in 41880a8
# When calculating the contour in projection-space the extent | |
# will now be the extent of the transformed points (-179, 180, -25, 25) | |
test_func(xx, yy, z, transform=ccrs.PlateCarree(), | |
transform_first=True) | |
assert_array_almost_equal(ax.get_extent(), (-179, 180, -25, 25)) | |
# The extent without the transform_first should be all the way out to -180 | |
test_func(xx, yy, z, transform=ccrs.PlateCarree(), | |
transform_first=False) | |
assert_array_almost_equal(ax.get_extent(), (-180, 180, -25, 25)) |
There are also some image test failures which, by eye, look like they might be because of a similar extent change to the above.
We would need to add in something like self.ignore_existing_data_limits = False
to cover my special dodgy latitude case from #2129.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ahh, OK. I think we definitely need this then, thanks for checking.
The two cases should definitely be different and that is preferable in these cases because their extents truly are different. The gallery kind of shows that if we force the extent to be global on the transformed-first one there is a gap and the extent is only -179/179 and doesn't touch the extreme edges. But, if we let the path logic do its magic it should be out to -180 and fill that gap.
@pytest.mark.mpl_image_compare(filename='contour_label.png', tolerance=0.5) | ||
@pytest.mark.mpl_image_compare( | ||
filename='contour_label.png', | ||
tolerance=3.9 if MPL_VERSION.release[:2] < (3, 8) else 0.5) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These images look really similar to me... I'd probably reverse this and leave the image out and just boost the tolerance to 3.9 for MPL3.8+. But, this seems fine as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK I reverted the test image. In Iris we use a perceptual image hash, so I don't have the experience with RMS to know what counts as a big number.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure there is a good understanding of what "big number" is here either :) I was just basing it off of a quick look at the images. I think it will be quite a bit greater than 10 if there is a big difference (i.e. not just font offsets)
I thought @bjlittle thought about adding those hashes to Cartopy at some point, but I can't find that issue/PR... Maybe it was upstream in pytest-mpl instead matplotlib/pytest-mpl#150
If that PR lands, it would probably be easier to add here as well I'd guess since we are using pytest-mpl already.
new_paths = [col_to_data.transform_path(path) for path in paths] | ||
new_paths = [path for path in new_paths if path.vertices.size >= 1] | ||
paths[:] = new_paths |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I think this could make sense. It does seem a bit suspect to rely on changing the returned object here like this.
lib/cartopy/crs.py
Outdated
@@ -1249,7 +1249,10 @@ def quick_vertices_transform(self, vertices, src_crs): | |||
""" | |||
return_value = None | |||
|
|||
if self == src_crs: | |||
if vertices.size == 0: | |||
return_value = vertices |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we actually return None in this case? (I don't have a good sense for what we would expect here)
Also, just looking at this quick, I think you can clean it up and get rid of the return_value = None
and just return the proper quantity in the two cases here directly instead.
Maybe a contour of only 3 points or something like that would get this case? Or making a contour collection and artificially setting the paths like you're doing here with your "bad" path with only one MOVETO that you extract from here. This might also just be a Cartopy artifact where we aren't producing proper paths when doing our interpolations? The comment in MPL indicates there are two MOVETOs one to start and one to get back to the start after moving, so we have a path with only one which seems odd, but perhaps it is also allowable so we should add that fallback to Matplotlib... I don't see much of a harm in raising that case upstream and putting another IndexError check in like you already did in your other PR over there even if you don't have a self-contained reproducer. |
Rationale
At Matplotlib v3.8 the
ContourSet
becomes a standard collection, rather than a container for a series ofPathCollection
s (matplotlib/matplotlib#25247). This requires two updates in Cartopy, which are also simplifications:We can no longer loop through the
PathCollection
s to figure out the data limits, but can get them directly from theContourSet
.The workaround for contour labelling introduced at Matplotlib contour labelling #1257 no longer works. A simpler workaround that still transforms the paths in place is still require (explanation of why is in the code comment). It is no longer necessary to split the paths into smaller ones as that is done for us by the
_iter_connected_components
method here.The plots produced by old and new versions are sufficiently different to cause a test failure. I updated the reference image to what is produced with mpl 3.8 and increased the tolerance for older versions. This seems to be consistent with the approach used on other image test in Cartopy.
Closes #2207
Implications