-
Notifications
You must be signed in to change notification settings - Fork 178
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Consider making end_datetime exclusive #1283
Comments
@LiamBindle If I get a start and end datetime for a capture, which should both be included in the range and the end_datetime is exclusive, I have the same issue. I just that I need to add an undefined amount of time instead of subtracting it. I guess both cases are valid and we can always find use cases for both, but we should make the spec unambiguous so we have to choose one or the other and in any case someone will complain unless we add a property such as So not sure how to proceed with this issue. If you feel strongly about it, maybe start a discussion about it in one of the next STAC community calls where it potentially gets a bit more visibility than in this issue... |
Hey @m-mohr, thanks for the suggestion to raise it at the next community call. That makes sense so I'll try to attend the next one. Cheers! |
Sorry I wasn't able to attend the meeting yesterday, I was off because it was a holiday in Canada. I'll try to attend the next one |
Dang, apologies for missing the meeting. I forgot about it this morning. I'll try to attend the next one. |
I 👍🏼 on inclusive end_datetime convention (CoC principle) |
Let's try to find other examples for standards that describe data (i.e. not filter expressions), e.g.
|
I did some research on standards / specifications that describe data (i.e. I excluded filtering because that's not our usecase here). Inclusive:
Exclusive:
Undefined:
It seems to lean towards inclusive ends, but I think I'd ultimately make it dependant on what OGC API - Features says. |
Due to opengeospatial/ogcapi-features#934 we'll inherit inclusive bounds for temporal extents in Collections. JSON FG is also inclusive, which OGC API - Records is based on and we try to align with Records as much as possible. |
We should probably also clarify that for the collection extent, see opengeospatial/ogcapi-features#934 |
Discussed in STAC community meeting: We think inclusive is the right choice based on the research I did above. |
First off, thank you all for your hard work! We're using STAC with great success and it's been a terrific tool for organizing our data.
Next, sorry @m-mohr for missing your response and question in #1255. My bad and I see this has already gone ahead. I'm a bit concerned that #1280 introduces a logical flaw into the spec which is that it makes it impossible to represent a time series where every instant in time is covered by exactly one item. Feel free to close this if you don't think the topic needs any more discussion, but I just wanted to advocate for an exclusive end_datetime once more.
@m-mohr You raised this question in #1280:
In this case, the end date in the source's metadata is already exclusive isn't it? The period [2022-01-01T00:00:00Z,2022-01-01T00:00:02Z) with an exclusive end date has a duration of exactly 2 seconds.
But what happens if there is another capture for the next 2 seconds of 2022-01-01T00:00:02Z - 2022-01-01T00:00:04Z? If the end date is inclusive then two items claim to cover the 2022-01-01T00:00:02Z instant in time whereas an exclusive end date handles it cleanly. If the end date is inclusive then you can't have a time series with exactly one item for every instant in time.
From #1280
I don't think so. Say you have an item that represents an average for the year 2018. When start is inclusive and end is exclusive you have start_datetime="2018-01-01T00:00:00.000000000Z" and end_datetime="2019-01-01T00:00:00.000000000Z". If the end date is inclusive then you need to subtract an undefined amount of time (1ns?) from the ending date. I.e., should it be end_datetime="2018-12-31T23:59:59.999999999Z"?
The text was updated successfully, but these errors were encountered: