You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
We have a website with 1000's of pages of documentation and it seems many bots on the internet request pages that don't exist.
boto3 calls S3, looking for the doc, client.get_object, and hits NoSuchKey.
That's fine, everything works.
The question I have is - are such errors "expensive" in any way?
What's the fastest, quickest, easiest, way to deal with large numbers of errors?
Just try to retrieve the page. If it's an error, no problem.
or
Implement a preliminary call to S3 to see if the object is present. If it's not there, don't attempt the full "get_object". But... does that save any time, or it only adds CPU cycles, rather than removing processing tasks.
or
Adjust exception handling somehow, so it's more or less efficient than it would be otherwise.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
We have a website with 1000's of pages of documentation and it seems many bots on the internet request pages that don't exist.
boto3 calls S3, looking for the doc, client.get_object, and hits NoSuchKey.
That's fine, everything works.
The question I have is - are such errors "expensive" in any way?
What's the fastest, quickest, easiest, way to deal with large numbers of errors?
or
or
Beta Was this translation helpful? Give feedback.
All reactions