Finding the costs of extractig a page #1646
Unanswered
JonasPapinigis
asked this question in
Forums - Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm looking to use this library on a relatively large scale seeded extraction. I wanted to know there is a way to expose the amount of tokens used in a single-(or even multi-)page extraction/crawl? I know I get the CrawlResult for each page, but I don't see any input/output token consumption metrics anywhere (possibly in metadata?).
My inference provider does not have a way to view EXACT costs, but I want to be able to make something like this for myself: https://github.com/orkunkinay/openai_cost_calculator
Thanks for any help!
Beta Was this translation helpful? Give feedback.
All reactions