You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
I don't believe there is a way for the client to retrieve the model chunk size. It can be set via AI.CONFIG, but I see no way to get the current setting back out.
Describe the solution you'd like
The cleanest approach would be to make a new command, say AI.RETRIEVECONFIG, that includes this information. It could also indicate whether the various backends have been loaded, for symmetry with AI.CONFIG
Describe alternatives you've considered
I'm not aware of any alternatives
Additional context
We're implementing model chunking support in SmartSim and are concerned that if one client sets a model chunk size, there is no way for a second to realize that the chunk size has been adjusted. While we're really only implementing AI.CONFIG to test chunking behavior, there could be a larger role for it someday.
The text was updated successfully, but these errors were encountered:
Hey, the new AI.CONFIG GET sub-command was added to RedisAI master version recently (#918), and it will allow you to retrieve the MODEL_CHUNK_SIZE configuration (see here the docs https://oss.redis.com/redisai/master/commands/#aiconfig).
Note that this option will also be available via redisai-py client soon.
Is your feature request related to a problem? Please describe.
I don't believe there is a way for the client to retrieve the model chunk size. It can be set via AI.CONFIG, but I see no way to get the current setting back out.
Describe the solution you'd like
The cleanest approach would be to make a new command, say AI.RETRIEVECONFIG, that includes this information. It could also indicate whether the various backends have been loaded, for symmetry with AI.CONFIG
Describe alternatives you've considered
I'm not aware of any alternatives
Additional context
We're implementing model chunking support in SmartSim and are concerned that if one client sets a model chunk size, there is no way for a second to realize that the chunk size has been adjusted. While we're really only implementing AI.CONFIG to test chunking behavior, there could be a larger role for it someday.
The text was updated successfully, but these errors were encountered: