You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Our team has worked on Intel x86 device with RedisEdge (Redis 6.2.6, RedisAI 1.2.5, RedisGears 1.2.4, and RedisTimeSeries 1.6.11) and they are really impressing with the performance, especially that Intel x86 device just equipped with an outdated 4th Gen i7 processor which without Intel® DL Boost.
Current redisfab/redisedgevision:0.4.0-jetson-arm64v8-bionic only supports RedisAI 1.0.2 and when we ran it with Torch backend (It looks like ONNX backend could not be loaded) - it also thrown out error message "GPU requested but Torch couldn't find CUDA".
We would like to request Describe the solution you'd like
We would like to seek for help regarding to have dockerfile.jetson with RedisAI 1.2.5 and ONNX backend, thus we can compare the performance difference between various device platform under the same codes. We noticed that there are few dockerfile.jetson available across RedisEdge GitHub and RedisAI GitHub (Branches) - they specified the Jetson model as Nano or Xavier. Would that be possible to open for various Jetson models?
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
Our team has worked on Intel x86 device with RedisEdge (Redis 6.2.6, RedisAI 1.2.5, RedisGears 1.2.4, and RedisTimeSeries 1.6.11) and they are really impressing with the performance, especially that Intel x86 device just equipped with an outdated 4th Gen i7 processor which without Intel® DL Boost.
Current redisfab/redisedgevision:0.4.0-jetson-arm64v8-bionic only supports RedisAI 1.0.2 and when we ran it with Torch backend (It looks like ONNX backend could not be loaded) - it also thrown out error message "GPU requested but Torch couldn't find CUDA".
We would like to request
Describe the solution you'd like
We would like to seek for help regarding to have dockerfile.jetson with RedisAI 1.2.5 and ONNX backend, thus we can compare the performance difference between various device platform under the same codes. We noticed that there are few dockerfile.jetson available across RedisEdge GitHub and RedisAI GitHub (Branches) - they specified the Jetson model as Nano or Xavier. Would that be possible to open for various Jetson models?
The text was updated successfully, but these errors were encountered: