How can I leverage capacitor-mlkit to run a custom TF lite model? #102
-
Hey everyone, I currently have a custom-trained TensorflowJS model which I'm running through Capacitor on my iOS/Android apps. To improve the inference time, I'm aiming to run it using the MLkit and was wondering if anyone could guide me on what would be the best way to do it. I can potentially convert the TFJS model to TFLite model, and am looking to have a Capacitor plugin that can use MLKit. I looked at all 5 plugins in the repo, and all of them seem to be using the pod provided by the TensorFlow team. How would you recommend me to go about in my case? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
This is not yet possible with these plugins. You are welcome to create a feature request. Please note that only the Image Labeling and the Object Detection & Tracking API offer support for custom models (see here). Otherwise, a custom Capacitor plugin with a specific Tensorflow implementation would be necessary. |
Beta Was this translation helpful? Give feedback.
This is not yet possible with these plugins. You are welcome to create a feature request.
Please note that only the Image Labeling and the Object Detection & Tracking API offer support for custom models (see here). Otherwise, a custom Capacitor plugin with a specific Tensorflow implementation would be necessary.