Create tflite model for object detection for custom dataset #152
-
Hi, |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Hello @param-esec Yes, you can train your own object detection. The only takeaway that you need from the TensorFlow to Larod Example is the function that shows how you can quantize the model and fine-tune. You can replace the rest (model architecture and dataset) with your own. Artpec8 DLPU supports quantization by tensor, that is only possible on TF1. You could ignore this guideline and deploy on the device a model quantized with TF2, but you will have performance that is from 2 to 30 times worse in speed and in some cases lose also some accuracy. At that point might be better to use the CPU of the device instead. |
Beta Was this translation helpful? Give feedback.
-
Hi @Corallo , These are my classes: I ran this commands and got my tflite model I then implemented model on object detection example but got this error: Am I missing any steps for training my custom model, should i make change in other .py file? |
Beta Was this translation helpful? Give feedback.
Hello @param-esec
Yes, you can train your own object detection. The only takeaway that you need from the TensorFlow to Larod Example is the function that shows how you can quantize the model and fine-tune. You can replace the rest (model architecture and dataset) with your own.
Artpec8 DLPU supports quantization by tensor, that is only possible on TF1.
Unfortunately, this means you won't be able to train it on Colab anymore without TF1. You'll have to use either another service like AWS or train on your own GPU.
You could ignore this guideline and deploy on the device a model quantized with TF2, but you will have performance that is from 2 to 30 times worse in speed and in some cases lose…