-
I want to preface by saying, I'm really excited for the upcoming Frigate+. Unfortunately, what I want to detect right now does not fit in its current model. Crossing my fingers for some custom model support as it gets more mature! I have a camera setup inside a bearded dragon enclosure, and I want to create a custom model to detect the Lizards movements to different parts of his enclosure. I'm able to train a Tensor Flow model and get detection independent of Frigate, but I'm struggling to get it into a format that Frigate will accept. I've tried a few methods with out much luck; GPU TensorRT
Config:
Output:
CPU and Tensorflow Lite Since I wasnt having much luck with TRT I decided to try a CPU based model Config
Output:
Thoughts The root of the issue is that I am clearly not a AI / Deep Learning expert. I've just gotten sucked into this rabbit hole because I want to tinker and know when my Lizard eats! Ive generated the models using Teachable Machine, because its barrier of entry was lowest. This is likely the main issue since the models that work with Frigate are YOLO based. I've not seen any discussions on anyone successfully training a model on custom data set outside of the YOLO labels. I'd love to hear any successes, and any documents / tips you may have to help me model my lizard in a way that Frigate can work with the model. I am going to try and use this method: https://blog.paperspace.com/yolov7/ and report back when I have time! |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 4 replies
-
The coral / cpu models are based on mobiledet by default. @NateMeyer / @blakeblackshear might be able to offer more specific advice, there are other users on Frigate as well that use custom trained models as well. |
Beta Was this translation helpful? Give feedback.
-
The shape of the tensors in your model is different from what the TensorRT detector is expecting. One option you could do is copy the detector from Make sure you pass in a valid label map that includes a row for every possible class that the model will return. Looks like your tflite model returned a class that didn't exist in your label map. |
Beta Was this translation helpful? Give feedback.
-
I believe I finally got something working! Using Yolov8 even!
I don't know how well it works in Frigate yet, but I believe that is a model / training issue, not a Frigate implementation issue. The only thing I don't fully understand is why the TensorRT File Generated by Ultralytics was not just plug and play. I needed to pass a onnx through
|
Beta Was this translation helpful? Give feedback.
I believe I finally got something working! Using Yolov8 even!
Its not refined by any means, but it compiles and plugs into frigate with no errors.