-
-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FEAT: Support Rockchip TPU #3626
Conversation
@NickM-27 which Rockchip parts are you hoping to target with this? Looks like Rockchip has split newer products into a v2 of the toolchain. Good news is the rknn-toolkit-lite2 package supports python 3.9. Bad news is I don't think the older parts are supported and need the original library. |
Things are looking dicey. I've only found an rknn-toolkit-lite2 3.9 for arm and was told by the owners that there are no plans for an amd64 3.9 variant. I've put this down for now (not in scope for 0.12) and planning to revisit later and see what has changed. Either way it'll have to be the lite version and we'll need a separate process which uses builds / converts the model outside of frigates build process |
My understanding was the lite package was just for running inference locally on the part. The amd64 packackage would prepare the model and could connect to an RKNPU as an accelerator. So yes, it would be hard to use in the second scenario, as an add-on. But I think the various RK3588 SBCs coming out can run locally with the lite toolchain. I have a Rock 5b on order I plan on playing with in a few weeks. |
The rockchip USB using the lite toolkit can run inferences and load a model, it just can't convert a model (which we can do beforehand anyway) Either way right now it's looking like amd64 won't support it at all unfortunately. They also don't have any documentation on how to get the Tensor outputs for what the model returns and that's what I was stuck on when I decided to put it down |
ab7b723
to
22802c9
Compare
✅ Deploy Preview for frigate-docs canceled.
|
I am way of out my depth here, but saw the todo comment in the commit. From the examples looks like |
Given the discussion around community supported boards framework it seems others may pick this up and regardless I am not looking to personally support this as I won't use it myself |
This is a super early WIP of potentially supporting the rockchip TPU. It may turn out that it can't be supported.
To-Do:
rknn.inference
results and parse out for frigate