Deepstack/CodeProject.AI detector #6522
-
First off, thank you so much for all involved in this project. I didn't expect this to work first shot, since its a work in progress and very beta. I've been playing with CodeProject.AI after seeing that @skrashevich pulled #6143 I tested using CodeProject.AI Explorer and used its object detection against some incorrect matches from frigate. Using ipcam-combined is actually returning the correct object, ex. frigate returned cat but ipcam-combined correctly identifies as dog - excellent! I'm wondering how to go about testing this in frigate. I'm currently running the latest frigate dev image frigate:dev-b568a29-tensorrt. My local CodeProject.AI instance is running via the same docker-compose.yml file as frigate. I used the example configuration from #6143, after changing the IP to localhost and removing the tensorrt detector and model section resulted in the following error Config Validation Errorsfrigate | 2023-05-17 09:38:17.723965344 *************************************************************
frigate | 2023-05-17 09:38:17.723968902 *************************************************************
frigate | 2023-05-17 09:38:17.723972552 *** Your config file is not valid! ***
frigate | 2023-05-17 09:38:17.723974398 *** Please check the docs at ***
frigate | 2023-05-17 09:38:17.723975636 *** https://docs.frigate.video/configuration/index ***
frigate | 2023-05-17 09:38:17.723990110 *************************************************************
frigate | 2023-05-17 09:38:17.723991274 *************************************************************
frigate | 2023-05-17 09:38:17.723992697 *** Config Validation Errors ***
frigate | 2023-05-17 09:38:17.723993845 *************************************************************
frigate | 2023-05-17 09:38:17.723994838 expected str, bytes or os.PathLike object, not NoneType
frigate | 2023-05-17 09:38:17.724480008 Traceback (most recent call last):
frigate | 2023-05-17 09:38:17.724482207 File "/opt/frigate/frigate/app.py", line 401, in start
frigate | 2023-05-17 09:38:17.724483269 self.init_config()
frigate | 2023-05-17 09:38:17.724484371 File "/opt/frigate/frigate/app.py", line 94, in init_config
frigate | 2023-05-17 09:38:17.724485438 self.config = user_config.runtime_config(self.plus_api)
frigate | 2023-05-17 09:38:17.724486543 File "/opt/frigate/frigate/config.py", line 1070, in runtime_config
frigate | 2023-05-17 09:38:17.724487493 detector_config.model.compute_model_hash()
frigate | 2023-05-17 09:38:17.724488613 File "/opt/frigate/frigate/detectors/detector_config.py", line 121, in compute_model_hash
frigate | 2023-05-17 09:38:17.724489578 with open(self.path, "rb") as f:
frigate | 2023-05-17 09:38:17.724490662 TypeError: expected str, bytes or os.PathLike object, not NoneType
frigate | 2023-05-17 09:38:17.724491500
frigate | 2023-05-17 09:38:17.724492538 *************************************************************
frigate | 2023-05-17 09:38:17.724509986 *** End Config Validation Errors ***
frigate | 2023-05-17 09:38:17.724511128 ************************************************************* I know that the frigate container is able to connect to localhost:32168 because I can curl the url from the frigate container successfully docker compose exec frigate curl -v "http:https://localhost:32168/v1/vision/detection"
* Trying 127.0.0.1:32168...
* Connected to localhost (127.0.0.1) port 32168 (#0)
> GET /v1/vision/detection HTTP/1.1
> Host: localhost:32168
> User-Agent: curl/7.74.0
> Accept: */*
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 405 Method Not Allowed
< Content-Length: 0
< Date: Wed, 17 May 2023 17:02:29 GMT
< Server: Kestrel
< Allow: POST
<
* Connection #0 to host localhost left intact
* Here are my settings docker-compose.yml:version: "3.9"
services:
frigate:
container_name: frigate
privileged: true
restart: unless-stopped
image: ghcr.io/blakeblackshear/frigate:dev-b568a29-tensorrt
network_mode: host
shm_size: "80mb"
volumes:
- /etc/localtime:/etc/localtime:ro
- /media/RAID/frigate:/media/frigate
- /media/RAID/frigate/trt-models:/trt-models
- ./config:/config
- type: tmpfs
target: /tmp/cache
tmpfs:
size: 1000000000
environment:
FRIGATE_RTSP_PASSWORD: "<redacted>"
PLUS_API_KEY: "<redacted>"
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu]
CodeProject.AI:
image: codeproject/ai-server:gpu
container_name: CodeProject.AI
environment:
- TZ=America/Vancouver
volumes:
- ./deepstack/settings:/etc/codeproject/ai
- ./deepstack/modules:/app/modules
ports:
- 32168:32168
restart: unless-stopped
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu] config.ymlmqtt:
host: 192.168.1.10
user: <redacted>
password: <redacted>
ffmpeg:
output_args:
record: preset-record-generic-audio-copy
snapshots:
enabled: True
clean_copy: True
timestamp: True
bounding_box: True
go2rtc:
streams:
frontdoor: #Reolink RLC-810A - IPC_523128M8MP - v3.1.0.956_22041503
- rtsp:https://<redacted>:<redacted>@192.168.1.150:554/h264Preview_01_main
- ffmpeg:frontdoor#audio=opus
duoleft: #Reolink Duo PoE - IPC_528B174MP - v3.0.0.1388_22100600
- rtsp:https://<redacted>:<redacted>@192.168.1.151:554/h264Preview_01_main
- ffmpeg:duoleft#audio=opus
duoright: #Reolink Duo PoE - IPC_528B174MP - v3.0.0.1388_22100600
- rtsp:https://<redacted>:<redacted>@192.168.1.151:554/h264Preview_02_main
- ffmpeg:duoright#audio=opus
inside: #Amcrest IP2M-841B - V2.420.AC00.18.R
- rtsp:https://<redacted>:<redacted>@192.168.1.153:554/cam/realmonitor?channel=1&subtype=0
- ffmpeg:inside#audio=opus
cameras:
inside:
enabled: true
ffmpeg:
hwaccel_args: preset-nvidia-h264
inputs:
- path: rtsp:https://127.0.0.1:8554/inside?video=copy&audio=aac
input_args: preset-rtsp-restream
roles:
- record
onvif:
host: 192.168.1.153
port: 80
user: <redacted>
password: <redacted>
frontdoor:
timestamp_style:
position: "tl"
ffmpeg:
hwaccel_args: preset-nvidia-h264
inputs:
- path: rtsp:https://127.0.0.1:8554/frontdoor?video=copy&audio=aac
input_args: preset-rtsp-restream
roles:
- record
- detect
detect:
width: 1280
height: 720
fps: 15
objects:
track:
- bear
- bird
- cat
- dog
- person
filters:
bird:
mask:
- 934,0,960,0,974,65,900,65
person:
min_area: 5000
mask:
- 934,0,960,0,974,65,900,65
- 1280,316,1280,622,1158,504
cat:
mask:
- 1280,316,1280,622,1158,504
motion:
mask:
- 934,0,960,0,974,65,900,65
- 1280,316,1280,622,1158,504
mqtt:
timestamp: False
bounding_box: False
crop: True
quality: 100
height: 1000
duoleft:
timestamp_style:
position: "br"
ffmpeg:
hwaccel_args: preset-nvidia-h264
inputs:
- path: rtsp:https://127.0.0.1:8554/duoleft?video=copy&audio=aac
input_args: preset-rtsp-restream
roles:
- record
- detect
detect:
width: 1280
height: 720
fps: 15
objects:
filters:
bird:
max_area: 20000
car:
min_area: 3500
mask:
- 1280,720,1280,85,961,93,935,0,686,0,334,85,0,273,0,720
person:
min_area: 2500
mask:
- 648,260,666,343,608,379,568,330,586,245
- 241,300,392,202,339,129,249,156,206,214
mqtt:
timestamp: False
bounding_box: False
crop: True
quality: 100
height: 1000
duoright:
timestamp_style:
position: "bl"
ffmpeg:
hwaccel_args: preset-nvidia-h264
inputs:
- path: rtsp:https://127.0.0.1:8554/duoright?video=copy&audio=aac
input_args: preset-rtsp-restream
roles:
- record
- detect
detect:
width: 1280
height: 720
fps: 15
objects:
filters:
person:
min_area: 2000
bear:
min_area: 3000
car:
min_area: 1500
mask:
- 1280,720,1280,450,1000,450,1000,720
- 172,21,173,0,439,0,546,86
mqtt:
timestamp: False
bounding_box: False
crop: True
quality: 100
height: 1000
detectors:
tensorrt:
type: tensorrt
device: 0
# deepstack:
# api_url: http:https://localhost:32168/v1/vision/detection
# type: deepstack
# api_timeout: 0.1 # seconds
model:
path: /trt-models/yolov7x-320.trt
labelmap_path: /trt-models/coco_91cl.txt
input_tensor: nchw
input_pixel_format: rgb
width: 320
height: 320
model_type: yolox
record:
enabled: True
retain:
days: 7
mode: all
events:
retain:
default: 14
mode: active_objects
objects:
dog: 7
cat: 7
car: 7
objects:
track:
- bear
- bird
- car
- cat
- dog
- person
birdseye:
enabled: True
mode: continuous
timestamp_style:
format: "%Y-%m-%d %H:%M:%S"
effect: shadow
For my test I was swapping this: detectors:
tensorrt:
type: tensorrt
device: 0
# deepstack:
# api_url: http:https://localhost:32168/v1/vision/detection
# type: deepstack
# api_timeout: 0.1 # seconds
model:
path: /trt-models/yolov7x-320.trt
labelmap_path: /trt-models/coco_91cl.txt
input_tensor: nchw
input_pixel_format: rgb
width: 320
height: 320
model_type: yolox for this detectors:
# tensorrt:
# type: tensorrt
# device: 0
deepstack:
api_url: http:https://localhost:32168/v1/vision/detection
type: deepstack
api_timeout: 0.1 # seconds
#model:
# path: /trt-models/yolov7x-320.trt
# labelmap_path: /trt-models/coco_91cl.txt
# input_tensor: nchw
# input_pixel_format: rgb
# width: 320
# height: 320
# model_type: yolox Thanks for reading! |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 7 replies
-
Looks like the changes to implement Frigate+ models aren't compatible with your config. It's looking for a path to a model file, but there isn't one. |
Beta Was this translation helpful? Give feedback.
-
hotfix in #6525 |
Beta Was this translation helpful? Give feedback.
-
Thanks guys! |
Beta Was this translation helpful? Give feedback.
hotfix in #6525
but it's an architecture problem: Frigate expects a local model file to count its hash, which is not applicable for remote detectors like deepstack. How to solve it most correctly -- question to @blakeblackshear