Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FR] Add ability to record on any movement #803

Closed
ljmerza opened this issue Feb 16, 2021 · 100 comments · Fixed by #10245
Closed

[FR] Add ability to record on any movement #803

ljmerza opened this issue Feb 16, 2021 · 100 comments · Fixed by #10245
Labels
enhancement New feature or request pinned

Comments

@ljmerza
Copy link
Contributor

ljmerza commented Feb 16, 2021

My Frigate instance has way too many false positives so I've had to raise the threshold but now I get a lot of false negatives. Since frigate only records either "clips" on object detection or 24/7, there's no way to realistically sift through videos that are actually useful. I either miss important events on clips from false negatives or have to look through 24/7 recordings.

I think a good balance would be to add the ability to record on any movement.

@blakeblackshear
Copy link
Owner

If you were struggling to sift through the false positives with detection, this won't help. Recording on all motion would just increase the number of false positives.

@ljmerza
Copy link
Contributor Author

ljmerza commented Feb 22, 2021

I struggle with false negatives not false positives... That's the point of recording on motion. With 24/7 recording, how could I know where to look if someone stole something off my back porch, I didn't notice for 3 days, and the object detection didn't pick it up? You are talking about 3 days worth of 24/7 recordings I have to sift through now to find where it was stolen. I want to be able to record on all motion, then false negatives wont be as important to miss. Recording on motion is a standard on NVR systems anyways.

@muzzak123
Copy link

I'm not sure if this is the same thing , but I would like to be able to capture a pic and clip based on ANY motion in a defined zone.
It seems like Frigate will only do this it it recognises an object. For example I would like to be able to recognise if my garage camera "sees" the garage door open. I've check the camera in Frigate with Motion Boxes turned on and it does detect the motion but obviously doesnt recognise it as an object. Perhaps an object of "Motion" could be used ?
I know FFmpeg Motion https://www.home-assistant.io/integrations/ffmpeg_motion/ will do this, but it would be nice to have it all bundled into the amazing Frigate UI.

@ronschaeffer
Copy link
Contributor

  • 1 for this. It would be great for consolidating camera management with Frigate awesome UI/HA integration.

@chrisgott
Copy link

i do have similar problems at nighttime (no ir able cam). i guess it will detect the motion, but cant detect either person or car, so there is no alerting. advising burglars to visit me at daytime only is maybe not the best option, so im asking if it is possible somehow to set a alert via mqtt on motion only? or maybe set score lower so that humanoid shadows will trigger?

@blakeblackshear
Copy link
Owner

You can use the detection fps sensor to alert on motion if you want.

@ljmerza
Copy link
Contributor Author

ljmerza commented May 13, 2021

Alert on movement might be too much. Can I record on movement instead?

@blakeblackshear
Copy link
Owner

If you could, this issue would already be closed.

@chrisgott
Copy link

You can use the detection fps sensor to alert on motion if you want.

it increases from 0 to 0.2 fps on first motion (person walking) and stays long at this value befor it falls back to 0. a second following motion (moving car) following up just seconds after leads to no change. anyways i'll give that a try cause its better than nothing.

@stale
Copy link

stale bot commented Jun 12, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Jun 12, 2021
@ljmerza
Copy link
Contributor Author

ljmerza commented Jun 12, 2021

Dont close it. Still an issue

@stale stale bot removed the stale label Jun 12, 2021
@osos
Copy link

osos commented Jun 28, 2021

I was looking for this as well.

Would love to just track "motion" in particular zones, and objects in other zones.

@stale
Copy link

stale bot commented Jul 28, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Jul 28, 2021
@ljmerza
Copy link
Contributor Author

ljmerza commented Jul 28, 2021

Don't close

@stale stale bot removed the stale label Jul 28, 2021
@blakeblackshear
Copy link
Owner

0.9.0 will lay the groundwork for this. In a future release, motion will be available as an option for the recordings retention policy.

@muzzak123
Copy link

Great news

@Numline1
Copy link

Numline1 commented Sep 5, 2021

If anyone's interested, there's a bit clunky, but pretty fair workaround - motionEye addon can already record on any movement and it also supports re-streaming. You could temporarily-ish use that for motion detection and then feed the data to Frigate for object detection. Not the best solution ever, but it should work fairly well.

@muzzak123
Copy link

@Numline1 I'm not sure what you mean. Perhaps you can elaborate. From my perspective Frigate can detect the motion by changes in the fps sensor, but it doesnt record it because it doesnt recognise the object. I want to be able to record even if the object is not recognised by Frigate

@Numline1
Copy link

Numline1 commented Sep 6, 2021

@muzzak123 I'm sorry, I kinda assumed everyone's using Frigate as an Home Assistant addon, which, obviously, is not true. Either way - motionEye is a separate software (motionEye is the frontend, motion is the backend - https://motion-project.github.io). Also, keep in mind, this is a workaround / possible solution for a problem, not a Frigate integration or anything like that.

As I've mentioned in my previous comment, you can temporarily (or permanently) install motion to deal with your motion detection/recording and then re-stream the RTPS into Frigate to deal with object detection. Once Frigate has its own motion detection in 0.9.x, you can get rid of motion (or keep it, if you'd like).

I'd also like to note that I'm in no way associated with either motion or Frigate projects, hopefully the Frigate author doesn't mind me mentioning a 3rd party project. It's just something that works well for me, since I find both projects to be pretty awesome.

@yuchangyuan
Copy link

It's quite hard to detect baby with quilt as person when camera in IR mode. So I think record all motion maybe a better choice than try to train the model by myself.

@cianpdx
Copy link

cianpdx commented Nov 11, 2021

could a workaround be to set threshold for an object you don't care about (hairdryer) to be so low that it always gets a positive from the detection? then it is saved.

but you can run your automations off the person or automobile detection, which has normal thresholds

or add a watermark of an object that will always get a hit and record that snapshot.

@AdmiralNemo
Copy link
Sponsor

That's similar to my workaround. There is always a car in my camera's field of view,so I have Frigate set to record when it detects a car. It works since Frigate only cares that there was motion and that a car was detected, not necessarily that the car was in motion.

@Junkbite
Copy link

could a workaround be to set threshold for an object you don't care about (hairdryer) to be so low that it always gets a positive from the detection? then it is saved.

but you can run your automations off the person or automobile detection, which has normal thresholds

No because then it would think all types of stationary items are objects and record constantly. If you set your threshold really low then all types of stuff in the picture will trigger. A garden hose sitting in my back yard would trigger as a person when I set the threshold low even though it wasn't moving. This would not work as a substitute.

@Ataman
Copy link

Ataman commented Dec 3, 2021

New frigate user here. My installation has difficulty detecting cars (or people) in the darker areas at night. Problem is, the suspicious activity we're trying to record mostly happens at night.

Having the ability to mark a zone that records motion even without detection but only at specifics times would be a great fallback. Yes, it would cause false positives but since it's night it should be manageable.

@NickM-27
Copy link
Sponsor Collaborator

And since person is the default, it becomes odd that 'motion' isn't recording all motion events when you explicitly leave out all references to detect people/etc.

Events are tracked objects, you can still easily have motion based 24/7 recording along with events.

A future version will allow seeing a timeline in the recordings view with the points in time where motion was detected shown and easily seekable.

@incith
Copy link

incith commented Jul 26, 2023

Right! I guess that's what I was trying to agree with above, a motion event that gets logged to the same events page would be used by more than a few people I think.

I guess e.g my concern is if something amazing happens outside I don't want to miss it because a tracked object wasn't in the shot.

@NickM-27
Copy link
Sponsor Collaborator

Yeah, the events page is for tracked objects. Motion detection is unreliable and would just clog the events with clutter and make it difficult to see the actually important things that happened.

@incith
Copy link

incith commented Jul 26, 2023

You know, I don't disagree.

I think my best use scenario is recording everything then with tracked objects.

Appreciate everything! Thank you!

@NickM-27
Copy link
Sponsor Collaborator

I'm a little confused by that last part so I want to make sure it's clear. You can easily record events with tracked objects AND record 24/7 on any motion (or record 24/7 and keep everything even if there was no motion). That way if something wasn't a tracked object but happened it was still recorded.

@incith
Copy link

incith commented Jul 26, 2023

Yup, exactly. That's how I plan to do it now. Thanks again!

@doug62
Copy link

doug62 commented Jul 26, 2023

+1 this feature, image detection isn't good enough to track all that is needed - Record on any motion (within zones) would be a very critical feature.

@BBaoVanC
Copy link

+1 this feature, image detection isn't good enough to track all that is needed - Record on any motion (within zones) would be a very critical feature.

Record on motion is already a thing, as in you can configure Frigate to not delete old footage past the 24/7 retention date if motion was detected.

For me, most of the motivation for wanting motion to show up in Events is gone if given the ability to create custom events from the recordings view (which iirc is coming in the next release) for stuff that didn't get detected, and bonus if there were a nicer UI for recordings.

@doug62
Copy link

doug62 commented Aug 15, 2023

+1 - I created a similar request quite some time ago, I have so many false positives + true negatives with image detection that I really can only count on motion, I actually had a car stolen but that wasn't detected...
If someone could create a sensor/class called "motion" and have that work within the same object detection flow that would be brilliant.
I wonder if my problems are that I use CPU rather than a card for detection, I run this in a K8 cluster with many CPUs and if quality is based on available CPU, that ain't my problem? (i.e. does this use CPU threads properly?)

@NickM-27
Copy link
Sponsor Collaborator

+1 - I created a similar request quite some time ago, I have so many false positives + true negatives with image detection that I really can only count on motion, I actually had a car stolen but that wasn't detected...

If someone could create a sensor/class called "motion" and have that work within the same object detection flow that would be brilliant.

I wonder if my problems are that I use CPU rather than a card for detection, I run this in a K8 cluster with many CPUs and if quality is based on available CPU, that ain't my problem? (i.e. does this use CPU threads properly?)

It doesn't matter how fast the CPU is, it is still orders of magnitude slower than a coral or GPU. This is likely to be an issue on your setup.

@doug62
Copy link

doug62 commented Aug 16, 2023

@NickM-27 Re: "It doesn't matter how fast the CPU" I have 6 cams, never seems to use more than 6CPUs when 32 are available.... So, if my poor detection is caused by resource it might be caused by software not threading/scaling optimally.

@NickM-27
Copy link
Sponsor Collaborator

NickM-27 commented Aug 16, 2023

@NickM-27 Re: "It doesn't matter how fast the CPU" I have 6 cams, never seems to use more than 6CPUs when 32 are available.... So, if my poor detection is caused by resource it might be caused by software not threading/scaling optimally.

That's not how it works, doesn't matter if your CPU is not being fully utilized, CPUs run the inferences slowly because it requires more instructions. Meanwhile a tpu or GPU runs these operations much more efficiently because they have dedicated hardware for the specific task.

Same thing about why a gpu decodes / encodes video faster than a CPU.

@doug62
Copy link

doug62 commented Aug 16, 2023

@NickM-27 I'm wondering why object detection is so unreliable, which is why I'm hoping we can make motion detection a first class citizen. I was GUESSING that poor object detection might be caused by not utilizing available compute, perhaps because the code doesn't utilize threads/maximum CPU/resources? (From what I have seen, FFMEP/OpenCV don't do this well)

Take this line for example: (I'm a C# guy, this is PLINQ)
foreach (var frame in frames.AsParallel().WithDegreeOfParallelism(50))
{
ProcessFrame(frame);
}

This code instantly starts 50 threads on 50 cores if they are there (otherwise it will mange them). I have 200 Cores in a Kubernetes cluster, why would I pay for custom hardware?

I see that you are a Sponsor/Collaborator - Thanks for you good work here

@NickM-27
Copy link
Sponsor Collaborator

I don't see how I can explain it more clearly, extra cores doesn't help, dedicated hardware made for the task does. You can easily see this with the inference times on the system page and in the docs https://docs.frigate.video/frigate/hardware#detectors

@NickM-27
Copy link
Sponsor Collaborator

Another part of explanation for suboptimal object detection is that the default model is a Google demo model trained on the COCO dataset. If you look at some examples of images in the COCO dataset they do not look like the security camera images at all. This leads to false positives.

There are a number of filters and other options to reduce false positives.

Better models, like what Frigate+ is planning to offer, that are trained on actual security camera examples offer higher accuracy and less issues with false positives.

@doug62
Copy link

doug62 commented Aug 16, 2023

I think we are saying the same thing - extra cores don't help cos the software doesn't take advantage of available resources. AND proprietary hardware is required for object detection.

@BBaoVanC
Copy link

I think we are saying the same thing - extra cores don't help cos the software doesn't take advantage of available resources. AND proprietary hardware is required for object detection.

If I understand correctly, "slowly" is referring to the latency. With software video encoding for example, adding more CPU cores doesn't really make it process any individual part of the video faster, instead it just processes more of the video simultaneously. So I think the object detection is similar. With more cores, it will still take the same 100ms to run the detection, but you'd be able to do more cameras simultaneously.

So it can't take advantage of more available resources because there's no more work left to feed to it that isn't already in process.

@NickM-27
Copy link
Sponsor Collaborator

NickM-27 commented Aug 17, 2023

I think we are saying the same thing - extra cores don't help cos the software doesn't take advantage of available resources. AND proprietary hardware is required for object detection.

If I understand correctly, "slowly" is referring to the latency. With software video encoding for example, adding more CPU cores doesn't really make it process any individual part of the video faster, instead it just processes more of the video simultaneously. So I think the object detection is similar. With more cores, it will still take the same 100ms to run the detection, but you'd be able to do more cameras simultaneously.

So it can't take advantage of more available resources because there's no more work left to feed to it that isn't already in process.

exactly, running a single model inference on a single frame is a synchronous action, multi threading does not improve this. It simply requires many more instructions on a general computing device (CPU) than it does on a device dedicated to these inferences (TPU, GPU).

@NickM-27 NickM-27 added the enhancement New feature or request label Aug 18, 2023
@doug62
Copy link

doug62 commented Sep 8, 2023

@NickM-27 Sorry to keep nagging, Noticed that my Frigate reports 103% utilization while only using one CPU while about 20 are available, I really want this to work, since you have more experience with the code, could you point me to the file that does the frame processing? I would like to take a crack at fixing the multi-threading area so that this could work with more than a 3 cameras on a high power PC..... Thanks in advance.

@NickM-27
Copy link
Sponsor Collaborator

NickM-27 commented Sep 8, 2023

If you want to run on multiple CPUs for object detection then you would setup multiple cpu detectors.

But again, this won't improve inference times.

@ivanjx
Copy link

ivanjx commented Nov 27, 2023

this would be far more useful for my use case since the detection AI model is not good enough with my camera. better to have false positives than false negatives.

currently relying on HA's history feature but it is quite annoying to keep switching between history and recordings view.

@vw-kombi
Copy link

For my config, I have a front camera with three zones. Front Left, Front Path and driveway.
I want people detection and recording events on Front Left and Driveway, but the front path which includes my letterbox never records the postman as they are on a strange electric bike contraction and this is never identified as anything, so I dont get any recordings of deliveries. So I want this zone to record on ANY MOVEMENT. Its important to get deliveries so when they say something was delivered, I can prove it was not. Any ideas ?

@NickM-27
Copy link
Sponsor Collaborator

@vw-kombi
Copy link

https://docs.frigate.video/configuration/record#reduced-storage-only-saving-video-when-motion-is-detected

Hi - Is this directed at my comment as I see nothing in there that will help what I want to do - or I am not understanding it.......

@NickM-27
Copy link
Sponsor Collaborator

NickM-27 commented Mar 20, 2024

You said

So I want this zone to record on ANY MOVEMENT.

the docs I linked to explain directly how to configure frigate to record on any movement, there is no way to do zone based motion recording. Frigate's motion detection is not designed to work with zones as it is not that accurate.

@vw-kombi
Copy link

vw-kombi commented Mar 20, 2024

I am sorry - I had a stroke some time back and it was a while of googling to get what I have now working almost perfectly (just missing the postman on his cart). Maybe all the config stuff I picked up along the way is conflicting here. This is my config below, and its the front-path zone that I want to not just be person, but any movement, then create a snapshot, event and start recording while there is movement :

cameras:
  Front-Cam:
    enabled: true
    ffmpeg:
      output_args:
        record: preset-record-generic-audio-copy
      inputs:
      - path: rtsp:https://hikvision:[email protected]:554/Streaming/Channels/102   # <----- The stream you want to use for detection
        input_args: preset-rtsp-restream
        roles:
        - detect
        - rtmp
      - path: rtsp:https://hikvision:[email protected]:554/Streaming/Channels/101   # <----- The stream you want to use for detection
        input_args: preset-rtsp-restream
        roles:
        - record
    detect:
      enabled: true # <---- disable detection until you have a working camera feed
      width: 1280
      height: 720
      fps: 6
    objects:
      track:
      - person
      - motorcycle
      filters:
        motorcycle:
          min_score: 40
    record:
      enabled: true
      retain:
        days: 7
        mode: motion
      events:
        required_zones:
        - front-left
        - front-path
        - driveway
        retain:
          default: 7
          mode: motion
    snapshots:
      enabled: true
      required_zones:
      - front-left
      - front-path
      - driveway
      bounding_box: true
      timestamp: true
      retain:
        default: 7
    motion:
      mask:
      - 184,676,426,673,425,720,0,720,0,671
      - 1024,0,1024,31,711,28,711,0
      - 967,720,1024,720,1024,34,649,27,643,47,879,92,842,282
      - 327,375,683,462,670,222,548,190,418,182
      - 0,372,155,410,182,315,221,179,112,177
      - 0,204,128,108,129,0,0,0
    zones:
      driveway:
        coordinates: 0,365,0,664,407,663,418,692,673,694,690,210,247,171
      front-path:
        coordinates: 
          863,117,855,217,828,280,851,346,929,634,873,720,693,720,708,414,708,135,530,96,533,57
      front-left:
        coordinates: 196,105,338,89,343,49,192,68

@NickM-27
Copy link
Sponsor Collaborator

NickM-27 commented Mar 20, 2024

then create a snapshot, event and start recording while there is movement :

right, I think this is the disconnect here. That is not supported like it says above. In frigate 0.13 you will only get recordings for this scenario which you will need to view using the recordings viewer

Screen Shot 2024-03-20 at 16 08 47 PM

This issue is closed because in the next version of frigate (0.14) the UI has been rewritten and there is a dedicated place to view a timeline showing motion activity and times where objects were detected so it will be easy to review things that frigate did not detect.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request pinned
Projects
None yet
Development

Successfully merging a pull request may close this issue.