US20150092052A1 - Image monitoring system and surveillance camera - Google Patents

Image monitoring system and surveillance camera Download PDF

Info

Publication number
US20150092052A1
US20150092052A1 US14/480,750 US201414480750A US2015092052A1 US 20150092052 A1 US20150092052 A1 US 20150092052A1 US 201414480750 A US201414480750 A US 201414480750A US 2015092052 A1 US2015092052 A1 US 2015092052A1
Authority
US
United States
Prior art keywords
abnormal
sound source
image
abnormal sound
abnormal image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/480,750
Other versions
US10204275B2 (en
Inventor
Dong Hak SHIN
Ki Yong Jeon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hanwha Vision Co Ltd
Original Assignee
Samsung Techwin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Techwin Co Ltd filed Critical Samsung Techwin Co Ltd
Assigned to SAMSUNG TECHWIN CO., LTD. reassignment SAMSUNG TECHWIN CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JEON, KI YONG, SHIN, DONG HAK
Publication of US20150092052A1 publication Critical patent/US20150092052A1/en
Assigned to HANWHA TECHWIN CO., LTD. reassignment HANWHA TECHWIN CO., LTD. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: SAMSUNG TECHWIN CO., LTD.
Assigned to HANWHA AEROSPACE CO., LTD. reassignment HANWHA AEROSPACE CO., LTD. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: HANWHA TECHWIN CO., LTD
Assigned to HANWHA AEROSPACE CO., LTD. reassignment HANWHA AEROSPACE CO., LTD. CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION NUMBER 10/853,669. IN ADDITION PLEASE SEE EXHIBIT A PREVIOUSLY RECORDED ON REEL 046927 FRAME 0019. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME. Assignors: HANWHA TECHWIN CO., LTD.
Application granted granted Critical
Publication of US10204275B2 publication Critical patent/US10204275B2/en
Assigned to HANWHA TECHWIN CO., LTD. reassignment HANWHA TECHWIN CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HANWHA AEROSPACE CO., LTD.
Assigned to HANWHA VISION CO., LTD. reassignment HANWHA VISION CO., LTD. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: HANWHA TECHWIN CO., LTD.
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06K9/00771
    • G06K9/00718
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8211Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a sound signal
    • G06K2009/00738
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection

Definitions

  • Apparatuses and methods consistent with exemplary embodiments of the inventive concept relate to an image monitoring system that analyzes an image and/or a sound source to generate, store, and display metadata about an abnormal situation, a surveillance camera included in the image monitoring system, and a method of operating the surveillance camera.
  • An image monitoring system generally includes a surveillance camera and an image recording apparatus such as a network video recorder (NVR or digital video recorder (DVR), and analyzes an image acquired via the surveillance camera by using an image analysis function to determine whether a surveillance scene is abnormal according to predefined rules.
  • NVR network video recorder
  • DVR digital video recorder
  • One or more embodiments of the inventive concept include an image monitoring system that analyzes an image and/or sound source to generate, store, and display metadata about an abnormal situation so as to increase the monitoring efficiency, a surveillance camera included in the image monitoring system, and a method of operating the surveillance camera.
  • the determiner may further determine whether the abnormal sound source is input from the abnormal image or an outside of the abnormal image.
  • a metadata generator included in the surveillance camera may generate metadata about each of the abnormal image and the abnormal sound source.
  • the determiner may further determine whether the abnormal image and the abnormal sound source indicate a same event.
  • the surveillance camera may further include the metadata generator configured to generate the metadata comprising times when the abnormal image and the abnormal sound source are obtained, respectively, types of the abnormal image and the abnormal sound source, a number of events indicated by the abnormal image and the abnormal sound source, whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image, and whether the abnormal image and the abnormal sound source indicate a same event, with reference to the determination result of the determiner.
  • the surveillance camera may also include an alarm information generator configured to generate the alarm information about the abnormal image and the abnormal sound source with reference to the determination result of the determiner and the metadata.
  • the types of the abnormal sound source and the abnormal image each may include a plurality of candidates and accuracy information about the candidates, and the determiner may determine whether the abnormal image and the abnormal sound source indicate the same event with reference to the plurality of candidates corresponding to the types of the abnormal image and the abnormal sound source and the accuracy information.
  • the determiner may further determine that the abnormal image and the abnormal sound source indicate the preset event.
  • the determiner may further determine whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image.
  • the metadata generator may generate the metadata about each of the abnormal image and the abnormal sound source.
  • the determiner may further determine whether the abnormal image and the abnormal sound source indicate the same event, and if it is determined by the determiner that the abnormal image and the abnormal sound source indicate the same event, the metadata generator may reduce the number of events indicated by the abnormal image and the abnormal sound source from the metadata.
  • the metadata generator may generate the metadata about each of the abnormal image and the abnormal sound source.
  • a monitoring system including the above surveillance camera; an image recording apparatus configured to receive the image, the metadata and the alarm information from the camera, store the image, the metadata and the alarm information, and perform processing for displaying the alarm information; and a display apparatus configured to display the image, the metadata, and the alarm information.
  • a method of controlling a surveillance camera including: analyzing an input image; analyzing a sound source in the image or a sound source that is input separately; determining whether an abnormal image and an abnormal sound source exist, and when the abnormal image and the abnormal sound source are obtained, according to a result of the image analysis and the sound source analysis; generating metadata and alarm information based on a result of the determining, wherein the abnormal image and the abnormal sound source are predefined.
  • the determining may include determining whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image.
  • the determining may also include determining whether the abnormal image and the abnormal sound source indicate a same event.
  • the metadata includes information about when the abnormal image and the abnormal sound source are obtained, respectively, types of the abnormal image and the abnormal sound source, a number of events indicated by the abnormal image and the abnormal sound source, whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image, and whether the abnormal image and the abnormal sound source indicate a same event, with reference to a result of the determining.
  • FIG. 1 is a block diagram illustrating an image monitoring system according to an exemplary embodiment
  • FIG. 2 is a block diagram illustrating a detailed structure of a surveillance camera of the image monitoring system of FIG. 1 , according to an exemplary embodiment
  • FIGS. 3A and 3B are views illustrating metadata that are generated by the surveillance camera of the image monitoring system of FIG. 1 , according to an exemplary embodiment.
  • FIG. 4 is a flowchart of a method of operating an image monitoring system, according to an exemplary embodiment.
  • FIG. 1 is a block diagram illustrating an image monitoring system according to an exemplary embodiment.
  • the image monitoring system includes a surveillance camera 100 , a microphone 200 , an image recording apparatus 300 , and a display apparatus 400 .
  • the surveillance camera 100 may be a single fixed camera including a fixed lens and having a fixed capturing range or a pan-tilt-zoom (PTZ) camera having a variable capturing range. If the surveillance camera 100 is a PTZ camera, the surveillance areas may be easily changed through a pan motion in a horizontal direction, a tilt motion in a vertical direction, and a zoom-in and/or zoom-out motion. The PTZ camera may have a uniform resolution and may horizontally and vertically rotated, in comparison with the single fixed camera, to thereby acquire images in all directions.
  • the surveillance camera 100 is connected to the image recording apparatus 300 that may be a DVR. If the surveillance camera 100 is a network camera, the surveillance camera 100 may be connected to the image recording apparatus 300 through a network. In this case, the image recording apparatus 300 may be an NVR.
  • the image monitoring system may include more surveillance cameras 100 that analyze one or more images and/or sound sources from the input images to determine whether an abnormal image exists, a sound source exists, or an abnormal image and sound source exist in order to generate metadata and alarm information.
  • the sound source may be included in or obtained from an image captured by the surveillance camera 100 or may be directly input to the surveillance camera 100 through the microphone 200 .
  • the surveillance camera 100 will be described in detail with reference to FIG. 2 .
  • the image recording apparatus 300 receives the image, the metadata, and the alarm information from the surveillance camera 100 , stores the image, the metadata, and the alarm information, and performs display processing of the alarm information.
  • the display processing of the alarm information refers to processing, such as flickering or highlighting, performed on the image or the metadata displayed on the display apparatus 400 .
  • the image recording apparatus 300 may be a DVR or an NVR according to a type of the surveillance camera 100 .
  • the display apparatus 400 displays the image, the metadata, and the processed alarm information output from the image recording apparatus 300 .
  • the display apparatus 400 may divide a screen according to the number of surveillance cameras 100 and display images respectively captured by each of the surveillance cameras 100 through the image recording apparatus 300 .
  • FIG. 2 is a block diagram illustrating a detailed structure of the surveillance camera 100 of FIG. 1 .
  • the surveillance camera 100 includes an image sensor 110 , an image analyzer 120 , a sound source analyzer 130 , a determiner 140 , a metadata generator 150 , an alarm information generator 160 , a driving controller 170 , and a tracking controller 180 .
  • the image sensor 110 converts an optical signal bouncing off a subject and passing through a lens (not shown) of the surveillance camera 100 into an electric signal (an image signal) and outputs the electric signal.
  • the image sensor 110 may also be a complementary metal-oxide semiconductor (CMOS) module or a charge-coupled device (CCD) module.
  • CMOS complementary metal-oxide semiconductor
  • CCD charge-coupled device
  • Image processing may be performed on an image output from the image sensor 110 .
  • the image processing may be performed by the image analyzer 120 or another module. In the present exemplary embodiment, for convenience of description, the image processing will be described as being performed by the image analyzer 120 .
  • the image analyzer 120 reduces noise from the image signal output from the image sensor 120 and may perform image signal processing for improving an image quality, such as a gamma correction, a color filter array interpolation, a color matrix, a color correction, a color enhancement, or the like.
  • the image analyzer 120 may also functionally perform color processing, blur processing, edge enhancement processing, image analysis processing, image recognition processing, image effect processing, or the like on the image signal output from the image sensor 110 .
  • the image analyzer 120 may perform face recognition, scene recognition, or the like through the image recognition processing. For example, the image analyzer 120 may perform luminance level adjusting, color correction, contrast adjusting, contour emphasis adjusting, screen division processing, character image generating, image synthesis process, or the like.
  • the image analyzer 120 performs an image analysis on the image signal output from the image sensor 110 , and if the analysis result satisfies a preset event generation condition, generates an event.
  • the image analysis refers to tracking to detect disappearance or appearance of an object in a screen, image tracking of an object similar to a particular image input by a user, sensing a motion of the object, screen blackout, or the like, and if the image analysis result satisfies the preset generation condition, an abnormal image is detected to generate an event.
  • the sound source analyzer 130 analyzes a sound source that is included in or obtained from the image signal output from the image sensor 110 or is directly input through the microphone 200 , and if the analysis result satisfies an event generation condition, generates an event.
  • the sound source analysis analyzes a sound source having a size greater than or equal to a threshold value designated by the user or a sound source specified by the user to generate the event.
  • several types of abnormal sound sources may be included into a database (DB), the analyzed sound source may be compared with the sound sources stored in the DB, and if the analyzed sound source corresponds to the stored sound sources, an event may be generated.
  • the event generation condition used in the image analysis corresponds to an appearance of an object, a generation of an image specified by the user (for example, an appearance of a unrecognizable face), a change in a screen color, a motion occurring in a set area, or the like and may be preset.
  • the event generation condition used in the sound source analysis corresponds to generation of an abnormal sound (a friction sound (skid) of a car tire, a glass breaking sound, an alarm sound, a collision sound, or the like), generation of a sound specified by the user (for example, a man's screaming, woman's screaming, baby's crying sound, or the like), generation of a sound with an amplitude higher than or equal to a threshold value, or the like, and may be preset.
  • the determiner 140 may determine whether an abnormal image exists, an abnormal sound exists, or an abnormal image and an abnormal sound source simultaneously exist according to an image analysis result and/or a sound source analysis result. In particular, if it is determined that an abnormal image and an abnormal sound source simultaneously exist, the determiner 140 determines whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image. Also, if the abnormal sound source is obtained from the abnormal image, the determiner 140 determines whether the abnormal image and the abnormal sound source indicate a same event. The determiner 140 performs multiple determinations as described above because a result of the generated metadata varies according to a determination result of the determiner 140 .
  • the metadata generator 150 generates metadata about the abnormal image, metadata about the abnormal sound source, and metadata about the abnormal image and the abnormal sound source with reference to the determination result of the determiner 140 .
  • the metadata refers to data that provides information about original data (an image, a sound source, or the image and the sound source) and describes another data.
  • the metadata generator 150 generate metadata including a number of times of generating the abnormal image and/or the abnormal sound source, analysis results and types of the abnormal image and/or the abnormal sound source, a number of generated events indicated by the abnormal image and/or the abnormal sound source, whether the abnormal sound source is obtained from the abnormal image, and whether the abnormal image and the abnormal sound source indicate a same event with reference to the determination result of the determiner 140 .
  • FIGS. 3A and 3B are views illustrating metadata that are generated by the metadata generator 150 , according to an exemplary embodiment.
  • the metadata is classified into items of a generation time, an abnormal sound source type, an abnormal image type, a cross check, a number of event generations, and multi-detection.
  • the multi detection item indicates a determination result of whether an abnormal sound source is obtained from an abnormal image. Specifically, the multi detection item indicates a determination result of recognizing a position in which the abnormal sound source is generated and whether the recognized position is obtained from the abnormal image.
  • the cross check item indicates a determination result of whether the abnormal image and the abnormal sound source indicate a same event. In other words, the cross check item indicates a determination result of whether the abnormal image and the abnormal sound source are a same type of events.
  • FIG. 3B illustrates a process of the determiner 140 of FIG. 2 that performs a cross check with reference to an abnormal image and an abnormal sound source, according to an exemplary embodiment.
  • a type of the abnormal sound source and a type of the abnormal image are not determined as one type, and may each includes a plurality of candidates and also includes accuracies of the respective candidates.
  • first metadata shown in FIG. 3B indicates that a probability of the abnormal sound source corresponding to a collision is 55%, a probability of the abnormal sound source corresponding to an explosion is 40%, and a probability of the abnormal sound source corresponding to man's screaming is 10%.
  • the first metadata shown in FIG. 3B indicates that a probability of the abnormal image corresponding to an explosion is 95% and a probability of the abnormal image corresponding to a collision is 5%.
  • a cross check result obtained from these analysis results may indicate an explosion.
  • the result of the cross check may be determined as being a violent event.
  • a man's screaming and a woman's screaming may be regarded as a same type of events in order to sum the accuracies, acquire a maximum value, or acquire an average value in order to use values of the accuracies.
  • a detailed method of performing the cross check by using the accuracies of the candidates is not limited to the above-described methods, and other various methods may be used.
  • the metadata generator 150 subtracts the number of generated events from generated metadata. Referring to reference numeral 310 of FIG. 3A , since the abnormal sound source and the abnormal image of various types indicate the same event, the number of generated events is subtracted from the metadata. That is, the number of events is changed from two to one. Referring to reference numeral 320 of FIG. 3A , since the abnormal sound source and the abnormal image of various types indicate different events, the number of generated events is maintained.
  • the metadata generator 150 generates metadata about each of the abnormal image and the abnormal sound source.
  • the alarm information generator 160 generates alarm information about the abnormal image and/or the abnormal sound source with reference to the determination result of the determiner 140 and the generated metadata.
  • the alarm information generator 160 may generate alarm information about the abnormal image and/or the abnormal sound source in which the larger number of events exist or alarm information about a particular abnormal image (e.g., a collision, or the like) or a particular abnormal sound source (e.g., woman's screaming, or the like).
  • the driving controller 170 controls driving of the surveillance camera 100 .
  • the driving controller 170 controls panning, tilting, and zooming operations, etc. of the surveillance camera 100 . If the surveillance camera is a fixed camera, the driving controller 170 may be omitted.
  • the tracking controller 180 outputs a tracking control signal to the driving controller 170 with reference to the determination result of the determiner 140 .
  • the surveillance camera 100 is driven according to the tracking control signal of the tracking controller 180 to track a target. For example, if the surveillance camera 100 is a zoom camera, and the abnormal sound source exists outside the abnormal image according to a multi detection determination result of the determiner 140 , the tracking controller 180 may output a control signal to the driving controller 170 to enable the surveillance camera 100 to zoom out and then capture a wider area.
  • the tracking controller 180 may output a control signal to the driving controller 170 to track a position where the abnormal sound source is generated. If a plurality of abnormal sound sources are simultaneously generated, tracking may be performed according to priorities with reference to metadata of the respective abnormal sound sources. For example, if a plurality of events simultaneously occur, tracking may be performed according to types and priorities of the events.
  • an image and data may be displayed on the display apparatus 400 or may be stored in an additional storage device. If the surveillance camera 100 is a fixed camera, tracking and driving controls may not be performed.
  • the image, the metadata, and the alarm information generated by the surveillance camera 100 are displayed on the display apparatus 400 through the image recording apparatus 300 .
  • the image recording apparatus 300 performs processing for the display of the alarm information.
  • Metadata about an abnormal situation may be generated, stored, and displayed through an image and/or sound source analysis to increase monitoring efficiency.
  • An abnormal situation occurring in a blind area of the surveillance camera 100 may be checked, and a subsequent search for the abnormal situation may become easy according to the generation of the metadata.
  • whether the abnormal sound source is obtained from the abnormal image and whether the abnormal image and the abnormal sound source indicate the same event may be determined to reduce repeated or false alarms.
  • FIG. 4 is a flowchart of a method of operating an image monitoring system, according to an exemplary embodiment. In the following description, descriptions of parts overlapping the descriptions of FIGS. 1 through 3 are omitted.
  • the surveillance camera 100 determines whether an abnormal image exists, an abnormal sound source exists, or an abnormal image and an abnormal sound source simultaneously exist, according to an analysis of an input image and/or a sound source input through the microphone 200 .
  • the surveillance camera 100 If it is determined in operation S 420 that the abnormal image and the abnormal sound source do not simultaneously exist, the surveillance camera 100 generates metadata of each of the abnormal image and the abnormal sound source in operation S 430 .
  • the surveillance camera 100 determines whether the abnormal sound source exists in the abnormal image in operation s 440 .
  • the surveillance camera 100 If it is determined in operation S 40 that the abnormal sound source exists outside the abnormal image, the surveillance camera 100 generates metadata about each of the abnormal image and the abnormal sound source in operation S 430 .
  • the surveillance camera 100 determines whether the abnormal image and the abnormal sound source are generated in a same event in operation S 450 .
  • the surveillance camera 100 If it is determined in operation S 450 that the abnormal image and the abnormal sound source are generated in different events, the surveillance camera 100 generates metadata about each of the abnormal image and the abnormal sound source in operation S 430 .
  • the surveillance camera 100 subtracts the number of generated events when generating the metadata in operation S 460 .
  • the surveillance camera 100 If the metadata is completely generated, the surveillance camera 100 generates alarm information with reference to the determination result and the metadata in operation S 470 .
  • the image recording apparatus 300 receives the image, the metadata, and the alarm information from the surveillance camera 100 , stores the image, the metadata, and the alarm information, performs processing for a display of the alarm information, and displays the alarm information on the display apparatus 400 .
  • Metadata about an abnormal situation may be generated, stored, and displayed through an analysis of an image and/or sound source to increase monitoring efficiency. Also, an abnormal situation occurring in a blind area of a camera may be checked, and a subsequent search for the abnormal situation may be become easy according to the generation of the metadata. In addition, whether an abnormal sound source exists in an abnormal image and whether the abnormal image and the abnormal sound source are generated in the same event may be determined to reduce repeated or false alarms.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Alarm Systems (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Burglar Alarm Systems (AREA)

Abstract

An image monitoring system including a surveillance camera is provided. The surveillance camera includes: an image analyzer configured to analyze an input image; a sound source analyzer configured to analyze a sound source in the image or a sound source that is input separately; and a determiner configured to determine whether an abnormal image and an abnormal sound source exist, and when the abnormal image and the abnormal sound source are obtained, according to a result of the image analysis and the sound source analysis, to generate metadata and alarm information based on a result of the determination, wherein the abnormal image and the abnormal sound source are predefined.

Description

    CROSS-REFERENCE TO THE RELATED APPLICATION
  • This application claims priority from Korean Patent Application No. 10-2013-0115708, filed on Sep. 27, 2013, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference.
  • BACKGROUND
  • 1. Field
  • Apparatuses and methods consistent with exemplary embodiments of the inventive concept relate to an image monitoring system that analyzes an image and/or a sound source to generate, store, and display metadata about an abnormal situation, a surveillance camera included in the image monitoring system, and a method of operating the surveillance camera.
  • 2. Description of the Related Art
  • An image monitoring system generally includes a surveillance camera and an image recording apparatus such as a network video recorder (NVR or digital video recorder (DVR), and analyzes an image acquired via the surveillance camera by using an image analysis function to determine whether a surveillance scene is abnormal according to predefined rules.
  • When the number of surveillance cameras that simultaneously operate in an image monitoring system has increased, it is difficult for a user to check in real time whether all surveillance scenes are abnormal. Also, it is possible to determine a moving object through image analysis, but it is difficult to detect an abnormal situation. In addition, even though an abnormal situation is detected through image analysis, the detection result may be false.
  • SUMMARY
  • One or more embodiments of the inventive concept include an image monitoring system that analyzes an image and/or sound source to generate, store, and display metadata about an abnormal situation so as to increase the monitoring efficiency, a surveillance camera included in the image monitoring system, and a method of operating the surveillance camera.
  • Various aspects of exemplary embodiments will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the embodiments.
  • According to an aspect of an exemplary embodiment, there is provided a surveillance camera which may include: an image analyzer configured to analyze an input image; a sound source analyzer configured to analyze a sound source in the image or a sound source that is input separately; and a determiner configured to determine whether an abnormal image and an abnormal sound source exist, and when the abnormal image and the abnormal sound source are obtained, according to a result of the image analysis and the sound source analysis, to generate metadata and alarm information based on a result of the determination, wherein the abnormal image and the abnormal sound source are predefined.
  • With regard to the determiner, if it is determined that the abnormal image and the abnormal sound source are simultaneously obtained, the determiner may further determine whether the abnormal sound source is input from the abnormal image or an outside of the abnormal image.
  • If it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, and the abnormal sound source is obtained from the outside of the abnormal image, a metadata generator included in the surveillance camera may generate metadata about each of the abnormal image and the abnormal sound source.
  • If it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, and the abnormal sound source is obtained from the abnormal image, the determiner may further determine whether the abnormal image and the abnormal sound source indicate a same event.
  • The surveillance camera may further include the metadata generator configured to generate the metadata comprising times when the abnormal image and the abnormal sound source are obtained, respectively, types of the abnormal image and the abnormal sound source, a number of events indicated by the abnormal image and the abnormal sound source, whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image, and whether the abnormal image and the abnormal sound source indicate a same event, with reference to the determination result of the determiner. The surveillance camera may also include an alarm information generator configured to generate the alarm information about the abnormal image and the abnormal sound source with reference to the determination result of the determiner and the metadata.
  • The types of the abnormal sound source and the abnormal image each may include a plurality of candidates and accuracy information about the candidates, and the determiner may determine whether the abnormal image and the abnormal sound source indicate the same event with reference to the plurality of candidates corresponding to the types of the abnormal image and the abnormal sound source and the accuracy information.
  • If it is determined by the determiner that a sum of a probability of a type of the abnormal image corresponding to a preset event and a probability of a type of the abnormal sound source corresponding to the preset event exceeds a preset threshold value, the determiner may further determine that the abnormal image and the abnormal sound source indicate the preset event.
  • If it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, the determiner may further determine whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image.
  • If it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, and the abnormal sound source is obtained from the outside of the abnormal image, the metadata generator may generate the metadata about each of the abnormal image and the abnormal sound source.
  • If it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, and the abnormal sound source is obtained from the abnormal image, the determiner may further determine whether the abnormal image and the abnormal sound source indicate the same event, and if it is determined by the determiner that the abnormal image and the abnormal sound source indicate the same event, the metadata generator may reduce the number of events indicated by the abnormal image and the abnormal sound source from the metadata.
  • If it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, the abnormal sound source is obtained from the abnormal image, and the abnormal sound source and the abnormal image indicate different events, the metadata generator may generate the metadata about each of the abnormal image and the abnormal sound source.
  • According to an aspect of another exemplary embodiment, there is provided a monitoring system including the above surveillance camera; an image recording apparatus configured to receive the image, the metadata and the alarm information from the camera, store the image, the metadata and the alarm information, and perform processing for displaying the alarm information; and a display apparatus configured to display the image, the metadata, and the alarm information.
  • According to an aspect of still another exemplary embodiment, there is provided a method of controlling a surveillance camera, the method including: analyzing an input image; analyzing a sound source in the image or a sound source that is input separately; determining whether an abnormal image and an abnormal sound source exist, and when the abnormal image and the abnormal sound source are obtained, according to a result of the image analysis and the sound source analysis; generating metadata and alarm information based on a result of the determining, wherein the abnormal image and the abnormal sound source are predefined.
  • The determining may include determining whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image.
  • The determining may also include determining whether the abnormal image and the abnormal sound source indicate a same event.
  • The metadata includes information about when the abnormal image and the abnormal sound source are obtained, respectively, types of the abnormal image and the abnormal sound source, a number of events indicated by the abnormal image and the abnormal sound source, whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image, and whether the abnormal image and the abnormal sound source indicate a same event, with reference to a result of the determining.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and/or other aspects will become apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a block diagram illustrating an image monitoring system according to an exemplary embodiment;
  • FIG. 2 is a block diagram illustrating a detailed structure of a surveillance camera of the image monitoring system of FIG. 1, according to an exemplary embodiment;
  • FIGS. 3A and 3B are views illustrating metadata that are generated by the surveillance camera of the image monitoring system of FIG. 1, according to an exemplary embodiment; and
  • FIG. 4 is a flowchart of a method of operating an image monitoring system, according to an exemplary embodiment.
  • DETAILED DESCRIPTION OF THE EXEMPLARY EMBODIMENTS
  • Reference will now be made in detail to exemplary embodiments in reference to the accompanying drawings, in which like reference numerals refer to like elements throughout. In this regard, the present embodiments may have different forms and should not be construed as being limited to the descriptions set forth herein. Accordingly, the embodiments are merely described below, by referring to the figures, to explain various aspects of the inventive concept. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
  • FIG. 1 is a block diagram illustrating an image monitoring system according to an exemplary embodiment.
  • Referring to FIG. 1, the image monitoring system includes a surveillance camera 100, a microphone 200, an image recording apparatus 300, and a display apparatus 400.
  • The surveillance camera 100 may be a single fixed camera including a fixed lens and having a fixed capturing range or a pan-tilt-zoom (PTZ) camera having a variable capturing range. If the surveillance camera 100 is a PTZ camera, the surveillance areas may be easily changed through a pan motion in a horizontal direction, a tilt motion in a vertical direction, and a zoom-in and/or zoom-out motion. The PTZ camera may have a uniform resolution and may horizontally and vertically rotated, in comparison with the single fixed camera, to thereby acquire images in all directions. The surveillance camera 100 is connected to the image recording apparatus 300 that may be a DVR. If the surveillance camera 100 is a network camera, the surveillance camera 100 may be connected to the image recording apparatus 300 through a network. In this case, the image recording apparatus 300 may be an NVR.
  • In the present exemplary embodiment, although only one surveillance camera 100 is shown in the image monitoring system of FIG. 1, the image monitoring system may include more surveillance cameras 100 that analyze one or more images and/or sound sources from the input images to determine whether an abnormal image exists, a sound source exists, or an abnormal image and sound source exist in order to generate metadata and alarm information. The sound source may be included in or obtained from an image captured by the surveillance camera 100 or may be directly input to the surveillance camera 100 through the microphone 200. The surveillance camera 100 will be described in detail with reference to FIG. 2.
  • The image recording apparatus 300 receives the image, the metadata, and the alarm information from the surveillance camera 100, stores the image, the metadata, and the alarm information, and performs display processing of the alarm information. The display processing of the alarm information refers to processing, such as flickering or highlighting, performed on the image or the metadata displayed on the display apparatus 400. As presented above, the image recording apparatus 300 may be a DVR or an NVR according to a type of the surveillance camera 100.
  • The display apparatus 400 displays the image, the metadata, and the processed alarm information output from the image recording apparatus 300. The display apparatus 400 may divide a screen according to the number of surveillance cameras 100 and display images respectively captured by each of the surveillance cameras 100 through the image recording apparatus 300.
  • FIG. 2 is a block diagram illustrating a detailed structure of the surveillance camera 100 of FIG. 1. Referring to FIG. 2, the surveillance camera 100 includes an image sensor 110, an image analyzer 120, a sound source analyzer 130, a determiner 140, a metadata generator 150, an alarm information generator 160, a driving controller 170, and a tracking controller 180.
  • The image sensor 110 converts an optical signal bouncing off a subject and passing through a lens (not shown) of the surveillance camera 100 into an electric signal (an image signal) and outputs the electric signal. The image sensor 110 may also be a complementary metal-oxide semiconductor (CMOS) module or a charge-coupled device (CCD) module.
  • Image processing may be performed on an image output from the image sensor 110. The image processing may be performed by the image analyzer 120 or another module. In the present exemplary embodiment, for convenience of description, the image processing will be described as being performed by the image analyzer 120.
  • The image analyzer 120 reduces noise from the image signal output from the image sensor 120 and may perform image signal processing for improving an image quality, such as a gamma correction, a color filter array interpolation, a color matrix, a color correction, a color enhancement, or the like. The image analyzer 120 may also functionally perform color processing, blur processing, edge enhancement processing, image analysis processing, image recognition processing, image effect processing, or the like on the image signal output from the image sensor 110. The image analyzer 120 may perform face recognition, scene recognition, or the like through the image recognition processing. For example, the image analyzer 120 may perform luminance level adjusting, color correction, contrast adjusting, contour emphasis adjusting, screen division processing, character image generating, image synthesis process, or the like.
  • Also, the image analyzer 120 performs an image analysis on the image signal output from the image sensor 110, and if the analysis result satisfies a preset event generation condition, generates an event. According to an exemplary embodiment, the image analysis refers to tracking to detect disappearance or appearance of an object in a screen, image tracking of an object similar to a particular image input by a user, sensing a motion of the object, screen blackout, or the like, and if the image analysis result satisfies the preset generation condition, an abnormal image is detected to generate an event.
  • The sound source analyzer 130 analyzes a sound source that is included in or obtained from the image signal output from the image sensor 110 or is directly input through the microphone 200, and if the analysis result satisfies an event generation condition, generates an event. In this case, the sound source analysis analyzes a sound source having a size greater than or equal to a threshold value designated by the user or a sound source specified by the user to generate the event. In addition, several types of abnormal sound sources may be included into a database (DB), the analyzed sound source may be compared with the sound sources stored in the DB, and if the analyzed sound source corresponds to the stored sound sources, an event may be generated.
  • The event generation condition used in the image analysis corresponds to an appearance of an object, a generation of an image specified by the user (for example, an appearance of a unrecognizable face), a change in a screen color, a motion occurring in a set area, or the like and may be preset. The event generation condition used in the sound source analysis corresponds to generation of an abnormal sound (a friction sound (skid) of a car tire, a glass breaking sound, an alarm sound, a collision sound, or the like), generation of a sound specified by the user (for example, a man's screaming, woman's screaming, baby's crying sound, or the like), generation of a sound with an amplitude higher than or equal to a threshold value, or the like, and may be preset.
  • The determiner 140 may determine whether an abnormal image exists, an abnormal sound exists, or an abnormal image and an abnormal sound source simultaneously exist according to an image analysis result and/or a sound source analysis result. In particular, if it is determined that an abnormal image and an abnormal sound source simultaneously exist, the determiner 140 determines whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image. Also, if the abnormal sound source is obtained from the abnormal image, the determiner 140 determines whether the abnormal image and the abnormal sound source indicate a same event. The determiner 140 performs multiple determinations as described above because a result of the generated metadata varies according to a determination result of the determiner 140.
  • The metadata generator 150 generates metadata about the abnormal image, metadata about the abnormal sound source, and metadata about the abnormal image and the abnormal sound source with reference to the determination result of the determiner 140. The metadata refers to data that provides information about original data (an image, a sound source, or the image and the sound source) and describes another data. In the present exemplary embodiment, the metadata generator 150 generate metadata including a number of times of generating the abnormal image and/or the abnormal sound source, analysis results and types of the abnormal image and/or the abnormal sound source, a number of generated events indicated by the abnormal image and/or the abnormal sound source, whether the abnormal sound source is obtained from the abnormal image, and whether the abnormal image and the abnormal sound source indicate a same event with reference to the determination result of the determiner 140.
  • FIGS. 3A and 3B are views illustrating metadata that are generated by the metadata generator 150, according to an exemplary embodiment.
  • Referring to FIG. 3A, the metadata is classified into items of a generation time, an abnormal sound source type, an abnormal image type, a cross check, a number of event generations, and multi-detection. The multi detection item indicates a determination result of whether an abnormal sound source is obtained from an abnormal image. Specifically, the multi detection item indicates a determination result of recognizing a position in which the abnormal sound source is generated and whether the recognized position is obtained from the abnormal image. The cross check item indicates a determination result of whether the abnormal image and the abnormal sound source indicate a same event. In other words, the cross check item indicates a determination result of whether the abnormal image and the abnormal sound source are a same type of events.
  • FIG. 3B illustrates a process of the determiner 140 of FIG. 2 that performs a cross check with reference to an abnormal image and an abnormal sound source, according to an exemplary embodiment. Referring to FIG. 3B, a type of the abnormal sound source and a type of the abnormal image are not determined as one type, and may each includes a plurality of candidates and also includes accuracies of the respective candidates. For example, first metadata shown in FIG. 3B indicates that a probability of the abnormal sound source corresponding to a collision is 55%, a probability of the abnormal sound source corresponding to an explosion is 40%, and a probability of the abnormal sound source corresponding to man's screaming is 10%. Also, the first metadata shown in FIG. 3B indicates that a probability of the abnormal image corresponding to an explosion is 95% and a probability of the abnormal image corresponding to a collision is 5%. A cross check result obtained from these analysis results may indicate an explosion.
  • If the type of the abnormal sound source and the type of the abnormal image are shown as various candidates, and the candidates include accuracies, the cross check may be performed with reference to the candidates of the types of the abnormal sound source and the abnormal image and the accuracies of the candidates. For example, with reference to the first metadata shown in FIG. 3B, a probability of the abnormal sound source being a collision and a probability of the abnormal image being an explosion are the highest, and thus, the types of the abnormal sound source and the abnormal image do not correspond to each other. However, if a sum of the probability (40%) of the abnormal sound source being the explosion and the probability (95%) of the abnormal image being the explosion exceeds a preset threshold value (for example, 130%), the result of the cross check may be determined as an explosion. For example, with reference to second metadata shown in FIG. 3B, if a sum of a probability (70%) of the abnormal sound source being man's screaming, a probability (30%) of the abnormal sound source being woman's screaming, and a probability (80%) of the abnormal sound source being a violent event exceeds a preset threshold value, the result of the cross check may be determined as being a violent event. In this case, a man's screaming and a woman's screaming may be regarded as a same type of events in order to sum the accuracies, acquire a maximum value, or acquire an average value in order to use values of the accuracies.
  • A detailed method of performing the cross check by using the accuracies of the candidates is not limited to the above-described methods, and other various methods may be used.
  • If the determiner 140 determines that an abnormal sound source is obtained from the abnormal image, and the abnormal image and the abnormal sound source indicate a same event, the metadata generator 150 subtracts the number of generated events from generated metadata. Referring to reference numeral 310 of FIG. 3A, since the abnormal sound source and the abnormal image of various types indicate the same event, the number of generated events is subtracted from the metadata. That is, the number of events is changed from two to one. Referring to reference numeral 320 of FIG. 3A, since the abnormal sound source and the abnormal image of various types indicate different events, the number of generated events is maintained.
  • However, if the determiner 140 determines that the abnormal image and the abnormal sound source are obtained differently, the abnormal sound source is obtained from an outside the abnormal image, and the abnormal sound source and the abnormal image indicate different events, the metadata generator 150 generates metadata about each of the abnormal image and the abnormal sound source.
  • The alarm information generator 160 generates alarm information about the abnormal image and/or the abnormal sound source with reference to the determination result of the determiner 140 and the generated metadata. The alarm information generator 160 may generate alarm information about the abnormal image and/or the abnormal sound source in which the larger number of events exist or alarm information about a particular abnormal image (e.g., a collision, or the like) or a particular abnormal sound source (e.g., woman's screaming, or the like).
  • The driving controller 170 controls driving of the surveillance camera 100. For example, the driving controller 170 controls panning, tilting, and zooming operations, etc. of the surveillance camera 100. If the surveillance camera is a fixed camera, the driving controller 170 may be omitted.
  • The tracking controller 180 outputs a tracking control signal to the driving controller 170 with reference to the determination result of the determiner 140. The surveillance camera 100 is driven according to the tracking control signal of the tracking controller 180 to track a target. For example, if the surveillance camera 100 is a zoom camera, and the abnormal sound source exists outside the abnormal image according to a multi detection determination result of the determiner 140, the tracking controller 180 may output a control signal to the driving controller 170 to enable the surveillance camera 100 to zoom out and then capture a wider area.
  • According to another exemplary embodiment, if the surveillance camera 100 is a PTZ camera, and the abnormal sound source exists outside the abnormal image according to the multi detection determination result of the determiner 140, the tracking controller 180 may output a control signal to the driving controller 170 to track a position where the abnormal sound source is generated. If a plurality of abnormal sound sources are simultaneously generated, tracking may be performed according to priorities with reference to metadata of the respective abnormal sound sources. For example, if a plurality of events simultaneously occur, tracking may be performed according to types and priorities of the events.
  • As a tracking result, an image and data may be displayed on the display apparatus 400 or may be stored in an additional storage device. If the surveillance camera 100 is a fixed camera, tracking and driving controls may not be performed.
  • The image, the metadata, and the alarm information generated by the surveillance camera 100 are displayed on the display apparatus 400 through the image recording apparatus 300. In this case, the image recording apparatus 300 performs processing for the display of the alarm information.
  • As described above, metadata about an abnormal situation may be generated, stored, and displayed through an image and/or sound source analysis to increase monitoring efficiency. An abnormal situation occurring in a blind area of the surveillance camera 100 may be checked, and a subsequent search for the abnormal situation may become easy according to the generation of the metadata. Also, when the metadata is generated, whether the abnormal sound source is obtained from the abnormal image and whether the abnormal image and the abnormal sound source indicate the same event may be determined to reduce repeated or false alarms.
  • FIG. 4 is a flowchart of a method of operating an image monitoring system, according to an exemplary embodiment. In the following description, descriptions of parts overlapping the descriptions of FIGS. 1 through 3 are omitted.
  • Referring to FIG. 4, in operation S410, the surveillance camera 100 determines whether an abnormal image exists, an abnormal sound source exists, or an abnormal image and an abnormal sound source simultaneously exist, according to an analysis of an input image and/or a sound source input through the microphone 200.
  • If it is determined in operation S420 that the abnormal image and the abnormal sound source do not simultaneously exist, the surveillance camera 100 generates metadata of each of the abnormal image and the abnormal sound source in operation S430.
  • However, if it is determined in operation S420 that the abnormal image and the abnormal sound source simultaneously exist, the surveillance camera 100 determines whether the abnormal sound source exists in the abnormal image in operation s440.
  • If it is determined in operation S40 that the abnormal sound source exists outside the abnormal image, the surveillance camera 100 generates metadata about each of the abnormal image and the abnormal sound source in operation S430.
  • However, if it is determined in operation S440 that the abnormal sound source exists in the abnormal image, the surveillance camera 100 determines whether the abnormal image and the abnormal sound source are generated in a same event in operation S450.
  • If it is determined in operation S450 that the abnormal image and the abnormal sound source are generated in different events, the surveillance camera 100 generates metadata about each of the abnormal image and the abnormal sound source in operation S430.
  • If it is determined in operation S450 that the abnormal image and the abnormal sound source are generated in the same event, the surveillance camera 100 subtracts the number of generated events when generating the metadata in operation S460.
  • If the metadata is completely generated, the surveillance camera 100 generates alarm information with reference to the determination result and the metadata in operation S470.
  • In operation S480, the image recording apparatus 300 receives the image, the metadata, and the alarm information from the surveillance camera 100, stores the image, the metadata, and the alarm information, performs processing for a display of the alarm information, and displays the alarm information on the display apparatus 400.
  • As described above, according to the one or more of the above embodiments, metadata about an abnormal situation may be generated, stored, and displayed through an analysis of an image and/or sound source to increase monitoring efficiency. Also, an abnormal situation occurring in a blind area of a camera may be checked, and a subsequent search for the abnormal situation may be become easy according to the generation of the metadata. In addition, whether an abnormal sound source exists in an abnormal image and whether the abnormal image and the abnormal sound source are generated in the same event may be determined to reduce repeated or false alarms.
  • It should be understood that the exemplary embodiments described therein should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects within each embodiment should typically be considered as available for other similar features or aspects in other embodiments.
  • While one or more embodiments have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope as defined by the following claims.

Claims (20)

What is claimed is:
1. A surveillance camera comprising:
an image analyzer configured to analyze an input image;
a sound source analyzer configured to analyze a sound source in the image or a sound source that is input separately; and
a determiner configured to determine whether an abnormal image and an abnormal sound source exist, and when the abnormal image and the abnormal sound source are obtained, according to a result of the image analysis and the sound source analysis, to generate metadata and alarm information based on a result of the determination,
wherein the abnormal image and the abnormal sound source are predefined.
2. The surveillance camera of claim 1, wherein if it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, the determiner further determines whether the abnormal sound source is input from the abnormal image or an outside of the abnormal image.
3. The surveillance camera of claim 2, further comprising a metadata generator,
wherein if it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, and the abnormal sound source is obtained from the outside of the abnormal image, the metadata generator generates metadata about each of the abnormal image and the abnormal sound source.
4. The surveillance camera of claim 2, wherein if it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, and the abnormal sound source is obtained from the abnormal image, the determiner further determines whether the abnormal image and the abnormal sound source indicate a same event.
5. The surveillance camera of claim 1 further comprising:
a metadata generator configured to generate the metadata comprising times when the abnormal image and the abnormal sound source are obtained, respectively, types of the abnormal image and the abnormal sound source, a number of events indicated by the abnormal image and the abnormal sound source, whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image, and whether the abnormal image and the abnormal sound source indicate a same event, with reference to the determination result of the determiner; and
an alarm information generator configured to generate the alarm information about the abnormal image and the abnormal sound source with reference to the determination result of the determiner and the metadata.
6. The surveillance camera of claim 5, wherein the types of the abnormal sound source and the abnormal image each comprises a plurality of candidates and accuracy information about the candidates, and
wherein the determiner determines whether the abnormal image and the abnormal sound source indicate the same event with reference to the plurality of candidates corresponding to the types of the abnormal image and the abnormal sound source and the accuracy information.
7. The surveillance camera of claim 6, wherein if it is determined by the determiner that a sum of a probability of a type of the abnormal image corresponding to a preset event and a probability of a type of the abnormal sound source corresponding to the preset event exceeds a preset threshold value, the determiner further determines that the abnormal image and the abnormal sound source indicate the preset event.
8. The surveillance camera of claim 5, wherein if it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, the determiner further determines whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image.
9. The surveillance camera of claim 8, wherein if it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, and the abnormal sound source is obtained from the outside of the abnormal image, the metadata generator generates the metadata about each of the abnormal image and the abnormal sound source.
10. The surveillance camera of claim 8, wherein if it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, and the abnormal sound source is obtained from the abnormal image, the determiner further determines whether the abnormal image and the abnormal sound source indicate the same event, and if it is determined by the determiner that the abnormal image and the abnormal sound source indicate the same event, the metadata generator reduces the number of events indicated by the abnormal image and the abnormal sound source from the metadata.
11. The surveillance camera of claim 10, wherein if it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, the abnormal sound source is obtained from the abnormal image, and the abnormal sound source and the abnormal image indicate different events, the metadata generator generates the metadata about each of the abnormal image and the abnormal sound source.
12. A monitoring system comprising:
the surveillance camera of claim 1;
an image recording apparatus configured to receive the image, the metadata and the alarm information from the camera, store the image, the metadata and the alarm information, and perform processing for displaying the alarm information; and
a display apparatus configured to display the image, the metadata, and the alarm information.
13. The monitoring system of claim 12, wherein if it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, the determiner further determines whether the abnormal sound source is input from the abnormal image or an outside of the abnormal image.
14. The monitoring system of claim 13, wherein the camera further comprises a metadata generator, and
wherein if it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, and the abnormal sound source is obtained from the outside of the abnormal image, the metadata generator generates metadata about each of the abnormal image and the abnormal sound source.
15. The monitoring system of claim 13, wherein if it is determined by the determiner that the abnormal image and the abnormal sound source are simultaneously obtained, and the abnormal sound source is obtained from the abnormal image, the determiner further determines whether the abnormal image and the abnormal sound source indicate a same event.
16. A method of controlling a surveillance camera, the method comprising:
analyzing an input image;
analyzing a sound source in the image or a sound source that is input separately;
determining whether an abnormal image and an abnormal sound source exist, and when the abnormal image and the abnormal sound source are obtained, according to a result of the image analysis and the sound source analysis;
generating metadata and alarm information based on a result of the determining, wherein the abnormal image and the abnormal sound source are predefined.
17. The method of claim 16, wherein the determining comprises determining whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image.
18. The method of claim 17, wherein the determining comprises determining whether the abnormal image and the abnormal sound source indicate a same event.
19. The method of claim 16, wherein the metadata comprises times when the abnormal image and the abnormal sound source are obtained, respectively, types of the abnormal image and the abnormal sound source, a number of events indicated by the abnormal image and the abnormal sound source, whether the abnormal sound source is obtained from the abnormal image or an outside of the abnormal image, and whether the abnormal image and the abnormal sound source indicate a same event, with reference to a result of the determining.
20. The method of claim 19, wherein the types of the abnormal sound source and the abnormal image each comprises a plurality of candidates and accuracy information about the candidates, and
wherein the determining comprises determining whether the abnormal image and the abnormal sound source indicate the same event with reference to the plurality of candidates corresponding to the types of the abnormal image and the abnormal sound source and the accuracy information.
US14/480,750 2013-09-27 2014-09-09 Image monitoring system and surveillance camera Active 2035-12-06 US10204275B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2013-0115708 2013-09-27
KR1020130115708A KR102066939B1 (en) 2013-09-27 2013-09-27 Image monitoring system

Publications (2)

Publication Number Publication Date
US20150092052A1 true US20150092052A1 (en) 2015-04-02
US10204275B2 US10204275B2 (en) 2019-02-12

Family

ID=52739776

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/480,750 Active 2035-12-06 US10204275B2 (en) 2013-09-27 2014-09-09 Image monitoring system and surveillance camera

Country Status (3)

Country Link
US (1) US10204275B2 (en)
KR (1) KR102066939B1 (en)
CN (1) CN104519318B (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104700536A (en) * 2015-04-07 2015-06-10 成都爱可信科技有限公司 Outdoor important facility safety protection device
US20170277947A1 (en) * 2016-03-22 2017-09-28 Sensormatic Electronics, LLC Method and system for conveying data from monitored scene via surveillance cameras
CN108141568A (en) * 2015-11-27 2018-06-08 韩华泰科株式会社 Osd information generation video camera, osd information synthesis terminal device 20 and the osd information shared system being made of it
WO2018181837A1 (en) * 2017-03-31 2018-10-04 Yokogawa Electric Corporation Methods and systems for image based anomaly detection
CN108702458A (en) * 2017-11-30 2018-10-23 深圳市大疆创新科技有限公司 Image pickup method and device
US20180373738A1 (en) * 2015-12-11 2018-12-27 Canon Kabushiki Kaisha Information presentation method and apparatus
US10192414B2 (en) 2016-03-22 2019-01-29 Sensormatic Electronics, LLC System and method for overlap detection in surveillance camera network
CN109559477A (en) * 2018-12-24 2019-04-02 绿瘦健康产业集团有限公司 Unmanned gymnasium indoor intelligent monitoring and managing method, device, terminal device and medium
US10318836B2 (en) 2016-03-22 2019-06-11 Sensormatic Electronics, LLC System and method for designating surveillance camera regions of interest
US10347102B2 (en) 2016-03-22 2019-07-09 Sensormatic Electronics, LLC Method and system for surveillance camera arbitration of uplink consumption
US20190304276A1 (en) * 2017-01-23 2019-10-03 Hanwha Techwin Co., Ltd Monitoring apparatus and system
US10475315B2 (en) 2016-03-22 2019-11-12 Sensormatic Electronics, LLC System and method for configuring surveillance cameras using mobile computing devices
US10638098B2 (en) * 2015-07-07 2020-04-28 Hanwha Defense Co., Ltd. Surveillance method
US10665071B2 (en) 2016-03-22 2020-05-26 Sensormatic Electronics, LLC System and method for deadzone detection in surveillance camera network
US10733231B2 (en) 2016-03-22 2020-08-04 Sensormatic Electronics, LLC Method and system for modeling image of interest to users
US10764539B2 (en) 2016-03-22 2020-09-01 Sensormatic Electronics, LLC System and method for using mobile device of zone and correlated motion detection
US10913533B2 (en) * 2016-12-12 2021-02-09 Optim Corporation Remote control system, remote control method and program
US11216847B2 (en) 2016-03-22 2022-01-04 Sensormatic Electronics, LLC System and method for retail customer tracking in surveillance camera network
US11601583B2 (en) 2016-03-22 2023-03-07 Johnson Controls Tyco IP Holdings LLP System and method for controlling surveillance cameras
WO2023238721A1 (en) * 2022-06-08 2023-12-14 富士フイルム株式会社 Information creation method and information creation device

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107347145A (en) * 2016-05-06 2017-11-14 杭州萤石网络有限公司 A kind of video frequency monitoring method and pan-tilt network camera
CN107040742B (en) * 2017-03-10 2019-10-18 浙江宇视科技有限公司 A kind of method for detecting abnormality and network hard disk video recorder NVR and video server
CN109698895A (en) * 2017-10-20 2019-04-30 杭州海康威视数字技术股份有限公司 A kind of analog video camera, monitoring system and data transmission method for uplink
WO2019076076A1 (en) * 2017-10-20 2019-04-25 杭州海康威视数字技术股份有限公司 Analog camera, server, monitoring system and data transmission and processing methods
CN110062198B (en) * 2018-01-18 2022-04-05 杭州海康威视数字技术股份有限公司 Monitoring evidence obtaining method, device and system, electronic equipment and storage medium
CN109300471B (en) * 2018-10-23 2021-09-14 中冶东方工程技术有限公司 Intelligent video monitoring method, device and system for field area integrating sound collection and identification
GB2628675A (en) * 2023-05-03 2024-10-02 Intelligent Instr Ltd Noise camera, server for processing documentary evidence from noise-cameras and methods

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6028626A (en) * 1995-01-03 2000-02-22 Arc Incorporated Abnormality detection and surveillance system
US20050280704A1 (en) * 2003-10-16 2005-12-22 Canon Europa Nv Method of video monitoring, corresponding device, system and computer programs
US20120300022A1 (en) * 2011-05-27 2012-11-29 Canon Kabushiki Kaisha Sound detection apparatus and control method thereof

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004070564A (en) 2002-08-05 2004-03-04 Nippon Denka:Kk Security system
KR100473758B1 (en) 2002-08-14 2005-03-10 엘지전자 주식회사 Method for detecting audio event in digital video recorder for monitoring
KR100876494B1 (en) * 2007-04-18 2008-12-31 한국정보통신대학교 산학협력단 Integrated file format structure composed of multi video and metadata, and multi video management system based on the same
KR101008060B1 (en) 2008-11-05 2011-01-13 한국과학기술연구원 Apparatus and Method for Estimating Sound Arrival Direction In Real-Time
JP5189536B2 (en) 2009-03-26 2013-04-24 池上通信機株式会社 Monitoring device
KR100999655B1 (en) * 2009-05-18 2010-12-13 윤재민 Digital video recorder system and application method thereof
KR20110095113A (en) 2010-02-16 2011-08-24 윤재민 Digital video recorder system displaying sound fields and application method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6028626A (en) * 1995-01-03 2000-02-22 Arc Incorporated Abnormality detection and surveillance system
US20050280704A1 (en) * 2003-10-16 2005-12-22 Canon Europa Nv Method of video monitoring, corresponding device, system and computer programs
US20120300022A1 (en) * 2011-05-27 2012-11-29 Canon Kabushiki Kaisha Sound detection apparatus and control method thereof

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104700536A (en) * 2015-04-07 2015-06-10 成都爱可信科技有限公司 Outdoor important facility safety protection device
US10638098B2 (en) * 2015-07-07 2020-04-28 Hanwha Defense Co., Ltd. Surveillance method
US10986314B2 (en) 2015-11-27 2021-04-20 Hanwha Techwin Co., Ltd. On screen display (OSD) information generation camera, OSD information synthesis terminal, and OSD information sharing system including the same
CN108141568A (en) * 2015-11-27 2018-06-08 韩华泰科株式会社 Osd information generation video camera, osd information synthesis terminal device 20 and the osd information shared system being made of it
EP3383030A4 (en) * 2015-11-27 2019-07-24 Hanwha Techwin Co., Ltd. Osd information generation camera, osd information synthesis terminal (20), and osd information sharing system comprising same
US20180373738A1 (en) * 2015-12-11 2018-12-27 Canon Kabushiki Kaisha Information presentation method and apparatus
US11182423B2 (en) * 2015-12-11 2021-11-23 Canon Kabushiki Kaisha Information presentation method and apparatus
US10665071B2 (en) 2016-03-22 2020-05-26 Sensormatic Electronics, LLC System and method for deadzone detection in surveillance camera network
US10475315B2 (en) 2016-03-22 2019-11-12 Sensormatic Electronics, LLC System and method for configuring surveillance cameras using mobile computing devices
US10764539B2 (en) 2016-03-22 2020-09-01 Sensormatic Electronics, LLC System and method for using mobile device of zone and correlated motion detection
US10318836B2 (en) 2016-03-22 2019-06-11 Sensormatic Electronics, LLC System and method for designating surveillance camera regions of interest
US10347102B2 (en) 2016-03-22 2019-07-09 Sensormatic Electronics, LLC Method and system for surveillance camera arbitration of uplink consumption
US20180218209A1 (en) * 2016-03-22 2018-08-02 Sensormatic Electronics, LLC Method and system for conveying data from monitored scene via surveillance cameras
US11216847B2 (en) 2016-03-22 2022-01-04 Sensormatic Electronics, LLC System and method for retail customer tracking in surveillance camera network
US10192414B2 (en) 2016-03-22 2019-01-29 Sensormatic Electronics, LLC System and method for overlap detection in surveillance camera network
US20170277947A1 (en) * 2016-03-22 2017-09-28 Sensormatic Electronics, LLC Method and system for conveying data from monitored scene via surveillance cameras
US10733231B2 (en) 2016-03-22 2020-08-04 Sensormatic Electronics, LLC Method and system for modeling image of interest to users
US11601583B2 (en) 2016-03-22 2023-03-07 Johnson Controls Tyco IP Holdings LLP System and method for controlling surveillance cameras
US10977487B2 (en) * 2016-03-22 2021-04-13 Sensormatic Electronics, LLC Method and system for conveying data from monitored scene via surveillance cameras
US9965680B2 (en) * 2016-03-22 2018-05-08 Sensormatic Electronics, LLC Method and system for conveying data from monitored scene via surveillance cameras
US10913533B2 (en) * 2016-12-12 2021-02-09 Optim Corporation Remote control system, remote control method and program
US11495103B2 (en) * 2017-01-23 2022-11-08 Hanwha Techwin Co., Ltd. Monitoring apparatus and system
US20190304276A1 (en) * 2017-01-23 2019-10-03 Hanwha Techwin Co., Ltd Monitoring apparatus and system
US11200427B2 (en) * 2017-03-31 2021-12-14 Yokogawa Electric Corporation Methods and systems for image based anomaly detection
WO2018181837A1 (en) * 2017-03-31 2018-10-04 Yokogawa Electric Corporation Methods and systems for image based anomaly detection
US11388333B2 (en) 2017-11-30 2022-07-12 SZ DJI Technology Co., Ltd. Audio guided image capture method and device
CN108702458A (en) * 2017-11-30 2018-10-23 深圳市大疆创新科技有限公司 Image pickup method and device
CN109559477A (en) * 2018-12-24 2019-04-02 绿瘦健康产业集团有限公司 Unmanned gymnasium indoor intelligent monitoring and managing method, device, terminal device and medium
WO2023238721A1 (en) * 2022-06-08 2023-12-14 富士フイルム株式会社 Information creation method and information creation device

Also Published As

Publication number Publication date
KR102066939B1 (en) 2020-01-16
CN104519318A (en) 2015-04-15
KR20150035322A (en) 2015-04-06
US10204275B2 (en) 2019-02-12
CN104519318B (en) 2019-03-29

Similar Documents

Publication Publication Date Title
US10204275B2 (en) Image monitoring system and surveillance camera
US10070053B2 (en) Method and camera for determining an image adjustment parameter
US11308777B2 (en) Image capturing apparatus with variable event detecting condition
JP2023526207A (en) Maintaining a constant size of the target object in the frame
JP4966012B2 (en) System and method for searching for changes in surveillance video
CN112805996B (en) Device and method for generating slow motion video clip
US10334150B2 (en) Camera system and method of tracking object using the same
US8437504B2 (en) Imaging system and imaging method
US7609290B2 (en) Surveillance system and method
JP2012185684A (en) Object detection device and object detection method
KR101212082B1 (en) Image Recognition Apparatus and Vison Monitoring Method thereof
JP2006107457A (en) Image processing apparatus and image processing method
US10937124B2 (en) Information processing device, system, information processing method, and storage medium
US20130236056A1 (en) Event detection system and method using image analysis
JP4889668B2 (en) Object detection device
JP2008035096A (en) Monitoring apparatus, monitoring method and program
KR101706221B1 (en) Security camera and Method of controlling thereof
US20160353052A1 (en) Adjusting length of living images
KR101272631B1 (en) Apparatus for detecting a moving object and detecting method thereof
JP2015097089A (en) Object detection device and object detection method
KR20090119668A (en) System for processing imaging for detecting the invasion and fire of building and method thereof
KR20160061706A (en) Camera apparatus and camera control system
JP2003319384A (en) Digital video image supervisory system having double function of automatically adjusting lens aperture and for detecting movement or the like of object

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG TECHWIN CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHIN, DONG HAK;JEON, KI YONG;REEL/FRAME:033697/0298

Effective date: 20140827

AS Assignment

Owner name: HANWHA TECHWIN CO., LTD., KOREA, REPUBLIC OF

Free format text: CHANGE OF NAME;ASSIGNOR:SAMSUNG TECHWIN CO., LTD.;REEL/FRAME:036233/0327

Effective date: 20150701

AS Assignment

Owner name: HANWHA AEROSPACE CO., LTD., KOREA, REPUBLIC OF

Free format text: CHANGE OF NAME;ASSIGNOR:HANWHA TECHWIN CO., LTD;REEL/FRAME:046927/0019

Effective date: 20180401

AS Assignment

Owner name: HANWHA AEROSPACE CO., LTD., KOREA, REPUBLIC OF

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION NUMBER 10/853,669. IN ADDITION PLEASE SEE EXHIBIT A PREVIOUSLY RECORDED ON REEL 046927 FRAME 0019. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME;ASSIGNOR:HANWHA TECHWIN CO., LTD.;REEL/FRAME:048496/0596

Effective date: 20180401

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: HANWHA TECHWIN CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HANWHA AEROSPACE CO., LTD.;REEL/FRAME:049013/0723

Effective date: 20190417

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

AS Assignment

Owner name: HANWHA VISION CO., LTD., KOREA, REPUBLIC OF

Free format text: CHANGE OF NAME;ASSIGNOR:HANWHA TECHWIN CO., LTD.;REEL/FRAME:064549/0075

Effective date: 20230228