US20240040215A1 - Image processing apparatus, image capturing apparatus, image processing method, and storage medium - Google Patents

Image processing apparatus, image capturing apparatus, image processing method, and storage medium Download PDF

Info

Publication number
US20240040215A1
US20240040215A1 US18/356,326 US202318356326A US2024040215A1 US 20240040215 A1 US20240040215 A1 US 20240040215A1 US 202318356326 A US202318356326 A US 202318356326A US 2024040215 A1 US2024040215 A1 US 2024040215A1
Authority
US
United States
Prior art keywords
image
subject
temperature
temperature distribution
obstacle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/356,326
Inventor
Kosuke Nobuoka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of US20240040215A1 publication Critical patent/US20240040215A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/026Control of working procedures of a pyrometer, other than calibration; Bandwidth calculation; Gain control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/08Optical arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/08Optical arrangements
    • G01J5/0846Optical arrangements having multiple detectors for performing different types of detection, e.g. using radiometry and reflectometry channels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • G01J5/0025Living bodies
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0066Radiation pyrometry, e.g. infrared or optical thermometry for hot spots detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Definitions

  • the present invention relates to an image processing apparatus, an image capturing apparatus, an image processing method, and a storage medium.
  • a CMOS image sensor that is often used in a common camera can generate an image by applying photoelectric conversion to electromagnetic waves in the range of visible light to near infrared (wavelengths of approximately 400 to 1000 nm), in a state where an infrared cut-off filter is not provided therein.
  • a thermal camera equipped with a thermal image sensor which can generate an image of heat on a surface of an object by detecting far infrared (longwave infrared (LWIR), with a wavelength of approximately 7 to 14 ⁇ m) radiated by the object, has been productized.
  • a thermal camera performs contactless sensing of the heat of an object with a temperature under a common social environment (e.g., a temperature in the range of approximately ⁇ 20° C. to 800° C.) as the intensity of radiated infrared, and generates an image of a temperature distribution in a shooting region. This enables a user to recognize the heat of the object as an image. Furthermore, as a thermal sensor detects infrared radiated by an object, it can perform shooting even under an environment where the light amount of visible light is zero, and has been used as, for example, a night-vision camera.
  • a common social environment e.g., a temperature in the range of approximately ⁇ 20° C. to 800° C.
  • a thermal camera has been widely utilized also as a contactless heat camera in maintaining equipment, making a determination about extinguishment of a fire, measuring a body temperature, and so forth.
  • a thermal camera has also been utilized as an advanced surveillance camera that especially surveils an object with a specific temperature (a person, an animal, a vehicle, a marine vessel, an aircraft, or the like) in an outdoor field.
  • Thermal sensors used in thermal cameras include quantum-type sensors that make use of the photoelectric effect of infrared, and heat-type sensors that do not make use of the photoelectric effect.
  • a microbolometer is known as a typical example of the heat-type sensors.
  • a microbolometer is a thermal sensor including a two-dimensional array of minute, thin films that are approximately 10 to 20 ⁇ m on a side and made of a metallic material with a high thermal sensitivity to far infrared (e.g., vanadium oxide).
  • far infrared e.g., vanadium oxide
  • the level of difficulty in manufacturing is high, and it is difficult to reduce the size thereof and provide a large number of pixels therein, compared to a common CMOS image sensor.
  • the number of pixels in a thermal sensor is approximately equivalent to the VGA size in many cases, and is approximately equivalent to the Full-HD size (1920 ⁇ 1080) at most even if its size is larger than the VGA size.
  • Japanese Patent Laid-Open No. 2008-174028 and Japanese Patent Laid-Open No. 2016-85739 disclose techniques to use a visible light camera and a thermal camera in combination.
  • the thermal camera specifies an image region of an object with a specific temperature, and the object in the specified image region is re-examined on an image of the visible light or near-infrared camera. In a case where the result of the re-examination indicates that the object with the specific temperature is not an expected object (e.g., person), information of the specified image region is deleted.
  • the thermal camera specifies a region of an object with a specific temperature, and then parameters for visible light image processing are changed in order to improve the visibility of the object with the specific temperature on a visible light image.
  • an obstacle plants, tree branches, a cloud of dust, or the like
  • the thermal sensor may not be able to detect far infrared from the object to be surveilled in distinction from far infrared from the obstacle.
  • the present invention has been made in view of the foregoing situation, and provides a technique to improve the accuracy of subject detection based on temperature distribution information that indicates a temperature distribution that has been detected based on an incident electromagnetic wave.
  • an image processing apparatus comprising at least one processor and/or at least one circuit which functions as: an obtainment unit configured to obtain a first image that has been generated through photoelectric conversion of incident light, and temperature distribution information indicating a temperature distribution that has been detected based on an incident electromagnetic wave; a detection unit configured to detect a subject and an obstacle that exists in front of the subject from the first image; an estimation unit configured to estimate a temperature corresponding to the subject in the temperature distribution information based on a result of detection of the obstacle; and an identification unit configured to identify a subject region that includes the subject in the temperature distribution information based on the estimated temperature.
  • an image capturing apparatus comprising: the image processing apparatus according to the first aspect; an image sensor configured to generate the first image through photoelectric conversion of the incident light; and a temperature distribution detector configured to generate the temperature distribution information by detecting the temperature distribution based on the incident electromagnetic wave.
  • an image processing method executed by an image processing apparatus comprising: obtaining a first image that has been generated through photoelectric conversion of incident light, and temperature distribution information indicating a temperature distribution that has been detected based on an incident electromagnetic wave; detecting a subject and an obstacle that exists in front of the subject from the first image; estimating a temperature corresponding to the subject in the temperature distribution information based on a result of detection of the obstacle; and identifying a subject region that includes the subject in the temperature distribution information based on the estimated temperature.
  • a non-transitory computer-readable storage medium which stores a program for causing a computer to execute an image processing method comprising: obtaining a first image that has been generated through photoelectric conversion of incident light, and temperature distribution information indicating a temperature distribution that has been detected based on an incident electromagnetic wave; detecting a subject and an obstacle that exists in front of the subject from the first image; estimating a temperature corresponding to the subject in the temperature distribution information based on a result of detection of the obstacle; and identifying a subject region that includes the subject in the temperature distribution information based on the estimated temperature.
  • FIG. 1 is a system configuration diagram of an image capturing apparatus.
  • FIG. 2 is a timing chart of processing executed by the image capturing apparatus.
  • FIG. 3 is a diagram showing a detailed configuration of a visible light image analysis unit 109 .
  • FIG. 4 is a diagram showing a detailed configuration of a temperature parameter setting unit 110 .
  • FIG. 5 is a diagram for describing the details of processing executed by an infrared light development unit 111 .
  • FIG. 6 is a diagram showing examples of display images generated by an image output unit 114 .
  • FIG. 1 is a system configuration diagram of an image capturing apparatus according to a first embodiment. This image capturing apparatus also functions as an image processing apparatus.
  • 100 denotes a visible light optical system
  • 101 denotes an infrared light optical system
  • 102 denotes a CMOS image sensor
  • 103 denotes a microbolometer
  • 104 denotes a parallax image generation unit
  • 105 denotes a parallax image memory
  • 106 denotes a visible light development unit.
  • 107 denotes a visible light image memory
  • 108 denotes an infrared light RAW image memory
  • 109 denotes a visible light image analysis unit
  • 110 denotes a temperature parameter setting unit
  • 111 denotes an infrared light development unit
  • 113 denotes a subject region identification unit.
  • 114 denotes an image output unit
  • 115 denotes a ranging unit
  • 116 denotes a visible light focus control unit
  • 117 denotes an infrared light focus control unit
  • 118 denotes a visible light lens driving unit
  • 119 denotes an infrared light lens driving unit.
  • Each component and each memory in the image capturing apparatus shown in FIG. 1 can be implemented based on any known technique with use of such hardware as a CPU and a memory (a ROM, a RAM, or the like), such software as a control program, or a combination of hardware and software. Note that no particular limitation is intended regarding a specific implementation method.
  • D 100 denotes a visible light RAW image
  • D 101 denotes an A image
  • D 102 denotes a B image
  • D 103 denotes a visible light image
  • D 104 denotes an infrared light RAW image
  • D 105 denotes a visible light image
  • D 106 denotes image analysis data
  • D 107 denotes temperature parameters.
  • D 108 denotes an infrared light RAW image
  • D 109 denotes an infrared light image
  • D 111 denotes subject region information
  • D 112 denotes an A image
  • D 113 denotes a B image.
  • D 114 denotes phase difference information
  • D 115 denotes visible light lens driving data
  • D 116 denotes infrared light lens driving data.
  • the image capturing apparatus generates parallax images, a visible light image, and an infrared light RAW image.
  • the CMOS image sensor 102 generates a visible light RAW image D 100 by applying photoelectric conversion to visible light that has been incident via the visible light optical system 100 (incident light).
  • the CMOS image sensor 102 includes a control unit and a pixel unit that includes a plurality of divided pixels, and is configured to generate information for image-plane phase detection autofocus (AF). Any known configuration can be used as a specific configuration of the CMOS image sensor 102 .
  • the visible light RAW image D 100 is transmitted to the visible light development unit 106 and the parallax image generation unit 104 .
  • the parallax image generation unit 104 executes division processing with respect to the visible light RAW image D 100 , thereby generating a plurality of parallax images that exhibit parallax relative to each other.
  • the number of parallax images is two, and the two parallax images are denoted by “A image” and “B image”, respectively.
  • the A image and the B image generated by the parallax image generation unit 104 are transferred to the parallax image memory 105 as the A image D 101 and the B image D 102 , respectively, and temporarily stored into the parallax image memory 105 .
  • the visible light development unit 106 generates a visible light image D 103 by executing development processing with respect to the visible light RAW image D 100 .
  • the visible light image D 103 is temporarily stored into the visible light image memory 107 .
  • the microbolometer 103 generates an infrared light RAW image D 104 by performing temperature distribution detection based on electromagnetic waves that have been incident via the infrared light optical system 101 .
  • the electromagnetic waves that have been incident via the infrared light optical system 101 include far-infrared light.
  • the far-infrared light used in the temperature distribution detection may be simply referred to as “infrared light”.
  • the infrared light RAW image D 104 is temporarily stored into the infrared light RAW image memory 108 .
  • T 200 denotes a frame cycle (one frame period)
  • T 201 denotes a CMOS accumulation period (a period in which the CMOS image sensor 102 accumulates charges) for one frame.
  • t 200 to t 207 denote various timings in one frame period.
  • t 200 is the timing of the beginning of a frame cycle; at t 200 , the CMOS image sensor 102 completes the accumulation of charges, and starts reading out pixel signals. Furthermore, at t 200 , the microbolometer 103 also starts reading out pixel signals. Note that the concept of accumulation does not exist in the microbolometer 103 .
  • processing for generating the visible light RAW image D 100 , the parallax images (the A image D 101 and the B image D 102 ), the visible light image D 103 , and the infrared light RAW image D 104 is started.
  • the parallax images (the A image D 101 and the B image D 102 ) are stored into the parallax image memory 105 , the visible light image D 103 is stored into the visible light image memory 107 , and the infrared light RAW image D 104 is stored into the infrared light RAW image memory 108 .
  • the following describes image analysis processing executed by the image capturing apparatus.
  • the visible light image D 103 stored in the visible light image memory 107 is read out as a visible light image D 105 and input to the visible light image analysis unit 109 .
  • the visible light image analysis unit 109 generates image analysis data D 106 by executing the image analysis processing with respect to the visible light image D 105 .
  • the image analysis data D 106 is input to the temperature parameter setting unit 110 .
  • the temperature parameter setting unit 110 generates temperature parameters D 107 based on the image analysis data D 106 .
  • 301 denotes an animal detection neural network (NN)
  • 302 denotes an animal detection database (DB)
  • 303 denotes an obstacle evaluation neural network (NN)
  • 304 denotes an obstacle evaluation database (DB)
  • 305 denotes a data output unit.
  • D 301 denotes animal position information
  • D 302 denotes obstacle attribute information
  • D 303 denotes obstacle intensity information
  • D 106 denotes image analysis data.
  • 401 denotes a data separation unit
  • 402 denotes a temperature setting neural network (NN)
  • 403 denotes a temperature setting database (DB).
  • the animal detection NN 301 of FIG. 3 performs computation according to a pre-configured neural network structure using the animal detection DB 302 , thereby generating animal position information D 301 indicating the position of a specific animal (a subject to be surveilled) included in the visible light image D 105 .
  • the animal detection DB 302 has been generated in advance using training images of the specific animal and a predetermined training algorithm.
  • the obstacle evaluation NN 303 analyzes whether an obstacle, such as grass, tree branches and leaves, and a cloud of dust, exists in front of the specific animal in the visible light image D 105 .
  • the obstacle evaluation NN 303 generates obstacle attribute information D 302 indicating a type of the obstacle, as well as obstacle intensity information D 303 indicating a density degree of the obstacle, as a result of such analysis (a result of detection of the obstacle).
  • the obstacle evaluation DB 304 has been generated in advance using a plurality of types of training object images that can be an obstacle to animal detection, such as grass, tree branches and leaves, and a cloud of dust, as well as a predetermined training algorithm.
  • the data output unit 305 groups the obstacle attribute information D 302 and the obstacle intensity information D 303 in accordance with a predetermined format, and outputs them as image analysis data D 106 .
  • the data separation unit 401 of FIG. 4 separates the obstacle attribute information D 302 and the obstacle intensity information D 303 from the image analysis data D 106 .
  • the temperature setting NN 402 generates temperature parameters D 107 by estimating how the expected body surface temperature of the specific animal will change in the condition of the obstacle indicated by the obstacle attribute information D 302 and the obstacle intensity information D 303 using the temperature setting DB 403 .
  • the temperature parameters D 107 include, for example, Center (° C.), W (° C.), and the expected body surface temperature of the specific animal.
  • Center (° C.) and W (° C.) are examples of parameters used by the infrared light development unit 111 and the subject region identification unit 113 among parameters included in the temperature parameters D 107 .
  • Center (° C.) denotes the expected temperature of an object to be surveilled (the specific animal)
  • W (° C.) denotes the range of temperatures to be surveilled, centered at the expected temperature.
  • Center denotes the expected temperature of an object to be surveilled (the specific animal)
  • W (° C.) denotes the range of temperatures to be surveilled, centered at the expected temperature.
  • Center a case where the body surface temperature of the specific animal is expected to be 36° C. and there is no influence of the obstacle.
  • the details of processing of the infrared light development unit 111 that uses Center (° C.) and W (° C.) will be described later with
  • the expected body surface temperature of the specific animal is an example of parameters used by the image output unit 114 among parameters included in the temperature parameters D 107 .
  • the details of processing of the image output unit 114 that uses the expected body surface temperature of the specific animal will be described later.
  • the temperature setting DB 403 has been generated in advance using the attribute and the density degree of the obstacle, such as grass, tree branches and leaves, and a cloud of dust, the body surface temperature of the specific animal in a case where the obstacle, such as grass, tree branches and leaves, and a cloud of dust, does not exist, and a predetermined training algorithm.
  • any known structures, training methods, and operations can be used as the structures, training methods, and operations of the above-described animal detection NN 301 , obstacle evaluation NN 303 , and temperature setting NN 402 .
  • the animal detection NN 301 generates and outputs the animal position information D 301 .
  • the animal detection NN 301 may output information (animal type information) indicating a type of an animal (a type of a subject that has been detected), in addition to the animal position information D 301 .
  • the obstacle evaluation NN 303 can transfer the animal type information to the data output unit 305
  • the data output unit 305 can include the animal type information in the image analysis data D 106 .
  • the data separation unit 401 separates the animal type information from the image analysis data D 106 , and transfers the animal type information to the temperature setting NN 402 .
  • the temperature setting NN 402 can estimate the temperature of a surveillance target based on not only the obstacle attribute information D 302 and the obstacle intensity information D 303 , but also the animal type information.
  • appropriate temperature estimation can be performed in accordance with a type of an animal that has been actually detected, even in a case where a type of an animal to be detected is unknown beforehand and the temperature in the case where there is no influence of an obstacle is unknown beforehand. This makes it possible to realize an image capturing apparatus that can surveil a plurality of types of animals (subjects).
  • t 201 the readout of the visible light image D 105 from the visible light image memory 107 is started.
  • t 201 is a timing at which a predetermined period has elapsed since t 200 , which is a timing at which storing of the visible light image D 103 is started.
  • the time difference between t 200 and t 201 can be set at the smallest time difference possible in a range where the accesses to the same address in a physical memory (e.g., DRAM) equipped with the visible light image memory 107 by way of writing and reading can be made in a desired order.
  • a physical memory e.g., DRAM
  • t 202 is also a timing at which the readout of an infrared light RAW image D 108 , which will be described later, is started.
  • the following describes processing executed by the image capturing apparatus, from development of the infrared light RAW image D 104 to generation of the subject region information D 111 .
  • the infrared light RAW image D 104 stored in the infrared light RAW image memory 108 is read out as an infrared light RAW image D 108 , and input to the infrared light development unit 111 .
  • the infrared light development unit 111 generates an infrared light image D 109 by executing development processing with respect to the infrared light RAW image D 108 based on the temperature parameters D 107 input from the temperature parameter setting unit 110 .
  • the infrared light image D 109 is input to the subject region identification unit 113 .
  • the subject region identification unit 113 identifies a region of a subject in the infrared light image D 109 based on the estimated temperature (Center) of the subject (a specific animal to be surveilled), and generates subject region information D 111 indicating the region of the subject.
  • the subject region identification unit 113 can identify the region of the subject by executing grouping processing with respect to pixels in a predetermined temperature range including the temperatures corresponding to the subject.
  • the grouping processing for pixels corresponding to the specific temperature can be executed using any known technique.
  • the infrared light image D 109 is different from the infrared light RAW image D 108 in that it is a developed image with enhanced contrast in the temperature range of Center ⁇ W (° C.).
  • both of the infrared light image D 109 and the infrared light RAW image D 108 include information of the temperature range of Center ⁇ W (° C.). Therefore, the subject region identification unit 113 may identify the region of the subject in the infrared light RAW image D 108 instead of the infrared light image D 109 .
  • the region of the subject is identified in information indicating the temperature distribution detected by the microbolometer 103 (temperature distribution information) as a result of processing of the subject region identification unit 113 .
  • 501 denotes an example of a bit range of the infrared light RAW image D 108 .
  • the infrared light RAW image D 108 is a 14-bit RAW image, with each pixel having a value in the range from 0 to 16383. 0 indicates ⁇ 20° C., and 16383 indicates +300° C. Center (° C.) and W (° C.) are examples of parameters used by the infrared light development unit 111 and the subject region identification unit 113 among parameters included in the temperature parameters D 107 .
  • Center (° C.) denotes the expected temperature of an object to be surveilled
  • W (° C.) denotes the range of temperatures to be surveilled, centered at the expected temperature.
  • Center 30.5 (° C.)
  • Center is also used by the image output unit 114 .
  • the infrared light image D 109 is a 12-bit luminance image, with each pixel having a value in the range from 0 to 4095.
  • the infrared light development unit 111 generates the infrared light image D 109 by tone-mapping Center (° C.) to 2047, and Center ⁇ W (° C.) to Th_btm and Th_top.
  • the temperature range of Center ⁇ W (° C.) in the infrared light RAW image D 108 is mapped to the luminance values of 2047 ⁇ 1024 in the infrared light image D 109 .
  • the infrared light image D 109 becomes an image in which the contrast in a temperature range to be surveilled (a predetermined temperature range including an estimated temperature of a subject) is more enhanced than the contrast outside the temperature range to be surveilled.
  • tone mapping processing can be executed using any known technique.
  • the readout of the infrared light RAW image D 108 from the infrared light RAW image memory 108 is started at t 202 , which is a timing at which the generation of the temperature parameters D 107 is completed. Note that as shown in FIG. 2 , storing of the infrared light RAW image D 104 is already completed at t 202 .
  • t 203 is also a timing at which the readout of an A image D 112 and a B image D 113 , which will be described later, is started.
  • the following describes processing in which the image capturing apparatus generates phase difference information D 114 .
  • the A image D 101 and the B image D 102 stored in the parallax image memory 105 are read out as an A image D 112 and a B image D 113 , and input to the ranging unit 115 . Furthermore, the subject region information D 111 is also input to the ranging unit 115 .
  • the ranging unit 115 generates the phase difference information D 114 by computing the amount of displacement between the A image D 112 and the B image D 113 with respect to the region indicated by the subject region information D 111 .
  • the computation of the amount of displacement in the ranging unit 115 can be executed using any known technique.
  • the readout of the A image D 112 and the B image D 113 from the parallax image memory 105 is started at t 203 , at which the generation of the subject region information D 111 is completed.
  • the generation of the phase difference information D 114 is also started, which is completed at t 204 .
  • t 204 is also a timing at which focus adjustment processing (AF control) is started.
  • focus adjustment processing AF control
  • the phase difference information D 114 is input to the visible light focus control unit 116 and the infrared light focus control unit 117 .
  • the visible light focus control unit 116 generates visible light lens driving data D 115 by calculating the amount of movement of a focusing lens included in the visible light optical system 100 based on the phase difference information D 114 , and applying an error correction associated with lens aberration to the calculated amount of movement.
  • the infrared light focus control unit 117 generates infrared light lens driving data D 116 by calculating the amount of movement of a focusing lens included in the infrared light optical system 101 based on the phase difference information D 114 , and applying an error correction associated with lens aberration to the calculated amount of movement.
  • the visible light lens driving data D 115 and the infrared light lens driving data D 116 are set in the visible light lens driving unit 118 and the infrared light lens driving unit 119 , respectively.
  • the visible light lens driving unit 118 adjusts the focus of the visible light optical system 100 by driving the focusing lens included in the visible light optical system 100 based on the visible light lens driving data D 115 .
  • the infrared light lens driving unit 119 adjusts the focus of the infrared light optical system 101 by driving the focusing lens included in the infrared light optical system 101 based on the infrared light lens driving data D 116 .
  • the image capturing apparatus of the present embodiment performs focus control for the CMOS image sensor 102 and the microbolometer 103 based on the state of focus in the region inside the visible light image that corresponds to the subject region in the infrared light image. Consequently, the subject can be brought into focus more accurately.
  • the generation of the visible light lens driving data D 115 and the infrared light lens driving data D 116 , the operations of the visible light lens driving unit 118 and the infrared light lens driving unit 119 , and the movements of the focusing lens included in the visible light optical system 100 and the focusing lens included in the infrared light optical system 101 are started at t 204 , which is a timing at which the generation of the phase difference information D 114 is completed, and are completed at t 205 .
  • the image capturing apparatus can set the frame cycle T 200 shown in FIG. 2 at 1/30 sec, which is a standard frame cycle for the microbolometer 103 .
  • the image capturing apparatus can perform shooting in a state where the subject has been brought into focus on a frame-by-frame basis, with respect to both of the visible light optical system 100 and the infrared light optical system 101 .
  • t 200 ′ is the timing of the beginning of a frame cycle for the next frame.
  • processing that is similar to processing from t 200 to t 206 related to the current frame is executed.
  • the following describes processing in which the image capturing apparatus generates and outputs display images.
  • the visible light image D 105 , the infrared light image D 109 , the image analysis data D 106 , the temperature parameters D 107 , and the subject region information D 111 are input to the image output unit 114 .
  • the image output unit 114 generates display images based on these images, data, parameters, and information that have been input.
  • the image output unit 114 outputs the generated display images to a display (not shown) of the image capturing apparatus.
  • the image output unit 114 distributes the generated display images to an external surveillance system (not shown) via, for example, a communication network.
  • FIG. 6 shows examples of the display images generated by the image output unit 114 .
  • the display images include a visible light image, a thermal image, and an image indicating detection information.
  • a frame based on the subject region information D 111 i.e., a frame indicating the region of the object to be surveilled is superimposed on the visible light image and the thermal image.
  • the detection information includes information indicating a type (a lion according to the examples of FIG. 6 ) of the detected subject (object to be surveilled), and information indicating a type (grass according to the examples of FIG. 6 ) of the detected obstacle, as information based on the image analysis data D 106 . Furthermore, the detection information includes the expected body surface temperature and the detected body surface temperature of the object to be surveilled (specific animal) as information based on the temperature parameters D 107 .
  • the expected body surface temperature is a body surface temperature that is expected in a case where there is no influence of the obstacle, and is 36.0° C. according to the examples of FIG. 6 .
  • the detected body surface temperature corresponds to “Center”, which has been described with reference to FIG. 4 and FIG. 5 , and is 30.5° C. according to the examples of FIG. 6 .
  • a surveillant can obtain information indicating whether the temperature indicated by the thermal image has fluctuated due to the obstacle, and to what extent the temperature has fluctuated if the fluctuation has occurred.
  • the image output unit 114 can provide notification of various types of information included in the display images by generating and outputting the display images.
  • the subject in this region can be brought into focus.
  • the image capturing apparatus obtains an image (a first image) that has been generated by the CMOS image sensor 102 through photoelectric conversion of incident light. Also, the image capturing apparatus obtains an image (temperature distribution information) indicating the temperature distribution that has been detected by the microbolometer 103 based on incident electromagnetic waves.
  • the image capturing apparatus detects a subject (e.g., a lion) and an obstacle that exists in front of the subject (e.g., grass, tree branches, a cloud of dust, or the like) from the first image. Based on the result of detection of the obstacle, the image capturing apparatus estimates the temperature corresponding to the subject in the temperature distribution information.
  • the image capturing apparatus identifies a subject region that includes the subject in the temperature distribution information.
  • the temperature distribution information that is used in identifying the subject region may be an image before the development, such as the infrared light RAW image D 108 , or may be an image after the development, such as the infrared light image D 109 .
  • the temperature that has been estimated based on the result of detection of an obstacle in an image generated through photoelectric conversion of incident light is used as the temperature of a subject that is referenced in detecting the subject (identifying the subject region) based on the temperature distribution information. This makes it possible to improve the accuracy of subject detection based on the temperature distribution information indicating the temperature distribution that has been detected based on incident electromagnetic waves.
  • Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
  • computer executable instructions e.g., one or more programs
  • a storage medium which may also be referred to more fully as a
  • the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
  • the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
  • the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radiation Pyrometers (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

There is provided an image processing apparatus. An obtainment unit obtains a first image that has been generated through photoelectric conversion of incident light, and temperature distribution information indicating a temperature distribution that has been detected based on an incident electromagnetic wave. A detection unit detects a subject and an obstacle that exists in front of the subject from the first image. An estimation unit estimates a temperature corresponding to the subject in the temperature distribution information based on a result of detection of the obstacle. An identification unit identifies a subject region that includes the subject in the temperature distribution information based on the estimated temperature.

Description

    BACKGROUND OF THE INVENTION Field of the Invention
  • The present invention relates to an image processing apparatus, an image capturing apparatus, an image processing method, and a storage medium.
  • Description of the Related Art
  • A CMOS image sensor that is often used in a common camera can generate an image by applying photoelectric conversion to electromagnetic waves in the range of visible light to near infrared (wavelengths of approximately 400 to 1000 nm), in a state where an infrared cut-off filter is not provided therein. Also, a thermal camera equipped with a thermal image sensor (thermal sensor), which can generate an image of heat on a surface of an object by detecting far infrared (longwave infrared (LWIR), with a wavelength of approximately 7 to 14 μm) radiated by the object, has been productized.
  • A thermal camera performs contactless sensing of the heat of an object with a temperature under a common social environment (e.g., a temperature in the range of approximately −20° C. to 800° C.) as the intensity of radiated infrared, and generates an image of a temperature distribution in a shooting region. This enables a user to recognize the heat of the object as an image. Furthermore, as a thermal sensor detects infrared radiated by an object, it can perform shooting even under an environment where the light amount of visible light is zero, and has been used as, for example, a night-vision camera. In addition, a thermal camera has been widely utilized also as a contactless heat camera in maintaining equipment, making a determination about extinguishment of a fire, measuring a body temperature, and so forth. Moreover, due to its characteristics as both a night-vision camera and a heat camera, a thermal camera has also been utilized as an advanced surveillance camera that especially surveils an object with a specific temperature (a person, an animal, a vehicle, a marine vessel, an aircraft, or the like) in an outdoor field.
  • Thermal sensors used in thermal cameras include quantum-type sensors that make use of the photoelectric effect of infrared, and heat-type sensors that do not make use of the photoelectric effect. A microbolometer is known as a typical example of the heat-type sensors. A microbolometer is a thermal sensor including a two-dimensional array of minute, thin films that are approximately 10 to 20 μm on a side and made of a metallic material with a high thermal sensitivity to far infrared (e.g., vanadium oxide). When the temperature of a minute electrode has been changed by radiated far infrared, the thermal resistance of the electrode changes. A thermal image can be obtained by electrically detecting the change in the thermal resistance.
  • With regard to a thermal sensor, the level of difficulty in manufacturing is high, and it is difficult to reduce the size thereof and provide a large number of pixels therein, compared to a common CMOS image sensor. The number of pixels in a thermal sensor is approximately equivalent to the VGA size in many cases, and is approximately equivalent to the Full-HD size (1920×1080) at most even if its size is larger than the VGA size.
  • Japanese Patent Laid-Open No. 2008-174028 and Japanese Patent Laid-Open No. 2016-85739 disclose techniques to use a visible light camera and a thermal camera in combination. According to the technique of Japanese Patent Laid-Open No. 2008-174028, the thermal camera specifies an image region of an object with a specific temperature, and the object in the specified image region is re-examined on an image of the visible light or near-infrared camera. In a case where the result of the re-examination indicates that the object with the specific temperature is not an expected object (e.g., person), information of the specified image region is deleted. According to the technique of Japanese Patent Laid-Open No. 2016-85739, the thermal camera specifies a region of an object with a specific temperature, and then parameters for visible light image processing are changed in order to improve the visibility of the object with the specific temperature on a visible light image.
  • During the surveillance of an object with a specific temperature (a person, an animal, a vehicle, a marine vessel, an aircraft, or the like) using a thermal camera, an obstacle (plants, tree branches, a cloud of dust, or the like) that does not completely hide the object to be surveilled may appear between the object to be surveilled and the thermal camera. In this case, due to, for example, a relatively small number of pixels in a thermal sensor, the thermal sensor may not be able to detect far infrared from the object to be surveilled in distinction from far infrared from the obstacle. In this case, the heat of the object to be surveilled and the heat of the obstacle are mixed, and a thermal image shows a region of a temperature that is different from (lower or higher than) the expected temperature of the object to be surveilled. However, whether this temperature difference is attributed to the existence of the obstacle cannot be determined solely based on the thermal image. Thus, if an obstacle exists between the object to be surveilled and the thermal camera, the accuracy of detection of the object to be surveilled decreases. The techniques of Japanese Patent Laid-Open No. 2008-174028 and Japanese Patent Laid-Open No. 2016-85739 cannot address such a problem.
  • SUMMARY OF THE INVENTION
  • The present invention has been made in view of the foregoing situation, and provides a technique to improve the accuracy of subject detection based on temperature distribution information that indicates a temperature distribution that has been detected based on an incident electromagnetic wave.
  • According to a first aspect of the present invention, there is provided an image processing apparatus, comprising at least one processor and/or at least one circuit which functions as: an obtainment unit configured to obtain a first image that has been generated through photoelectric conversion of incident light, and temperature distribution information indicating a temperature distribution that has been detected based on an incident electromagnetic wave; a detection unit configured to detect a subject and an obstacle that exists in front of the subject from the first image; an estimation unit configured to estimate a temperature corresponding to the subject in the temperature distribution information based on a result of detection of the obstacle; and an identification unit configured to identify a subject region that includes the subject in the temperature distribution information based on the estimated temperature.
  • According to a second aspect of the present invention, there is provided an image capturing apparatus, comprising: the image processing apparatus according to the first aspect; an image sensor configured to generate the first image through photoelectric conversion of the incident light; and a temperature distribution detector configured to generate the temperature distribution information by detecting the temperature distribution based on the incident electromagnetic wave.
  • According to a third aspect of the present invention, there is provided an image processing method executed by an image processing apparatus, comprising: obtaining a first image that has been generated through photoelectric conversion of incident light, and temperature distribution information indicating a temperature distribution that has been detected based on an incident electromagnetic wave; detecting a subject and an obstacle that exists in front of the subject from the first image; estimating a temperature corresponding to the subject in the temperature distribution information based on a result of detection of the obstacle; and identifying a subject region that includes the subject in the temperature distribution information based on the estimated temperature.
  • According to a fourth aspect of the present invention, there is provided a non-transitory computer-readable storage medium which stores a program for causing a computer to execute an image processing method comprising: obtaining a first image that has been generated through photoelectric conversion of incident light, and temperature distribution information indicating a temperature distribution that has been detected based on an incident electromagnetic wave; detecting a subject and an obstacle that exists in front of the subject from the first image; estimating a temperature corresponding to the subject in the temperature distribution information based on a result of detection of the obstacle; and identifying a subject region that includes the subject in the temperature distribution information based on the estimated temperature.
  • Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a system configuration diagram of an image capturing apparatus.
  • FIG. 2 is a timing chart of processing executed by the image capturing apparatus.
  • FIG. 3 is a diagram showing a detailed configuration of a visible light image analysis unit 109.
  • FIG. 4 is a diagram showing a detailed configuration of a temperature parameter setting unit 110.
  • FIG. 5 is a diagram for describing the details of processing executed by an infrared light development unit 111.
  • FIG. 6 is a diagram showing examples of display images generated by an image output unit 114.
  • DESCRIPTION OF THE EMBODIMENTS
  • Hereinafter, embodiments will be described in detail with reference to the attached drawings. Note, the following embodiments are not intended to limit the scope of the claimed invention. Multiple features are described in the embodiments, but limitation is not made to an invention that requires all such features, and multiple such features may be combined as appropriate. Furthermore, in the attached drawings, the same reference numerals are given to the same or similar configurations, and redundant description thereof is omitted.
  • FIRST EMBODIMENT Configuration of Image Capturing Apparatus
  • FIG. 1 is a system configuration diagram of an image capturing apparatus according to a first embodiment. This image capturing apparatus also functions as an image processing apparatus. In FIG. 1, 100 denotes a visible light optical system, 101 denotes an infrared light optical system, 102 denotes a CMOS image sensor, 103 denotes a microbolometer, 104 denotes a parallax image generation unit, 105 denotes a parallax image memory, and 106 denotes a visible light development unit. Also, 107 denotes a visible light image memory, 108 denotes an infrared light RAW image memory, 109 denotes a visible light image analysis unit, 110 denotes a temperature parameter setting unit, 111 denotes an infrared light development unit, and 113 denotes a subject region identification unit. Furthermore, 114 denotes an image output unit, 115 denotes a ranging unit, 116 denotes a visible light focus control unit, 117 denotes an infrared light focus control unit, 118 denotes a visible light lens driving unit, and 119 denotes an infrared light lens driving unit.
  • Each component and each memory in the image capturing apparatus shown in FIG. 1 can be implemented based on any known technique with use of such hardware as a CPU and a memory (a ROM, a RAM, or the like), such software as a control program, or a combination of hardware and software. Note that no particular limitation is intended regarding a specific implementation method.
  • Also, D100 denotes a visible light RAW image, D101 denotes an A image, D102 denotes a B image, D103 denotes a visible light image, D104 denotes an infrared light RAW image, D105 denotes a visible light image, D106 denotes image analysis data, and D107 denotes temperature parameters. Furthermore, D108 denotes an infrared light RAW image, D109 denotes an infrared light image, D111 denotes subject region information, D112 denotes an A image, and D113 denotes a B image. In addition, D114 denotes phase difference information, D115 denotes visible light lens driving data, and D116 denotes infrared light lens driving data.
  • Processing for Generating Parallax Images, Visible Light Image, and Infrared Light RAW Image
  • With reference to FIG. 1 , a description is now given of processing in which the image capturing apparatus generates parallax images, a visible light image, and an infrared light RAW image.
  • The CMOS image sensor 102 generates a visible light RAW image D100 by applying photoelectric conversion to visible light that has been incident via the visible light optical system 100 (incident light). The CMOS image sensor 102 includes a control unit and a pixel unit that includes a plurality of divided pixels, and is configured to generate information for image-plane phase detection autofocus (AF). Any known configuration can be used as a specific configuration of the CMOS image sensor 102. The visible light RAW image D100 is transmitted to the visible light development unit 106 and the parallax image generation unit 104.
  • The parallax image generation unit 104 executes division processing with respect to the visible light RAW image D100, thereby generating a plurality of parallax images that exhibit parallax relative to each other. In the example of FIG. 1 , the number of parallax images is two, and the two parallax images are denoted by “A image” and “B image”, respectively. The A image and the B image generated by the parallax image generation unit 104 are transferred to the parallax image memory 105 as the A image D101 and the B image D102, respectively, and temporarily stored into the parallax image memory 105.
  • The visible light development unit 106 generates a visible light image D103 by executing development processing with respect to the visible light RAW image D100. The visible light image D103 is temporarily stored into the visible light image memory 107.
  • The microbolometer 103 generates an infrared light RAW image D104 by performing temperature distribution detection based on electromagnetic waves that have been incident via the infrared light optical system 101. The electromagnetic waves that have been incident via the infrared light optical system 101 include far-infrared light. In the present specification, for the sake of simplicity, the far-infrared light used in the temperature distribution detection may be simply referred to as “infrared light”. The infrared light RAW image D104 is temporarily stored into the infrared light RAW image memory 108.
  • Next, with reference to a timing chart of FIG. 2 , a description is given of individual processing timings in the above-described processing for generating the parallax images, the visible light image, and the infrared light RAW image. In FIG. 2 , T200 denotes a frame cycle (one frame period), and T201 denotes a CMOS accumulation period (a period in which the CMOS image sensor 102 accumulates charges) for one frame. t200 to t207 denote various timings in one frame period.
  • t200 is the timing of the beginning of a frame cycle; at t200, the CMOS image sensor 102 completes the accumulation of charges, and starts reading out pixel signals. Furthermore, at t200, the microbolometer 103 also starts reading out pixel signals. Note that the concept of accumulation does not exist in the microbolometer 103. Along with the start of the readout, processing for generating the visible light RAW image D100, the parallax images (the A image D101 and the B image D102), the visible light image D103, and the infrared light RAW image D104 is started. The parallax images (the A image D101 and the B image D102) are stored into the parallax image memory 105, the visible light image D103 is stored into the visible light image memory 107, and the infrared light RAW image D104 is stored into the infrared light RAW image memory 108.
  • Image Analysis Processing
  • With reference to FIG. 1 , the following describes image analysis processing executed by the image capturing apparatus.
  • The visible light image D103 stored in the visible light image memory 107 is read out as a visible light image D105 and input to the visible light image analysis unit 109. The visible light image analysis unit 109 generates image analysis data D106 by executing the image analysis processing with respect to the visible light image D105. The image analysis data D106 is input to the temperature parameter setting unit 110. The temperature parameter setting unit 110 generates temperature parameters D107 based on the image analysis data D106.
  • The details of processing executed by the visible light image analysis unit 109 and the temperature parameter setting unit 110 are now described with reference to FIG. 3 and FIG. 4 .
  • In FIG. 3, 301 denotes an animal detection neural network (NN), 302 denotes an animal detection database (DB), 303 denotes an obstacle evaluation neural network (NN), 304 denotes an obstacle evaluation database (DB), and 305 denotes a data output unit. Furthermore, D301 denotes animal position information, D302 denotes obstacle attribute information, D303 denotes obstacle intensity information, and D106 denotes image analysis data.
  • In FIG. 4, 401 denotes a data separation unit, 402 denotes a temperature setting neural network (NN), and 403 denotes a temperature setting database (DB).
  • The animal detection NN 301 of FIG. 3 performs computation according to a pre-configured neural network structure using the animal detection DB 302, thereby generating animal position information D301 indicating the position of a specific animal (a subject to be surveilled) included in the visible light image D105. Here, the animal detection DB 302 has been generated in advance using training images of the specific animal and a predetermined training algorithm.
  • Using the obstacle evaluation DB 304 and the animal position information D301, the obstacle evaluation NN 303 analyzes whether an obstacle, such as grass, tree branches and leaves, and a cloud of dust, exists in front of the specific animal in the visible light image D105. The obstacle evaluation NN 303 generates obstacle attribute information D302 indicating a type of the obstacle, as well as obstacle intensity information D303 indicating a density degree of the obstacle, as a result of such analysis (a result of detection of the obstacle). Here, the obstacle evaluation DB 304 has been generated in advance using a plurality of types of training object images that can be an obstacle to animal detection, such as grass, tree branches and leaves, and a cloud of dust, as well as a predetermined training algorithm.
  • The data output unit 305 groups the obstacle attribute information D302 and the obstacle intensity information D303 in accordance with a predetermined format, and outputs them as image analysis data D106.
  • Next, the data separation unit 401 of FIG. 4 separates the obstacle attribute information D302 and the obstacle intensity information D303 from the image analysis data D106. The temperature setting NN 402 generates temperature parameters D107 by estimating how the expected body surface temperature of the specific animal will change in the condition of the obstacle indicated by the obstacle attribute information D302 and the obstacle intensity information D303 using the temperature setting DB 403.
  • The temperature parameters D107 include, for example, Center (° C.), W (° C.), and the expected body surface temperature of the specific animal.
  • Center (° C.) and W (° C.) are examples of parameters used by the infrared light development unit 111 and the subject region identification unit 113 among parameters included in the temperature parameters D107. Center (° C.) denotes the expected temperature of an object to be surveilled (the specific animal), whereas W (° C.) denotes the range of temperatures to be surveilled, centered at the expected temperature. For example, in a case where the body surface temperature of the specific animal is expected to be 36° C. and there is no influence of the obstacle, Center=36 (° C.). Also, in consideration of fluctuations in the body surface temperature caused by the ambient temperature and the like, the setting of W=20 (° C.) is configured, for example. The details of processing of the infrared light development unit 111 that uses Center (° C.) and W (° C.) will be described later with reference to FIG. 5 .
  • The expected body surface temperature of the specific animal is an example of parameters used by the image output unit 114 among parameters included in the temperature parameters D107. The details of processing of the image output unit 114 that uses the expected body surface temperature of the specific animal will be described later.
  • For example, the temperature setting NN 402 can estimate that “in a case where grass that has been currently detected exists in front of an object of 36° C., the object appears in a thermal image as an object of 30.5° C.”. Therefore, even in a case where the body surface temperature of the specific animal is expected to be 36° C., a temperature that has been estimated based on the result of detection of an obstacle is set, such as Center=30.5 (° C.), in accordance with the condition of the obstacle.
  • Here, the temperature setting DB 403 has been generated in advance using the attribute and the density degree of the obstacle, such as grass, tree branches and leaves, and a cloud of dust, the body surface temperature of the specific animal in a case where the obstacle, such as grass, tree branches and leaves, and a cloud of dust, does not exist, and a predetermined training algorithm.
  • Note that any known structures, training methods, and operations can be used as the structures, training methods, and operations of the above-described animal detection NN 301, obstacle evaluation NN 303, and temperature setting NN 402.
  • Furthermore, according to the above description, it has been assumed that the animal detection NN 301 generates and outputs the animal position information D301. However, the animal detection NN 301 may output information (animal type information) indicating a type of an animal (a type of a subject that has been detected), in addition to the animal position information D301. In this case, the obstacle evaluation NN 303 can transfer the animal type information to the data output unit 305, and the data output unit 305 can include the animal type information in the image analysis data D106. The data separation unit 401 separates the animal type information from the image analysis data D106, and transfers the animal type information to the temperature setting NN 402. In this way, the temperature setting NN 402 can estimate the temperature of a surveillance target based on not only the obstacle attribute information D302 and the obstacle intensity information D303, but also the animal type information. By adopting a configuration that utilizes the animal type information in the above-described manner, appropriate temperature estimation can be performed in accordance with a type of an animal that has been actually detected, even in a case where a type of an animal to be detected is unknown beforehand and the temperature in the case where there is no influence of an obstacle is unknown beforehand. This makes it possible to realize an image capturing apparatus that can surveil a plurality of types of animals (subjects).
  • Next, with reference to the timing chart of FIG. 2 , a description is given of individual processing timings in the above-described image analysis processing.
  • At t201, the readout of the visible light image D105 from the visible light image memory 107 is started. t201 is a timing at which a predetermined period has elapsed since t200, which is a timing at which storing of the visible light image D103 is started. The time difference between t200 and t201 can be set at the smallest time difference possible in a range where the accesses to the same address in a physical memory (e.g., DRAM) equipped with the visible light image memory 107 by way of writing and reading can be made in a desired order.
  • In parallel with the readout of the visible light image D105, the generation of the image analysis data D106 and the temperature parameters D107 is executed, which is completed at t202. t202 is also a timing at which the readout of an infrared light RAW image D108, which will be described later, is started.
  • Processing from Development of Infrared Light RAW Image D104 to Generation of Subject Region Information D111
  • With reference to FIG. 1 , the following describes processing executed by the image capturing apparatus, from development of the infrared light RAW image D104 to generation of the subject region information D111.
  • The infrared light RAW image D104 stored in the infrared light RAW image memory 108 is read out as an infrared light RAW image D108, and input to the infrared light development unit 111. The infrared light development unit 111 generates an infrared light image D109 by executing development processing with respect to the infrared light RAW image D108 based on the temperature parameters D107 input from the temperature parameter setting unit 110. The infrared light image D109 is input to the subject region identification unit 113.
  • The subject region identification unit 113 identifies a region of a subject in the infrared light image D109 based on the estimated temperature (Center) of the subject (a specific animal to be surveilled), and generates subject region information D111 indicating the region of the subject. No particular limitation is intended regarding an identification method; for example, the subject region identification unit 113 can identify the region of the subject by executing grouping processing with respect to pixels in a predetermined temperature range including the temperatures corresponding to the subject. The grouping processing for pixels corresponding to the specific temperature can be executed using any known technique.
  • Note that the infrared light image D109 is different from the infrared light RAW image D108 in that it is a developed image with enhanced contrast in the temperature range of Center±W (° C.). However, as can be understood from FIG. 5 , both of the infrared light image D109 and the infrared light RAW image D108 include information of the temperature range of Center±W (° C.). Therefore, the subject region identification unit 113 may identify the region of the subject in the infrared light RAW image D108 instead of the infrared light image D109. Regardless of which one of the infrared light image D109 and the infrared light RAW image D108 is used, the region of the subject is identified in information indicating the temperature distribution detected by the microbolometer 103 (temperature distribution information) as a result of processing of the subject region identification unit 113.
  • With reference to FIG. 5 , a description is now given of the details of processing executed by the infrared light development unit 111. In FIG. 5, 501 denotes an example of a bit range of the infrared light RAW image D108. In the example of FIG. 5 , the infrared light RAW image D108 is a 14-bit RAW image, with each pixel having a value in the range from 0 to 16383. 0 indicates −20° C., and 16383 indicates +300° C. Center (° C.) and W (° C.) are examples of parameters used by the infrared light development unit 111 and the subject region identification unit 113 among parameters included in the temperature parameters D107. Center (° C.) denotes the expected temperature of an object to be surveilled, whereas W (° C.) denotes the range of temperatures to be surveilled, centered at the expected temperature. As stated earlier, in a case where there is no influence of an obstacle, the settings of Center=36 (° C.) and W=20 (° C.) are configured, for example. Furthermore, in a case where there is an influence of an obstacle, a temperature that has been estimated based on the result of detection of the obstacle, such as Center=30.5 (° C.), is set. Note that Center (° C.) is also used by the image output unit 114.
  • 502 denotes an example of a bit range of the infrared light image D109. In the example of FIG. 5 , the infrared light image D109 is a 12-bit luminance image, with each pixel having a value in the range from 0 to 4095.
  • The infrared light development unit 111 generates the infrared light image D109 by tone-mapping Center (° C.) to 2047, and Center±W (° C.) to Th_btm and Th_top. Th_top and Th_btm are preset values; for example, Th_top=2047+1024, and Th_btm=2047−1024. In this case, the temperature range of Center±W (° C.) in the infrared light RAW image D108 is mapped to the luminance values of 2047±1024 in the infrared light image D109. Furthermore, with regard to pixels in the infrared light RAW image D108 that correspond to temperatures outside the temperature range of Center±W (° C.), a value indicating an invalid pixel (e.g., a luminance value=0) is set in the infrared light image D109. As a result, the infrared light image D109 becomes an image in which the contrast in a temperature range to be surveilled (a predetermined temperature range including an estimated temperature of a subject) is more enhanced than the contrast outside the temperature range to be surveilled. Note that tone mapping processing can be executed using any known technique.
  • Next, with reference to the timing chart of FIG. 2 , a description is given of individual processing timings in the above-described processing from development of the infrared light RAW image D104 to generation of the subject region information D111.
  • The readout of the infrared light RAW image D108 from the infrared light RAW image memory 108 is started at t202, which is a timing at which the generation of the temperature parameters D107 is completed. Note that as shown in FIG. 2 , storing of the infrared light RAW image D104 is already completed at t202.
  • In parallel to the readout of the infrared light RAW image D108, the generation of the infrared light image D109 and the subject region information D111 is executed, which is completed at t203. t203 is also a timing at which the readout of an A image D112 and a B image D113, which will be described later, is started.
  • Processing for Generating Phase Difference Information D114
  • With reference to FIG. 1 , the following describes processing in which the image capturing apparatus generates phase difference information D114.
  • The A image D101 and the B image D102 stored in the parallax image memory 105 are read out as an A image D112 and a B image D113, and input to the ranging unit 115. Furthermore, the subject region information D111 is also input to the ranging unit 115.
  • The ranging unit 115 generates the phase difference information D114 by computing the amount of displacement between the A image D112 and the B image D113 with respect to the region indicated by the subject region information D111. The computation of the amount of displacement in the ranging unit 115 can be executed using any known technique.
  • Next, with reference to the timing chart of FIG. 2 , a description is given of individual processing timings in the above-described processing for generating the phase difference information D114.
  • The readout of the A image D112 and the B image D113 from the parallax image memory 105 is started at t203, at which the generation of the subject region information D111 is completed. In parallel with the readout of the A image D112 and the B image D113, the generation of the phase difference information D114 is also started, which is completed at t204. t204 is also a timing at which focus adjustment processing (AF control) is started.
  • Focus Adjustment Processing (AF Control)
  • With reference to FIG. 1 , the following describes focus adjustment processing (AF control) executed by the image capturing apparatus.
  • The phase difference information D114 is input to the visible light focus control unit 116 and the infrared light focus control unit 117. The visible light focus control unit 116 generates visible light lens driving data D115 by calculating the amount of movement of a focusing lens included in the visible light optical system 100 based on the phase difference information D114, and applying an error correction associated with lens aberration to the calculated amount of movement.
  • The infrared light focus control unit 117 generates infrared light lens driving data D116 by calculating the amount of movement of a focusing lens included in the infrared light optical system 101 based on the phase difference information D114, and applying an error correction associated with lens aberration to the calculated amount of movement.
  • The visible light lens driving data D115 and the infrared light lens driving data D116 are set in the visible light lens driving unit 118 and the infrared light lens driving unit 119, respectively. The visible light lens driving unit 118 adjusts the focus of the visible light optical system 100 by driving the focusing lens included in the visible light optical system 100 based on the visible light lens driving data D115. The infrared light lens driving unit 119 adjusts the focus of the infrared light optical system 101 by driving the focusing lens included in the infrared light optical system 101 based on the infrared light lens driving data D116. As a result, AF control for the subject included in the region indicated by the subject region information D111 (i.e., the object to be surveilled) is completed.
  • As described above, the image capturing apparatus of the present embodiment performs focus control for the CMOS image sensor 102 and the microbolometer 103 based on the state of focus in the region inside the visible light image that corresponds to the subject region in the infrared light image. Consequently, the subject can be brought into focus more accurately.
  • Next, with reference to the timing chart of FIG. 2 , a description is given of individual processing timings in the above-described focus adjustment processing (AF control). The generation of the visible light lens driving data D115 and the infrared light lens driving data D116, the operations of the visible light lens driving unit 118 and the infrared light lens driving unit 119, and the movements of the focusing lens included in the visible light optical system 100 and the focusing lens included in the infrared light optical system 101 are started at t204, which is a timing at which the generation of the phase difference information D114 is completed, and are completed at t205.
  • The image capturing apparatus can set the frame cycle T200 shown in FIG. 2 at 1/30 sec, which is a standard frame cycle for the microbolometer 103. In this case, the image capturing apparatus can perform shooting in a state where the subject has been brought into focus on a frame-by-frame basis, with respect to both of the visible light optical system 100 and the infrared light optical system 101.
  • Note that the accumulation of charges in the CMOS image sensor 102 for the next frame is started at t206, and completed at t200′. t200′ is the timing of the beginning of a frame cycle for the next frame. With respect to the next frame as well, processing that is similar to processing from t200 to t206 related to the current frame is executed.
  • Processing for Generating and Outputting Display Images
  • With reference to FIG. 1 , the following describes processing in which the image capturing apparatus generates and outputs display images.
  • The visible light image D105, the infrared light image D109, the image analysis data D106, the temperature parameters D107, and the subject region information D111 are input to the image output unit 114. The image output unit 114 generates display images based on these images, data, parameters, and information that have been input. The image output unit 114 outputs the generated display images to a display (not shown) of the image capturing apparatus. Furthermore, the image output unit 114 distributes the generated display images to an external surveillance system (not shown) via, for example, a communication network.
  • FIG. 6 shows examples of the display images generated by the image output unit 114. According to the examples of FIG. 6 , the display images include a visible light image, a thermal image, and an image indicating detection information. Furthermore, a frame based on the subject region information D111 (i.e., a frame indicating the region of the object to be surveilled) is superimposed on the visible light image and the thermal image.
  • The detection information includes information indicating a type (a lion according to the examples of FIG. 6 ) of the detected subject (object to be surveilled), and information indicating a type (grass according to the examples of FIG. 6 ) of the detected obstacle, as information based on the image analysis data D106. Furthermore, the detection information includes the expected body surface temperature and the detected body surface temperature of the object to be surveilled (specific animal) as information based on the temperature parameters D107. The expected body surface temperature is a body surface temperature that is expected in a case where there is no influence of the obstacle, and is 36.0° C. according to the examples of FIG. 6 . The detected body surface temperature corresponds to “Center”, which has been described with reference to FIG. 4 and FIG. 5 , and is 30.5° C. according to the examples of FIG. 6 .
  • Therefore, by confirming the displayed detection information, a surveillant can obtain information indicating whether the temperature indicated by the thermal image has fluctuated due to the obstacle, and to what extent the temperature has fluctuated if the fluctuation has occurred.
  • As described above, the image output unit 114 can provide notification of various types of information included in the display images by generating and outputting the display images.
  • Furthermore, as AF control is performed with respect to a subject in a range of a frame based on the subject region information D111 (the details of AF control are as described above), the subject in this region can be brought into focus.
  • Note that according to the examples of FIG. 6 , although a so-called conflict between far objects and near objects can occur due to the existence of grass in front of the lion, which is the object to be surveilled, the conflict between far objects and near objects can be fixed by using any known technique.
  • Summary of First Embodiment
  • As described above, according to the first embodiment, the image capturing apparatus obtains an image (a first image) that has been generated by the CMOS image sensor 102 through photoelectric conversion of incident light. Also, the image capturing apparatus obtains an image (temperature distribution information) indicating the temperature distribution that has been detected by the microbolometer 103 based on incident electromagnetic waves. The image capturing apparatus detects a subject (e.g., a lion) and an obstacle that exists in front of the subject (e.g., grass, tree branches, a cloud of dust, or the like) from the first image. Based on the result of detection of the obstacle, the image capturing apparatus estimates the temperature corresponding to the subject in the temperature distribution information. Then, based on the estimated temperature, the image capturing apparatus identifies a subject region that includes the subject in the temperature distribution information. The temperature distribution information that is used in identifying the subject region may be an image before the development, such as the infrared light RAW image D108, or may be an image after the development, such as the infrared light image D109.
  • As described above, according to the present embodiment, the temperature that has been estimated based on the result of detection of an obstacle in an image generated through photoelectric conversion of incident light is used as the temperature of a subject that is referenced in detecting the subject (identifying the subject region) based on the temperature distribution information. This makes it possible to improve the accuracy of subject detection based on the temperature distribution information indicating the temperature distribution that has been detected based on incident electromagnetic waves.
  • OTHER EMBODIMENTS
  • Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
  • While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
  • This application claims the benefit of Japanese Patent Application No. 2022-118871, filed Jul. 26, 2022, which is hereby incorporated by reference herein in its entirety.

Claims (10)

What is claimed is:
1. An image processing apparatus, comprising at least one processor and/or at least one circuit which functions as:
an obtainment unit configured to obtain a first image that has been generated through photoelectric conversion of incident light, and temperature distribution information indicating a temperature distribution that has been detected based on an incident electromagnetic wave;
a detection unit configured to detect a subject and an obstacle that exists in front of the subject from the first image;
an estimation unit configured to estimate a temperature corresponding to the subject in the temperature distribution information based on a result of detection of the obstacle; and
an identification unit configured to identify a subject region that includes the subject in the temperature distribution information based on the estimated temperature.
2. The image processing apparatus according to claim 1, wherein
the estimation unit estimates the temperature further based on a type of the subject.
3. The image processing apparatus according to claim 1, wherein
the at least one processor and/or the at least one circuit further functions as a notification unit configured to provide notification of at least one of a position of the subject region in the temperature distribution information, a position of a region which is inside the first image and which corresponds to the subject region, the estimated temperature, and the result of detection of the obstacle.
4. The image processing apparatus according to claim 1, wherein
the at least one processor and/or the at least one circuit further functions as a generation unit configured to generate a second image indicating the temperature distribution based on the temperature distribution information so that contrast in a predetermined temperature range including the estimated temperature is more enhanced than contrast outside the predetermined temperature range.
5. An image capturing apparatus, comprising:
the image processing apparatus according to claim 1;
an image sensor configured to generate the first image through photoelectric conversion of the incident light; and
a temperature distribution detector configured to generate the temperature distribution information by detecting the temperature distribution based on the incident electromagnetic wave.
6. The image capturing apparatus according to claim 5, wherein
the at least one processor and/or the at least one circuit further functions as a control unit configured to perform focus control for the image sensor and the temperature distribution detector based on a state of focus in a region which is inside the first image and which corresponds to the subject region.
7. The image capturing apparatus according to claim 5, wherein
the incident light includes visible light.
8. The image capturing apparatus according to claim 5, wherein
the incident electromagnetic wave includes far-infrared light.
9. An image processing method executed by an image processing apparatus, comprising:
obtaining a first image that has been generated through photoelectric conversion of incident light, and temperature distribution information indicating a temperature distribution that has been detected based on an incident electromagnetic wave;
detecting a subject and an obstacle that exists in front of the subject from the first image;
estimating a temperature corresponding to the subject in the temperature distribution information based on a result of detection of the obstacle; and
identifying a subject region that includes the subject in the temperature distribution information based on the estimated temperature.
10. A non-transitory computer-readable storage medium which stores a program for causing a computer to execute an image processing method comprising:
obtaining a first image that has been generated through photoelectric conversion of incident light, and temperature distribution information indicating a temperature distribution that has been detected based on an incident electromagnetic wave;
detecting a subject and an obstacle that exists in front of the subject from the first image;
estimating a temperature corresponding to the subject in the temperature distribution information based on a result of detection of the obstacle; and
identifying a subject region that includes the subject in the temperature distribution information based on the estimated temperature.
US18/356,326 2022-07-26 2023-07-21 Image processing apparatus, image capturing apparatus, image processing method, and storage medium Pending US20240040215A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022118871A JP2024016619A (en) 2022-07-26 2022-07-26 Image processing apparatus, image processing method, and program
JP2022-118871 2022-07-26

Publications (1)

Publication Number Publication Date
US20240040215A1 true US20240040215A1 (en) 2024-02-01

Family

ID=89664082

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/356,326 Pending US20240040215A1 (en) 2022-07-26 2023-07-21 Image processing apparatus, image capturing apparatus, image processing method, and storage medium

Country Status (2)

Country Link
US (1) US20240040215A1 (en)
JP (1) JP2024016619A (en)

Also Published As

Publication number Publication date
JP2024016619A (en) 2024-02-07

Similar Documents

Publication Publication Date Title
US20190164257A1 (en) Image processing method, apparatus and device
US10070038B2 (en) Image processing apparatus and method calculates distance information in a depth direction of an object in an image using two images whose blur is different
US10133349B2 (en) Display apparatus and method of controlling the same
CN109712192B (en) Camera module calibration method and device, electronic equipment and computer readable storage medium
US9762788B2 (en) Image pickup apparatus, depth information acquisition method and program
US20140211045A1 (en) Image processing apparatus and image pickup apparatus
US20160360081A1 (en) Control apparatus, image pickup apparatus, control method, and non-transitory computer-readable storage medium
US11682131B2 (en) Image capturing apparatus and method of controlling image capturing apparatus
US20190230269A1 (en) Monitoring camera, method of controlling monitoring camera, and non-transitory computer-readable storage medium
CN109559353B (en) Camera module calibration method and device, electronic equipment and computer readable storage medium
KR20200117562A (en) Electronic device, method, and computer readable medium for providing bokeh effect in video
CN111246100B (en) Anti-shake parameter calibration method and device and electronic equipment
WO2020124517A1 (en) Photographing equipment control method, photographing equipment control device and photographing equipment
US9538074B2 (en) Image processing apparatus, imaging apparatus, and image processing method
JP6337131B2 (en) Infrared imaging device, fixed pattern noise calculation method, and fixed pattern noise calculation program
JP7306408B2 (en) Temperature estimation device, temperature estimation method and temperature estimation program
US10204400B2 (en) Image processing apparatus, imaging apparatus, image processing method, and recording medium
US10334152B2 (en) Image acquisition device and method for determining a focus position based on sharpness
US20240040215A1 (en) Image processing apparatus, image capturing apparatus, image processing method, and storage medium
US20210243395A1 (en) Image pickup apparatus for inferring noise and learning device
KR102251307B1 (en) Thermal camera system with distance measuring function
US9871982B2 (en) Image pickup apparatus that detects flash band and control method therefor
JP6251817B2 (en) Infrared imaging device, image processing method, and image processing program
KR102487590B1 (en) Method for measuring of object based on face-recognition
US20160071286A1 (en) Image processing apparatus, imaging apparatus, control method, and storage medium

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION