CN113068019A - Dual-optical camera calibration apparatus, method, electronic apparatus, and storage medium - Google Patents
Dual-optical camera calibration apparatus, method, electronic apparatus, and storage medium Download PDFInfo
- Publication number
- CN113068019A CN113068019A CN202110287677.8A CN202110287677A CN113068019A CN 113068019 A CN113068019 A CN 113068019A CN 202110287677 A CN202110287677 A CN 202110287677A CN 113068019 A CN113068019 A CN 113068019A
- Authority
- CN
- China
- Prior art keywords
- sensor
- image
- angle
- reference object
- dual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
The embodiment of the application provides a dual-optical camera calibration device, a method, an electronic device and a storage medium, and relates to the technical field of camera calibration, the dual-optical camera calibration device comprises a calibration plate with a reference object, a first fixing piece, a second fixing piece and a controller, wherein the first fixing piece and the second fixing piece are used for fixing a first sensor and a second sensor of a dual-optical camera respectively, the controller is in communication connection with the second fixing piece, and the controller is also used for being in communication connection with the first sensor and the second sensor; and the controller is used for obtaining a first image and a second image, obtaining offset information between the first sensor and the second sensor according to the positions of the reference object in the first image and the second image, and controlling the second fixing piece to move according to the offset information so as to calibrate the dual-light camera. The double-optical-camera calibration equipment provided by the embodiment of the application can be used for calibrating two image sensors in the double optical camera.
Description
Technical Field
The present disclosure relates to the field of camera calibration technologies, and in particular, to a dual-optical camera calibration apparatus, a method, an electronic apparatus, and a storage medium.
Background
There are two image sensors in a two-light camera, such as a visible light image sensor, an infrared light image sensor, and the like. Due to production errors, assembly errors and the like, the field ranges of two image sensors in a dual-optical camera cannot be effectively overlapped. For example, in a two-camera, the field angle of the visible light image sensor is large, and the field angle of the infrared light image sensor is small, and the field range of the visible light image sensor should completely include the field range of the infrared light image sensor.
Due to the above situation, in the prior art, images acquired by two image sensors are usually processed, image areas acquired by the two image sensors and at the overlapping portion of the field of view ranges are determined to be valid images, and image areas outside the overlapping portion of the field of view ranges are discarded. Although the effective image can be obtained, when the overlapping part of the field ranges of the two image sensors is small, the effective image is small, the discarded image area is large, and the resource waste is caused. Therefore, in order to increase the size of the effective image, it is necessary to calibrate two image sensors in the two-light camera.
Disclosure of Invention
An object of the embodiments of the present application is to provide a dual-optical camera calibration apparatus, a dual-optical camera calibration method, an electronic apparatus, and a storage medium, so as to calibrate two image sensors in a dual-optical camera. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a dual-photo camera calibration apparatus, including: calibration board, first mounting, second mounting and controller with reference object, wherein:
the first fixing piece and the second fixing piece are used for fixing a first sensor and a second sensor of the dual-optical camera respectively, the controller is in communication connection with the second fixing piece, and the controller is also in communication connection with the first sensor and the second sensor;
the controller is used for obtaining a first image and a second image, obtaining offset information between the first sensor and the second sensor according to the positions of the reference object in the first image and the second image, and controlling the second fixing piece to move according to the offset information so as to calibrate the dual-optical camera;
wherein the first image is: in the case that the reference object is in a coincident field of view range, the image acquired by the first sensor; the second image is: the image acquired by the second sensor is in the condition that the reference object is in the overlapped field of view range; the coincidence field range is as follows: a range in which the field of view of the first sensor coincides with the field of view of the second sensor.
In an embodiment of the present application, the second fixing member includes a fixing portion and a moving portion, the fixing portion is used for fixing the second sensor, and the moving portion is used for driving the fixing portion to move;
the controller is used for controlling the movable part to move according to the offset information.
In one embodiment of the present application, the movable portion includes a stepping motor and an angle slide; the stepping motor is connected with the angle sliding table; the angle sliding table is connected with the movable part; the controller is in communication connection with the stepper motor,
the controller is specifically configured to: controlling the stepping motor to drive the angle sliding table to rotate according to the offset information;
the angle sliding table is used for driving the fixing part to rotate.
In one embodiment of the present application, the apparatus further comprises: a movable connecting piece;
the movable connecting piece is respectively connected with the first fixing piece and the calibration plate and used for adjusting the distance between the first fixing piece and the calibration plate.
In one embodiment of the present application, the reference object is a heat source object, the heat source object is located on the surface of the calibration plate, and the calibration plate is further used for controlling the temperature of the heat source object.
In an embodiment of the present application, the initial positions of the first fixing member and the second fixing member are: and the fixed front faces of the lenses of the first sensor and the second sensor are parallel.
In one embodiment of the present application, the device further comprises a display, the display communicatively coupled to the controller;
the controller is further configured to: performing image processing on a first image acquired by the first sensor and a second image acquired by the second sensor, and sending the processed images to the display;
the display is used for: the received image is displayed.
In a second aspect, an embodiment of the present application provides a dual-optical camera calibration method, including:
the method comprises the steps of obtaining a first image collected by a first sensor of a dual-optical camera under the condition that a preset reference object is in a superposition view field range, and obtaining a second image collected by a second sensor of the dual-optical camera under the condition that the reference object is in the superposition view field range, wherein the superposition view field range is as follows: a range where the field of view of the first sensor coincides with the field of view of the second sensor;
obtaining offset information between the first sensor and the second sensor according to the positions of the reference object in the first image and the second image;
calibrating the dual-light camera according to the offset information.
In an embodiment of the application, the obtaining offset information between the first sensor and the second sensor according to the position of the reference object in the first image and the second image includes:
determining the position of the reference object in the first image and the second image;
obtaining a first conversion relation of pixel positions in the first image and the second image according to the determined positions;
determining the position of a preset image in the second image to be converted into the position in the first image according to the first conversion relation, and taking the position as a second target position;
calculating offset information between the first sensor and the second sensor according to a deviation between a first target position and the second target position, wherein the first target position is: the preset image is at the position of the first image.
In one embodiment of the present application, the calculating offset information between the first sensor and the second sensor according to the deviation between the first target position and the second target position includes:
obtaining a deviation between a second actual position and a first actual position as a reference deviation according to the second target position, the first target position and a preset second conversion relationship, wherein the second conversion relationship is as follows: the conversion relation between the pixel position in the first image and the actual position in the plane where the reference object is located, wherein the first actual position is as follows: the first target position corresponds to an actual position of the plane where the reference object is located, and the second actual position is as follows: the actual position of the plane where the reference object is located corresponds to the second target position;
and calculating an offset angle between the first sensor and the second sensor according to the vertical distance between the first sensor and the reference object and the reference deviation to serve as offset information.
In one embodiment of the present application, the offset angle between the first sensor and the second sensor comprises: a horizontal offset angle between the first sensor and the second sensor in a horizontal direction, and a vertical offset angle between the first sensor and the second sensor in a vertical direction;
the calibrating the dual-light camera according to the offset information includes:
and when the horizontal offset angle is larger than a horizontal angle allowance, horizontally adjusting the second sensor according to the horizontal offset angle, wherein the horizontal angle allowance is as follows: half of a difference in a horizontal direction between a field angle of the first sensor and a field angle of the second sensor;
and under the condition that the vertical deviation angle is larger than a vertical angle allowance, vertically adjusting the second sensor according to the vertical deviation angle, wherein the vertical angle allowance is as follows: a difference between a field angle of the first sensor and a field angle of the second sensor in a vertical direction is half of the difference.
In one embodiment of the present application, the method further comprises:
obtaining a third conversion relation of pixel positions in the first image and a third image according to the position of the reference object in the third image, wherein the third image is an image obtained by acquiring the reference object by the calibrated second sensor;
determining the position of the preset image in the third image to be converted into the position in the first image according to the third conversion relation, and taking the position as a third target position;
under the condition that the third target position is outside a preset calibration area, calculating new offset information between the first sensor and the second sensor according to the deviation between the first target position and the third target position, and recalibrating the dual-optical camera according to the new offset information, wherein the preset calibration area is as follows: a region of a preset size containing the first target location.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor and the communication interface complete communication between the memory and the processor through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of the second aspect when executing the program stored in the memory.
In a fourth aspect, the present application provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the method steps of any one of the second aspects.
Embodiments of the present application also provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform any of the above-described dual-light camera calibration methods.
The embodiment of the application has the following beneficial effects:
the two optical camera calibration equipment that this application embodiment provided includes: calibration board, first mounting, second mounting and controller with reference object, wherein: the first fixing piece and the second fixing piece are used for fixing a first sensor and a second sensor of the dual-optical camera respectively, the controller is in communication connection with the second fixing piece, and the controller is also in communication connection with the first sensor and the second sensor; the controller is used for obtaining a first image and a second image, obtaining offset information between the first sensor and the second sensor according to the positions of the reference object in the first image and the second image, and controlling the second fixing piece to move according to the offset information so as to calibrate the dual-light camera; wherein the first image is: under the condition that the reference object is in the overlapped field of view range, the image acquired by the first sensor; the second image is: under the condition that the reference object is in the overlapped field of view range, the image acquired by the second sensor; the coincidence field range is: the field of view of the first sensor coincides with the field of view of the second sensor. The method comprises the steps of firstly obtaining the positions of a reference object in images acquired by two image sensors, determining the deviation between a first image and a second image based on the deviation between the positions, reflecting the deviation information between the first sensor and the second sensor according to the deviation information, controlling the second fixing piece to move according to the deviation information, and driving the second sensor to move by the second fixing piece, so that the calibration of the dual-optical camera can be realized. Therefore, the two-optical-camera calibration equipment provided by the embodiment of the application can be used for calibrating two image sensors in the two-optical camera.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other embodiments can be obtained by using the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a dual-optical-camera calibration apparatus according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of obtaining offset information according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of an offset angle according to an embodiment of the present disclosure;
fig. 4a and 4b are schematic diagrams of angles of view provided by an embodiment of the present application;
FIG. 5 is a schematic flow chart of a recalibration method provided herein;
FIG. 6 is a schematic diagram of a predetermined calibration area according to an embodiment of the present application;
fig. 7 is a schematic flowchart of a first dual-optical-camera calibration method according to an embodiment of the present disclosure;
fig. 8 is a schematic flowchart of a second method for calibrating a dual-optical camera according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments that can be derived by one of ordinary skill in the art from the description herein are intended to be within the scope of the present disclosure.
In order to calibrate two image sensors in a dual-optical camera, embodiments of the present application provide a dual-optical camera calibration apparatus, a method, an electronic apparatus, and a storage medium, which are respectively described in detail below.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a dual-optical camera calibration apparatus according to an embodiment of the present application, and as shown in fig. 1, the dual-optical camera calibration apparatus includes: calibration board with reference object 101, first mount 102, second mount 103 and controller 104, wherein:
the first fixing piece 102 and the second fixing piece 103 are used for fixing a first sensor and a second sensor of the dual-optical camera respectively, the controller 104 is in communication connection with the second fixing piece 103, and the controller 104 is also used for being in communication connection with the first sensor and the second sensor;
a controller 104 for obtaining the first image and the second image, obtaining offset information between the first sensor and the second sensor according to the position of the reference object in the first image and the second image, and controlling the second fixing member 103 to move according to the offset information to calibrate the dual-optical camera;
wherein the first image is: under the condition that the reference object is in the overlapped field of view range, the image acquired by the first sensor; the second image is: under the condition that the reference object is in the overlapped field of view range, the image acquired by the second sensor; the coincidence field range is: the field of view of the first sensor coincides with the field of view of the second sensor.
Specifically, when the dual-optical camera is calibrated, the first sensor of the dual-optical camera may be fixed to the first fixing member 102, the second sensor may be fixed to the second fixing member 103, and the reference object is kept within the overlapping field of view, so that the first sensor and the second sensor may both perform image acquisition on the reference object to obtain the first image and the second image, respectively.
The first sensor and the second sensor are respectively in communication connection with the controller 104, so that the acquired images can be transmitted to the controller 104.
After the first image acquired by the first sensor and the second image acquired by the second sensor are obtained, the controller 104 can identify the positions of the reference object in the first image and the second image respectively, further can obtain the deviation between the positions of the reference object in different images, and determines the deviation between the first image and the second image based on the deviation, and the deviation between the first image and the second image can reflect the deviation information between the first sensor and the second sensor, so that the second fixing member 103 can be controlled to move according to the deviation information, the second fixing member 103 drives the second sensor to move, and the calibration of the dual-optical camera can be realized. Therefore, the dual-optical-camera calibration device provided by the embodiment can be used for calibrating two image sensors in the dual-optical camera.
First, the structure of the dual-optical camera calibration apparatus will be described in detail.
The reference object may be in the shape of a block, a sphere, or the like. The calibration plate 101 may have one or more references therein, for example, 4 references, 5 references, etc.
In one embodiment of the present application, the reference object may be a heat source object, the heat source object may be located on the surface of the calibration plate 101, and the calibration plate 101 is further used for temperature control of the heat source object.
Specifically, the heat source may generate heat, and the calibration board 101 may control the temperature of the heat source, for example, may control the heat source to heat, maintain a constant temperature, cool, and the like. When the dual-optical camera comprising the thermal imaging sensor is calibrated, the heat source block can emit heat, so that the thermal imaging sensor can acquire images, and the dual-optical camera comprising the thermal imaging sensor can be calibrated based on the images.
In an embodiment of the application, the calibration board 101 may be in communication connection with the controller 104, the controller 104 may generate a temperature control instruction and send the temperature control instruction to the calibration board 101, the temperature control instruction may carry temperature information, and after receiving the instruction, the calibration board 101 performs temperature control on the heat source object according to the instruction.
In an embodiment of the present application, the calibration board 101 may further include a supporting board, where the supporting board is used to mount reference objects, and the reference objects may be mounted on the supporting board according to a preset arrangement manner, or may be randomly mounted on the supporting board.
The first fixing member 102 is used to fix the first sensor in the dual-optical camera. Specifically, the first sensor may be fixed in the first fixing member 102, so that the first sensor does not move during the calibration of the dual-optical camera.
The first sensor is any image sensor in a dual-optical camera, and may be a visible light image sensor or a non-visible light image sensor, such as an infrared light image sensor.
In an embodiment of the present application, the shape of the first fixing element 102 is designed according to the shape of the first sensor, so that the first fixing element 102 and the first sensor are more matched, and the fixing effect is improved.
In addition, the first fixing member 102 may also be a deformable structure, so that the first fixing member 102 can fix the first sensor with different shapes.
In one embodiment of the present application, the dual-optical-camera calibration apparatus may further include: and the movable connecting pieces are respectively connected with the first fixing piece 102 and the calibration plate 101 and are used for adjusting the distance between the first fixing piece 102 and the calibration plate 101.
The movable connecting piece can be a screw rod, such as a trapezoidal screw rod, a ball screw rod and the like, and can also be an elastic connecting piece and the like.
Specifically, the first fixing member 102 and the calibration plate 101 may be respectively connected to the movable connecting member, and the distance between the first fixing member 102 and the calibration plate 101 may be adjusted by the movable connecting member.
In an embodiment of the present application, a vertical distance between the first fixing member 102 and the calibration board 101 may be adjusted to a preset distance, where the preset distance may enable the first sensor connected to the first fixing member 102 to acquire a clear image of the calibration board. The value of the preset distance can be set manually or determined according to the type of the first sensor.
Specifically, the corresponding relationship between different sensors and the distance may be preset, when the dual-optical camera needs to be calibrated, the type of the first sensor may be obtained, then the distance corresponding to the type is queried in the corresponding relationship to serve as the preset distance, and the distance between the first fixing member 102 and the calibration board is adjusted to be the preset distance, so that the first sensor fixed by the first fixing member 102 can acquire a clear image of the reference object.
In an embodiment of the present application, the movable connecting element may be further communicatively connected to the controller 104, so that the controller 104 can adjust the distance between the first fixing element 102 and the calibration board 101 by adjusting the movable connecting element.
The second fixing member 103 is used to fix the second sensor in the dual-optical camera. Specifically, the second sensor can be fixed in the second fixing member 103, so that the second sensor cannot move relative to the second fixing member 103 in the calibration process of the dual-optical camera, and the second sensor is adjusted in a manner of moving the second fixing member 103, thereby realizing calibration of the dual-optical camera.
The second sensor is an image sensor of the dual-optical camera except the first sensor, and may be a visible light image sensor or a non-visible light image sensor, such as an infrared light image sensor.
In an embodiment of the present application, the second fixing element 103 includes a fixing portion for fixing the second sensor, and a moving portion for driving the fixing portion to move;
and a controller 104 for controlling the movable portion to move according to the offset information.
Specifically, the movable portion in the second fixing member 103 is connected to the fixing portion, the second sensor is fixed to the fixing portion, and the movable portion is connected to the fixing portion, so that the second sensor can be adjusted by adjusting the movement of the fixing portion.
The controller 104 may be directly connected to the movable portion in a communication manner, so that the controller can directly control the movable portion, thereby adjusting the second sensor.
In one embodiment of the present application, the movable part may include a stepping motor and an angle slide; the stepping motor is connected with the angle sliding table; the angle sliding table is connected with the movable part; the controller 104 is communicatively coupled to the stepper motor,
the controller 104 is specifically configured to: controlling the stepping motor to drive the angle sliding table to rotate according to the offset information; the angle sliding table is used for driving the fixing part to rotate.
Specifically, the controller 104 can control the stepping motor to rotate according to the offset information, so as to drive the angle sliding table to rotate, and then the angle sliding table can drive the fixing part to rotate, so as to drive the second sensor to rotate, thereby realizing the calibration of the dual-optical camera.
In an embodiment of the application, the fixing portion may be a clamping mechanism, and is configured to clamp the second sensor, so as to fix the second sensor.
In one embodiment of the present application, the initial positions of the first fixing member 102 and the second fixing member 103 are: and the fixed front faces of the lenses of the first sensor and the second sensor are parallel. Specifically, when the first sensor is fixed on the first fixing member 102 and the second sensor is fixed on the second fixing member 103, the front surfaces of the lenses of the first sensor and the second sensor can be parallel, so that the probability of the coincidence of the fields of view of the first sensor and the second sensor can be improved, and the reference object can be conveniently positioned in the common field of view of the first sensor and the second sensor.
In one embodiment of the present application, the dual-light camera calibration device may further comprise a display, the display being communicatively coupled to the controller;
the controller is further configured to: processing a first image acquired by a first sensor and a second image acquired by a second sensor, and sending the processed images to a display;
the display is used for: the received image is displayed.
Specifically, the controller may perform image processing on the first image and the second image, such as rendering, zooming, image enhancement, and the like, and then send the processed images to the display, so that the display may display images collected by the first sensor and the second sensor, and a user may visually determine whether the reference object is in the fields of view of the first sensor and the second sensor through the display, and determine the accuracy of the calibrated dual-optical camera.
Next, the operation of the dual-optical camera calibration apparatus will be described in detail.
Referring to fig. 2 and fig. 2 are schematic diagrams illustrating a process for obtaining offset information according to an embodiment of the present disclosure, when the controller 104 obtains offset information between the first sensor and the second sensor, the method may include the following steps S201 to S204:
s201, determining the positions of the reference object in the first image and the second image.
The position may be a pixel position of the reference object in the image.
Specifically, the first image is an image acquired by the first sensor when the reference object is in the overlapping view field range, and the overlapping view field range is as follows: the range of the overlapped part in the fields of view of the first sensor and the second sensor, namely the reference object is positioned in the image acquisition range of the first sensor, so that the reference object is positioned in the first image;
similarly, the reference object is also within the image capture range of the second sensor, and thus the reference object is also in the second image.
In one embodiment of the present application, the position of the reference object in the first image and the position of the reference object in the second image can be detected by using image detection.
In addition, the selection operation of the user can be received, the positions of the reference objects in the first image and the second image are selected by the user through the input device, and therefore the controller can determine the positions of the reference objects in the first image and the second image respectively.
In one embodiment of the present application, a first coordinate system may be established based on the image acquired by the first sensor, so that after the position of the reference object in the first image is determined, the coordinates of the position in the first coordinate system may be obtained as first coordinates representing the position of the reference object in the first image;
and, a second coordinate system may be established with reference to the image acquired by the second sensor, so that after the position of the reference object in the second image is determined, a coordinate of the position in the second coordinate system may be obtained as a second coordinate representing the position of the reference object in the second image.
S202, obtaining a first conversion relation of pixel positions in the first image and the second image according to the determined positions.
Specifically, since the number of sensor pixel arrays and the field of view range of the first sensor and the second sensor may not be the same, the pixel positions of the same reference object in the first image and the second image respectively are also different, and based on the pixel positions of the reference object in the first image and the second image respectively, the conversion relationship between the pixel positions in the first image and the second image can be obtained.
In one embodiment of the present application, in the case where the positions of the reference object in the first image and the second image are expressed in the form of coordinates in S201, a coordinate conversion relationship between the first coordinate system and the second coordinate system may be obtained by using a first coordinate of the reference object in the first coordinate system and a second coordinate of the reference object in the second coordinate system, and the coordinate conversion relationship may be used as a first conversion relationship of pixel positions between the first image and the second image.
And S203, determining the position of the preset image in the second image to be converted into the position in the first image according to the first conversion relation, and taking the position as a second target position.
The preset image may be an image region at a center position in the image, an image region at a vertex position of a lower left corner, an image region at a vertex position of an upper right corner, and the like. Taking the above-mentioned preset image as an image region at the center position in the image as an example, the position of the preset image in the second image may be: an image area at a central position within the second image, a pixel position in the second image.
Specifically, the pixel position of the preset image in the second image may be obtained first, and then the pixel position is converted into the first image according to the first conversion relationship between the pixel positions in the first image and the pixel positions in the second image, which is obtained in S202, to obtain the second target position.
In an embodiment of the application, coordinates of the preset image in the second coordinate system in the second image may be obtained, and the obtained coordinates are converted into the first coordinate system according to a coordinate conversion relationship between the first coordinate system and the second coordinate system, so that a position of the preset image in the second image is converted into a position in the first image as the second target position.
S204, calculating offset information between the first sensor and the second sensor according to the deviation between the first target position and the second target position.
Wherein the first target position is: the preset image is at the position of the first image. Taking the preset image as an example of an image region at the vertex position of the upper right corner in the image, the first target position may be: and the pixel position of the image area at the top right vertex position in the first image.
In particular, after determining the first target position and the second target position, a deviation between the positions may be obtained, which may reflect a deviation between the first image and the second image.
Taking the preset image as an image area at the center position in the image as an example, the coordinates of the image area at the center position in the second image in the second coordinate system may be obtained first, and then the position of the coordinates in the first coordinate system may be obtained according to the coordinate conversion relationship as the second target position, and in addition, the coordinates of the image area at the center position in the first image in the first coordinate system may be obtained as the first target position, so that the center positions of the first image and the second image may be converted into the same coordinate system, and the deviation between the two positions is the deviation between the second image and the first image.
Since the deviation between the first image and the second image can reflect the deviation between the first sensor and the second sensor, the deviation information between the first sensor and the second sensor can be calculated according to the deviation between the first target position and the second target position, so that the two image sensors of the dual-optical camera can be conveniently calibrated based on the deviation information.
In one embodiment of the present application, when obtaining the offset information, a deviation between the second actual position and the first actual position may be obtained according to the second target position and the first target position, and a preset second conversion relationship, and used as a reference deviation, and an offset angle between the first sensor and the second sensor may be calculated according to a vertical distance between the first sensor and the calibration board 101 and the reference deviation, and used as the offset information.
Wherein, the second conversion relation is: the translation between the pixel position in the first image and the actual position in the plane of the calibration plate 101. Specifically, the reference object is located in the calibration board 101, that is, the reference object is located on the plane of the calibration board 101, and according to the position of the reference object in the plane of the calibration board and the position of the reference object in the first image, the conversion relationship between the pixel position in the first image and the actual position in the plane of the calibration board can be obtained.
The first actual position is: the actual position of the first target position corresponding to the plane where the calibration plate 101 is located may be understood as: and the position of the first target position in the first image, which is converted according to the second conversion relation and corresponds to the plane of the calibration plate.
The second actual position is: the actual position of the second target position corresponding to the plane where the calibration plate 101 is located may be understood as: and the position of the second target position in the first image, which is converted according to the second conversion relation and corresponds to the plane of the calibration plate.
Specifically, a first actual position corresponding to the first target position and a second actual position corresponding to the second target position may be obtained according to the second conversion relationship, then a deviation between the first actual position and the second actual position may be obtained as a reference deviation, a vertical distance between the first sensor and the calibration board may be obtained, further, an angle of an interval between the first target position and the second target position in an actual scene may be calculated according to the vertical distance and the reference deviation, and the angle may reflect an offset angle between the first sensor and the second sensor, so that the angle may be used as offset information between the first sensor and the second sensor.
Referring to fig. 3, fig. 3 is a schematic diagram of an offset angle according to an embodiment of the present disclosure. In one embodiment of the present application, when calculating the offset angle γ, the calculation may be performed according to the following formula:
γ=tan-1(r/L)
wherein r represents a reference deviation between the first actual position and the second actual position, and L represents a vertical distance between the first sensor and the calibration plate.
In an embodiment of the present application, in addition to obtaining the offset angle, the orientation of the first actual position relative to the second actual position may also be obtained, and the offset angle and the orientation are used together as offset information, so that during calibration, the second fixing member 102 may be controlled to rotate the offset angle along the orientation, thereby implementing calibration of the first sensor and the second sensor of the dual-optical camera.
In an embodiment of the present application, when obtaining the reference deviation, an actual coordinate system may be established in advance with a plane where the calibration board 101 is located as a reference, coordinates of the reference object in the actual coordinate system are obtained, and then a coordinate conversion relationship between the first coordinate system and the actual coordinate system is obtained as a second conversion relationship according to the coordinates of the reference object in the first coordinate system.
After the first target position and the second target position are obtained, the coordinates of the first target position and the second target position in the first coordinate system are converted according to the second conversion relation, so that the coordinates of the first actual position and the second actual position in the actual coordinate system are obtained, and the reference deviation between the first actual position and the second actual position is obtained according to the coordinates.
In one embodiment of the present application, when obtaining the reference deviation, the horizontal deviation and the vertical deviation between the second actual position and the first actual position may be obtained according to the second target position and the first target position, and a preset second conversion relationship.
Specifically, after a first actual position corresponding to the first target position and a second actual position corresponding to the second target position are obtained based on the second conversion relationship, respectively, a distance between the first actual position and the second actual position in the horizontal direction may be calculated as a horizontal deviation, and a distance between the first actual position and the second actual position in the vertical direction may be calculated as a vertical deviation.
Correspondingly, the offset angle between the first sensor and the second sensor may include: the horizontal offset angle between the first sensor and the second sensor in the horizontal direction, and the vertical offset angle between the first sensor and the second sensor in the vertical direction.
Thus, when obtaining the offset information, the horizontal offset angle between the first sensor and the second sensor can be calculated according to the vertical distance and the horizontal deviation between the first sensor and the calibration board 101;
and calculates the vertical offset angle between the first sensor and the second sensor according to the vertical distance and the vertical deviation between the first sensor and the calibration plate 101.
Specifically, the horizontal offset angle γ x and the vertical offset angle γ y may be calculated according to the following formulas:
γx=tan-1(rx/L)
γy=tan-1(ry/L)
wherein rx represents a horizontal deviation between the first actual position and the second actual position, ry represents a vertical deviation between the first actual position and the second actual position, and L represents a vertical distance between the first sensor and the calibration board.
In an embodiment of the present application, when the second fixing element 103 is controlled to move according to the offset information, the second fixing element 103 may be horizontally adjusted according to the horizontal offset angle when the horizontal offset angle is greater than the horizontal angle margin;
in the case where the vertical offset angle is larger than the vertical angle margin, the second fixing member 103 is vertically adjusted according to the vertical offset angle.
Wherein, the horizontal angle allowance is: half of the difference in the horizontal direction between the field angle of the first sensor and the field angle of the second sensor;
the vertical angle allowance is: the angle of view of the first sensor is half the difference in the vertical direction from the angle of view of the second sensor.
Specifically, a first angle of view of the first sensor and a second angle of view of the second sensor may be obtained in advance, and a half of a difference between the first angle of view and the second angle of view in the horizontal direction may be calculated as a horizontal angle margin, and a half of a difference between the first angle of view and the second angle of view in the vertical direction may be calculated as a vertical angle margin.
For example, assuming that the angle of view of the first sensor in the horizontal direction is 170 °, the angle of view in the vertical direction is 110 °, the angle of view of the second sensor in the horizontal direction is 120 °, and the angle of view in the vertical direction is 90 °, a horizontal angle margin of (170 ° -120 °)/2 is 25 °, and a vertical angle margin of (110 ° -90 °)/2 is 10 ° can be calculated.
Referring to fig. 4a and 4b, fig. 4a and 4b are schematic views of a field angle provided by an embodiment of the present application, respectively. Assuming that, in which a solid line box represents a field of view of the first sensor, an angle shown by a solid arrow represents a field angle of the first sensor, a cross mark represents a first actual position, a dashed line box represents a field of view of the second sensor, an angle shown by a dashed arrow represents a field angle of the second sensor, and a small box represents a second actual position, since a horizontal angle margin is: since the difference between the angle of view of the first sensor and the angle of view of the second sensor in the horizontal direction is half, the angle formed by the dotted arrow and the realized arrow shown in fig. 4a can be used as the horizontal angle margin.
As shown in fig. 4b, when the horizontal offset angle between the first actual position and the second actual position is too large, the field of view of the first sensor will exceed the field of view of the second sensor.
In this way, when the second fixing member 103 is controlled to move, it can be determined whether the horizontal offset angle is larger than the horizontal angle margin:
if yes, it means that, in the horizontal direction, the field of view of the current sensor is within the field of view of the other sensor, so the second sensor may not be adjusted in the horizontal direction;
if not, it is indicated that the field of view of the current sensor is not within the field of view of the other sensor in the horizontal direction, so that the second sensor can be adjusted in the horizontal direction.
In addition, it can also be determined whether the vertical offset angle is greater than the vertical angle margin:
if so, it is stated that, in the vertical direction, the field of view of the current sensor is within the field of view of the other sensor, so the second sensor may not be adjusted in the vertical direction;
if not, it is indicated that the field of view of the current sensor is not within the field of view of the other sensor in the vertical direction, and therefore the second sensor may be adjusted in the vertical direction.
Referring to fig. 5, fig. 5 is a flowchart illustrating a recalibration method provided in the present application, and after calibrating the dual-camera based on the above-mentioned manner, the controller 104 may be further configured to perform the following steps S501 to S503:
and S501, obtaining a third conversion relation of pixel positions in the first image and the third image according to the position of the reference object in the third image.
And the third image is an image obtained by acquiring an image of the reference object by the calibrated second sensor.
Specifically, after calibration, the second sensor may capture an image of the reference object again, and send the captured image to the controller 104, so that the controller 104 obtains a third image. The position of the reference object in the third image can then be obtained. Since the position of the reference object in the first image has been previously obtained and the first sensor has not moved, i.e. the position of the reference object in the first image has not changed, there is no need to obtain the position of the reference object in the first image again. Further, the conversion relationship between the pixel positions in the first image and the third image can be obtained as a third conversion relationship according to the positions of the reference object in the first image and the third image, respectively.
And S502, determining the position of the preset image in the third image to be converted into the position in the first image according to the third conversion relation, and taking the position as a third target position.
Specifically, the pixel position of the preset image in the third image may be obtained first, and then the pixel position is converted into the first image according to the third conversion relationship between the pixel positions in the first image and the third image obtained in step S501, so as to obtain the third target position.
And S503, under the condition that the third target position is outside the preset calibration area, calculating new offset information between the first sensor and the second sensor according to the deviation between the first target position and the third target position, and controlling the second fixing member 103 to move according to the new offset information to realize recalibration of the dual-optical camera.
Wherein, the preset calibration area is as follows: a region of a predetermined size containing the first target position.
Specifically, it may be determined whether the third target position is outside a preset calibration area, and if not, it may be determined that the calibration of the dual-optical camera has been successful; if so, it is described that a large deviation still exists between the first sensor and the second sensor after calibration, and the first sensor and the second sensor of the dual-camera need to be calibrated again, and in this case, new deviation information between the first sensor and the second sensor may be recalculated according to the deviation between the first target position and the third target position, and then the second fixing member 103 may be controlled to move again according to the new deviation information, so as to implement the recalibration of the dual-camera.
Referring to fig. 6, fig. 6 is a schematic diagram of a preset calibration area according to an embodiment of the present disclosure. The solid line box in fig. 6 may represent the first image, and the dashed line box may represent the third image, as shown in fig. 6, when the third target position is outside the preset calibration area, the third image may not completely include the first image, which indicates that there is still a large deviation between the calibrated first sensor and the calibrated second sensor, and therefore the first sensor and the second sensor of the dual-optical camera need to be calibrated;
in the case that the third target position is within the preset calibration area, the third image can completely contain the first image, which indicates that the deviation between the first sensor and the second sensor after calibration is small, so that the first sensor and the second sensor of the dual-optical camera do not need to be calibrated.
In an embodiment of the present application, the center position of the preset calibration area may be the first target position, and in addition, the first target position may be used as a vertex, an edge point, and the like of the preset calibration area. The preset calibration area can be square, rectangular, circular and the like, and the size of the preset calibration area can be set manually.
In an embodiment of the application, an actual width corresponding to the preset calibration area is less than or equal to a width threshold, and a corresponding actual height is less than or equal to a height threshold.
Wherein, the actual width is: the pixel width of the preset calibration area in the first image is converted to the actual width of the plane where the calibration plate 101 is located, and the actual height is: the pixel height of the preset calibration area in the first image is converted to the actual height of the plane where the calibration plate 101 is located,
the width threshold C is:
C=L*tan(αx-βx);
the height threshold G is:
G=L*tan(αy-βy)
l is a vertical distance between the first fixing member 102 and the calibration plate 101, α x is an angle of view of the first sensor in the horizontal direction, α y is an angle of view of the first sensor in the vertical direction, β x is an angle of view of the second sensor in the horizontal direction, and β y is an angle of view of the second sensor in the vertical direction.
Specifically, the above (α x- β x) may represent a horizontal angle margin of the angle of view between the first sensor and the second sensor, and the above (α y- β y) may represent a vertical angle margin of the angle of view between the first sensor and the second sensor. According to the horizontal angle allowance, the vertical angle allowance and the vertical distance between the first fixing piece 101 and the calibration plate, the actual width and height of the preset calibration area corresponding to the plane of the calibration plate can be calculated, and then according to a second conversion relation between the pixel position in the first image and the actual position of the calibration plate in the plane, the area of the preset calibration area in the first image can be obtained, so that the preset calibration area is determined.
Referring to fig. 7, fig. 7 is a schematic flowchart of a first dual-optical-camera calibration method according to an embodiment of the present disclosure, which can be applied to the controller 104, and the method includes the following steps S701 to S708:
s701, under the condition that the reference object is in the overlapped view field range, a first image collected by a visible light image sensor in the double-camera is obtained, and a second image collected by an infrared light image sensor in the double-camera is obtained.
Wherein, the coincidence field range is: the range of the overlapping portion in the field of view of the visible light image sensor and the field of view of the infrared light image sensor.
S702, determining a first coordinate of the reference object in a first coordinate system, obtaining a second coordinate of the reference object in a second coordinate system, and obtaining a first conversion relation between the first coordinate system and the second coordinate system based on the first coordinate and the second coordinate.
Wherein the first coordinate system is: a coordinate system established based on an image collected by the visible light image sensor, wherein the second coordinate system is: and a coordinate system established by taking the image collected by the infrared light image sensor as a reference.
And S703, obtaining the coordinate of the central position in the second image in the second coordinate system, and converting the obtained coordinate into the first coordinate system according to the first conversion relation between the first coordinate system and the second coordinate system to obtain the second target position.
S704, obtaining coordinates of the center position of the first image in the first coordinate system as a first target position.
S705, according to the second conversion relationship, a horizontal offset angle in the horizontal direction between the first actual position corresponding to the first target position and the second actual position corresponding to the second target position, and a vertical offset angle in the vertical direction are obtained.
Wherein, the second conversion relation is: the translation between the pixel position in the first image and the actual position in the plane of the calibration plate 101.
S706, calibrating the dual-optical camera according to the horizontal offset angle under the condition that the horizontal offset angle is larger than the horizontal angle allowance; and under the condition that the vertical offset angle is larger than the vertical angle allowance, calibrating the dual-optical camera according to the vertical offset angle.
Wherein, the horizontal angle allowance is: half of the difference in the horizontal direction between the field angle of the first sensor and the field angle of the second sensor;
the vertical angle allowance is: the angle of view of the first sensor is half the difference in the vertical direction from the angle of view of the second sensor.
And S707, acquiring a third image acquired by acquiring the image of the reference object by the calibrated second sensor, determining the position of the reference object in the third image, acquiring a third conversion relation between the pixel positions in the first image and the third image according to the determined position, and determining the position of the center of the third image converted into the first image according to the third conversion relation, wherein the position is used as a third target position.
S708, when the third target position is outside the preset calibration area, calculating new offset information between the first sensor and the second sensor according to the deviation between the first target position and the third target position, controlling the second fixing element 103 to move according to the new offset information, so as to recalibrate the dual-optical camera, and returning to step S707 until the third target position is in the preset calibration area after calibration.
The above-described embodiment provides a dual-light camera calibration apparatus including: calibration board, first mounting, second mounting and controller with reference object, wherein: the first fixing piece and the second fixing piece are used for fixing a first sensor and a second sensor of the dual-optical camera respectively, the controller is in communication connection with the second fixing piece, and the controller is also in communication connection with the first sensor and the second sensor; the controller is used for obtaining a first image and a second image, obtaining offset information between the first sensor and the second sensor according to the positions of the reference object in the first image and the second image, and controlling the second fixing piece to move according to the offset information so as to calibrate the dual-light camera; wherein the first image is: under the condition that the reference object is in the overlapped field of view range, the image acquired by the first sensor; the second image is: under the condition that the reference object is in the overlapped field of view range, the image acquired by the second sensor; the coincidence field range is: the field of view of the first sensor coincides with the field of view of the second sensor. The method comprises the steps of firstly obtaining the positions of a reference object in images acquired by two image sensors, determining the deviation between a first image and a second image based on the deviation between the positions, reflecting the deviation information between the first sensor and the second sensor according to the deviation information, controlling the second fixing piece to move according to the deviation information, and driving the second sensor to move by the second fixing piece, so that the calibration of the dual-optical camera can be realized. Therefore, the dual-optical-camera calibration device provided by the embodiment can be used for calibrating two image sensors in the dual-optical camera.
Referring to fig. 8, fig. 8 is a schematic flowchart of a second method for calibrating a dual-optical camera according to an embodiment of the present application, where the method includes:
s801, obtaining a first image collected by a first sensor of a dual-optical camera when a preset reference object is in a coincidence field range, and obtaining a second image collected by a second sensor of the dual-optical camera when the reference object is in the coincidence field range, where the coincidence field range is: a range where the field of view of the first sensor coincides with the field of view of the second sensor;
s802, obtaining offset information between the first sensor and the second sensor according to the positions of the reference object in the first image and the second image;
and S803, calibrating the dual-optical camera according to the offset information.
In an embodiment of the present application, the method may be applied to the dual-optical camera calibration device provided in the above embodiment, and in addition, may be applied to electronic devices such as an electronic computer, a mobile phone, a flat panel, a video camera, and the like.
For example, the method may be applied to a dual-optical camera, and a processor in the dual-optical camera may calibrate the first sensor and the second sensor included in the dual-optical camera by applying the method, so as to calibrate the processor.
In an embodiment of the application, the obtaining offset information between the first sensor and the second sensor according to the position of the reference object in the first image and the second image includes:
determining the position of the reference object in the first image and the second image;
obtaining a first conversion relation of pixel positions in the first image and the second image according to the determined positions;
determining the position of a preset image in the second image to be converted into the position in the first image according to the first conversion relation, and taking the position as a second target position;
calculating offset information between the first sensor and the second sensor according to a deviation between a first target position and the second target position, wherein the first target position is: the preset image is at the position of the first image.
In one embodiment of the present application, the calculating offset information between the first sensor and the second sensor according to the deviation between the first target position and the second target position includes:
obtaining a deviation between a second actual position and a first actual position as a reference deviation according to the second target position, the first target position and a preset second conversion relationship, wherein the second conversion relationship is as follows: the conversion relation between the pixel position in the first image and the actual position in the plane where the reference object is located, wherein the first actual position is as follows: the first target position corresponds to an actual position of the plane where the reference object is located, and the second actual position is as follows: the actual position of the plane where the reference object is located corresponds to the second target position;
and calculating an offset angle between the first sensor and the second sensor according to the vertical distance between the first sensor and the reference object and the reference deviation to serve as offset information.
In one embodiment of the present application, the offset angle between the first sensor and the second sensor comprises: a horizontal offset angle between the first sensor and the second sensor in a horizontal direction, and a vertical offset angle between the first sensor and the second sensor in a vertical direction;
the calibrating the dual-light camera according to the offset information includes:
and when the horizontal offset angle is larger than a horizontal angle allowance, horizontally adjusting the second sensor according to the horizontal offset angle, wherein the horizontal angle allowance is as follows: half of a difference in a horizontal direction between a field angle of the first sensor and a field angle of the second sensor;
and under the condition that the vertical deviation angle is larger than a vertical angle allowance, vertically adjusting the second sensor according to the vertical deviation angle, wherein the vertical angle allowance is as follows: a difference between a field angle of the first sensor and a field angle of the second sensor in a vertical direction is half of the difference.
In one embodiment of the present application, the method further comprises:
obtaining a third conversion relation of pixel positions in the first image and a third image according to the position of the reference object in the third image, wherein the third image is an image obtained by acquiring the reference object by the calibrated second sensor;
determining the position of the preset image in the third image to be converted into the position in the first image according to the third conversion relation, and taking the position as a third target position;
under the condition that the third target position is outside a preset calibration area, calculating new offset information between the first sensor and the second sensor according to the deviation between the first target position and the third target position, and recalibrating the dual-optical camera according to the new offset information, wherein the preset calibration area is as follows: a region of a preset size containing the first target location.
In the calibration scheme of the dual-optical camera provided by the embodiment, the positions of the reference object in the images acquired by the two image sensors are firstly obtained, the deviation between the first image and the second image can be determined based on the deviation between the positions, the deviation between the first image and the second image can reflect the deviation information between the first sensor and the second sensor, and the second sensor is further controlled to move according to the deviation information, so that the calibration of the dual-optical camera can be realized. Therefore, the two-light camera calibration scheme provided by the above embodiment can be applied to calibrate two image sensors in the two-light camera.
The embodiment of the present application further provides an electronic device, as shown in fig. 9, which includes a processor 901, a communication interface 902, a memory 903, and a communication bus 904, where the processor 901, the communication interface 902, and the memory 903 complete mutual communication through the communication bus 904,
a memory 903 for storing computer programs;
the processor 901 is configured to implement the dual-optical-camera calibration method when executing the program stored in the memory 903.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
In yet another embodiment provided herein, there is also provided a computer readable storage medium having a computer program stored therein, the computer program when executed by a processor implementing the steps of any of the above-described dual-light camera calibration methods.
In yet another embodiment provided herein, there is also provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform any of the above-described two-light camera calibration methods.
In the calibration scheme of the dual-optical camera provided by the embodiment, the positions of the reference object in the images acquired by the two image sensors are firstly obtained, the deviation between the first image and the second image can be determined based on the deviation between the positions, the deviation between the first image and the second image can reflect the deviation information between the first sensor and the second sensor, and the second sensor is further controlled to move according to the deviation information, so that the calibration of the dual-optical camera can be realized. Therefore, the two-light camera calibration scheme provided by the above embodiment can be applied to calibrate two image sensors in the two-light camera.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the method embodiments, the electronic device embodiments, the computer-readable storage medium embodiments, and the computer program product embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference may be made to some descriptions of the method embodiments for relevant points.
The above description is only for the preferred embodiment of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application are included in the protection scope of the present application.
Claims (14)
1. A dual-photo camera calibration apparatus, comprising: calibration board, first mounting, second mounting and controller with reference object, wherein:
the first fixing piece and the second fixing piece are used for fixing a first sensor and a second sensor of the dual-optical camera respectively, the controller is in communication connection with the second fixing piece, and the controller is also in communication connection with the first sensor and the second sensor;
the controller is used for obtaining a first image and a second image, obtaining offset information between the first sensor and the second sensor according to the positions of the reference object in the first image and the second image, and controlling the second fixing piece to move according to the offset information so as to calibrate the dual-optical camera;
wherein the first image is: in the case that the reference object is in a coincident field of view range, the image acquired by the first sensor; the second image is: the image acquired by the second sensor is in the condition that the reference object is in the overlapped field of view range; the coincidence field range is as follows: a range in which the field of view of the first sensor coincides with the field of view of the second sensor.
2. The apparatus according to claim 1, wherein the second fixing member comprises a fixing portion and a moving portion, the fixing portion is used for fixing the second sensor, and the moving portion is used for driving the fixing portion to move;
the controller is used for controlling the movable part to move according to the offset information.
3. The apparatus of claim 2, wherein the movable section includes a stepper motor and an angle slide; the stepping motor is connected with the angle sliding table; the angle sliding table is connected with the movable part; the controller is in communication connection with the stepper motor,
the controller is specifically configured to: controlling the stepping motor to drive the angle sliding table to rotate according to the offset information;
the angle sliding table is used for driving the fixing part to rotate.
4. The apparatus of claim 1, further comprising: a movable connecting piece;
the movable connecting piece is respectively connected with the first fixing piece and the calibration plate and used for adjusting the distance between the first fixing piece and the calibration plate.
5. The apparatus of claim 1, wherein the reference object is a heat source object, the heat source object is located on a surface of the calibration plate, and the calibration plate is further used for temperature control of the heat source object.
6. The apparatus of claim 1, wherein the initial positions of the first and second fixtures are: and the fixed front faces of the lenses of the first sensor and the second sensor are parallel.
7. The device of any one of claims 1-6, further comprising a display communicatively coupled to the controller;
the controller is further configured to: performing image processing on a first image acquired by the first sensor and a second image acquired by the second sensor, and sending the processed images to the display;
the display is used for: the received image is displayed.
8. A method of dual-light camera calibration, the method comprising:
the method comprises the steps of obtaining a first image collected by a first sensor of a dual-optical camera under the condition that a preset reference object is in a superposition view field range, and obtaining a second image collected by a second sensor of the dual-optical camera under the condition that the reference object is in the superposition view field range, wherein the superposition view field range is as follows: a range where the field of view of the first sensor coincides with the field of view of the second sensor;
obtaining offset information between the first sensor and the second sensor according to the positions of the reference object in the first image and the second image;
calibrating the dual-light camera according to the offset information.
9. The method of claim 8, wherein obtaining offset information between the first sensor and the second sensor based on the position of the reference object in the first image and the second image comprises:
determining the position of the reference object in the first image and the second image;
obtaining a first conversion relation of pixel positions in the first image and the second image according to the determined positions;
determining the position of a preset image in the second image to be converted into the position in the first image according to the first conversion relation, and taking the position as a second target position;
calculating offset information between the first sensor and the second sensor according to a deviation between a first target position and the second target position, wherein the first target position is: the preset image is at the position of the first image.
10. The method of claim 9, wherein calculating offset information between the first sensor and the second sensor based on the deviation between the first target position and the second target position comprises:
obtaining a deviation between a second actual position and a first actual position as a reference deviation according to the second target position, the first target position and a preset second conversion relationship, wherein the second conversion relationship is as follows: the conversion relation between the pixel position in the first image and the actual position in the plane where the reference object is located, wherein the first actual position is as follows: the first target position corresponds to an actual position of the plane where the reference object is located, and the second actual position is as follows: the actual position of the plane where the reference object is located corresponds to the second target position;
and calculating an offset angle between the first sensor and the second sensor according to the vertical distance between the first sensor and the reference object and the reference deviation to serve as offset information.
11. The method of claim 10, wherein the offset angle between the first sensor and the second sensor comprises: a horizontal offset angle between the first sensor and the second sensor in a horizontal direction, and a vertical offset angle between the first sensor and the second sensor in a vertical direction;
the calibrating the dual-light camera according to the offset information includes:
and when the horizontal offset angle is larger than a horizontal angle allowance, horizontally adjusting the second sensor according to the horizontal offset angle, wherein the horizontal angle allowance is as follows: half of a difference in a horizontal direction between a field angle of the first sensor and a field angle of the second sensor;
and under the condition that the vertical deviation angle is larger than a vertical angle allowance, vertically adjusting the second sensor according to the vertical deviation angle, wherein the vertical angle allowance is as follows: a difference between a field angle of the first sensor and a field angle of the second sensor in a vertical direction is half of the difference.
12. The method of claim 9, further comprising:
obtaining a third conversion relation of pixel positions in the first image and a third image according to the position of the reference object in the third image, wherein the third image is an image obtained by acquiring the reference object by the calibrated second sensor;
determining the position of the preset image in the third image to be converted into the position in the first image according to the third conversion relation, and taking the position as a third target position;
under the condition that the third target position is outside a preset calibration area, calculating new offset information between the first sensor and the second sensor according to the deviation between the first target position and the third target position, and recalibrating the dual-optical camera according to the new offset information, wherein the preset calibration area is as follows: a region of a preset size containing the first target location.
13. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 8 to 12 when executing a program stored in the memory.
14. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any of the claims 8-12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110287677.8A CN113068019B (en) | 2021-03-17 | 2021-03-17 | Dual-optical camera calibration apparatus, method, electronic apparatus, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110287677.8A CN113068019B (en) | 2021-03-17 | 2021-03-17 | Dual-optical camera calibration apparatus, method, electronic apparatus, and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113068019A true CN113068019A (en) | 2021-07-02 |
CN113068019B CN113068019B (en) | 2023-02-03 |
Family
ID=76561011
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110287677.8A Active CN113068019B (en) | 2021-03-17 | 2021-03-17 | Dual-optical camera calibration apparatus, method, electronic apparatus, and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113068019B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113884123A (en) * | 2021-09-23 | 2022-01-04 | 广州小鹏汽车科技有限公司 | Sensor calibration method and device, vehicle and storage medium |
CN113923317A (en) * | 2021-08-30 | 2022-01-11 | 珠海视熙科技有限公司 | Camera frame synchronization test method, device and storage medium |
CN114205483A (en) * | 2022-02-17 | 2022-03-18 | 杭州思看科技有限公司 | Scanner precision calibration method and device and computer equipment |
CN115824285A (en) * | 2022-12-09 | 2023-03-21 | 合肥御微半导体技术有限公司 | Sensor position calibration method, device, equipment and medium |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070104353A1 (en) * | 2003-12-16 | 2007-05-10 | Michael Vogel | Calibration of a surveying instrument |
US20110032399A1 (en) * | 2009-08-04 | 2011-02-10 | Wei-Chao Kao | Double-light sources optical scanning device and method of using the same |
US20120075470A1 (en) * | 2010-09-27 | 2012-03-29 | National Applied Research Laboratories | Time-Sequential Multi-Spectrum Imaging Device |
CN107977924A (en) * | 2016-10-21 | 2018-05-01 | 杭州海康威视数字技术股份有限公司 | A kind of image processing method based on dual sensor imaging, system |
CN207730423U (en) * | 2017-11-22 | 2018-08-14 | 信利光电股份有限公司 | It is a kind of to take the photograph module system for testing optical axis with visible light and the double of infrared light |
CN110166714A (en) * | 2019-04-11 | 2019-08-23 | 深圳市朗驰欣创科技股份有限公司 | Double light fusion methods of adjustment, double light fusion adjustment device and double light fusion devices |
CN110361092A (en) * | 2018-04-11 | 2019-10-22 | 杭州海康威视数字技术股份有限公司 | A kind of method for registering images, device and thermal imaging camera |
CN110519498A (en) * | 2019-08-29 | 2019-11-29 | 深圳市道通智能航空技术有限公司 | A kind of method, apparatus and pair light camera of double light camera imaging calibrations |
CN110519540A (en) * | 2019-08-29 | 2019-11-29 | 深圳市道通智能航空技术有限公司 | A kind of image processing method, device, equipment and storage medium |
CN111279393A (en) * | 2018-11-29 | 2020-06-12 | 深圳市大疆创新科技有限公司 | Camera calibration method, device, equipment and storage medium |
CN111784781A (en) * | 2020-06-28 | 2020-10-16 | 杭州海康微影传感科技有限公司 | Parameter determination method, device, equipment and system |
CN112184784A (en) * | 2020-09-27 | 2021-01-05 | 烟台艾睿光电科技有限公司 | Double-spectrum image alignment method and device, electronic equipment and storage medium |
CN112285940A (en) * | 2020-10-29 | 2021-01-29 | 中国航空工业集团公司洛阳电光设备研究所 | Optical axis consistency assembling and correcting method for double-view-field lens |
-
2021
- 2021-03-17 CN CN202110287677.8A patent/CN113068019B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070104353A1 (en) * | 2003-12-16 | 2007-05-10 | Michael Vogel | Calibration of a surveying instrument |
US20110032399A1 (en) * | 2009-08-04 | 2011-02-10 | Wei-Chao Kao | Double-light sources optical scanning device and method of using the same |
US20120075470A1 (en) * | 2010-09-27 | 2012-03-29 | National Applied Research Laboratories | Time-Sequential Multi-Spectrum Imaging Device |
CN107977924A (en) * | 2016-10-21 | 2018-05-01 | 杭州海康威视数字技术股份有限公司 | A kind of image processing method based on dual sensor imaging, system |
CN207730423U (en) * | 2017-11-22 | 2018-08-14 | 信利光电股份有限公司 | It is a kind of to take the photograph module system for testing optical axis with visible light and the double of infrared light |
CN110361092A (en) * | 2018-04-11 | 2019-10-22 | 杭州海康威视数字技术股份有限公司 | A kind of method for registering images, device and thermal imaging camera |
CN111279393A (en) * | 2018-11-29 | 2020-06-12 | 深圳市大疆创新科技有限公司 | Camera calibration method, device, equipment and storage medium |
CN110166714A (en) * | 2019-04-11 | 2019-08-23 | 深圳市朗驰欣创科技股份有限公司 | Double light fusion methods of adjustment, double light fusion adjustment device and double light fusion devices |
CN110519498A (en) * | 2019-08-29 | 2019-11-29 | 深圳市道通智能航空技术有限公司 | A kind of method, apparatus and pair light camera of double light camera imaging calibrations |
CN110519540A (en) * | 2019-08-29 | 2019-11-29 | 深圳市道通智能航空技术有限公司 | A kind of image processing method, device, equipment and storage medium |
CN111784781A (en) * | 2020-06-28 | 2020-10-16 | 杭州海康微影传感科技有限公司 | Parameter determination method, device, equipment and system |
CN112184784A (en) * | 2020-09-27 | 2021-01-05 | 烟台艾睿光电科技有限公司 | Double-spectrum image alignment method and device, electronic equipment and storage medium |
CN112285940A (en) * | 2020-10-29 | 2021-01-29 | 中国航空工业集团公司洛阳电光设备研究所 | Optical axis consistency assembling and correcting method for double-view-field lens |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113923317A (en) * | 2021-08-30 | 2022-01-11 | 珠海视熙科技有限公司 | Camera frame synchronization test method, device and storage medium |
CN113923317B (en) * | 2021-08-30 | 2022-11-15 | 珠海视熙科技有限公司 | Camera frame synchronization test method, device and storage medium |
CN113884123A (en) * | 2021-09-23 | 2022-01-04 | 广州小鹏汽车科技有限公司 | Sensor calibration method and device, vehicle and storage medium |
CN114205483A (en) * | 2022-02-17 | 2022-03-18 | 杭州思看科技有限公司 | Scanner precision calibration method and device and computer equipment |
CN115824285A (en) * | 2022-12-09 | 2023-03-21 | 合肥御微半导体技术有限公司 | Sensor position calibration method, device, equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN113068019B (en) | 2023-02-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113068019B (en) | Dual-optical camera calibration apparatus, method, electronic apparatus, and storage medium | |
CN109855568B (en) | Method and device for detecting automatic driving sensor, electronic equipment and storage medium | |
US10896521B2 (en) | Coordinate calibration between two dimensional coordinate system and three dimensional coordinate system | |
US9781412B2 (en) | Calibration methods for thick lens model | |
US20080007627A1 (en) | Method of distance estimation to be implemented using a digital camera | |
US11218626B2 (en) | Fast focus using dual cameras | |
TWI630377B (en) | Thermal detection device | |
CN106570907B (en) | Camera calibration method and device | |
CN108924544A (en) | Camera distortion measurement method and test device | |
WO2017107534A1 (en) | Method and device for measuring angle, and method and device for adjusting angle | |
CN109727292A (en) | Based on multi-cam-projector interactive projection system and automation scaling method | |
CN109996050A (en) | Control method and control device of projection robot | |
CN111311671B (en) | Workpiece measuring method and device, electronic equipment and storage medium | |
US9654749B2 (en) | Projection methods and projection devices | |
CN111343360B (en) | Correction parameter obtaining method | |
CN115683352A (en) | Target temperature measuring method and device, electronic equipment and storage medium | |
WO2018152710A1 (en) | Image correction method and device | |
WO2020228593A1 (en) | Method and apparatus for determining categories of target objects in picture | |
CN114674276B (en) | Distance measurement method, machine vision system, and storage medium | |
CN110581977B (en) | Video image output method and device and three-eye camera | |
JP2013207745A (en) | Image pickup device, image processing method, and program | |
US20120300058A1 (en) | Control computer and method for regulating mechanical arm using the same | |
US9723189B2 (en) | Portable electronic-devices and methods for image extraction | |
CN114339179A (en) | Projection correction method, projection correction device, storage medium and projection equipment | |
CN114286075B (en) | Correction parameter adjustment method, correction parameter adjustment device, electronic equipment and readable medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |