CN117036474B - Micro-scale-oriented active camera repositioning method - Google Patents

Micro-scale-oriented active camera repositioning method Download PDF

Info

Publication number
CN117036474B
CN117036474B CN202310992687.0A CN202310992687A CN117036474B CN 117036474 B CN117036474 B CN 117036474B CN 202310992687 A CN202310992687 A CN 202310992687A CN 117036474 B CN117036474 B CN 117036474B
Authority
CN
China
Prior art keywords
navigation
freedom
scale
camera
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310992687.0A
Other languages
Chinese (zh)
Other versions
CN117036474A (en
Inventor
冯伟
陈豪杰
张乾
李楠
崔晓
王柏钦
万亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN202310992687.0A priority Critical patent/CN117036474B/en
Publication of CN117036474A publication Critical patent/CN117036474A/en
Application granted granted Critical
Publication of CN117036474B publication Critical patent/CN117036474B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Studio Devices (AREA)

Abstract

The invention discloses a micro-scale oriented active camera repositioning method, which comprises the following steps: step one: construction indication frameStep two: from acquiring current active camera imagesReference imageGenerating a navigation frameStep three: according to the navigation frameIndication frameGenerating six-degree-of-freedom navigation directions for active camerasAnd navigation step lengthStep four: adjusting the space pose of the active camera according to the navigation direction and the navigation step length; step five: if the navigation frame does not meet the convergence condition, j=j+1, jumping to the second step, otherwise jumping to the sixth step; step six: if the current shooting scale is the minimum shooting scale, namely k=k, repositioning is completed; otherwise, let j=1, k=k+1, enter the next scale, jump to step two, the invention realizes the gradual progressive repositioning from macro scale to micro scale through the way of scale relay in the repositioning process of the active camera, does not need to carry on the internal reference calibration to the camera.

Description

Micro-scale-oriented active camera repositioning method
Technical Field
The invention belongs to the field of artificial intelligence and computer vision, relates to an active vision technology, and particularly relates to a micro-scale-oriented active camera repositioning method.
Background
The active camera repositioning aims at physically and truly restoring the six-degree-of-freedom pose of the camera to the same pose as that of the historical shooting, so as to obtain two observation images consistent with the same target space, plays an important role in the fields of environment monitoring, historical cultural heritage preventive protection, tiny change detection and the like, and is an important application of active vision technology. The active camera repositioning process comprises relative pose estimation of the camera and dynamic adjustment of the camera, and the adjustment of the camera is completed by a mechanical platform.
Currently, the most advanced active camera repositioning algorithm has achieved great success in cultural heritage minor change detection tasks. However, it should be noted that these monitoring are all repositioned on a macro scale, and in addition, these methods require pre-calibration of the internal parameters of the camera, and the solution of the relative pose difference of the camera is achieved by using the internal parameter information through a pose estimation algorithm. However, many fine monitoring tasks need to be performed on the microscopic scale of the observed target, such as high-precision product flaw detection or fine micro-variation detection, and these flaws or variations may only be on the micrometer scale, and the existing active camera repositioning method and related camera products cannot be adapted to the above requirements. The current camera and the historical reference image are required to co-view the area at the initial moment in the visual accurate positioning, and the conditions are extremely difficult to ensure on a microscopic scale (the general observation range is about 1 millimeter), so that the most feasible mode is to gradually transition from a macroscopic scale to camera repositioning on the microscopic scale by adopting scale relay. In the process, the zoom range of the camera is large and continuous, so that the internal parameter calibration of the camera becomes infeasible, and the existing active camera repositioning method cannot be suitable for the microscopic scale.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a micro-scale-oriented active camera repositioning method, which realizes gradual repositioning from a macro scale to a micro scale in a scale relay manner in the active camera repositioning process without performing internal reference calibration on the camera.
The micro-scale oriented active camera repositioning method is realized by the following technical scheme:
a micro-scale oriented active camera repositioning method, the method comprising the steps of:
step one: initializing: given historical reference captured image set Adjusting a current active camera and imageWith coincident viewing angles, construction of instruction boxesSetting k=1 to represent a scale index, and j=1 to represent an index of the camera adjustment times under a certain scale;
wherein: k represents the total number of dimensions, For the most macro-scale captured image,B 1,…,b4 indicates that the frame is respectively four vertex coordinates of upper left, upper right, lower left and lower right for the image shot at the microscopic scale;
Step two: from acquiring current active camera images Reference imageGenerating a navigation frameWherein: Representing the current active camera image at the jth adjustment at the kth scale, Four vertex coordinates;
step three: according to the navigation frame Indication frameGenerating six-degree-of-freedom navigation directions for active camerasAnd navigation step length
Wherein: rx, ry and rz respectively represent three axes of rotational degrees of freedom, and tx, ty and tz respectively represent three axes of translational degrees of freedom;
step four: adjusting the space pose of the active camera according to the navigation direction and the navigation step length;
step five: if the navigation frame does not meet the convergence condition that the navigation directions of the six degrees of freedom are all 0, j=j+1, jumping to the second step, otherwise jumping to the sixth step; wherein: the convergence condition is as follows:
Step six: if the current shooting scale is the minimum shooting scale, namely k=k, repositioning is completed; otherwise, let j=1, k=k+1, enter the next scale, jump to step two.
Further, the navigation frame generating process in the second step includes the following steps:
201: for the current image Reference imageFeature extraction and feature matching are carried out to obtain a series of matching point pairsWherein i is the index of the matching point pair;
202: based on all matching pairs Calculating a homography matrix H k,j;
203: indicating frame by homography matrix H k,j Transforming to obtain navigation frameWherein: instruction frameIs that Wherein: w and h respectively represent the width and height of the image; the superscript T denotes a transpose operation;
The indication frame is indicated by the following formula The middle coordinates are transformed by a homography matrix H k,j to obtain a navigation frameAll vertex coordinates;
Wherein m is more than or equal to 1 and less than or equal to 4, b m and All representing homogeneous coordinates.
Further, the generating process of the six-degree-of-freedom navigation direction of the camera in the step 3 includes the following steps:
301: three translational degree of freedom navigation directions Is calculated by (1): navigation frameIndication frameThe center coordinates of (a) are respectivelyAnd b c, respectively usingAnd b c calculating the navigation direction of the translational degree of freedom of the x-axis and the y-axis according to the difference of the coordinates of the x-axis and the y-axisAndBy means ofAndCalculating the navigation direction of the z-axis translational degree of freedom from the area difference of the enclosed areasWherein: for three translational degrees of freedom navigation directions, the calculation is as follows,
X-axis translational degree of freedom direction
Direction of freedom of translation of y-axis
Direction of freedom of translation of z-axis
Wherein b c andRespectively representing indication framesNavigation frameCenter point coordinates, i.e.
302: Three rotational degrees of freedom navigation directionsIs calculated by (1): respectively according to the navigation frameLength difference between upper and lower sides, length difference between left and right sides,To calculate the navigation directions of three rotational degrees of freedom respectivelyWherein:
For a three rotational degree of freedom navigation direction, the calculation is as follows,
Direction of rotational freedom of x-axis
Direction of rotational freedom of y-axis
Direction of freedom of z-axis rotation
Wherein: Respectively represent the threshold value of convergence of the corresponding rotational degree of freedom judgment.
Further, the step 3 of generating the six-degree-of-freedom navigation step length of the camera includes the following steps:
401: three translational degree of freedom navigation step Is calculated by (1): if the first translational movement at the current scale, i.e. j=1, then an initial translational step is givenAnd orderOtherwise, the change of the navigation frame after the last camera adjustment is adaptive to estimate the camera adjustment translation step length at the current moment, specifically comprising the following steps:
Wherein: area (·) represents a function of the calculation Area, and symbols (1) and (2) represent the 1 st and 2 nd elements of the corresponding vector;
402: three rotational degrees of freedom navigation step Is calculated by (1): if the first translational movement at the current scale, i.e. j=1, then an initial rotation step is givenAnd orderOtherwise, the camera adjustment rotation step length at the current moment is adaptively estimated according to the change of the navigation frame after the last camera adjustment, specifically comprising the following steps:
Wherein: sign I representation of the modulus of the vector is calculated, i.e. the length of the vector, AndRespectively representing the midpoint coordinates of the left frame and the right frame of the navigation frame, namelyAndAbove navigation frameThe optimal x-axis rotation step length at the current moment is deduced by the proportional change of the upper side length and the lower side length after the last adjustmentUsing navigation framesThe ratio change of the left side length and the right side length after the last adjustment is used for deducing the optimal y-axis rotation step length at the current momentUsing navigation framesThe optimal z-axis rotation step length at the current moment is deduced by the change of the ordinate proportion of the midpoint of the left side length and the midpoint of the right side length after last adjustmentFurther, in the step 4, the process of adjusting the space pose of the active camera according to the navigation direction and the navigation step length includes the following steps:
the camera pose adjustment strategy of firstly rotating and then translating is adopted, namely when the y-axis rotating navigation direction is not 0, the x-axis translation is not adjusted, and when the x-axis rotating navigation direction is not 0, the y-axis translation is not adjusted;
the motion adjustment amounts of the active camera in six degrees of freedom are as follows:
Advantageous effects
The technical scheme provided by the invention has the beneficial effects that:
1. Vision-based camera repositioning in turn requires the current camera to have an initial co-view area with the reference image. However, since the micro-scale area is very small, the initial common view area is very difficult to ensure, so that the existing active camera repositioning method cannot be suitable for the micro-scale. The invention effectively solves the problems based on the scale relay strategy, and enables the repositioning of the camera under the microscopic scale to be possible.
2. In the process of repositioning the active camera, gradual and progressive repositioning from a macro scale to a micro scale is realized in a scale relay mode, internal reference calibration of the camera is not needed, and the problem of continuous zoom camera calibration is avoided.
3. The invention provides a micro-scale active camera repositioning method, which can automatically restore a current camera to the same shooting pose of a historical micro-image through a mechanical platform by utilizing a scale relay strategy to acquire micro-image data with consistent space; the invention realizes the accurate repositioning of the camera under the microscopic scale of the target surface for the first time, and can effectively support the fine sensing task (such as micro-change detection, identity recognition and the like) of a high-value target.
Drawings
FIG. 1 is a flow chart of a micro-scale oriented active camera repositioning method
FIG. 2 is a diagram of an example of camera repositioning at a microscopic scale
FIG. 3 shows the comparison of camera repositioning accuracy (AFD value) on a microscopic scale for the present method and the prior art optimization method
FIG. 4 shows the comparison of camera repositioning accuracy (AFD value) on a macro scale for the present method and the prior art optimization method
Detailed Description
The technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings. Based on the technical solutions of the present invention, all other embodiments obtained by a person skilled in the art without making any creative effort fall within the protection scope of the present invention.
The invention provides a micro-scale oriented active camera repositioning method, which comprises the following steps of
1. Micro-scale-oriented active camera repositioning method
According to the repositioning method, the observation scale is continuously reduced through a macroscopic-to-microscopic camera repositioning relay strategy, and finally, the camera repositioning under the microscopic scale is realized. The method comprises the following steps:
step one: initializing: given historical reference captured image set Where K represents the total number of scales.For the most macro-scale captured image,The image is taken at the microscopic scale. The mechanical platform carries on the camera, the manual adjustment mechanical platform guarantees current camera and pictureThere is a coincident viewing angle. Construction indication frame(B 1,…,b4 is the four vertex coordinates of upper left, upper right, lower left, and lower right, respectively). Setting k=1 to represent a scale index, and j=1 to represent an index of the camera adjustment times under a certain scale;
Step two: generating a navigation frame: acquire the current camera image and name Representing the current camera image at the jth adjustment at the kth scale. According toReference imageGenerating a navigation frame(Four vertex coordinates);
step three: navigation direction and step size generation: according to the navigation frame Indication frameGenerating six-degree-of-freedom navigation directions for camerasAnd navigation step length Wherein rx, ry and rz respectively represent three axes of rotational degrees of freedom, tx, ty and tz respectively represent three axes of translational degrees of freedom;
Step four: and (3) camera pose adjustment: driving a mechanical platform to adjust the space pose of the camera according to the navigation direction and the navigation step length;
step five: if the navigation frame does not meet the convergence condition, j=j+1, jumping to the second step, otherwise jumping to the sixth step;
Step six: if the current shooting scale is the minimum shooting scale, namely k=k, repositioning is completed; otherwise, let j=1, k=k+1, enter the next scale, jump to step two.
1) Initializing: given historical reference captured image setWhere K represents the total number of scales.For the most macro-scale captured image,The image is taken at the microscopic scale. The mechanical platform carries on the camera, the manual adjustment mechanical platform guarantees current camera and pictureThere is a coincident viewing angle. Construction indication frame(B 1,…,b4 is the four vertex coordinates of upper left, upper right, lower left respectively). Setting k=1 represents a scale index, and j=1 represents an index of the number of camera adjustments at a certain scale.
Description 1: acquisition of a reference image set
And carrying out multi-scale shooting on the parts needing repositioning. Each time, the image is shot from the scale k=k (which is the microscopic scale finally needing to finish the repositioning of the camera) to obtainThen gradually decreasing the camera magnification (k=k-1) and capturing the corresponding imageUntil k=1, finally constitute a reference captured image setIn practical application, three scales (k=3) can meet the requirement of relay repositioning of the subsequent scales.
2) Generating a navigation frame: acquire the current camera image and nameRepresenting the current camera image at the jth adjustment at the kth scale. According toReference imageGenerating a navigation frame(Four vertex coordinates).
(1) For the current imageReference imageFeature extraction and feature matching are carried out to obtain a series of matching point pairsWherein i is the index of the matching point pair;
(2) According to all matching point pairs Calculating a homography matrix H k,j;
(3) Indicating frame by homography matrix H k,j Transforming to obtain navigation frame
Description 2: feature extraction and matching method
The invention adopts the image corner features to realize feature extraction and matching between images, such as SIFT features, ORB features and the like.
Description 3: calculation of homography matrix H k,j
Homography matrices can map points in one two-dimensional or three-dimensional space to points in another two-dimensional or three-dimensional space, which describes the mapping relationship between two planes. For points in a plane, their homography matrix is unique, in the two-dimensional case the homography matrix is a 3x3 matrix. The homography matrix can be calculated through a series of matching point pairs obtained by two images, and the process belongs to the existing work and is not in the protection scope of the invention. In the method of the present invention, the present invention uses the front imageAnd the matching point pair calculated by the reference image I ref To calculate the homography matrix H k,j.
Description 4: indication frameNavigation frameIs generated by (a)
Indication frameCan be regarded as the pose state of the reference image shooting and navigation frameCan be regarded as the current camera pose state, and the camera repositioning process can be regarded as the navigation frame by adjusting the current camera poseIndication frameAnd gradually overlapping. Indication frameRemain fixed and are therefore set in the present invention asWhere w and h represent the image width and height, respectively. The superscript T denotes a transpose operation.
For navigation framesIn other words, variables k and j each represent the jth camera pose adjustment time at the kth scale, andRespectively representing the upper left corner coordinate, the upper right corner coordinate, the lower right corner coordinate and the lower left corner coordinate of the navigation frame.Through indication frameThe intermediate coordinates are obtained by transformation of a homography matrix H k,j.
The specific method for calculating the transformation coordinates is as follows:
Wherein, m is more than or equal to 1 and less than or equal to 4. From the above, the navigation frame can be calculated All vertex coordinates. Note that b m and in the above formulaAll representing homogeneous coordinates.
3) Navigation direction and step size generation: according to the navigation frameIndication frameGenerating six-degree-of-freedom navigation directions for camerasAnd navigation step length Wherein rx, ry and rz respectively represent three rotational degrees of freedom of x, y and z, and tx, ty and tz respectively represent three translational degrees of freedom of x, y and z.
(1) Three translational degree of freedom navigation directionsIs calculated by (1): navigation frameIndication frameThe center coordinates of (a) are respectivelyAnd b c, respectively usingAnd b c calculating the navigation direction of the translational degree of freedom of the x-axis and the y-axis according to the difference of the coordinates of the x-axis and the y-axisAndBy means ofAndCalculating the navigation direction of the z-axis translational degree of freedom from the area difference of the enclosed areas
(2) Three rotational degrees of freedom navigation directionsIs calculated by (1): respectively according to the navigation frameLength difference between upper and lower sides, length difference between left and right sides,To calculate the navigation directions of three rotational degrees of freedom respectively
(3) Three translational degree of freedom navigation stepIs calculated by (1): if the first translational movement at the current scale, i.e. j=1, then an initial translational step is givenAnd orderOtherwise, the invention adaptively estimates the camera adjustment translation step length at the current moment according to the change of the navigation frame after the last camera adjustment, and specifically comprises the following steps:
Where Area (·) represents a function of the calculated Area, the symbols (1) and (2) represent the 1 st and 2 nd elements of the corresponding vector.
(4) Three rotational degrees of freedom navigation stepIs calculated by (1): if the first translational movement at the current scale, i.e. j=1, then an initial rotation step is givenAnd orderOtherwise, the invention adaptively estimates the camera adjustment rotation step length at the current moment according to the change of the navigation frame after the last camera adjustment, and specifically comprises the following steps:
Wherein the symbols are |·| the modulus representing the vector is calculated, i.e. the length of the vector, AndRespectively representing the midpoint coordinates of the left frame and the right frame of the navigation frame, namelyAndAbove navigation frameThe optimal x-axis rotation step length at the current moment is deduced by the proportional change of the upper side length and the lower side length after the last adjustmentUsing navigation framesThe ratio change of the left side length and the right side length after the last adjustment is used for deducing the optimal y-axis rotation step length at the current momentUsing navigation framesThe optimal z-axis rotation step length at the current moment is deduced by the change of the ordinate proportion of the midpoint of the left side length and the midpoint of the right side length after last adjustment
Description 5: calculation of navigation directions of six degrees of freedom
For three translational degrees of freedom navigation directions, the calculation is as follows,
X-axis translational degree of freedom direction
Direction of freedom of translation of y-axis
Direction of freedom of translation of z-axis
Wherein b c andRespectively representing indication framesNavigation frameCenter point coordinates, i.e. And respectively representing the threshold values for judging convergence of the corresponding translational degrees of freedom, and taking 5 in the experiment.
For a three rotational degree of freedom navigation direction, the calculation is as follows,
Direction of rotational freedom of x-axis
Direction of rotational freedom of y-axis
Direction of freedom of z-axis rotationWherein the method comprises the steps ofThe respective thresholds representing the convergence of the corresponding rotational degree of freedom determinations were taken 5 in the experiment.
4) And (3) camera pose adjustment: and driving the mechanical platform to adjust the space pose of the camera according to the navigation direction and the navigation step length.
Based on the calculated navigation directionAnd navigation step length To guide the mechanical platform to adjust the pose. In practical applications, ambiguity exists between navigation of the x-axis translational and y-axis rotational degrees of freedom, navigation of the y-axis translational and x-axis rotational degrees of freedom. Specifically, taking the x-axis translational degree of freedom and the y-axis rotation as an example, when the y-axis rotates, the left and right side lengths of the navigation frame are changed, and the central coordinates of the navigation frame are also shifted, so that the invention cannot determine whether the pose difference on the x-axis translation exists at this time. Similarly, when the x-axis rotates to generate rotary motion, the invention cannot judge whether the y-axis has translational motion. In order to solve the ambiguity problem, the invention adopts a camera pose adjustment strategy of rotating first and then translating, namely when the rotation navigation direction of the y axis is not 0, the invention does not adjust the translation of the x axis, and also when the rotation navigation direction of the x axis is not 0, the invention does not adjust the translation of the y axis. The final mechanical stage motion adjustment in six degrees of freedom is therefore as follows:
5) If the navigation frame does not meet the convergence condition, j=j+1, jumping to step two, otherwise jumping to step six.
And after the pose of the camera is adjusted in the previous step, judging whether the navigation frame meets the convergence condition, and when the navigation directions of the six degrees of freedom are all 0, indicating that the navigation frame and the indication frame are highly overlapped, and finishing the repositioning of the camera under the scale. In particular, whenAnd is also provided withAnd is also provided withAnd is also provided withAnd is also provided withAnd is also provided withAnd when the convergence condition is met, jumping to the step six, otherwise, indicating that repositioning under the scale is not completed, j=j+1, and jumping to the step two to continue iterative adjustment.
6) If the current shooting scale is the minimum shooting scale, namely k=k, repositioning is completed; otherwise, let j=1, k=k+1, enter the next scale, jump to step two.
If the current scale reaches the microscopic scale of the repositioning requirement, namely k=k is satisfied, the camera repositioning has been completed for the reference image under the microscopic scale, the whole algorithm flow is ended, and the output is the current camera imageIn practical application, the invention generally adopts 3 scales (K=3) to relay so as to meet the micro-scale repositioning requirement.
2. Feasibility verification
The feasibility of the method according to the invention is verified in the following with reference to specific examples, described in detail below:
The invention performs comparative experiments on 10 scenes in total, and the scenes relate to 4 targets of electronic cigarette shells, porcelain, bronze ware and wine bottles. For each scene, the invention acquires 3 images with different scales (namely K=3) to construct a reference image set And the repositioning experiment of the scale relay is carried out by using the method of the invention. Wherein the microscopic photographing range is about 1 mm. The invention adopts a Lopa nationality stone (Rokae) Xmate ER mechanical arm as a mechanical platform to carry a camera.
The invention selects the characteristic point average distance (AFD) as an index for evaluating the repositioning effect. Specifically, the invention performs feature matching on the reference image and the relocated image, and the Euclidean distance average value between all matching point pairs is the AFD value. Smaller AFD values indicate higher repositioning accuracy.
Fig. 3 shows a comparison of the method of the present invention and the existing optimal camera repositioning method (ACT IVE CAMERA RE loca l izat ion from A SINGLE REFERENCE IMAGE out Hand-Eye Ca l ibrat ion) under the above experimental setup, since the existing method requires accurate camera internal reference calibration and cannot effectively reposition the micro-scale image on the one Hand, it cannot always converge during repositioning, and repositioning fails. The method can successfully realize the accurate repositioning of the camera of the microscale reference image by means of the scale relay strategy and the characteristic of no need of camera calibration.
Because the existing methods are primarily designed for macro-scale camera repositioning, the present invention also compares the present methods at the macro-scale to existing methods for a more fair comparison. The experimental setup is similar to the microscopic camera repositioning experiments, and the test is performed under 10 different scenes (the shooting range is about 30 cm), and the difference is that the method does not need scale relay under the macroscopic scale, and only needs to construct a reference image with one scale. Fig. 4 shows the experimental results, and the repositioning accuracy of the method is similar to that of the existing optimal method, so that the method can not only effectively solve the difficult problem of repositioning the camera in a microscopic scale, but also can be used for repositioning the camera in a macroscopic scale scene under the general condition, and has excellent universality and superiority.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, with reference to the description of method embodiments in part. The system embodiments described above are merely illustrative, wherein the units and modules illustrated as separate components may or may not be physically separate. In addition, some or all of the units and modules can be selected according to actual needs to achieve the purpose of the embodiment scheme. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The foregoing disclosure is merely illustrative of the preferred embodiments of the invention and the invention is not limited thereto, since modifications and variations may be made by those skilled in the art without departing from the principles of the invention.

Claims (4)

1. A micro-scale oriented active camera repositioning method, the method comprising the steps of:
step one: initializing: given historical reference captured image set Adjusting a current active camera and imageWith coincident viewing angles, construction of instruction boxesSetting k=1 to represent a scale index, and j=1 to represent an index of the camera adjustment times under a certain scale;
wherein: k represents the total number of dimensions, For the most macro-scale captured image,B 1,...,b4 is the coordinates of four vertexes of the upper left, the upper right, the lower left and the lower right of the indication frame respectively;
Step two: from acquiring current active camera images Reference imageGenerating a navigation frameThe navigation frame generation process in the second step comprises the following steps:
201: for the current image Reference imageFeature extraction and feature matching are carried out to obtain a series of matching point pairsWherein i is the index of the matching point pair;
202: based on all matching pairs Calculating a homography matrix H k,j;
203: indicating frame by homography matrix H k,j Transforming to obtain navigation frameWherein: instruction frameIs that Wherein: w and h respectively represent the width and height of the image; the superscript T denotes a transpose operation;
The indication frame is indicated by the following formula The middle coordinates are transformed by a homography matrix H k,j to obtain a navigation frameAll vertex coordinates;
wherein, Representing the current active camera image at the jth adjustment at the kth scale,Four vertex coordinates; m is more than or equal to 1 and less than or equal to 4, and bm andAll represent homogeneous coordinates:
step three: according to the navigation frame Indication frameGenerating six-degree-of-freedom navigation directions for active camerasAnd navigation step length
Wherein: rx, r y, rz respectively represent three rotational degrees of freedom of the x, y, z, tx, t y, tz respectively represent three translational degrees of freedom of the x, y, z;
step four: adjusting the space pose of the active camera according to the navigation direction and the navigation step length;
Step five: if the navigation frame does not meet the convergence condition that the navigation directions of the six degrees of freedom are all 0, j=j+1, jumping to the second step, otherwise jumping to the sixth step;
Step six: if the current shooting scale is the minimum shooting scale, namely k=k, repositioning is completed; otherwise, let j=1, k=k+1, enter the next scale, jump to step two.
2. The micro-scale oriented active camera repositioning method according to claim 1, wherein the six-degree-of-freedom navigation direction generating process of the camera in the step 3 comprises the following steps:
301: three translational degree of freedom navigation directions Is calculated by (1): navigation frameIndication frameThe center coordinates of (a) are respectivelyAnd b c, respectively usingAnd b c calculating the navigation direction of the translational degree of freedom of the x-axis and the y-axis according to the difference of the coordinates of the x-axis and the y-axisAndBy means ofAndCalculating the navigation direction of the z-axis translational degree of freedom from the area difference of the enclosed areasWherein: for three translational degrees of freedom navigation directions, the calculation is as follows,
X-axis translational degree of freedom direction
Direction of freedom of translation of y-axis
Direction of freedom of translation of z-axis
Wherein b c andRespectively representing indication framesNavigation frameCenter point coordinates, i.e.
302: Three rotational degrees of freedom navigation directionsIs calculated by (1): respectively according to the navigation frameLength difference between upper and lower sides, length difference between left and right sides,To calculate the navigation directions of three rotational degrees of freedom respectivelyWherein:
For a three rotational degree of freedom navigation direction, the calculation is as follows,
Direction of rotational freedom of x-axis
Direction of rotational freedom of y-axis
Direction of freedom of z-axis rotation
Wherein: Respectively represent the threshold value of convergence of the corresponding rotational degree of freedom judgment.
3. The micro-scale oriented active camera repositioning method according to claim 1, wherein the step 3 of generating the six-degree-of-freedom navigation step length of the camera comprises the following steps:
401: three translational degree of freedom navigation step Is calculated by (1): if the first translational movement at the current scale, i.e. j=1, then an initial translational step is givenAnd orderOtherwise, the change of the navigation frame after the last camera adjustment is adaptive to estimate the camera adjustment translation step length at the current moment, specifically comprising the following steps:
Wherein: area (·) represents a function of the calculation Area, and symbols (1) and (2) represent the 1 st and 2 nd elements of the corresponding vector;
402: three rotational degrees of freedom navigation step Is calculated by (1): if the first translational movement at the current scale, i.e. j=1, then an initial rotation step is givenAnd orderOtherwise, the camera adjustment rotation step length at the current moment is adaptively estimated according to the change of the navigation frame after the last camera adjustment, specifically comprising the following steps:
Wherein: sign I representation of the modulus of the vector is calculated, i.e. the length of the vector, AndRespectively representing the midpoint coordinates of the left frame and the right frame of the navigation frame, namelyAndAbove navigation frameThe optimal x-axis rotation step length at the current moment is deduced by the proportional change of the upper side length and the lower side length after the last adjustmentUsing navigation framesThe ratio change of the left side length and the right side length after the last adjustment is used for deducing the optimal y-axis rotation step length at the current momentUsing navigation framesThe optimal z-axis rotation step length at the current moment is deduced by the change of the ordinate proportion of the midpoint of the left side length and the midpoint of the right side length after last adjustment
4. The micro-scale oriented active camera repositioning method according to claim 1, wherein the step 4 adjusts the active camera spatial pose according to the navigation direction and the navigation step length, and comprises the following steps:
the camera pose adjustment strategy of firstly rotating and then translating is adopted, namely when the y-axis rotating navigation direction is not 0, the x-axis translation is not adjusted, and when the x-axis rotating navigation direction is not 0, the y-axis translation is not adjusted;
the motion adjustment amounts of the active camera in six degrees of freedom are as follows:
CN202310992687.0A 2023-08-08 2023-08-08 Micro-scale-oriented active camera repositioning method Active CN117036474B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310992687.0A CN117036474B (en) 2023-08-08 2023-08-08 Micro-scale-oriented active camera repositioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310992687.0A CN117036474B (en) 2023-08-08 2023-08-08 Micro-scale-oriented active camera repositioning method

Publications (2)

Publication Number Publication Date
CN117036474A CN117036474A (en) 2023-11-10
CN117036474B true CN117036474B (en) 2024-10-25

Family

ID=88629384

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310992687.0A Active CN117036474B (en) 2023-08-08 2023-08-08 Micro-scale-oriented active camera repositioning method

Country Status (1)

Country Link
CN (1) CN117036474B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105144196A (en) * 2013-02-22 2015-12-09 微软技术许可有限责任公司 Method and device for calculating a camera or object pose
CN106996769A (en) * 2017-03-22 2017-08-01 天津大学 A kind of active pose fast relocation method without camera calibration

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107564061B (en) * 2017-08-11 2020-11-20 浙江大学 Binocular vision mileage calculation method based on image gradient joint optimization
CN108596976B (en) * 2018-04-27 2022-02-22 腾讯科技(深圳)有限公司 Method, device and equipment for relocating camera attitude tracking process and storage medium
CN110501017A (en) * 2019-08-12 2019-11-26 华南理工大学 A kind of Mobile Robotics Navigation based on ORB_SLAM2 ground drawing generating method
CN112070831B (en) * 2020-08-06 2022-09-06 天津大学 Active camera repositioning method based on multi-plane joint pose estimation
CN112509027B (en) * 2020-11-11 2023-11-21 深圳市优必选科技股份有限公司 Repositioning method, robot, and computer-readable storage medium
CN113418527B (en) * 2021-06-15 2022-11-29 西安微电子技术研究所 Strong real-time double-structure continuous scene fusion matching navigation positioning method and system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105144196A (en) * 2013-02-22 2015-12-09 微软技术许可有限责任公司 Method and device for calculating a camera or object pose
CN106996769A (en) * 2017-03-22 2017-08-01 天津大学 A kind of active pose fast relocation method without camera calibration

Also Published As

Publication number Publication date
CN117036474A (en) 2023-11-10

Similar Documents

Publication Publication Date Title
CN110853075B (en) Visual tracking positioning method based on dense point cloud and synthetic view
US20180066934A1 (en) Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium
CN111612794B (en) High-precision three-dimensional pose estimation method and system for parts based on multi-2D vision
CN111801198A (en) Hand-eye calibration method, system and computer storage medium
CN102842117B (en) Method for correcting kinematic errors in microscopic vision system
WO2018159168A1 (en) System and method for virtually-augmented visual simultaneous localization and mapping
US12073582B2 (en) Method and apparatus for determining a three-dimensional position and pose of a fiducial marker
CN113256718B (en) Positioning method and device, equipment and storage medium
CN113361365B (en) Positioning method, positioning device, positioning equipment and storage medium
CN111754579A (en) Method and device for determining external parameters of multi-view camera
Lippiello et al. 3D monocular robotic ball catching with an iterative trajectory estimation refinement
CN117036474B (en) Micro-scale-oriented active camera repositioning method
CN106600651B (en) A kind of modeling method of imaging system
CN112712030A (en) Three-dimensional attitude information restoration method and device
Rougeaux et al. Robust tracking by a humanoid vision system
CN111915741A (en) VR generater based on three-dimensional reconstruction
CN110955958A (en) Working method of workpiece positioning device based on CAD model
CN117340879A (en) Industrial machine ginseng number identification method and system based on graph optimization model
Hemayed et al. The CardEye: A trinocular active vision system
Zhang et al. Tracking with the CAD Model of Object for Visual Servoing
CN109341530B (en) Object point positioning method and system in binocular stereo vision
Deng et al. High-precision control of robotic arms based on active visual under unstructured scenes
Hwang et al. Neural-network-based 3-D localization and inverse kinematics for target grasping of a humanoid robot by an active stereo vision system
Cai et al. A target tracking and location robot system based on omnistereo vision
CN111784771A (en) 3D triangulation method and device based on binocular camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant