CN115290121A - Unmanned aerial vehicle positioning precision testing method based on dynamic camera calibration - Google Patents

Unmanned aerial vehicle positioning precision testing method based on dynamic camera calibration Download PDF

Info

Publication number
CN115290121A
CN115290121A CN202210916477.9A CN202210916477A CN115290121A CN 115290121 A CN115290121 A CN 115290121A CN 202210916477 A CN202210916477 A CN 202210916477A CN 115290121 A CN115290121 A CN 115290121A
Authority
CN
China
Prior art keywords
positioning
unmanned aerial
camera
coordinate system
self
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202210916477.9A
Other languages
Chinese (zh)
Inventor
刘天豪
蔚保国
易卿武
何成龙
郝菁
熊华捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 54 Research Institute
Original Assignee
CETC 54 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 54 Research Institute filed Critical CETC 54 Research Institute
Priority to CN202210916477.9A priority Critical patent/CN115290121A/en
Publication of CN115290121A publication Critical patent/CN115290121A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0257Hybrid positioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides an unmanned aerial vehicle positioning accuracy testing method based on dynamic camera calibration, and belongs to the technical field of navigation positioning performance evaluation. The method comprises the following steps: initializing a dynamic capture camera; the node to be tested performs a test task in a camera coverage range and returns self-positioning information in real time; acquiring the real position of a node to be detected under a dynamic camera through a switch and special software; calculating coordinate transformation; and calculating a positioning error under a coordinate system of the dynamic camera. The method is simple to operate, strong in universality and high in practicability, and has a high application value in multi-node positioning test evaluation under indoor conditions.

Description

Unmanned aerial vehicle positioning precision testing method based on dynamic camera calibration
Technical Field
The invention belongs to the technical field of navigation positioning performance evaluation, and particularly relates to an unmanned aerial vehicle positioning accuracy testing method based on dynamic camera calibration.
Background
The test evaluation of the positioning accuracy is an indispensable key link of the navigation positioning system in the testing and practical application processes. The traditional satellite-guided positioning accuracy evaluation method generally places a measuring antenna on a reference point with known coordinates, and takes the coordinates of the point as real coordinates. This method needs to have good guard reception signal, no shielding and no electromagnetic interference in the test environment. However, in the environment where the satellite navigation signal is weak or invalid, such as near a building and indoors, it is difficult to accurately evaluate the positioning accuracy of a plurality of nodes equipped with devices with positioning functions, such as UWB, inertial navigation, and cameras. The main difficulties faced are: 1) The output results of the multiple positioning modes are different in structure and are not in the same coordinate system, so that the normalized positioning precision evaluation is difficult to perform; 2) The evaluation accuracy of the positioning accuracy cannot be guaranteed, if the anchor nodes are arranged indoors in a manual calibration mode, the inherent error is relatively large, and the accuracy cannot be guaranteed for the evaluation of the positioning accuracy of centimeter-level.
Disclosure of Invention
In view of the above, the invention provides an unmanned aerial vehicle positioning accuracy testing method based on dynamic capture camera calibration, and the method can be applied to an application scene with an indoor dynamic capture camera erection space, unavailable sanitation guidance and no precise surveying and mapping capability, and can realize multi-node positioning accuracy testing and evaluation.
In order to achieve the purpose, the invention adopts the technical scheme that:
an unmanned aerial vehicle positioning accuracy testing method based on dynamic camera calibration comprises the following steps:
(1) The method comprises the steps that a calibration rod is held by hands, a field is wound in the coverage range of an indoor dynamic capture camera, the dynamic capture camera is initialized, and the origin of a coordinate system is determined;
(2) Placing at least four unmanned aerial vehicles to be detected in the coverage area of the dynamic capture camera, and installing a base for sensing the dynamic capture camera on the unmanned aerial vehicles to be detected; the geometric center of the base is superposed with the geometric center of a transmitting antenna of the positioning and measuring equipment of the unmanned aerial vehicle to be measured;
(3) Starting the capturing camera, sensing the position information of each base through the moving capturing camera, transmitting the position information to the data processing computer, and converting the position information into coordinates under a coordinate system of the moving capturing camera by the data processing computer;
(4) Starting the unmanned aerial vehicle to be detected, carrying out self-positioning on the unmanned aerial vehicle to be detected in real time according to the observed quantity, transmitting a positioning result to a data processing computer, and converting the positioning result into a coordinate under a self-positioning coordinate system by the data processing computer;
(5) Performing least square fitting on the coordinates under the coordinate system of the mobile capturing camera and the coordinates under the self-positioning coordinate system to obtain a conversion relation between the self-positioning coordinates and the coordinates of the mobile capturing camera; according to the conversion relation, coordinate transformation is carried out on the self-positioning coordinate of the unmanned aerial vehicle to be detected, and the coordinate of the self-positioning result of the unmanned aerial vehicle to be detected in the coordinate system of the moving capture camera is obtained;
(6) And calculating the Euclidean distance between the self-positioning result coordinate and the position information coordinate obtained by the mobile capturing camera under the coordinate system of the mobile capturing camera to obtain the real-time positioning error of each unmanned aerial vehicle to be detected.
Further, the specific mode of the step (5) is as follows:
(501) Taking l non-coplanar unmanned aerial vehicles to be detected as common nodes of a self-positioning coordinate system and a movable capturing camera coordinate system, wherein l is more than or equal to 4, and the coordinates of the common nodes in the movable capturing camera coordinate system and the self-positioning coordinate system are respectively recorded as:
Figure BDA0003775916570000031
(302) Subtracting the first row from the 2 nd to the l th row of X' respectively to obtain:
Figure BDA0003775916570000032
(303) Subtracting the first row from the 2 nd row to the l th row of X, respectively, to obtain:
Figure BDA0003775916570000033
(304) Setting a rotation matrix between A and b as Q, and solving the rotation matrix Q by using a least square method;
(305) Carrying out coordinate transformation on the self-positioning coordinate of the unmanned aerial vehicle to be detected according to the rotation matrix to obtain the coordinate of the self-positioning result of the unmanned aerial vehicle to be detected in the coordinate system of the dynamic capture camera:
Figure BDA0003775916570000034
compared with the prior art, the invention has the beneficial effects that:
(1) The invention solves the problem of heterogeneous data normalization evaluation in different positioning modes, namely no matter other positioning modes such as UWB, inertial navigation, vision and the like, as long as the relative distance relation of the nodes to be measured can be obtained, the precision evaluation can be carried out by converting an observation coordinate system into a moving capture camera coordinate system through a fixed coordinate conversion framework;
(2) The invention can realize the real-time positioning precision evaluation in the dynamic operation process of the nodes, does not need to collect and then perform centralized processing on the data, and can display the positioning performance of each node in the test process in real time.
Drawings
FIG. 1 is a flow chart of a method of an embodiment of the present invention.
Detailed Description
The invention is further described below with reference to the drawings and examples.
As shown in fig. 1, a method for testing positioning accuracy of an unmanned aerial vehicle based on calibration of a dynamic capture camera includes the following steps:
(1) The method comprises the steps that a calibration rod is held by hands, a field is wound in the coverage range of an indoor dynamic capture camera, the dynamic capture camera is initialized, and the origin of a coordinate system is determined;
(2) Placing a plurality of unmanned aerial vehicles (more than or equal to 4) to be tested in the coverage area of the mobile capturing camera, and installing a base which can be sensed by the mobile capturing camera on the unmanned aerial vehicles, wherein the base needs to be concentric with a measuring equipment transmitting antenna of the unmanned aerial vehicles;
(3) Starting the capturing camera, sensing the dynamic information of the base through the capturing camera, transmitting the data to the special software of the capturing camera system on the data processing computer through the exchanger, and converting the data into a recognizable coordinate structure;
(4) Starting the unmanned aerial vehicle, carrying out an autonomous or manually controlled flight task, carrying out self-positioning in real time according to the observed quantity in the task process, transmitting a positioning result to a data processing computer in a wireless communication mode, and carrying out format standardization processing;
(5) Assuming that all unmanned aerial vehicles are not coplanar, taking the unmanned aerial vehicles as common nodes of a self-positioning coordinate system and a movable capturing camera coordinate system, calculating a rotation matrix from the self-positioning coordinate system of the unmanned aerial vehicles to the movable capturing camera coordinate system in a least square fitting mode, and performing coordinate transformation on the self-positioning coordinate according to the rotation matrix to obtain coordinates of the unmanned aerial vehicle group in the movable capturing camera coordinate system;
(6) And calculating Euclidean distances between the self-positioning coordinates and real coordinates output by the dynamic capture camera to obtain real-time positioning errors of the unmanned aerial vehicles.
Further, the error can be displayed on a screen, so that the positioning effect of the unmanned aerial vehicle at any moment in the flight process can be displayed in real time.
Further, the specific mode of the step (5) is as follows:
(501) Respectively recording the coordinates of the common node in a moving camera coordinate system and a self-positioning coordinate system as:
Figure BDA0003775916570000051
(502) Subtracting from the first row with 2 rows of X' gives:
Figure BDA0003775916570000052
(503) Subtracting the first row from 2 to l rows of X yields:
Figure BDA0003775916570000053
(504) Let the rotation matrix be Q, AQ = b, Q = (A) T A)\(A T b) In that respect However, since the self-positioning coordinates have errors, the rotation matrix A cannot be completely overlapped with the rotation matrix b after the rotation of the rotation matrix A, and therefore the rotation matrix Q is solved by the least square method;
(505) And rotating the A according to the rotation matrix, and reducing the rotated A into l groups of coordinates to finish coordinate conversion.
In a word, the method has the heterogeneous data test evaluation capability which is not restricted by a positioning mode, and after an initial application process is designed, various platforms can be evaluated only by simple base replacement and data format updating, so that the method is simple in operation process, strong in real-time performance, high in universality and practicability and high in evaluation accuracy.
The method can be used for application platforms with multi-node positioning capability such as unmanned aerial vehicles, robots and handheld terminals, achieves universal multi-node positioning accuracy assessment capability in indoor environment through the test processes of calibration, setting or selecting of common nodes, coordinate conversion and error analysis of the movable capturing camera, and has high application value for multi-node positioning test assessment under indoor conditions.

Claims (2)

1. An unmanned aerial vehicle positioning accuracy testing method based on dynamic camera calibration is characterized by comprising the following steps:
(1) The method comprises the steps that a calibration rod is held by hands, a field is wound in the coverage range of an indoor dynamic capture camera, the dynamic capture camera is initialized, and the origin of a coordinate system is determined;
(2) Placing at least four unmanned aerial vehicles to be detected in the coverage area of the dynamic capture camera, and installing a base for sensing the dynamic capture camera on the unmanned aerial vehicles to be detected; the geometric center of the base is superposed with the geometric center of a transmitting antenna of the positioning and measuring equipment of the unmanned aerial vehicle to be measured;
(3) Starting the capturing camera, sensing the position information of each base through the moving capturing camera, transmitting the position information to the data processing computer, and converting the position information into coordinates under a coordinate system of the moving capturing camera by the data processing computer;
(4) Starting the unmanned aerial vehicle to be detected, carrying out self-positioning on the unmanned aerial vehicle to be detected in real time according to the observed quantity, transmitting a positioning result to a data processing computer, and converting the positioning result into a coordinate under a self-positioning coordinate system by the data processing computer;
(5) Performing least square fitting on coordinates under a coordinate system of the mobile capturing camera and coordinates under a self-positioning coordinate system to obtain a conversion relation between the self-positioning coordinates and the coordinates of the mobile capturing camera; according to the conversion relation, coordinate transformation is carried out on the self-positioning coordinate of the unmanned plane to be detected, and the coordinate of the self-positioning result of the unmanned plane to be detected in the coordinate system of the dynamic camera is obtained;
(6) And calculating Euclidean distance between the self-positioning result coordinate and the position information coordinate obtained by the dynamic capture camera under a coordinate system of the dynamic capture camera to obtain the real-time positioning error of each unmanned aerial vehicle to be detected.
2. The unmanned aerial vehicle positioning accuracy testing method based on dynamic camera calibration as claimed in claim 1, wherein the specific manner of step (5) is as follows:
(501) Taking l non-coplanar unmanned aerial vehicles to be detected as common nodes of a self-positioning coordinate system and a movable capturing camera coordinate system, wherein l is more than or equal to 4, and the coordinates of the common nodes in the movable capturing camera coordinate system and the self-positioning coordinate system are respectively recorded as:
Figure FDA0003775916560000021
(502) Subtracting the first row from the 2 nd to the l th row of X' respectively to obtain:
Figure FDA0003775916560000022
(503) Subtracting the first row from the 2 nd row to the l th row of X, respectively, to obtain:
Figure FDA0003775916560000023
(504) Setting a rotation matrix between A and b as Q, and solving the rotation matrix Q by using a least square method;
(505) Carrying out coordinate transformation on the self-positioning coordinate of the unmanned aerial vehicle to be detected according to the rotation matrix to obtain the coordinate of the self-positioning result of the unmanned aerial vehicle to be detected in the coordinate system of the dynamic capture camera:
Figure FDA0003775916560000024
CN202210916477.9A 2022-08-01 2022-08-01 Unmanned aerial vehicle positioning precision testing method based on dynamic camera calibration Withdrawn CN115290121A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210916477.9A CN115290121A (en) 2022-08-01 2022-08-01 Unmanned aerial vehicle positioning precision testing method based on dynamic camera calibration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210916477.9A CN115290121A (en) 2022-08-01 2022-08-01 Unmanned aerial vehicle positioning precision testing method based on dynamic camera calibration

Publications (1)

Publication Number Publication Date
CN115290121A true CN115290121A (en) 2022-11-04

Family

ID=83825739

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210916477.9A Withdrawn CN115290121A (en) 2022-08-01 2022-08-01 Unmanned aerial vehicle positioning precision testing method based on dynamic camera calibration

Country Status (1)

Country Link
CN (1) CN115290121A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118376248A (en) * 2024-06-24 2024-07-23 成都铂升科技有限公司 Single-mark positioning technology for clustered robots under motion capture system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118376248A (en) * 2024-06-24 2024-07-23 成都铂升科技有限公司 Single-mark positioning technology for clustered robots under motion capture system

Similar Documents

Publication Publication Date Title
US10801843B2 (en) Indoor mobile robot position and posture measurement system based on photoelectric scanning and measurement method
US9182435B2 (en) Method and software for spatial pattern analysis
US9316486B2 (en) Method and apparatus for determining and storing the position and orientation of antenna structures
US20090284425A1 (en) Antenna test system
CN113074727A (en) Indoor positioning navigation device and method based on Bluetooth and SLAM
CN108490473A (en) A kind of the unmanned plane enhancing localization method and system of fusion GNSS and UWB
CN108775901B (en) Real-time SLAM scene map construction system, navigation system and method
CN112596048B (en) Method for accurately detecting position of low-speed unmanned aerial vehicle through radar photoelectric cooperation
CN111999730A (en) Black-flying unmanned aerial vehicle flyer positioning method and system
EP2005363A2 (en) Systems and methods for determining a global or local position of a point of interest within a scene using a three-dimensional model of the scene
CN108413966A (en) Localization method based on a variety of sensing ranging technology indoor locating systems
CN106370160A (en) Robot indoor positioning system and method
CN110806560B (en) Object positioning method and system, electronic equipment and readable storage medium
CN110926479A (en) Method and system for automatically generating indoor three-dimensional navigation map model
CN115290121A (en) Unmanned aerial vehicle positioning precision testing method based on dynamic camera calibration
CN111381267A (en) Positioning system and method based on RTK and WiFi combination
CN115876197A (en) Mooring lifting photoelectric imaging target positioning method
CN109146936B (en) Image matching method, device, positioning method and system
CN110274600B (en) Method, device and system for acquiring GPS (global positioning system) information of robot
CN117310627A (en) Combined calibration method applied to vehicle-road collaborative road side sensing system
CN115598660A (en) Space stereo easy-to-construct grid array Bluetooth position service device and method
CN110531397B (en) Outdoor inspection robot positioning system and method based on GPS and microwave
CN111913204B (en) Mechanical arm guiding method based on RTK positioning
CN113985906A (en) Vehicle-mounted mobile type calibration system and method based on unmanned aerial vehicle platform
KR101217855B1 (en) System for maintaining proper topography measurement information by measuring position level and distance between positions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20221104