CN113566846B - Navigation calibration method and device, electronic equipment and computer readable medium - Google Patents
Navigation calibration method and device, electronic equipment and computer readable medium Download PDFInfo
- Publication number
- CN113566846B CN113566846B CN202110830066.3A CN202110830066A CN113566846B CN 113566846 B CN113566846 B CN 113566846B CN 202110830066 A CN202110830066 A CN 202110830066A CN 113566846 B CN113566846 B CN 113566846B
- Authority
- CN
- China
- Prior art keywords
- road
- terminal
- navigation
- compass
- navigation image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000003190 augmentative effect Effects 0.000 claims abstract description 57
- 230000004044 response Effects 0.000 claims abstract description 21
- 238000001514 detection method Methods 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 13
- 239000000470 constituent Substances 0.000 claims description 7
- 239000003550 marker Substances 0.000 claims 2
- 238000012545 processing Methods 0.000 abstract description 8
- 239000011449 brick Substances 0.000 description 15
- 238000004891 communication Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 238000004422 calculation algorithm Methods 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000002955 isolation Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000004575 stone Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
Landscapes
- Engineering & Computer Science (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Navigation (AREA)
Abstract
The disclosure provides a navigation calibration method and a navigation calibration device, and relates to the technical fields of computer vision, image processing, augmented reality and the like. The specific implementation scheme is as follows: detecting whether the road section where the terminal is located is a straight road section or not based on the positioning information of the terminal; acquiring a navigation image to be augmented reality in response to the detected road section being a straight road section; identifying road facilities in the navigation image and collecting position information of the road facilities in the navigation image; calibrating the indication direction of a compass based on the position information and the indication direction of the compass in the terminal; and adding an augmented reality navigation mark in the navigation image based on the calibrated indication direction. This embodiment improves the accuracy of augmented reality navigation.
Description
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to the field of computer vision, image processing, augmented reality, and the like, and in particular, to a navigation calibration method and apparatus, an electronic device, a computer-readable medium, and a computer program product.
Background
The virtual 3D AR (Augmented Reality) identification attached to the three-dimensional space is superimposed in the live-action camera picture, so that navigation can be intuitively provided for a user.
In order to realize the AR navigation, the terminal needs to be tracked by an electronic compass, and a conversion relation between a geographic coordinate system and a coordinate system of a camera device of the terminal is calculated, and the conversion relation determines whether the virtual 3D navigation indicator can be attached to a real route. However, certain errors (usually up to ± 15 degrees) exist in the electronic compass during working, and the electronic compass is easily interfered by surrounding magnetic fields to form larger errors, so that the correctness of the conversion relation between the geographic coordinate system and the terminal coordinate system is influenced, and the virtual 3D navigation indicator is not attached to the real route.
Disclosure of Invention
A navigation calibration method and apparatus, an electronic device, a computer readable medium, and a computer program product are provided.
According to a first aspect, there is provided a navigation calibration method, the method comprising: detecting whether the road section where the terminal is located is a straight road section or not based on the positioning information of the terminal; acquiring a navigation image to be augmented reality in response to the detected road section being a straight road section; identifying road facilities in the navigation image and collecting position information of the road facilities in the navigation image; calibrating the indication direction of a compass based on the position information and the indication direction of the compass in the terminal; and adding an augmented reality navigation mark in the navigation image based on the calibrated indication direction.
According to a second aspect, there is provided a navigation calibration device, the device comprising: a detection unit configured to detect whether a road section where the terminal is currently located is a straight road section based on the positioning information of the terminal; an acquisition unit configured to acquire a navigation image to be augmented reality in response to detection that the road segment is a straight road segment; an identification unit configured to identify the asset in the navigation image and collect position information of the asset in the navigation image; a calibration unit configured to calibrate an indication direction of a compass in the terminal based on the position information and the indication direction of the compass; and the adding unit is configured to add the augmented reality navigation mark in the navigation image based on the calibrated indication direction.
According to a third aspect, there is provided an electronic device comprising: at least one processor; and a memory communicatively connected to the at least one processor, wherein the memory stores instructions executable by the at least one processor, the instructions being executable by the at least one processor to enable the at least one processor to perform the method as described in any one of the implementations of the first aspect.
According to a fourth aspect, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform a method as described in any one of the implementations of the first aspect.
According to a fifth aspect, there is provided a computer program product comprising a computer program which, when executed by a processor, implements a method as described in any of the implementations of the first aspect.
The navigation calibration method and the navigation calibration device provided by the embodiment of the disclosure are characterized in that firstly, whether a road section where a terminal is currently located is a straight road section is detected based on positioning information of the terminal; secondly, acquiring a navigation image to be augmented reality in response to the detected road section being a straight road section; thirdly, identifying the road facilities in the navigation image and collecting the position information of the road facilities in the navigation image; secondly, calibrating the indication direction of the compass based on the position information and the indication direction of the compass in the terminal; and finally, adding an augmented reality navigation mark in the navigation image based on the calibrated indication direction. Therefore, by determining the position information of the road facilities on the straight road section, a reliable calibration standard can be provided for the indication direction of the compass, the reliability of compass indication is ensured, the condition of route indication errors is avoided, and the guiding accuracy is improved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a flow chart diagram of one embodiment of a navigation calibration method according to the present disclosure;
FIG. 2 is a schematic illustration of a roadway asset in a navigation image according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of an error representation of an AR coordinate system and a geographic coordinate system in an embodiment of the present disclosure;
FIG. 4 is a schematic view of the constituent elements of an asset of an embodiment of the present disclosure;
FIG. 5 is a schematic block diagram of an embodiment of a navigation calibration device according to the present disclosure;
FIG. 6 is a schematic diagram of a structure of an identification unit in an embodiment of the present disclosure;
FIG. 7 is a schematic view of another structure of the recognition unit in the embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a structure of a calibration unit in an embodiment of the present disclosure;
FIG. 9 is a block diagram of an electronic device for implementing a navigation calibration method of an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
FIG. 1 shows a flow 100 of one embodiment of a navigation calibration method according to the present disclosure, the navigation calibration method comprising the steps of:
In this embodiment, the terminal may be a mobile terminal held by an object, the mobile terminal is provided with a camera device, scenes around the terminal or the object can be shot in real time through the camera device, when the object has a requirement for augmented reality navigation, an application supporting augmented reality display on the terminal is opened, and a navigation image superimposed with an augmented reality indication identifier can be viewed in real time on an interface corresponding to the application.
In this embodiment, the execution subject on which the navigation calibration method is executed may be a processor in the terminal, a server that communicates with the terminal, or other electronic device that communicates with the terminal, and the execution subject on which the navigation calibration method is executed may further implement the function of the application, and may further provide a virtual reality navigation function for the object by superimposing an augmented reality indicator corresponding to the travel route of the user in the navigation image to be augmented reality, so that the user experiences a display effect of 3D virtual reality in the navigation process.
In this embodiment, a terminal may be provided with a GPS (Global Positioning System), and when detecting a road segment where the terminal is currently located, the terminal may be obtained through the GPS on the terminal, where the terminal includes: the operation position of the object hand-held terminal, the duration of the terminal at each operation position, the specific time point of the terminal at each operation position, the related information of the terminal at each operation position, and the like.
In this embodiment, based on the positioning information of the terminal, detecting whether the road segment where the terminal is currently located is a straight road segment includes: and determining the running track of the terminal based on the positioning information, and determining the road section where the terminal is currently located as a straight road section when the shape of the running track on the current road section is a straight line.
Optionally, the detecting, based on the positioning information of the terminal, whether the road segment where the terminal is currently located is a straight road segment may further include: the method comprises the steps of acquiring road section information (road section shape, road section direction and the like) in advance, and detecting whether a road section where a terminal is located is a straight road section or not based on positioning information and the road section information acquired in advance.
And 102, responding to the detected road section as the straight road section, and acquiring a navigation image to be augmented reality.
In this embodiment, the navigation image to be augmented reality is an image that is captured in real time by the imaging device after the object opens the application supporting augmented reality on the terminal for navigation. The navigation image may be an image of the scene around the terminal or the subject, and the scene around the subject may be road facilities around roads, greening scenes, characters, and the like, as shown in fig. 2 in particular.
And 103, identifying the road facilities in the navigation image and collecting the position information of the road facilities in the navigation image.
In this embodiment, the road facility may be a road traffic facility, and the road traffic facility may be: the method comprises the steps that a roadway, a sidewalk, an isolation guardrail, a flat stone, a curbstone, a blind road, an isolation pile, various threading wells, signal lamps and the like are arranged, each road traffic facility has respective specification requirements, when a road section where a terminal is located is a straight road section, the road facilities on the straight road section are arranged in the directions of the straight road section, such as the parallel direction, the horizontal direction, the vertical direction and the like, after the road traffic facilities are shot in a navigation image of the straight road section, the actual direction of the road facilities in a geographic coordinate system can be determined according to the road traffic facilities with the specification requirements, and reliable calibration basis can be provided for the indication direction of a compass through the actual direction of the road facilities in the geographic coordinate system.
According to the preset characteristics of different road facilities, the image with the characteristics of the road facilities can be identified in the navigation image in real time through the image identification technology, so that whether the road facilities exist in the navigation image or not is determined. In this embodiment, identifying the road facility in the navigation image by the image recognition technology includes: whether the navigation image contains the road facilities is analyzed and judged based on a specific image recognition algorithm. The image recognition algorithm includes, but is not limited to, target recognition algorithms such as Fast Regions with CNNs features based on deep learning, SSD (single shot multi-box detector), yolo (young Only Look one), and other types of image target recognition algorithms.
In this embodiment, the location information is all information of the road facility related to the location in the navigation image, and specifically, the location information may include: the direction, position, size, shape of the asset, coordinate values of the respective pixels, etc. of the asset, the relationship between the asset and the straight section may be determined by the position information of the asset, for example, arranged in parallel along the straight section, or arranged vertically along the straight section, or arranged intermittently along the straight section, etc.
The national regulations specify the shape, style, size, and installation location, orientation, etc. of road facilities. The actual orientation of the asset in the geographic coordinate system may be determined by the pointing action of the asset itself. For example, when the road facility is a blind road a, as shown in fig. 2, the pattern of the blind road a has high recognizability: 1) The straight strip-shaped protruding bricks are used for indicating a straight road, and the straight strip-shaped protrusions point to the direction of the blind road; 2) The dot-shaped raised bricks are used for paving corners or bypassing impassable road facilities such as well lids/trees and the like. 3) Besides the special 3D convex shape, the blind road brick is bright yellow and is in sharp contrast with the color of other bricks of the sidewalk.
Assuming that when the object of the handheld terminal travels on a straight road segment, the camera in the terminal captures the blind road a as shown in fig. 2, and the road segment where the blind road a is located is in the true north-south direction, the actual direction of the blind road a in the geographic coordinate system will be the south or the north direction.
And 104, calibrating the indication direction of the compass based on the position information and the indication direction of the compass in the terminal.
In this embodiment, in the AR navigation process, the terminal needs to be tracked by hardware such as a three-axis attitude angle and acceleration device, and data such as six-degree-of-freedom displacement and attitude of the terminal in the coordinate system of the imaging device is output.
By comparing the position and the posture of the terminal under the coordinate system of the camera device at the same moment, the coordinate of the terminal under the geographic environment and the indication direction of the compass, the transformation relation between the coordinate system of the camera device of the terminal and the geographic coordinate system can be obtained. The two coordinate systems are aligned in direction and mainly realized by means of a compass of the terminal.
As an example, during AR navigation, an object travels on a road that is true north and south, and the geographic true north direction is aligned with the y-axis of the camera coordinate system of the terminal, then at this time the augmented reality indicator should be placed along the y-axis of the camera coordinate system of the terminal and aligned with the actual route that is true north and south. When the compass has errors, the y-axis of the coordinate system of the terminal camera cannot be correctly aligned with the true north direction under the geographic coordinate system.
In the AR navigation process, the direction drawn by the augmented reality navigation mark is the direction t1 of the AR coordinate system, the direction pointed by the straight road segment or the road facility is the actual direction t2 of the road facility in the geographic coordinate system, and the included angle formed by the extension and intersection between t1 and t2 is the error declination of the compass at this moment, as shown in fig. 3.
In this embodiment, in obtaining the navigation image issued by the camera device, first, the image contour identification may be used to obtain the road facility and the position information of the road facility in the navigation image, and determine the actual direction of the road facility, for example, pointing to the true north. Secondly, the direction of the AR coordinate system is calculated according to the indication direction of the real-time compass, such as 20 degrees to the east of true north. Again, the angle between the orientation of the AR coordinate system and the orientation of the geographic coordinate system is calculated, e.g., +20 degrees. Then the compass can be known to have a current 20 degree error and the indicated direction of the compass can be calibrated.
And 105, adding an augmented reality navigation mark in the navigation image based on the calibrated indication direction.
In this embodiment, after the direction of instruction of the compass is calibrated, the direction of instruction after calibration is obtained, on the basis of the direction of instruction after calibration, the augmented reality indicator is adopted to indicate the straight road section, and at this moment, the direction of the augmented reality indicator is consistent with the direction of instruction after calibration of the compass, so that the accuracy of the direction of the augmented reality indicator can be ensured, and further, the augmented reality indicator is superimposed in the navigation image, so that a more accurate direction indication effect can be presented for the object.
In this embodiment, increasing the augmented reality navigation identifier in the navigation image based on the calibrated indication direction includes: firstly, tracking a terminal through hardware such as an Inertial Measurement Unit (IMU) and the like to obtain the position and the posture of the terminal under a relative coordinate system. This relative coordinate system is typically related to the initial pose of the terminal when tracking is initiated. The relative coordinate system tracked by the terminal is aligned to the geographic coordinate system by comparing the direction of the terminal under the relative coordinate system at the same moment with the currently calibrated indication direction of the compass, and the conversion relation between the geographic coordinate system and the coordinates of the relative coordinate system is obtained. And judging the current running direction by combining the positioning information of the object or the terminal. And obtaining the direction of the running direction under the relative coordinate system of the terminal according to the conversion relation. And drawing an augmented reality navigation identifier according to the direction of the running direction in the relative coordinate system of the terminal, and adding the augmented reality navigation identifier into the navigation image to realize AR navigation.
In some optional implementations of this embodiment, adding an augmented reality navigation identifier in the navigation image based on the calibrated pointing direction includes: generating an augmented reality navigation mark with the direction consistent with the calibrated indication direction; and overlaying the augmented reality navigation mark in the navigation image based on the positioning information.
In this embodiment, the calibrated indication direction may be a direction obtained by adding a difference between the indication direction of the compass and the actual direction of the road facility in the geographic coordinate system to the indication direction of the compass. The direction of the augmented reality navigation mark is the direction of a coordinate system of the camera device, and the calibrated indication direction can be obtained by converting the direction of the augmented reality navigation mark through coordinates, so that the consistency of the augmented reality navigation mark and the calibrated indication direction is ensured.
In the embodiment, the augmented reality indication mark with the same direction as the calibrated indication direction is generated, and the generated augmented reality indication mark is superposed in the navigation image, so that the indication effect after augmented reality can be presented in real time, and the user experience is improved.
The navigation calibration method provided by the embodiment of the disclosure includes the steps that firstly, whether a road section where a terminal is located currently is a straight road section is detected based on positioning information of the terminal; secondly, acquiring a navigation image to be augmented reality in response to the detected road section being a straight road section; thirdly, identifying road facilities in the navigation image and collecting position information of the road facilities in the navigation image; secondly, calibrating the indication direction of the compass based on the position information and the indication direction of the compass in the terminal; and finally, adding an augmented reality navigation mark in the navigation image based on the calibrated indication direction. Therefore, by determining the position information of the road facilities on the straight road section, a reliable calibration standard can be provided for the indication direction of the compass, the reliability of compass indication is ensured, the condition of route indication errors is avoided, and the guiding accuracy is improved.
In some optional implementation manners of this embodiment, detecting whether the road segment where the terminal is currently located is a straight road segment based on the positioning information of the terminal includes: acquiring the shape of a road section to which the positioning information belongs in a preset navigation map; and detecting whether the road section where the terminal is currently located is a straight road section or not based on the positioning information and the shape of the road section.
Specifically, as shown in fig. 3, the position information of the terminal held by the object in the navigation map includes: the position coordinate is positioned on the road section b, and the shape of the road section b is a straight line; and when the shape of the road section b keeps a straight line all the time along with the change of the position coordinate value, determining that the road section where the terminal is located is a straight road section.
In the optional implementation mode, the shape of the road section to which the positioning information belongs is determined through the navigation map, so that whether the road section where the terminal is located is a straight road section or not is determined, and a reliable basis is provided for determining the condition of the road section where the terminal is located.
Optionally, the detecting, based on the positioning information of the terminal, whether the road segment where the terminal is currently located is a straight road segment includes: and acquiring a position coordinate in the positioning information, and determining that the road section where the terminal is currently located is a straight road section in response to that the change value of the abscissa (for example, east-west road section) or the ordinate (for example, north-south road section) of the position coordinate in the preset time is in a preset value range.
In some optional implementation manners of this embodiment, the detecting, based on the positioning information and the shape of the road segment, whether the road segment where the terminal is currently located is a straight road segment includes: in response to determining that the shape of the road section is a straight line, detecting whether the terminal is always located on the road section through the positioning information in a preset time period; and in response to determining that the terminal is always located on the road section, determining that the road section where the terminal is currently located is a straight road section.
In this embodiment, the preset time period may be set based on the object running speed and the display requirement of the navigation image, for example, the preset time is 2s.
In this optional implementation manner, when the road section is in a straight line shape and the terminal continuously operates on the current road section for the preset time period, it may be determined that the road section on which the terminal continuously operates is a straight road section, and it may be effectively determined from the road shape and the operation state of the terminal-holding object that the road section on which the terminal is currently located is a straight road section.
Optionally, the detecting, based on the positioning information and the shape of the road segment, whether the road segment where the terminal is currently located is a straight road segment further includes: in response to determining that the shape of the road segment is a non-straight line, such as a curved line, the road segment on which the terminal is currently located is determined to be a non-straight road segment by detecting that the terminal is located on the road segment by the positioning information.
Optionally, the detecting, based on the positioning information and the shape of the road segment, whether the road segment where the terminal is currently located is a straight road segment further includes: detecting whether the positioning information is consistent with position information of the road section in response to determining that the shape of the road section is a straight line; and determining that the road section where the terminal is currently located is a straight road section in response to the positioning information being consistent with the position information of the road section.
In some optional implementations of this embodiment, the identifying the road facility in the navigation image and collecting the position information of the road facility in the navigation image includes: determining the constituent units of the road facility in the navigation image; determining the outline of the road facility by the composition unit of the road facility; from the contour of the asset, the location information of the asset in the navigation image is identified.
In this optional implementation manner, different road facilities have different composition structures, for example, in fig. 4, the composition unit of the blind road a is a square brick a1, the composition unit of the sidewalk is a line segment, and the composition unit of the isolation guardrail c is a pillar c1, for different road facilities, the composition unit of the road facility may be first determined in the navigation image, after all the composition units of the road facility are identified, the outline of the road facility is outlined, and the extending state of the road facility may be determined by the outline of the road facility, so that the position information of the road facility in the navigation image may be accurately and reliably located.
In the optional implementation mode, the composition unit of the road facility is determined, the outline of the road facility is determined by the composition unit, and the position information of the road facility is determined based on the outline, so that the reliability and the accuracy of the position information identification are improved.
In some optional implementations of this embodiment, the identifying the road facility in the navigation image and collecting the position information of the road facility in the navigation image includes: determining characteristic parts of road facilities and characteristic colors of the characteristic parts; determining an area corresponding to the characteristic color in the navigation image; identifying a feature in a region; and positioning the characteristic part in the navigation image to obtain the position information of the characteristic part, and using the position information of the characteristic part as the position information of the road facility.
In this optional implementation, for road facilities having different feature portions, the position information of the road facility may be identified based on the feature portion of the road facility and the feature color of the feature portion.
Taking the blind road as an example, firstly, according to the characteristics of the blind road, the characteristic part of the blind road can be determined to be a straight strip-shaped raised brick or a point-shaped raised brick, and the color of the straight strip-shaped raised brick or the point-shaped raised brick is yellow. In this embodiment, a region having a yellow characteristic color in the navigation image may be determined first, and a straight-bar-shaped raised brick or a dot-shaped raised brick may be identified in the region, so that the position of the straight-bar-shaped raised brick or the dot-shaped raised brick may be obtained, and the position of the straight-bar-shaped raised brick may directly reflect the position of the blind sidewalk; or a plurality of dot-shaped raised bricks can directly reflect the position of the blind road and determine whether the blind road is in the north-south direction or the east-west direction on the straight road section.
In the optional implementation mode, the characteristic part of the road facility and the characteristic color of the characteristic part are determined, and the position information of the road facility is determined according to the characteristic part of the road facility and the characteristic color of the characteristic part, so that the reliability and the accuracy of identifying the position information are improved.
Optionally, the identifying the road facility in the navigation image and acquiring the position information of the road facility in the navigation image includes: and determining road facilities in the navigation image, dividing the areas of the road facilities in the navigation image, and determining the position information of the road facilities according to the direction of the areas where the road facilities are located in the navigation image.
In some optional implementation manners of this embodiment, the calibrating the indication direction of the compass based on the location information and the indication direction of the compass in the terminal includes: determining an actual direction of the asset in the geographic coordinate system based on the location information; determining an indicated direction of the asset based on the indicated direction of the compass and an actual direction of the asset; and calibrating the indication direction of the compass based on the indication direction of the road facility to obtain the calibrated indication direction.
In this optional implementation, when the position information of the road facility in the navigation image is obtained, the actual direction of the road facility in the geographic coordinate system may be obtained through coordinate transformation or the like, and the actual direction of the road facility in the geographic coordinate system may reflect the real terminal direction, and when the indicated direction of the compass is not consistent with the actual direction, it is determined that the indicated direction of the compass is shifted, and the compass needs to be calibrated.
In this embodiment, the indication direction of the road facility is used to indicate the running direction of the object of the handheld terminal, and since the road facility is generally in an extended state in the road, there are generally two actual directions of the road facility, which are opposite directions to each other, for example, north-south or east-west, the indication direction of the compass is biased to the indication of the road facility, and in order to determine the indication direction of the road facility more accurately, after the actual direction of the road facility is determined, a direction close to the indication direction of the compass in the actual directions may be selected as the indication direction of the road facility.
In the optional implementation mode, the actual deviation of the compass can be determined based on the position information of the road facility in the navigation image and the indication direction of the compass in the terminal, and the accuracy and the reliability of the calibration of the indication direction of the compass are improved.
Optionally, the calibrating the pointing direction of the compass based on the location information and the pointing direction of the compass in the terminal includes: determining a virtual direction of the compass in a coordinate system of the navigation image based on the indicated direction of the compass; determining an offset angle of the compass based on the virtual direction and the location information; and calibrating the indication direction of the compass based on the offset angle to obtain the calibrated indication direction.
In this embodiment, the offset angle may be converted to an angle in a geographic coordinate system, and the indication direction of the compass is calibrated based on the angle in the geographic coordinate system, so as to obtain a calibrated indication direction.
With further reference to fig. 5, as an implementation of the methods illustrated in the above figures, the present disclosure provides an embodiment of a navigation calibration apparatus, which corresponds to the embodiment of the method illustrated in fig. 1, and which is particularly applicable to various electronic devices.
As shown in fig. 5, the navigation calibration apparatus 500 provided in the present embodiment includes: the device comprises a detection unit 501, an acquisition unit 502, a recognition unit 503, a calibration unit 504 and an adding unit 505. The detecting unit 501 may be configured to detect whether the road segment where the terminal is currently located is a straight road segment based on the positioning information of the terminal. The acquiring unit 502 may be configured to acquire a navigation image to be augmented reality in response to detecting that the road segment is a straight road segment. The above-mentioned identifying unit 503 may be configured to identify the road facility in the navigation image and collect the position information of the road facility in the navigation image. The calibration unit 504 may be configured to calibrate the pointing direction of the compass based on the position information and the pointing direction of the compass in the terminal. The adding unit 505 may be configured to add an augmented reality navigation identifier in the navigation image based on the calibrated indication direction.
In the present embodiment, in the navigation calibration apparatus 500: the detailed processing of the detecting unit 501, the obtaining unit 502, the identifying unit 503, the calibrating unit 504, and the adding unit 505 and the technical effects thereof can refer to the related descriptions of step 101, step 102, step 103, step 104, and step 105 in the corresponding embodiment of fig. 1, which are not described herein again.
In some optional implementations of this embodiment, the detecting unit 501 includes: an acquisition module (not shown in the figure), and a detection module (not shown in the figure). The acquiring module may be configured to acquire a shape of a road segment to which the positioning information belongs in a preset navigation map. The detection module may be configured to detect whether the road segment where the terminal is currently located is a straight road segment based on the positioning information and the shape of the road segment.
In some optional implementation manners of this embodiment, the detection module includes: a detection sub-module (not shown), a determination sub-module (not shown). The detection submodule may be configured to detect whether the terminal is located on the road segment at all times through the positioning information in a preset time period in response to determining that the shape of the road segment is a straight line. The determining submodule may be configured to determine that the road segment where the terminal is currently located is the straight road segment in response to determining that the terminal is always located on the road segment.
In some optional implementations of this embodiment, as shown in fig. 6, the identification unit 600 includes: a location determination module 601, an area determination module 602, a location identification module 603, and a location positioning module 604. The location determining module 601 may be configured to determine a characteristic location of the road facility and a characteristic color of the characteristic location. The region determining module 602 may be configured to determine a region in the navigation image corresponding to the characteristic color. The above-mentioned part identification module 603 may be configured to identify a characteristic part in the region. The location positioning module 604 may be configured to position the feature in the navigation image, obtain the location information of the feature, and use the location information of the feature as the location information of the road facility.
In some optional implementations of this embodiment, as shown in fig. 7, the identification unit 700 includes: unit determination module 701, contour determination module 702, facility identification module 703. The unit determination module 701 may be configured to determine a constituent unit of the road facility in the navigation image. The contour determination module 702, as described above, may be configured to determine a contour of an asset from constituent elements of the asset. The facility identification module 703 may be configured to identify the location information of the road facility in the navigation image from the contour of the road facility.
In some optional implementations of this embodiment, as shown in fig. 8, the calibration unit 800 includes: an actual determination module 801, an indication determination module 802, a calibration module 803. The actual determination module 801 may be configured to determine an actual direction of the asset in the geographic coordinate system based on the location information. The indication determination module 802, as described above, may be configured to determine the indicated direction of the asset based on the indicated direction of the compass and the actual direction of the asset. The calibration module 803 may be configured to calibrate the pointing direction of the compass based on the pointing direction of the road facility, so as to obtain a calibrated pointing direction.
In some optional implementations of this embodiment, the adding unit 505 includes: a generating module (not shown in the figure), and a superposing module (not shown in the figure). The generating module may be configured to generate an augmented reality navigation identifier whose direction is consistent with the calibrated indication direction. The overlay module may be configured to overlay an augmented reality navigation identifier in the navigation image based on the positioning information.
In the navigation calibration device provided by the embodiment of the disclosure, first, the detection unit 501 detects whether a road section where a terminal is currently located is a straight road section based on the positioning information of the terminal; secondly, the obtaining unit 502 obtains a navigation image to be augmented reality in response to detecting that the road segment is a straight road segment; thirdly, the identifying unit 503 identifies the road facility in the navigation image and collects the position information of the road facility in the navigation image; from then, the calibration unit 504 calibrates the pointing direction of the compass based on the position information and the pointing direction of the compass in the terminal; finally, the adding unit 505 adds an augmented reality navigation mark in the navigation image based on the calibrated indication direction. Therefore, by determining the position information of the road facilities on the straight road section, a reliable calibration standard can be provided for the indication direction of the compass, the reliability of compass indication is ensured, the condition of wrong route indication is avoided, and the guiding accuracy is improved.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the personal information of the related user all accord with the regulations of related laws and regulations, and do not violate the customs of public sequences.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
FIG. 9 illustrates a schematic block diagram of an example electronic device 900 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 9, the apparatus 900 includes a computing unit 901, which can perform various appropriate actions and processes in accordance with a computer program stored in a Read Only Memory (ROM) 902 or a computer program loaded from a storage unit 908 into a Random Access Memory (RAM) 903. In the RAM903, various programs and data required for the operation of the device 900 can also be stored. The calculation unit 901, ROM 902, and RAM903 are connected to each other via a bus 904. An input/output (I/O) interface 905 is also connected to bus 904.
A number of components in the device 900 are connected to the I/O interface 905, including: an input unit 906 such as a keyboard, a mouse, and the like; an output unit 907 such as various types of displays, speakers, and the like; a storage unit 908 such as a magnetic disk, optical disk, or the like; and a communication unit 909 such as a network card, a modem, a wireless communication transceiver, and the like. The communication unit 909 allows the device 900 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The computing unit 901 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 901 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 901 performs the respective methods and processes described above, such as the navigation calibration method. For example, in some embodiments, the navigation calibration method may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 908. In some embodiments, part or all of the computer program may be loaded and/or installed onto device 900 via ROM 902 and/or communications unit 909. When the computer program is loaded into RAM903 and executed by the computing unit 901, one or more steps of the navigation calibration method described above may be performed. Alternatively, in other embodiments, the computing unit 901 may be configured to perform the navigation calibration method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable navigation calibration device such that the program code, when executed by the processor or controller, causes the functions/acts specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server with a combined blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel or sequentially or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.
Claims (17)
1. A navigation calibration method, the method comprising:
detecting whether a road section where the terminal is located is a straight road section or not based on the positioning information of the terminal;
acquiring a navigation image to be augmented reality in response to the fact that the road section is detected to be a straight road section;
identifying road facilities in the navigation image and collecting position information of the road facilities in the navigation image;
calibrating the indication direction of a compass in the terminal based on the position information and the indication direction of the compass; the calibrating the indication direction of the compass based on the position information and the indication direction of the compass in the terminal comprises: determining an actual direction of the asset in a geographic coordinate system based on the location information; calculating the direction of an AR coordinate system based on the indicated direction of the compass, calculating an included angle between the orientation of the AR coordinate system and the orientation of the geographic coordinate system based on the actual direction of the road facility, determining an error existing in the compass according to the included angle between the orientations, and calibrating the indicated direction of the compass based on the error;
and adding an augmented reality navigation mark in the navigation image based on the calibrated indication direction.
2. The method of claim 1, wherein the detecting whether the road section where the terminal is currently located is a straight road section based on the terminal-based positioning information comprises:
acquiring the shape of a road section to which the positioning information belongs in a preset navigation map;
and detecting whether the road section where the terminal is located is a straight road section or not based on the positioning information and the shape of the road section.
3. The method of claim 2, wherein the detecting whether the road segment where the terminal is currently located is a straight road segment based on the positioning information and the shape of the road segment comprises:
in response to determining that the shape of the road section is a straight line, detecting whether the terminal is always located on the road section through the positioning information in a preset time period;
and in response to determining that the terminal is located on the road section all the time, determining that the road section where the terminal is located currently is a straight road section.
4. The method of any of claims 1-3, wherein the identifying assets in the navigation image and acquiring location information of the assets in the navigation image comprises:
determining a characteristic part of the road facility and a characteristic color of the characteristic part;
determining an area corresponding to the characteristic color in the navigation image;
identifying the feature in the region;
and positioning the characteristic part in the navigation image to obtain the position information of the characteristic part, and taking the position information of the characteristic part as the position information of the road facility.
5. The method of any of claims 1-3, wherein the identifying assets in the navigation image and collecting location information of the assets in the navigation image comprises:
determining constituent units of the asset in the navigation image;
determining, by a component unit of the asset, a profile of the asset;
and identifying the position information of the road facilities in the navigation image according to the outline of the road facilities.
6. The method of claim 5, wherein the calibrating the indicated direction of a compass in the terminal based on the location information and the indicated direction of the compass comprises:
determining an actual direction of the asset in a geographic coordinate system based on the location information;
determining an indicated direction of the asset based on the indicated direction of the compass and an actual direction of the asset;
and calibrating the indication direction of the compass based on the indication direction of the road facility to obtain the calibrated indication direction.
7. The method of claim 6, wherein the augmenting augmented reality navigation indicia in the navigation image based on the calibrated pointing direction comprises:
generating an augmented reality navigation mark with the direction consistent with the calibrated indication direction;
and based on the positioning information, overlaying the augmented reality navigation mark in the navigation image.
8. A navigation calibration device, the device comprising:
the detection unit is configured to detect whether a road section where the terminal is currently located is a straight road section or not based on the positioning information of the terminal;
an acquisition unit configured to acquire a navigation image to be augmented reality in response to detection that the road segment is a straight road segment;
an identification unit configured to identify a road facility in the navigation image and acquire position information of the road facility in the navigation image;
a calibration unit configured to calibrate an indicated direction of a compass in the terminal based on the location information and the indicated direction of the compass; the calibrating the indication direction of the compass based on the position information and the indication direction of the compass in the terminal includes: determining an actual direction of the asset in a geographic coordinate system based on the location information; calculating the direction of an AR coordinate system based on the indicated direction of the compass, calculating an included angle between the orientation of the AR coordinate system and the orientation of the geographic coordinate system based on the actual direction of the road facility, determining an error existing in the compass according to the included angle between the orientations, and calibrating the indicated direction of the compass based on the error;
an adding unit configured to add an augmented reality navigation mark in the navigation image based on the calibrated indication direction.
9. The apparatus of claim 8, wherein the detection unit comprises:
the acquisition module is configured to acquire the shape of a road section to which the positioning information belongs in a preset navigation map;
a detection module configured to detect whether the road segment where the terminal is currently located is a straight road segment based on the positioning information and the shape of the road segment.
10. The apparatus of claim 9, wherein the detection module comprises:
a detection sub-module configured to detect whether the terminal is always located on the road segment through the positioning information for a preset time period in response to determining that the shape of the road segment is a straight line;
a determination submodule configured to determine that the road segment on which the terminal is currently located is a straight road segment in response to determining that the terminal is located on the road segment all the time.
11. The apparatus according to one of claims 8-10, wherein the identification unit comprises:
a location determination module configured to determine a feature location of the asset and a feature color of the feature location;
a region determination module configured to determine a region in the navigation image corresponding to the feature color;
a site identification module configured to identify the feature site in the region;
and the part positioning module is configured to position the characteristic part in the navigation image, obtain the position information of the characteristic part and use the position information of the characteristic part as the position information of the road facility.
12. The apparatus according to one of claims 8 to 10, the identification unit comprising:
a unit determination module configured to determine constituent units of the asset in the navigation image;
a contour determination module configured to determine a contour of the asset from constituent units of the asset;
a facility identification module configured to identify location information of the asset in the navigation image from an outline of the asset.
13. The apparatus of claim 12, wherein the calibration unit comprises:
an actual determination module configured to determine an actual direction of the asset in a geographic coordinate system based on the location information;
an indication determination module configured to determine an indicated direction of the asset based on the indicated direction of the compass and an actual direction of the asset;
a calibration module configured to calibrate the indicated direction of the compass based on the indicated direction of the asset, resulting in a calibrated indicated direction.
14. The apparatus of claim 13, wherein the increasing means comprises:
a generation module configured to generate an augmented reality navigation marker having a direction consistent with the calibrated pointing direction;
an overlay module configured to overlay the augmented reality navigation marker in the navigation image based on the positioning information.
15. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
16. A non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method of any one of claims 1-7.
17. A computer program product comprising a computer program which, when executed by a processor, implements the method of any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110830066.3A CN113566846B (en) | 2021-07-22 | 2021-07-22 | Navigation calibration method and device, electronic equipment and computer readable medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110830066.3A CN113566846B (en) | 2021-07-22 | 2021-07-22 | Navigation calibration method and device, electronic equipment and computer readable medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113566846A CN113566846A (en) | 2021-10-29 |
CN113566846B true CN113566846B (en) | 2022-11-04 |
Family
ID=78166236
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110830066.3A Active CN113566846B (en) | 2021-07-22 | 2021-07-22 | Navigation calibration method and device, electronic equipment and computer readable medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113566846B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104884895A (en) * | 2013-11-18 | 2015-09-02 | 宇龙计算机通信科技(深圳)有限公司 | Electronic compass calibrating method and terminal |
CN106774313A (en) * | 2016-12-06 | 2017-05-31 | 广州大学 | A kind of outdoor automatic obstacle-avoiding AGV air navigation aids based on multisensor |
CN110567475A (en) * | 2019-09-19 | 2019-12-13 | 北京地平线机器人技术研发有限公司 | Navigation method, navigation device, computer readable storage medium and electronic equipment |
CN112729327A (en) * | 2020-12-24 | 2021-04-30 | 浙江商汤科技开发有限公司 | Navigation method, navigation device, computer equipment and storage medium |
-
2021
- 2021-07-22 CN CN202110830066.3A patent/CN113566846B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104884895A (en) * | 2013-11-18 | 2015-09-02 | 宇龙计算机通信科技(深圳)有限公司 | Electronic compass calibrating method and terminal |
CN106774313A (en) * | 2016-12-06 | 2017-05-31 | 广州大学 | A kind of outdoor automatic obstacle-avoiding AGV air navigation aids based on multisensor |
CN110567475A (en) * | 2019-09-19 | 2019-12-13 | 北京地平线机器人技术研发有限公司 | Navigation method, navigation device, computer readable storage medium and electronic equipment |
CN112729327A (en) * | 2020-12-24 | 2021-04-30 | 浙江商汤科技开发有限公司 | Navigation method, navigation device, computer equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN113566846A (en) | 2021-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107328410B (en) | Method for locating an autonomous vehicle and vehicle computer | |
CN111833717B (en) | Method, device, equipment and storage medium for positioning vehicle | |
US8264537B2 (en) | Photogrammetric networks for positional accuracy | |
CN108318043A (en) | Method, apparatus for updating electronic map and computer readable storage medium | |
US9366765B2 (en) | Handheld GIS data collection device target augmentation | |
EP3667236B1 (en) | A method of determining position data | |
CN107328420A (en) | Localization method and device | |
US10997785B2 (en) | System and method for collecting geospatial object data with mediated reality | |
CN110515110B (en) | Method, device, equipment and computer readable storage medium for data evaluation | |
CN113570664A (en) | Augmented reality navigation display method and device, electronic equipment and computer medium | |
KR20210022343A (en) | Method and system for providing mixed reality contents related to underground facilities | |
CN111932611B (en) | Object position acquisition method and device | |
CN110018503B (en) | Vehicle positioning method and positioning system | |
EP3736610B1 (en) | Augmented reality system for electromagnetic buried asset location | |
EP4030143A2 (en) | Method and apparatus for navigating route, electronic device, computer readable medium | |
CN113592951A (en) | Method and device for calibrating external parameters of vehicle-road cooperative middle-road side camera and electronic equipment | |
CN111612851B (en) | Method, apparatus, device and storage medium for calibrating camera | |
CN113566846B (en) | Navigation calibration method and device, electronic equipment and computer readable medium | |
CN113566847B (en) | Navigation calibration method and device, electronic equipment and computer readable medium | |
CN107864510A (en) | A kind of indoor orientation method, terminal device and storage medium suitable for nuclear island of nuclear power station | |
KR101988278B1 (en) | Indication Objects Augmenting Apparatus using Base Point of 3D Object Recognition of Facilities and Buildings with Relative Coordinates of Indication Objects and Method thereof, and Computer readable storage medium | |
CN113218380B (en) | Electronic compass correction method and device, electronic equipment and storage medium | |
CN117606506A (en) | Vehicle positioning method, device, electronic equipment and medium | |
CN116229028A (en) | AR-based construction indication method and device, electronic equipment and storage medium | |
CN118015088B (en) | Object positioning method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |