WO2019068175A1 - Pose determination system and method - Google Patents
Pose determination system and method Download PDFInfo
- Publication number
- WO2019068175A1 WO2019068175A1 PCT/CA2018/051229 CA2018051229W WO2019068175A1 WO 2019068175 A1 WO2019068175 A1 WO 2019068175A1 CA 2018051229 W CA2018051229 W CA 2018051229W WO 2019068175 A1 WO2019068175 A1 WO 2019068175A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tags
- pose
- signals
- tag
- relative
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 230000000153 supplemental effect Effects 0.000 claims description 28
- 230000004044 response Effects 0.000 claims description 13
- 238000001514 detection method Methods 0.000 claims description 7
- 230000007340 echolocation Effects 0.000 claims description 5
- 230000000007 visual effect Effects 0.000 claims description 5
- 230000006870 function Effects 0.000 description 31
- 239000013598 vector Substances 0.000 description 8
- 238000005259 measurement Methods 0.000 description 7
- 241000282414 Homo sapiens Species 0.000 description 6
- 244000025254 Cannabis sativa Species 0.000 description 5
- 230000008901 benefit Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000007257 malfunction Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000005055 memory storage Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 241001494496 Leersia Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000009429 distress Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 231100001261 hazardous Toxicity 0.000 description 1
- 238000003973 irrigation Methods 0.000 description 1
- 230000002262 irrigation Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000004549 pulsed laser deposition Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000009182 swimming Effects 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 238000013024 troubleshooting Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0259—Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
- G05D1/0261—Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using magnetic plots
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/74—Systems using reradiation of radio waves, e.g. secondary radar systems; Analogous systems
- G01S13/76—Systems using reradiation of radio waves, e.g. secondary radar systems; Analogous systems wherein pulse-type signals are transmitted
- G01S13/765—Systems using reradiation of radio waves, e.g. secondary radar systems; Analogous systems wherein pulse-type signals are transmitted with exchange of information between interrogator and responder
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/87—Combinations of radar systems, e.g. primary radar and secondary radar
- G01S13/876—Combination of several spaced transponders or reflectors of known location for determining the position of a receiver
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
- G01S5/0247—Determining attitude
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
- G01S5/14—Determining absolute distances from a plurality of spaced points of known location
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/02—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using radio waves
- G01S3/14—Systems for determining direction or deviation from predetermined direction
- G01S3/28—Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived simultaneously from receiving antennas or antenna systems having differently-oriented directivity characteristics
- G01S3/30—Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived simultaneously from receiving antennas or antenna systems having differently-oriented directivity characteristics derived directly from separate directional systems
Definitions
- FIG. 1 is a schematic diagram of an illustrative autonomous vehicle location system
- pose is used in the sense of a particular way of a thing being positioned and oriented.
- a "pose” of a thing is determined with respect to one or more references, such as the Earth or a landmark or a marker or a tag.
- a pose may be determined in a particular coordinate system defined usually relative to one of these landmarks or objects or other references.
- a "pose” of an object includes the location of the object with respect to one or more references, and also includes a heading (in two dimensions-2D) or orientation (more generally) with respect to one or more references.
- the full orientation is comprised of heading, attitude, and bank angles.
- An autonomous vehicle is a machine that moves from place to place without human control or intervention.
- an autonomous vehicle may convey a human from place to place, while in other cases, the autonomous vehicle may be unable to convey a human.
- an autonomous vehicle may be under human control for part of the journey, and in other cases, the autonomous vehicle goes from place to place independent of human control.
- Autonomous vehicles may be of any size: ships at sea, vessels in space, motorized ground transportation, drones, some weapons or military hardware, carts, trains, automobiles, robotic home vacuum cleaners, and other robotic conveyances can be vehicles that may be, to a degree, autonomous.
- the tags may be, for example, on the perimeter or border of the defined area of operation, or in the defined area of operation, or near to the defined area of operation, or any combination thereof. Whether a tag is "near" a defined area of operation may be a function of the communication range of the tag.
- the tags will be essentially stationary with respect to the defined area of operation.
- the tags will also be comparatively low in functionality, that is, having specialized functions and less versatility than (for example) a general-purpose processor or a cellular telephone. Low functionality may have a number of potential benefits. For example, a tag with low functionality may require little or no external power to operate. As will be mentioned below in connection with an illustrative embodiment, power to the tags may be supplied by solar cells on the tags. There may be embodiments in which a tag is entirely passive, operating on the power received from the autonomous vehicle.
- the vehicle can then be moved to a new location where tag 1 ,tag 2 ,tag 3 can all be seen by the hub sensors, and the position of tag 3 relative to previously computed tag 2 and tag x positions can also be computed, and so on where the position of tag i+1 is computed relative to the previously computed positions of tag t and tag t _ x and then the vehicle is re-positioned anew.
- these measurements could be stored and post- processed as a batch-job to obtain an optimal set of positions in the least-squares sense.
- the tags need not be, and ordinarily are not, a part of the autonomous vehicle itself.
- an object located at an angle of zero degrees would be straight ahead, and an object located at an angle of 90 degrees ( ⁇ /2 radians) would be directly to the right (assuming the convention that 90 degrees is to the right and 270 degrees is to the left; the opposite convention also may be applied), and so on.
- the zero heading can be referenced to a feature of the vehicle (e.g. a direction perpendicular to its steering axle) or without loss of generality could be the normal to the hub's 3 sensor antennae.
- detecting the distances and relative angles to two or more tags is a matter of prudence; and as a practical matter, three or more tags are often useful. Detection of two or more tags (or three or more tags for 3D implementations) not only may improve accuracy and precision, it avoids situations in which a single-tag system will fail. For example, an autonomous vehicle, traveling in a circular path with the tag located at the center of the circle, will read that the distance to the tag is constant, and that the angle to the tag is constant (assuming the tag does not convey any angle information to the autonomous vehicle).
- FIG. 1 is a schematic diagram of a typical autonomous vehicle pose-detecting system 10.
- the system 10 includes an autonomous vehicle 12, which is a machine that typically includes mechanical and electronic components.
- the system also includes two tags 14A and 14B that serve as reference nodes. Although two tags are shown in FIG. 1 , any number of tags may be employed. (A generic tag may be identified by reference numeral 14.)
- the autonomous vehicle 12 may be a vehicle of any kind.
- a typical autonomous vehicle 12 will be described as a mower that includes apparatus to mow an outdoor field.
- the outdoor field may be thought of as the defined area of operation of the mower. It may be undesirable for the autonomous vehicle 12 to operate autonomously outside the boundaries of the outdoor field.
- Other examples of an autonomous vehicle 12, by no means the only examples, include agricultural equipment, irrigation equipment, cleaning equipment, moving/conveying equipment, and delivery equipment.
- ground-based autonomous vehicles will be discussed, alternative embodiments include water-based autonomous vehicles (including those that float or submerge) and air-based autonomous vehicles (such as low-flying drones).
- the autonomous vehicle 12 may employ any form of propulsion (such as petroleum-powered, electric, wind-propelled) and may be of any configuration or size.
- the autonomous vehicle 12 may be configured to convey one or more human beings, or not.
- the autonomous vehicle 12 may include one or more pieces of functional apparatus 32 according to its general purposes; in the case of a mower, for example, the functional apparatus 32 may include specialized equipment for mowing.
- the functional apparatus 32 includes one or more kinds of mobility apparatus 34, which convey the autonomous vehicle 12 from place to place (often within the defined area of operation, but the mobility apparatus 34 may convey the autonomous vehicle 12 from place to place outside the defined area of operation as well).
- Mobility apparatus 34 may also include apparatus that steers the autonomous vehicle 12 that governs the speed of the autonomous vehicle 12, that brakes the autonomous vehicle 12, or other components that make the autonomous vehicle 12 function as a vehicle.
- Mobility apparatus 34 may include various things such as one or more wheels, propellers, motors, fuel supplies, batteries or other power-related components, rudders, and so forth.
- the tags 14 may be deployed at any known positions inside the defined area of operation, or on the perimeter or border of the defined area of operation, or proximate to the defined area of operation.
- the tags 14 may be mounted upon dedicated pedestals, i.e. , supporting structures that hold the tags 14 in fixed positions relative to the defined area of operation (and that may have other functionality); the tags 14 may be mounted upon already-existing structures in fixed positions relative to the defined area of operation (such as fence posts, streetlights, buildings, trees, and so on); or any combination thereof.
- the tags 14 may be deployed at any height above the ground; for some installations, for example, one meter above the ground might be a typical height for all tags 14, while for another installation, some tags 14 may be positioned higher above the ground while others are positioned lower.
- the autonomous vehicle 12 includes a processor 22 (or more generally a "controller” which may include plural processors with dedicated functions) that receives as input the return signals 20A, 20B or signals from the antenna hub 16 that are functions of the received return signals 20A, 20B. As a function of this input, the processor computes, infers, calculates, measures, or otherwise determines the pose of the autonomous vehicle 12 with respect to the tags 14, and with respect to the defined area of operation. As used herein, a first thing (such as an output) is computed or otherwise determined "as a function of" a second thing (such as an input), when the first thing is directly or indirectly dependent upon the second thing; the first thing may be, but need not be, dependent exclusively upon the second thing.
- a first thing such as an output
- a second thing such as an input
- the antenna hub 16 may be in any of several configurations, and can comprise a plurality of hub antennas.
- the plurality of hub antennas can be operatively connected to any suitable receiver/transceiver radio, including an ultra-wideband (UWB) receiver/transceiver radio, such as an integrated USB radio system like the DW1000 available from DecaWave of Dublin, Ireland, for example.
- UWB ultra-wideband
- One illustrative configuration includes three omnidirectional antennas deployed on the vertices of a triangle, such as an equilateral triangle. The distance from one antenna to another may be known to a good degree of precision.
- the signal When a signal is received from a tag 14, the signal may be received by a first hub antenna in the antenna hub 16 first, and by a second hub antenna in the antenna hub 16 later, after a tiny but measurable delay.
- an angle of the tag 14, with respect to the orientation of the autonomous vehicle 12 or the antenna hub 16 can be computed. (Other parameters of interest may be computed or otherwise determined as well.)
- Distance of the tag 14, with respect to the orientation of the autonomous vehicle 12 may be computed on the basis of the received signals in a number of ways.
- One way involves each tag 14 in the system 10 transmitting its response 20 in a manner to reduce interference among responses from several tags.
- Tag 14B shows illustrative components of a tag 14.
- An antenna 36 may detect or receive electromagnetic signals 18 from the autonomous vehicle 12, and transmit electromagnetic signals 20 to the autonomous vehicle 12.
- a processor 38 may process the electromagnetic signals 18 detected or received by the antenna 36, and may record the time that the signals 18 were received according to an on-board clock 40. (Various tags 14 in the system 10 may synchronize their clocks 40 with one another, but this is not necessary.) Any data, such as the time a signal 18 was received or information about the tag 14B itself, may be stored in memory 42.
- the tag processor 38 may be of any type, there may be practical advantages for the processor 38 to have limited capability or low functionality, as mentioned previously.
- the tag 14B may have a power supply 44, which may include one or more power sources such as a battery, a solar power array, connection to an electrical grid, and so forth.
- the tag 14B may be configured to operate automatically in a variety of power modes, such as operating in a low-power operating state for much of the time, and automatically switching to a high-power operating state after detecting a signal from an autonomous vehicle 12, and automatically switching back to a low-power operating state after responding to the signal from the autonomous vehicle 12. Operating much of the time in a low-power operating state conserves power for times when more power is useful.
- the distance of the antenna hub 16 to a tag 14 is a function of the time it takes for an electromagnetic signal 20 (traveling at the speed of light) to travel from the antenna 36 of a tag 14 to the antenna hub 16. There are numerous ways in which this travel time can be measured.
- signal 18 transmitted by the antenna hub 16 may be a polling signal. This same polling signal may be broadcast to all tags 14 in range.
- a tag 14 may (for example) change from a low-power operating state to a high-power operating state, and record the time at which the signal 18 was received.
- each tag 14 may wait until its assigned time window to transmit its response signal 20.
- the response signal 20 may include an identification of the tag 14 sending the response signal 20, the time at which the signal 18 from the antenna hub 16 was received, as well as the time at which the response signal 20 was sent (both times according to the on-board clock 40 used by the tag 14). This response signal may be received by the antenna hub 16.
- one or more error-correction techniques may be applied to measure the time in which it took for the signal from the tag 14 to reach the antenna hub 16.
- the signals' three hub antennae are synced to the same local oscillator, then all single- difference common mode range errors cancel out.
- time for the signal to travel from the tag 14 to the antenna hub 16 is known, then the distance from the tag 14 to the antenna hub 16 is also known (linear distance traveled by an electromagnetic signal is the travel time multiplied by the speed of light).
- a defined area of operation should be sized and shaped so as not to have any locations in which the autonomous vehicle will be out of range of all of the tags. It may be a criterion for layout of a defined area of operation, in one example, that all locations in the defined area of operation be at least 70 meters from at least two tags. In another example, it may be specified that all locations in the defined area of operation be at least 50 meters from at least three tags.
- the autonomous vehicle 12 may compute the distance to one or more tags 14, as well as the angle of each tag relative to the autonomous vehicle 12. With information about distance and angle, the autonomous vehicle 12 may compute the pose of the autonomous vehicle 12 relative to the tags 14, and relative to the defined area of operation. Alternatively, as clarified above, pose may be determined using distance but without detecting angle.
- the processor 22 can control the operation of the autonomous vehicle 12 as a function of the pose (or as a function of any parameters related to or derived from the location and orientation). Examples of controlling the operation include turning a corner, avoiding an obstacle, increasing/decreasing speed, or activating/deactivating some of the functional apparatus 32.
- the autonomous vehicle may include an on-board clock 26.
- On-board clock 26 may keep time internally or in reference to external time signals (such as wireless network signals or global positioning system (GPS) signals), or both.
- external time signals such as wireless network signals or global positioning system (GPS) signals
- the processor 22 in FIG. 1 may be, but need not be, a single discrete component of the autonomous vehicle 12.
- the processor 22 may be a general-purpose processor (configured to perform one or more operations by executable instructions), or a specialized processor, or any combination of processing elements.
- the operations of the processor 22 may be distributed among multiple components.
- various processing functions may be divided among multiple elements (for example, some processing may be performed by supplemental location apparatus 28, as discussed below).
- components such as the clock 26 may be included in the processor 22, and need not be embodied as discrete components.
- memory 24 need not be embodied as a single discrete component.
- memory may include one or more memory elements that are physically separated from the autonomous vehicle 12, with data and instructions conveyed wirelessly (for example) to the autonomous vehicle 12.
- the autonomous vehicle 12 may determine its pose with respect to the defined area of operation by measuring the linear distance from the antenna hub 16 to any number of tags 14, and measuring the angular displacement of the tags.
- pose may be determined using distance but without detecting angle.
- the linear distance from the autonomous vehicle 12 to a tag 14 is a function of the time it takes a signal to travel from the autonomous vehicle 12 to the tag 14; it is also possible to think of the linear distance between the autonomous vehicle 12 and the tag 14 as being a function of the time it takes for a signal to travel from the autonomous vehicle 12 to the tag 14 and the time it takes for a reply signal to travel from the tag 14 to the autonomous vehicle 12.
- Electromagnetic signals travel at the speed of light, so the time it takes for a signal to go from one site to another is a function of the distance between the sites.
- time computations are performed by the autonomous vehicle 12.
- the autonomous vehicle 12 in effect transmits a signal and measures the time it takes to receive a reply from a tag 14. This measured time is a function of the distance from the autonomous vehicle 12 to the tag 14.
- Determination of the angle of a tag 14 relative to the autonomous vehicle 12 may be accomplished by any of several techniques.
- a comparatively uncomplicated technique may involve the antenna hub 16 having two or more antennas, disposed apart from one another by a known or fixed distance.
- a reply signal from a tag 14 may be received by the two antennas at two times, and the difference between the two times is the time difference.
- the relative angle is a function of the time difference.
- the method may additionally comprise measuring a phase and time of arrival of a signal, such as an ultrawideband signal, transmitted by the tag for each of the plurality hub antennae; and determining the differential phase of arrival, differential time of arrival, time angle of arrival and phase angle of arrival for each of the plurality of hub antennae; and determining a location of the tag relative to the plurality of hub antennae using the phase angle of arrival and range of the tag for each of the respective hub antennae.
- determining the location of the tag may comprise determining a three dimensional (or 3D) location of the tag relative to each of the plurality of hub antennae, using the phase angle of arrival and range of the tag for each of the hub antennae.
- three hub antennas may be used in combination as two or three pairs of antenna elements, to determine a 3D location of the tag using the phase angle of arrival and range of the tag for each of the two or three pairs of antenna elements.
- determining the location of the tag may comprise determining an aggregation or average of a plurality of determined locations using the phase angle of arrival and range for each of the two or more respective pairs of hub antennae.
- system 10 may desirably provide for location of each of the tags 14 in the plurality of tags by means of determining an angle of arrival of the inbound signal with respect to the antenna hub 16, which may be combined with a range of tags 14 from the antenna hub 16 to calculate a relative position of each of the plurality of tags 14 with respect to antenna hub 16, such as recited according to aspects of the presently disclosed methods described in further detail below.
- system 10 may be adapted for implementation of embodiments of the present inventive methods according to the disclosure which provide for using a differential time of arrival of an inbound between the first and second hub antenna to determine a differential time angle of arrival, which may desirably be used in combination with a multi-lobe differential phase angle of arrival beam pattern calculated for the phase difference of arrival of the inbound between the hub antennas, such as to disambiguate the multi-lobe phase angle of arrival beam pattern, and provide for a desirably more precise disambiguated phase angle of arrival of the inbound signal relative to the first and second hub antennas.
- system 10 may desirably provide for improved accuracy and precision for locating the position of tags 14 relative to the first and second hub antennas, than may be provided using time of arrival methods alone.
- system 10 may desirably provide for use of an antenna hub 16 having a plurality of sparsely spaced hub antennas which may be widely spaced relative to the wavelength of the UWB carrier wave signal such as to provide for greater position determination accuracy for a particular precision of time and/or phase differential measurement at the first and second hub antennas.
- the antenna hub 16 may optionally also be configured to transmit an outbound signal for reception by the tags 14.
- outbound signal may be used as a polling signal such as to initiate a response by tags 14 by transmission of inbound signal, for example.
- the outbound signal may be used in connection with the inbound signal to provide for a round trip time of flight measurement for determining a range of tags 14 relative to the antenna hub 16, for example.
- the outbound signal may be used in conjunction with the inbound signal and/or optionally also with calibration signal 30 to allow for synchronization of time measurements or to account for clock drift between tags 14 and the antenna hub 16, or to measure and/or calculate error or calibration data such as interference, reflection, multipath, distortion, attenuation or other factors involving the transmission of UWB signals by system 10.
- the system can be employed to passively track a movable object within a defined area of operation.
- the antenna hub can be affixed to the movable object, such as a person or animal, and the processor can be simply employed to determine the pose of the movable object relative to the plurality of tags.
- Supplemental location apparatus 28 may include any of several kinds of location apparatus that may be used in the event that the antenna hub-tag system (or the distance-and-angle technique) may be inadequate for brief or extended periods of time.
- An example of a supplemental location apparatus 28 may be, for example, a GPS receiver, such as a conventional GPS receiver or a real-time kinematic (RTK) receiver.
- GPS receiver such as a conventional GPS receiver or a real-time kinematic (RTK) receiver.
- supplemental location apparatus 28 may include one or more inertial sensors, or an echolocation apparatus (such as radar or sonar), or a compass, or an odometer, or a gyroscope, of a wheel sensor/encoder, or a visual sighting system, or a remote-operator-assisted piloting system or an altimeter. Some kinds of supplemental location apparatus 28 may be useful for determining location but not orientation, some may be useful for determining orientation but not location, and some may be useful for determining both. In particular if using an accelerometer to determine pitch and roll, it is possible to use only two tags to obtain a 3D pose.
- the supplemental location apparatus 28 may generate one or more signals as a function of the thing being detected, which in turn is a function of the pose of 30 the autonomous vehicle 12.
- the processor 22 may determine the pose of the autonomous vehicle 12 (in or outside the defined area of operation) as a function of the signal generated by the supplemental location apparatus 28.
- the antenna hub 16 may lose contact with one or more tags 14 or may fail to receive signals 20 from one or more tags 14. Loss of contact may be due to any number of reasons, such as an object that happens to be interposed between the antenna hub 16 and one or more tags 14 (interfering with line- of-sight or interfering with signals between the antenna hub 16 and one or more tags 14), or damage to a tag 14, or interference from a weather condition, or breakdown or malfunction of the antenna hub 16. Conditions such as any of these may result in outages of the distance-and-angle system. The outages may be momentary, or brief, or extended.
- Supplemental location apparatus 28 may be used during an outage for purposes of pose correction, or for emergency operation, or for bringing the autonomous vehicle 16 to a safe stop, or returning the autonomous vehicle 16 to home location, or guiding the autonomous vehicle 16 away from a hazard, or changing the operating mode of the vehicle from autonomous to user-controlled, for example. Supplemental location apparatus 28 may also be used when there is no outage.
- the autonomous vehicle 12 may use previous data and computations to move about when (for example) contact with all tags (or all but one tag) is lost.
- the processor 22 having previously computed the position and heading and having information from devices such as a compass or wheel or a vertical gyro, may extrapolate position and heading. If contact with the tags is reestablished within a reasonable time, the processor 22 may correct for errors (if any) and the autonomous vehicle 12 may proceed as before. If contact with the tags is not re-established within a reasonable time, the autonomous vehicle 12 may take some other action, such as shutting down or issuing a distress call. The autonomous vehicle 12 may also call upon supplemental location apparatus 28 for assistance with navigating.
- the supplemental location apparatus 28 may have deficiencies of its own. Some supplemental location apparatus 28 may be too costly to operate at all times, or may be susceptible to becoming unreliable in certain environments or bad weather, for example. Even so, if the distance-and-angle techniques develop trouble, the supplemental location apparatus may under some circumstances be able to keep the trouble from becoming worse.
- the autonomous vehicle 12 may include input/output (I/O) devices 30 other than those on the antenna hub 16 or the supplemental location apparatus 28.
- I/O devices 30 may be of any kind; input may be received and output transmitted wirelessly, audibly, visually, haptically, or in any combination thereof, or in other fashions.
- Examples of other I/O devices 30 include a radio receiver, an alarm, a warning light, a keypad, user controls, and an emergency stop switch.
- a defined area of operation may be defined or otherwise established.
- One illustrative technique involves having the tags 14 deployed proximate to the expected defined area of operation, and manually positioning or guiding the autonomous vehicle 12 around the perimeter of the defined area of operation. As the autonomous vehicle 12 moves around the perimeter, the autonomous vehicle 12 notes the position of the tags 14. Once the perimeter is closed, the autonomous vehicle 12 has information about the boundaries of the defined area of operation, and the positions of the tags 14 with respect to the boundaries. From this information, the autonomous vehicle 12 may create a map of the defined area of interest.
- Another technique may include manually positioning or guiding the autonomous vehicle 12 to vertices of the defined area of operation.
- a further technique may involve moving the autonomous vehicle 12 proximate to a hazard, and instructing the autonomous vehicle 12 to avoid the hazard. Still a further technique may entail the autonomous vehicle automatically following physical perimeter markers, such as a fence, and regarding the area inside the perimeter markers as the defined area of operation.
- the autonomous vehicle 12 may be instructed to mow this defined area of operation.
- the mowing operations need not be uniform throughout the entire playing area.
- the autonomous vehicle 12 may be instructed to avoid the green and the hazards entirely, for example, and do no mowing operations there.
- the autonomous vehicle 12 may be instructed to mow the grass in the fairway to a shorter length than the grass in the rough, etc.
- FIG. 2 is an illustrative map showing overlapping defined areas of operation on a golf course.
- FIG. shows one (first) playing area 50 on an illustrative golf course, and a neighboring (second) playing area 52 on the same golf course.
- Each playing area may have its own tee area, fairway, rough, green, and hazards; and the layout of these features will be different for every play area on the golf course.
- An autonomous vehicle 12 that functions as a mower may be used to mow the grass in the various playing areas, while avoiding hazards and cutting the grass to desired lengths at various sites.
- tags 14 Deployed on the golf course are several tags 14; in FIG. 2, eight illustrative tags 54, 56, 58, 60, 62, 64, and 66 are shown. Some of the tags may be deployed in trees proximate to the playing areas, others may be deployed on dedicated pedestals, and others may be deployed in other fashions.
- tags 54, 56, 58, 60, 62, and 66 define the first defined area of operation 68, and the first defined area of operation 68 is related to the first playing area 50.
- tags 60. 62, 64, 66, and other tags define the second defined area of operation 70, and the second defined area of operation 70 is related to the second playing area 52.
- the first and second defined areas of operation 68, 70 do not overlap geographically, though the first and second defined areas of operation 68, 70 may share a tag 62.
- an autonomous vehicle 12 that performs functions on the first defined area of operation 68 may be called upon to perform similar functions on the second defined area of operation 70.
- an autonomous vehicle 12 may move autonomously from the first defined area of operation 68 to the second defined area of operation 70, such movement may be aided by creation of a third defined area of operation 72, which geographically overlaps the first defined area of operation 68 and the second defined area of operation 70.
- the boundaries of the third defined area of operation 72 essentially correspond to tags 58, 60, 64, and 66, and tag 62 is positioned well within and away from the perimeter of the third defined area of operation 72.
- the autonomous vehicle 12 may autonomously terminate its functions in the first defined area of operation 68 and begin its functions the third defined area of operation 72.
- the autonomous vehicle 12 may finish its work in the first defined area of operation 68, and then move to that part of the first defined area of operation 68 that overlaps the third defined area of operation 72.
- the autonomous vehicle 12 may then begin its mowing operations in area between the first defined area of operation 68 and the second defined area of operation 70, such as mowing the grass in the region 74 between the playing areas 50, 52.
- the autonomous vehicle 12 may move directly to the second defined area of operation 70, by moving to that part of the third defined area of operation 72 that overlaps the second defined area of operation 70.
- the autonomous vehicle 12 may finish its work in the third defined area of operation 72 (which may comprise mowing or simply moving from one defined area of operation to another), moving to that part of the third defined area of operation 72 that overlaps the second defined area of operation 70.
- the autonomous vehicle 12 may then begin its mowing operations in the second defined area of operation 70.
- the pose determination system and method can be employed in an indoor environment.
- accuracy and precision of the pose determination system can be increased by synchronizing the plurality of tags by employing a single high stability oscillator.
- Using a single oscillator can also improve navigation performance, since different tags normally would have different oscillator drifts. It would also be possible to use open standards IEEE 1588 PTP-V2, or synchronous Ethernet to disseminate phase and frequency.
- the operation of the autonomous vehicle in a defined area of operation may simplify the programming of the autonomous vehicle, in that the autonomous vehicle can be programmed to deal with conditions, hazards, obstacles, and other various eventualities that affect the defined area of operation, rather than the much wider range of eventualities that may affect a broader geographical area.
- the tags described herein can serve as reference nodes that require little or no external power, and may include little or no infrastructure to communicate with other tags or with any other network.
- a system that uses UWB can be expected to be reliable at low power and is adaptable to a range of terrains and environments and weather conditions. For some terrains and environments, UWB coupled with GPS may provide additional reliability and adaptability.
- embodiments may include vehicles that are not autonomous.
- embodiments may include objects that are not conventionally thought of as vehicles, such as living things.
- embodiments may be made applicable to a virtual world as well as to a real world.
- a pose of an object in a virtual world may be determined with respect to a virtual defined area of operation or virtual reference nodes.
- Virtual world applications may include various forms of gaming in a virtual world.
- processing power can be concentrated in the autonomous vehicle, which may have fewer power constraints than the tags.
- Security against theft and other mischief can be concentrated in the autonomous vehicle.
- the autonomous vehicle may be locked up when not in use, but it may not be a practical necessity to lock up the tags, which may remain deployed near the defined area of operation.
- Embodiments may be applied to run-of-the-mill activities as well as unusual activities. Applications may be civilian as well as military. Uses may be practical as well as artistic (for example, in the illustration in which the autonomous vehicle is a mower, the autonomous vehicle may be programmed to cut grass to produce a pleasing design).
- various functions and components may be implemented in hardware, software, firmware, middleware or a combination thereof and utilized in systems, subsystems, components or subcomponents thereof.
- elements thereof may be instructions and/or code segments to perform the necessary tasks.
- the program or code segments may be stored in a machine readable medium, such as a processor readable, such as a processor readable medium or a computer program product, or transmitted by a computer data signal embodied in a carrier wave, or a signal modulated by a carrier, over a transmission medium or communication link.
- the machine readable medium or processor readable medium may include any medium that can store or transfer information in a form readable and executable by a machine, for example a processor, computer, etc.
- An embodiment may relate to a computer storage product with a computer-readable medium having computer code thereon for performing various computer-implemented operations.
- the computer-readable media and computer code may be those specially designed and constructed for the purposes of the disclosed embodiments, or they may be of the kind well known and available to those having skill in the computer software arts.
- Examples of computer-readable media include, but are not limited to: ROM and RAM devices including Flash RAM memory storage cards, sticks and chips, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and execute program code, such as application specific integrated circuits ("ASICs"), programmable logic devices ("PLDs”) and ROM and RAM devices including Flash RAM memory storage cards, sticks and chips, for example.
- Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter. For example, an embodiment may be implemented using any suitable scripting, markup and/or programming languages and development tools. Another embodiment may be implemented in hardwired circuitry in place of, or in combination with, machine- executable software instructions.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Systems and methods for determination of pose of an autonomous vehicle with respect to a defined area of operation are described. Reference nodes or tags are deployed at known positions proximate to the defined area of operation. The autonomous vehicle can detect the relative position of typically two or more reference nodes, and can determine the pose of the autonomous vehicle with respect to the defined area of operation.
Description
POSE DETERMINATION SYSTEM AND METHOD
RELATED APPLICATION INFORMATION
The present application claims priority to US provisional application serial number 62/567,523 filed October 3, 2017, the disclosure of which is incorporated herein by reference in its entirety.
FIELD
Embodiments described herein relate to location and orientation monitoring systems and methods. Embodiments described herein further relate to autonomous vehicles and control systems.
BACKGROUND
Determining location and orientation of an object may be important in a number of circumstances. One circumstance where the determinations are important, but not the only one, is with an autonomous vehicle.
Autonomous vehicles are being developed and marketed for a variety of civilian and military applications. One of the difficulties faced by designers of autonomous vehicles is the challenge of making an autonomous vehicle aware of its locations and orientations and surroundings, such that the autonomous vehicle may operate safely and effectively.
SUMMARY OF EMBODIMENTS OF THE INVENTION
Generally, embodiments of the present invention relate to a system and method for determining a pose for movable object. In an embodiment, the movable object can be an autonomous vehicle system.
In a broad aspect, an autonomous vehicle system can have a vehicle with a functional apparatus comprising a mobility apparatus to move the vehicle within an
area of operation, a pose-detecting apparatus having a plurality of antennas to wirelessly send an outbound signal and receive inbound signals from a plurality of tags being fixed relative to the area of operation, a supplemental location apparatus to generate a location-related signal; and a controller for determining a pose of the vehicle relative to the tags based on information about the outbound signal and the inbound signals, or the location-related signal, or control the mobility apparatus based on the pose to keep the vehicle within the area of operation.
In another broad aspect of the invention, a method of autonomously operating a vehicle can comprise wirelessly sending an outbound signal from at least one of a plurality of antennas of the vehicle, wirelessly receiving at the plurality of antennas inbound signals emitted by a plurality of tags being fixed relative to an area of operation, the inbound signals emitted in response to the plurality of tags receiving the outbound signal, generating at a controller of the vehicle an indication of whether a pose of the vehicle relative to the tags can be determined based on information about the outbound signal and the inbound signals, and determining at the controller a pose of the vehicle relative to the tags. If the indication is positive, then determining the pose based on the information about the outbound signal and the inbound signals, and if the indication is negative, then obtaining a location-related signal at a supplemental location apparatus of the vehicle; and determining the pose based on the location-related signal; and controlling the vehicle based on the pose to keep the vehicle within the area of operation.
In another broad aspect of the invention, a system for determining three dimensional location and orientation of a movable object in an area of operation can comprise at least first, second and third tags configured at respective predetermined non-collinear locations in or near the area of operation to define directly or indirectly a reference coordinate system, each tag having a tag antenna providing respective tag signals, and a pose determination system adapted to be integrated with or coupled to the object so as to be movable therewith, the pose determination system comprising an antenna hub having at least first, second and third hub antennas arranged in a non- collinear configuration to define an object coordinate system and to receive the tag signals, and a controller coupled to the antenna hub, the controller having one or more
processors which execute stored instructions to determine the distance of each of the first, second and third hub antennas to each of the first, second and third tags employing timing of receipt of the tag signals, determine the location of each of the first, second and third hub antennas relative to the reference coordinate system using the antenna to tag distances and predetermined tag locations, and determine the orientation of the object coordinate system relative to the reference coordinate system using the antenna locations.
In another broad aspect of the invention, a pose determination system adapted to be integrated with or coupled to an object so as to be movable therewith comprises an antenna hub having at least first, second and third hub antennas arranged in a non-collinear configuration to define an object coordinate system and to receive at tag signals from first, second and third tags, and a controller coupled to the antenna hub. The controller can have one or more processors which execute stored instructions to determine the distance of each of the first, second and third hub antennas to each of the first, second and third tags employing timing of receipt of the tag signals, determine the location of each of the first, second and third hub antennas relative to the reference coordinate system using the antenna to tag distances and predetermined tag locations, and determine the orientation of the object coordinate system relative to the reference coordinate system using the antenna locations.
In another broad aspect of the invention, a method for determining three dimensional location and orientation of a movable object in an area of operation can comprise providing first, second and third tag signals respectively from each of first, second and third tags configured at predetermined non-collinear locations in or near the area of operation to define directly or indirectly a reference coordinate system, receiving, at an antenna hub having at least first, second and third hub antennas arranged in a non-collinear configuration to define an object coordinate system, the tag signals from the first, second and third tags, determining at one or more programmed processors the distance of each of the first, second and third hub antennas to each of the first, second and third tags employing timing of receipt of the tag signals, determining at one or more programmed processors the location of each of the first, second and third hub antennas relative to the reference coordinate system using the
antenna to tag distances and predetermined tag locations, and determining at one or more programmed processors the orientation of the object coordinate system relative to the reference coordinate system using the antenna locations.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic diagram of an illustrative autonomous vehicle location system; and
FIG. 2 is an illustrative map showing overlapping defined areas of operation.
DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
As discussed herein, "pose" is used in the sense of a particular way of a thing being positioned and oriented. Generally speaking, a "pose" of a thing is determined with respect to one or more references, such as the Earth or a landmark or a marker or a tag. A pose may be determined in a particular coordinate system defined usually relative to one of these landmarks or objects or other references. A "pose" of an object includes the location of the object with respect to one or more references, and also includes a heading (in two dimensions-2D) or orientation (more generally) with respect to one or more references. In aerospace, a representative 3D application, the full orientation is comprised of heading, attitude, and bank angles. In other applications the terminology yaw, pitch and roll, and/or Euler angles, may be employed for 3D orientation. A detailed discussion of 3D orientation, including terminology and mathematical descriptions, may be found in the following publications, the disclosures of which are incorporated herein by reference in their entirety: Representing Attitude: Euler Angles, Unit Quaternions, and Rotation Vectors, James Diebel, Stanford University, 20 October 2006, https://www.astro.rug.nl/software/kapteyn/downloads/attitude.pdf; Determination of a Position in Three Dimensions Using Trilateration and Approximate Distances, Willy Hereman and William S Murphy, Jr., Colorado School of Mines, October 1995, https://inside.mines.edu/~whereman/papers/Murphy-Hereman-
Trilateration-MCS-07-1995.pdf.
The pose of an object may be determined by direct measurement or detection of location and orientation, but this is not the only way to determine an object's pose. Other quantities may be measured or detected that may be indicative of pose, or location or orientation separately. Preferred but non-limiting examples of pose determination are discussed in detail below. As alternative examples, an object's velocity may indicate its direction of movement and its position and its orientation. Similarly, acceleration and change in direction of an object may indicate the pose of the object. As discussed herein, a pose of an object may be determined directly from location and orientation or indirectly from one or more related or derivative parameters. Typically, specific quantities indicative of location and orientation may be computed or otherwise determined, though this is not necessary in all cases.
As will be discussed below, pose is of considerable significance for the operation of an autonomous vehicle. The various embodiments described below include an autonomous vehicle but are not, however, limited to vehicles, nor to an autonomous vehicle.
An autonomous vehicle is a machine that moves from place to place without human control or intervention. In some cases, an autonomous vehicle may convey a human from place to place, while in other cases, the autonomous vehicle may be unable to convey a human. In some cases, an autonomous vehicle may be under human control for part of the journey, and in other cases, the autonomous vehicle goes from place to place independent of human control. Autonomous vehicles may be of any size: ships at sea, vessels in space, motorized ground transportation, drones, some weapons or military hardware, carts, trains, automobiles, robotic home vacuum cleaners, and other robotic conveyances can be vehicles that may be, to a degree, autonomous.
Not all autonomous vehicles operate in the same way or under the same conditions. Some autonomous vehicles work in a defined area of operation, and some do not. In general, a defined area of operation bounds or limits the geographical region in which the autonomous vehicle operates. The boundary of the geographical region constrains where the autonomous vehicle can go. Outside the boundary, the
autonomous vehicle may not move as it should, or may not move at all; or it may be undesirable for the autonomous vehicle to operate anywhere except for the defined area of operation. In a typical case, the boundaries of the defined area of operation are in close geographic proximity such that an autonomous vehicle operating therein would have line-of-sight view of markers designating the boundaries.
Many of the autonomous vehicles mentioned previously do not operate in defined areas of operation.
The embodiments described herein apply to objects, such as autonomous vehicles, that operate within a defined area of operation. The defined area of operation may be, for example, an outdoor plot of land such as a field, or a factory floor, or a parking lot, or a swimming pool, or a small lake, or a short section of roadway. The defined area of operation may itself be stationary with respect to the planet, but this is not necessary in all implementations. For example, a defined area of operation may exist on the deck of a ship at sea, such that the defined area of operation may be stationary with respect to the ship but moving with respect to the planet. In embodiments, the system and method can be employed to passively track a moving object, such as a person.
To operate within a defined area of operation, an autonomous vehicle ordinarily has to be able to sense or otherwise determine its pose, which may be thought of generally as where the autonomous vehicle is located with respect to the defined area of operation or a landmark proximate to (in, on the border of, or near) the defined area of operation, and the direction in which the autonomous vehicle is oriented or aimed or going with respect to the defined area of operation. It may also be helpful to sense or calculate related or derivative parameters, such as speed of travel, acceleration, change in direction, and future position.
The embodiments discussed below include positioning of at least two reference nodes, having wireless transmission capability, which may be called "tags," at some known positions proximate to the defined area of operation. If the vehicle operates on a flat surface, and only 'heading' is required, then two tags are sufficient; we will refer to this as a 2D pose. In the general case with non-flat topography, where 3D position, as well as yaw, pitch, and roll angles are sought, a plurality, such as three
or more tags will be required. We shall refer to this as the 3D pose. These reference nodes define the area of operation for the autonomous vehicle, and the autonomous vehicle typically stores data relating which nodes are related to which defined area of operation and how the tags are deployed with respect to the defined area of operation. The tags may be, for example, on the perimeter or border of the defined area of operation, or in the defined area of operation, or near to the defined area of operation, or any combination thereof. Whether a tag is "near" a defined area of operation may be a function of the communication range of the tag. The tags will be essentially stationary with respect to the defined area of operation. In a typical embodiment, the tags will also be comparatively low in functionality, that is, having specialized functions and less versatility than (for example) a general-purpose processor or a cellular telephone. Low functionality may have a number of potential benefits. For example, a tag with low functionality may require little or no external power to operate. As will be mentioned below in connection with an illustrative embodiment, power to the tags may be supplied by solar cells on the tags. There may be embodiments in which a tag is entirely passive, operating on the power received from the autonomous vehicle.
A typical embodiment, however, may include a tag that is active (partially or fully active), that is, powered by some source other than or in addition to the power from the autonomous vehicle. A tag that is active may have an effective range of transmission that exceeds that of a passive tag. An active tag may also support more kinds of signals (in terms of modulation, frequency, information, power, timing, among other things) than a passive tag. A tag with low functionality may also be less costly to produce, which may be advantageous in that some implementations may entail multiple tags. Low functionality may also affect the value of the tags, making them less attractive to thieves or other mischief-makers. Low functionality may also imply that there are fewer ways in which the tags can malfunction or go wrong (and it may be economical to replace a malfunctioning tag).
As previously mentioned, the tags' locations do need to be known with respect to a uniform reference frame. They can either be surveyed (e.g., with respect to World Geodetic System 1984 or WGS84 datum), or the vehicle could survey them in recursively as follows. Since the vectors from tag0, tagx and tag2 to the vehicle frame
can be computed, we can without loss of generality set the position of tagQ to be (0,0,0), then the position of tagx and tag2 relative to tag0 can be recorded. The vehicle can then be moved to a new location where tag1,tag2,tag3 can all be seen by the hub sensors, and the position of tag3 relative to previously computed tag2 and tagx positions can also be computed, and so on where the position of tagi+1 is computed relative to the previously computed positions of tagt and tagt_x and then the vehicle is re-positioned anew. Ideally, these measurements could be stored and post- processed as a batch-job to obtain an optimal set of positions in the least-squares sense.
The tags need not be, and ordinarily are not, a part of the autonomous vehicle itself.
Rather, the autonomous vehicle is equipped with a pose-detecting apparatus that can measure, estimate, sense, ascertain, or otherwise detect the relative position of a tag, typically the position of the tag in relation to the vehicle's coordinate system. In a typical embodiment, the pose-detecting apparatus can detect the distance to two or more tags (or three or more tags for 3D pose), and the relative angle (measured with respect to a reference) of each tag. Details of an implementation of range and angle detection using tags is provided in International Patent Application PCT/CA2016/051309, filed November 10, 2016, publication no. WO2017/079839, the disclosure of which is incorporated herein by reference in its entirety. A typical implementation may treat the heading of the autonomous vehicle as a reference, with angles measured relative to the heading. So, for example, an object located at an angle of zero degrees (zero radians) would be straight ahead, and an object located at an angle of 90 degrees (π/2 radians) would be directly to the right (assuming the convention that 90 degrees is to the right and 270 degrees is to the left; the opposite convention also may be applied), and so on. The zero heading can be referenced to a feature of the vehicle (e.g. a direction perpendicular to its steering axle) or without loss of generality could be the normal to the hub's 3 sensor antennae. In one embodiment, detecting (and detecting over time) the distances and angles, the pose-detecting apparatus can determine the location and orientation of the pose-detecting apparatus (and hence the pose of the autonomous vehicle) with respect to the tags, as well as
information related to or derivable therefrom (such as the direction in which the pose- detecting apparatus may be traveling, speed of travel, acceleration, and change in direction). Alternatively, in place of angle detection and as discussed in detail below, three tags may be employed along with tag range information to derive full 3D orientation (yaw, pitch and roll) at one time. Therefore, to clarify, given three or more non-collinear tags, it is possible to determine the 3D pose of the vehicle without it moving.
Other techniques for determining the pose of the autonomous vehicle with respect to the defined area of operation (including more sophisticated forms of triangulation) may also be employed. As noted above, range information to three tags may be employed to derive 3D pose in a preferred embodiment. For purposes of discussion, however, first the use of distance and angle will be described. Distance- and-angle-based operation is often accurate enough for typical implementations, and may have benefits of low cost/overhead, simplicity, easy set-up, easy maintenance, easy troubleshooting/repair, and ready adaptability, among other things.
For distance-and-angle-based operation, detecting the distances and relative angles to two or more tags (rather than a single tag) is a matter of prudence; and as a practical matter, three or more tags are often useful. Detection of two or more tags (or three or more tags for 3D implementations) not only may improve accuracy and precision, it avoids situations in which a single-tag system will fail. For example, an autonomous vehicle, traveling in a circular path with the tag located at the center of the circle, will read that the distance to the tag is constant, and that the angle to the tag is constant (assuming the tag does not convey any angle information to the autonomous vehicle). In other words, even though the autonomous vehicle is changing position relative to the tag and the defined area of operation, the distance and angle measurements give incomplete information as to how the autonomous vehicle is moving, or in which direction it is moving, or how fast it is going, or where it is within the defined area of operation. When the autonomous vehicle detects distance and angle to a second tag at a different location, however, this failure can be avoided; the autonomous vehicle mathematically can determine its own location and heading. Other circumstances may exist in which detection of a single tag will result in an inaccurate or
ambiguous determination of location or orientation or both. Generally speaking, detecting distance and angle to two (or more) tags can reduce or eliminate error or ambiguity in pose and information related to or derivable therefrom. Also generally speaking, three tags are more useful than two, and four are more useful than three. In some cases (such as where the defined area of operation has obstacles or hills or places where tags can be blocked, or where the autonomous vehicle may change altitude), more tags may be helpful.
FIG. 1 is a schematic diagram of a typical autonomous vehicle pose-detecting system 10. The system 10 includes an autonomous vehicle 12, which is a machine that typically includes mechanical and electronic components. The system also includes two tags 14A and 14B that serve as reference nodes. Although two tags are shown in FIG. 1 , any number of tags may be employed. (A generic tag may be identified by reference numeral 14.)
The autonomous vehicle 12 may be a vehicle of any kind. For purposes of illustration, a typical autonomous vehicle 12 will be described as a mower that includes apparatus to mow an outdoor field. The outdoor field may be thought of as the defined area of operation of the mower. It may be undesirable for the autonomous vehicle 12 to operate autonomously outside the boundaries of the outdoor field. Other examples of an autonomous vehicle 12, by no means the only examples, include agricultural equipment, irrigation equipment, cleaning equipment, moving/conveying equipment, and delivery equipment. Although ground-based autonomous vehicles will be discussed, alternative embodiments include water-based autonomous vehicles (including those that float or submerge) and air-based autonomous vehicles (such as low-flying drones).
The autonomous vehicle 12 may employ any form of propulsion (such as petroleum-powered, electric, wind-propelled) and may be of any configuration or size. The autonomous vehicle 12 may be configured to convey one or more human beings, or not. The autonomous vehicle 12 may include one or more pieces of functional apparatus 32 according to its general purposes; in the case of a mower, for example, the functional apparatus 32 may include specialized equipment for mowing. The functional apparatus 32 includes one or more kinds of mobility apparatus 34, which
convey the autonomous vehicle 12 from place to place (often within the defined area of operation, but the mobility apparatus 34 may convey the autonomous vehicle 12 from place to place outside the defined area of operation as well). Mobility apparatus 34 may also include apparatus that steers the autonomous vehicle 12 that governs the speed of the autonomous vehicle 12, that brakes the autonomous vehicle 12, or other components that make the autonomous vehicle 12 function as a vehicle. Mobility apparatus 34 may include various things such as one or more wheels, propellers, motors, fuel supplies, batteries or other power-related components, rudders, and so forth.
The functional apparatus 32 may also include any of various safety systems, such as systems to suspend operations or shut down in case of malfunction or when any of several hazardous conditions may be present.
The tags 14 may be deployed at any known positions inside the defined area of operation, or on the perimeter or border of the defined area of operation, or proximate to the defined area of operation. The tags 14 may be mounted upon dedicated pedestals, i.e. , supporting structures that hold the tags 14 in fixed positions relative to the defined area of operation (and that may have other functionality); the tags 14 may be mounted upon already-existing structures in fixed positions relative to the defined area of operation (such as fence posts, streetlights, buildings, trees, and so on); or any combination thereof. The tags 14 may be deployed at any height above the ground; for some installations, for example, one meter above the ground might be a typical height for all tags 14, while for another installation, some tags 14 may be positioned higher above the ground while others are positioned lower.
The autonomous vehicle 12 includes a pose-detecting apparatus, which includes an antenna hub 16. The antenna hub 16, described in more detail below, wirelessly transmits a signal 18 (the signal being an electromagnetic signal transmitted wirelessly), which is received by the tags 14A, 14B; and the tags 14A, 14B generate a return signal in response to the signal 18. In FIG. 1 , tags 14A, 14B receive the same single signal from the antenna hub 16 and respond to this single signal 18; in some embodiments, the antenna hub 16 may generate multiple signals of different kinds, and the tags 14A, 14B may respond to different signals.
The antenna hub 16 receives and detects the return signals 20A, 20B from the tags 14A, 14B. The autonomous vehicle 12 includes a processor 22 (or more generally a "controller" which may include plural processors with dedicated functions) that receives as input the return signals 20A, 20B or signals from the antenna hub 16 that are functions of the received return signals 20A, 20B. As a function of this input, the processor computes, infers, calculates, measures, or otherwise determines the pose of the autonomous vehicle 12 with respect to the tags 14, and with respect to the defined area of operation. As used herein, a first thing (such as an output) is computed or otherwise determined "as a function of" a second thing (such as an input), when the first thing is directly or indirectly dependent upon the second thing; the first thing may be, but need not be, dependent exclusively upon the second thing.
The antenna hub 16 may be in any of several configurations, and can comprise a plurality of hub antennas. The plurality of hub antennas can be operatively connected to any suitable receiver/transceiver radio, including an ultra-wideband (UWB) receiver/transceiver radio, such as an integrated USB radio system like the DW1000 available from DecaWave of Dublin, Ireland, for example. One illustrative configuration includes three omnidirectional antennas deployed on the vertices of a triangle, such as an equilateral triangle. The distance from one antenna to another may be known to a good degree of precision. When a signal is received from a tag 14, the signal may be received by a first hub antenna in the antenna hub 16 first, and by a second hub antenna in the antenna hub 16 later, after a tiny but measurable delay. By applying principles of geometry and trigonometry, an angle of the tag 14, with respect to the orientation of the autonomous vehicle 12 or the antenna hub 16, can be computed. (Other parameters of interest may be computed or otherwise determined as well.)
Distance of the tag 14, with respect to the orientation of the autonomous vehicle 12 may be computed on the basis of the received signals in a number of ways. One way involves each tag 14 in the system 10 transmitting its response 20 in a manner to reduce interference among responses from several tags.
Tag 14B shows illustrative components of a tag 14. An antenna 36 may detect or receive electromagnetic signals 18 from the autonomous vehicle 12, and transmit electromagnetic signals 20 to the autonomous vehicle 12. A processor 38 may
process the electromagnetic signals 18 detected or received by the antenna 36, and may record the time that the signals 18 were received according to an on-board clock 40. (Various tags 14 in the system 10 may synchronize their clocks 40 with one another, but this is not necessary.) Any data, such as the time a signal 18 was received or information about the tag 14B itself, may be stored in memory 42. Although the tag processor 38 may be of any type, there may be practical advantages for the processor 38 to have limited capability or low functionality, as mentioned previously.
In some embodiments, the tag 14B may have a power supply 44, which may include one or more power sources such as a battery, a solar power array, connection to an electrical grid, and so forth. The tag 14B may be configured to operate automatically in a variety of power modes, such as operating in a low-power operating state for much of the time, and automatically switching to a high-power operating state after detecting a signal from an autonomous vehicle 12, and automatically switching back to a low-power operating state after responding to the signal from the autonomous vehicle 12. Operating much of the time in a low-power operating state conserves power for times when more power is useful.
The distance of the antenna hub 16 to a tag 14 is a function of the time it takes for an electromagnetic signal 20 (traveling at the speed of light) to travel from the antenna 36 of a tag 14 to the antenna hub 16. There are numerous ways in which this travel time can be measured. In one embodiment, signal 18 transmitted by the antenna hub 16 may be a polling signal. This same polling signal may be broadcast to all tags 14 in range. In response to the polling signal, a tag 14 may (for example) change from a low-power operating state to a high-power operating state, and record the time at which the signal 18 was received. Since, in some embodiments, various tags 14 may have assigned time windows in which to broadcast their signals 20 (the tags 14 possibly operating on the presumption that the antenna hub 16 is somewhere inside the defined area of operation, but the tags 14 not necessarily having information about where the antenna hub 16 is located), each tag 14 may wait until its assigned time window to transmit its response signal 20. The response signal 20 may include an identification of the tag 14 sending the response signal 20, the time at which the signal 18 from the antenna hub 16 was received, as well as the time at which the response signal 20 was
sent (both times according to the on-board clock 40 used by the tag 14). This response signal may be received by the antenna hub 16. Even if the on-board clock 40 of the autonomous vehicle 12 is not perfectly synchronized with the on-board clock 26 of the tag 14, one or more error-correction techniques may be applied to measure the time in which it took for the signal from the tag 14 to reach the antenna hub 16. In particular, if the signals' three hub antennae are synced to the same local oscillator, then all single- difference common mode range errors cancel out. When time for the signal to travel from the tag 14 to the antenna hub 16 is known, then the distance from the tag 14 to the antenna hub 16 is also known (linear distance traveled by an electromagnetic signal is the travel time multiplied by the speed of light).
The electromagnetic signals 18, 20 may be in any format, with any characteristics (frequency, band, modulation, etc.). In a typical implementation, ultra- wide band (UWB) signals may be employed. UWB may have a number of advantages, including having wide spectrum of frequency bands, with some frequencies having less risk of being blocked. UWB signals may carry digital information of almost any kind (including timing information and information identifying responding tag), they may operate at very low power, and they may be less prone to interfere with other electromagnetic signals in the area. UWB can function in a variety of lighting and weather conditions. One concern about UWB may be the range of UWB, which may be less than the range of other electromagnetic communication technologies. Although there is no sharply-defined range for UWB, ordinarily a defined area of operation should be sized and shaped so as not to have any locations in which the autonomous vehicle will be out of range of all of the tags. It may be a criterion for layout of a defined area of operation, in one example, that all locations in the defined area of operation be at least 70 meters from at least two tags. In another example, it may be specified that all locations in the defined area of operation be at least 50 meters from at least three tags.
Using techniques such as UWB, the autonomous vehicle 12 may compute the distance to one or more tags 14, as well as the angle of each tag relative to the autonomous vehicle 12. With information about distance and angle, the autonomous vehicle 12 may compute the pose of the autonomous vehicle 12 relative to the tags 14, and relative to the defined area of operation. Alternatively, as clarified above, pose may
be determined using distance but without detecting angle.
With the computed pose of the autonomous vehicle 12, the processor 22 can control the operation of the autonomous vehicle 12 as a function of the pose (or as a function of any parameters related to or derived from the location and orientation). Examples of controlling the operation include turning a corner, avoiding an obstacle, increasing/decreasing speed, or activating/deactivating some of the functional apparatus 32.
The autonomous vehicle 12 may include one or more memory elements 24, which may be of any kind. Memory 24 may be volatile or non-volatile or any combination thereof. Memory 24 may store software or instructions that pertain to determining the pose of the autonomous vehicle 12 or how the autonomous vehicle 12 is to carry out its functions (e.g., which particular tags 14 are proximate to or define the defined area of operation, or what path to pursue as the autonomous vehicle 12 moves about the defined area of operation, or what to do at a particular site in the defined area of operation, or what hazards may exist in the in the defined area of operation). Memory 24 may also store data of any kind, such as a map of the defined area of operation, or a record of places where the autonomous vehicle 12 has been. In a typical implementation, the processor 22 cooperates with the memory 24 to perform any of many kinds of computational and decision-making functions.
The autonomous vehicle may include an on-board clock 26. On-board clock 26 may keep time internally or in reference to external time signals (such as wireless network signals or global positioning system (GPS) signals), or both.
The processor 22 in FIG. 1 may be, but need not be, a single discrete component of the autonomous vehicle 12. In some embodiments, the processor 22 may be a general-purpose processor (configured to perform one or more operations by executable instructions), or a specialized processor, or any combination of processing elements. The operations of the processor 22 may be distributed among multiple components. In some embodiments, various processing functions may be divided among multiple elements (for example, some processing may be performed by supplemental location apparatus 28, as discussed below). In some embodiments, components such as the clock 26 may be included in the processor 22, and need not be
embodied as discrete components.
Similarly, the memory 24 need not be embodied as a single discrete component. In some embodiments, memory may include one or more memory elements that are physically separated from the autonomous vehicle 12, with data and instructions conveyed wirelessly (for example) to the autonomous vehicle 12.
As noted above, the autonomous vehicle 12 may determine its pose with respect to the defined area of operation by measuring the linear distance from the antenna hub 16 to any number of tags 14, and measuring the angular displacement of the tags. Alternatively, as clarified above, pose may be determined using distance but without detecting angle. The linear distance from the autonomous vehicle 12 to a tag 14 is a function of the time it takes a signal to travel from the autonomous vehicle 12 to the tag 14; it is also possible to think of the linear distance between the autonomous vehicle 12 and the tag 14 as being a function of the time it takes for a signal to travel from the autonomous vehicle 12 to the tag 14 and the time it takes for a reply signal to travel from the tag 14 to the autonomous vehicle 12. Electromagnetic signals travel at the speed of light, so the time it takes for a signal to go from one site to another is a function of the distance between the sites.
As a practical matter, and for purposes of illustrative discussion, it will be assumed that time computations are performed by the autonomous vehicle 12. The autonomous vehicle 12 in effect transmits a signal and measures the time it takes to receive a reply from a tag 14. This measured time is a function of the distance from the autonomous vehicle 12 to the tag 14.
Determination of the angle of a tag 14 relative to the autonomous vehicle 12 may be accomplished by any of several techniques. A comparatively uncomplicated technique may involve the antenna hub 16 having two or more antennas, disposed apart from one another by a known or fixed distance. A reply signal from a tag 14 may be received by the two antennas at two times, and the difference between the two times is the time difference. The relative angle is a function of the time difference.
In a particular embodiment of the invention, the method may additionally comprise measuring a phase and time of arrival of a signal, such as an ultrawideband signal, transmitted by the tag for each of the plurality hub antennae; and determining the
differential phase of arrival, differential time of arrival, time angle of arrival and phase angle of arrival for each of the plurality of hub antennae; and determining a location of the tag relative to the plurality of hub antennae using the phase angle of arrival and range of the tag for each of the respective hub antennae. In one such embodiment, determining the location of the tag may comprise determining a three dimensional (or 3D) location of the tag relative to each of the plurality of hub antennae, using the phase angle of arrival and range of the tag for each of the hub antennae. In an exemplary such embodiment, three hub antennas may be used in combination as two or three pairs of antenna elements, to determine a 3D location of the tag using the phase angle of arrival and range of the tag for each of the two or three pairs of antenna elements. In another embodiment, determining the location of the tag may comprise determining an aggregation or average of a plurality of determined locations using the phase angle of arrival and range for each of the two or more respective pairs of hub antennae.
In one embodiment, system 10 may desirably provide for location of each of the tags 14 in the plurality of tags by means of determining an angle of arrival of the inbound signal with respect to the antenna hub 16, which may be combined with a range of tags 14 from the antenna hub 16 to calculate a relative position of each of the plurality of tags 14 with respect to antenna hub 16, such as recited according to aspects of the presently disclosed methods described in further detail below. In a particular embodiment, system 10 may be adapted for implementation of embodiments of the present inventive methods according to the disclosure which provide for using a differential time of arrival of an inbound between the first and second hub antenna to determine a differential time angle of arrival, which may desirably be used in combination with a multi-lobe differential phase angle of arrival beam pattern calculated for the phase difference of arrival of the inbound between the hub antennas, such as to disambiguate the multi-lobe phase angle of arrival beam pattern, and provide for a desirably more precise disambiguated phase angle of arrival of the inbound signal relative to the first and second hub antennas. Accordingly, in such an embodiment, system 10 may desirably provide for improved accuracy and precision for locating the position of tags 14 relative to the first and second hub antennas, than may be provided using time of arrival methods alone. In another embodiment, system 10 may desirably
provide for use of an antenna hub 16 having a plurality of sparsely spaced hub antennas which may be widely spaced relative to the wavelength of the UWB carrier wave signal such as to provide for greater position determination accuracy for a particular precision of time and/or phase differential measurement at the first and second hub antennas.
In a further embodiment, the antenna hub 16, may optionally also be configured to transmit an outbound signal for reception by the tags 14. In one such embodiment, outbound signal may be used as a polling signal such as to initiate a response by tags 14 by transmission of inbound signal, for example. In another aspect, the outbound signal may be used in connection with the inbound signal to provide for a round trip time of flight measurement for determining a range of tags 14 relative to the antenna hub 16, for example. In yet another aspect, the outbound signal may be used in conjunction with the inbound signal and/or optionally also with calibration signal 30 to allow for synchronization of time measurements or to account for clock drift between tags 14 and the antenna hub 16, or to measure and/or calculate error or calibration data such as interference, reflection, multipath, distortion, attenuation or other factors involving the transmission of UWB signals by system 10.
As alluded to earlier, in embodiments, the system can be employed to passively track a movable object within a defined area of operation. In such embodiments, the antenna hub can be affixed to the movable object, such as a person or animal, and the processor can be simply employed to determine the pose of the movable object relative to the plurality of tags.
Next a preferred embodiment for determining vehicle orientation will be described.
Having previously surveyed the tags positions via vehicle-self survey or via other survey techniques, one can then perform a translation and rotation of this coordinate system so as to align it with local north (or local magnetic north) such that (x,y,z)enu align with an 'East', 'North', 'Up' (ENU) frame. That is (xi,yi,Zi)enu = X + R(xi,yi,Zi)survey for some vector X and rotation R. The three axes 'East' 'North' and 'Up' need not align on any earth spin or magnetic axis, or even be level to the ground, but they must be orthonormal.
Again, without loss of generality, take the normal to the three hub antennae to be the vehicle orientation vector.
Having our vehicle orientation vector and a reference frame in an ENU frame, suppose the vehicle were driven to some random place in the area of operation. Then, for each hub antenna Aj one can use the tag locations (xi,yi,Zi)enu and ranges r (from the ith tag to the jth antenna) to perform trilateration calculations and to compute the ENU position of the jth antenna (xaj, yaj, zaj)enu. This can be done as described in the Hereman et al. publication referenced above.
It then remains to recover the orientation vector in ENU coordinates. This can be done by calculating the cross-product of antenna position differences (A2 - Ai )enu x (A3 - A2)enu = Denu i.e. the vector normal to the hub sensors (but now in the ENU frame). The angle of the rotation (relative to the 'north' axis) is φ = cos~1( emi · (0,1,0)), and the axis is V = (0, 1 ,0) xDenu (V s a unit vector).
From this we can then create a unit quaternion (a.k.a. versor) Q(r,x,y,z): r = cos( φ 12 ),
x = Vx sin(q> 12 ),
y = Vy sin(cp 12 ),
z = Vz sin(q> 12 )
The above referenced Diebel publication shows how to turn a unit quaternion to a rotation matrix, c.f. equation (125), and equation (72) describes how to recover the Euler angles for yaw, pitch and roll using the (1 ,2,3) Euler sequence. This provides the 3D orientation derived from signals from three tags received at three antennas in the antenna hub. The above computations may be implemented in processor(s) 22.
Some embodiments of the autonomous vehicle 12 include supplemental location apparatus 28. Supplemental location apparatus 28 may include any of several kinds of location apparatus that may be used in the event that the antenna hub-tag system (or the distance-and-angle technique) may be inadequate for brief or extended periods of time. An example of a supplemental location apparatus 28 may be, for example, a GPS receiver, such as a conventional GPS receiver or a real-time kinematic (RTK) receiver. Other examples of supplemental location apparatus 28 may include
one or more inertial sensors, or an echolocation apparatus (such as radar or sonar), or a compass, or an odometer, or a gyroscope, of a wheel sensor/encoder, or a visual sighting system, or a remote-operator-assisted piloting system or an altimeter. Some kinds of supplemental location apparatus 28 may be useful for determining location but not orientation, some may be useful for determining orientation but not location, and some may be useful for determining both. In particular if using an accelerometer to determine pitch and roll, it is possible to use only two tags to obtain a 3D pose. Similarly, by employing a barometer, and calibrating at point of known height, or using previous 3D pose measurements to calculate differential height, it is also possible to obtain a 3D pose while using only two tags (current low-cost barometers have a relative accuracy of ~ 10cm). Merely using a single antenna RTK system will only yield position however. On the other hand, using a properly calibrated MEMs accelerometer, gyro and magnetometer alone, can yield 3D orientation (but not position).
The supplemental location apparatus 28 may generate one or more signals as a function of the thing being detected, which in turn is a function of the pose of 30 the autonomous vehicle 12. The processor 22 may determine the pose of the autonomous vehicle 12 (in or outside the defined area of operation) as a function of the signal generated by the supplemental location apparatus 28.
In the course of operation, the antenna hub 16 may lose contact with one or more tags 14 or may fail to receive signals 20 from one or more tags 14. Loss of contact may be due to any number of reasons, such as an object that happens to be interposed between the antenna hub 16 and one or more tags 14 (interfering with line- of-sight or interfering with signals between the antenna hub 16 and one or more tags 14), or damage to a tag 14, or interference from a weather condition, or breakdown or malfunction of the antenna hub 16. Conditions such as any of these may result in outages of the distance-and-angle system. The outages may be momentary, or brief, or extended. Supplemental location apparatus 28 may be used during an outage for purposes of pose correction, or for emergency operation, or for bringing the autonomous vehicle 16 to a safe stop, or returning the autonomous vehicle 16 to home location, or guiding the autonomous vehicle 16 away from a hazard, or changing the operating mode of the vehicle from autonomous to user-controlled, for example.
Supplemental location apparatus 28 may also be used when there is no outage.
As an alternative to, or in addition to, utilizing supplemental location apparatus 28, the autonomous vehicle 12 may use previous data and computations to move about when (for example) contact with all tags (or all but one tag) is lost. The processor 22, having previously computed the position and heading and having information from devices such as a compass or wheel or a vertical gyro, may extrapolate position and heading. If contact with the tags is reestablished within a reasonable time, the processor 22 may correct for errors (if any) and the autonomous vehicle 12 may proceed as before. If contact with the tags is not re-established within a reasonable time, the autonomous vehicle 12 may take some other action, such as shutting down or issuing a distress call. The autonomous vehicle 12 may also call upon supplemental location apparatus 28 for assistance with navigating.
The supplemental location apparatus 28 may have deficiencies of its own. Some supplemental location apparatus 28 may be too costly to operate at all times, or may be susceptible to becoming unreliable in certain environments or bad weather, for example. Even so, if the distance-and-angle techniques develop trouble, the supplemental location apparatus may under some circumstances be able to keep the trouble from becoming worse.
As shown in FIG.1 , the autonomous vehicle 12 may include input/output (I/O) devices 30 other than those on the antenna hub 16 or the supplemental location apparatus 28. Such I/O devices 30 may be of any kind; input may be received and output transmitted wirelessly, audibly, visually, haptically, or in any combination thereof, or in other fashions. Examples of other I/O devices 30 include a radio receiver, an alarm, a warning light, a keypad, user controls, and an emergency stop switch.
There are many ways in which a defined area of operation may be defined or otherwise established. One illustrative technique involves having the tags 14 deployed proximate to the expected defined area of operation, and manually positioning or guiding the autonomous vehicle 12 around the perimeter of the defined area of operation. As the autonomous vehicle 12 moves around the perimeter, the autonomous vehicle 12 notes the position of the tags 14. Once the perimeter is closed, the autonomous vehicle 12 has information about the boundaries of the defined area of
operation, and the positions of the tags 14 with respect to the boundaries. From this information, the autonomous vehicle 12 may create a map of the defined area of interest. Another technique may include manually positioning or guiding the autonomous vehicle 12 to vertices of the defined area of operation. A further technique may involve moving the autonomous vehicle 12 proximate to a hazard, and instructing the autonomous vehicle 12 to avoid the hazard. Still a further technique may entail the autonomous vehicle automatically following physical perimeter markers, such as a fence, and regarding the area inside the perimeter markers as the defined area of operation.
Whatever functions the autonomous vehicle 12 performs, they need not be performed in the same way at all places in a defined area of operation, but may be set or changed or suspended depending upon where the autonomous vehicle 12 is located and/or oriented within the defined area of operation. For purposes of illustration, consider an example that will be discussed in more detail in relation to FIG. 2: the autonomous vehicle is a mower and the defined area of operation is a playing area on a golf course (e.g., tee area, fairway, rough, green, cart paths, and hazards for a single hole). The autonomous vehicle 12 may be instructed to mow this defined area of operation. The mowing operations, however, need not be uniform throughout the entire playing area. The autonomous vehicle 12 may be instructed to avoid the green and the hazards entirely, for example, and do no mowing operations there. The autonomous vehicle 12 may be instructed to mow the grass in the fairway to a shorter length than the grass in the rough, etc.
FIG. 2 is an illustrative map showing overlapping defined areas of operation on a golf course. FIG. shows one (first) playing area 50 on an illustrative golf course, and a neighboring (second) playing area 52 on the same golf course. Each playing area may have its own tee area, fairway, rough, green, and hazards; and the layout of these features will be different for every play area on the golf course. An autonomous vehicle 12 that functions as a mower may be used to mow the grass in the various playing areas, while avoiding hazards and cutting the grass to desired lengths at various sites.
Deployed on the golf course are several tags 14; in FIG. 2, eight
illustrative tags 54, 56, 58, 60, 62, 64, and 66 are shown. Some of the tags may be deployed in trees proximate to the playing areas, others may be deployed on dedicated pedestals, and others may be deployed in other fashions.
Generally speaking, tags 54, 56, 58, 60, 62, and 66 define the first defined area of operation 68, and the first defined area of operation 68 is related to the first playing area 50. Tags 60. 62, 64, 66, and other tags (not shown in FIG.2) define the second defined area of operation 70, and the second defined area of operation 70 is related to the second playing area 52.
As depicted in FIG. 2, the first and second defined areas of operation 68, 70 do not overlap geographically, though the first and second defined areas of operation 68, 70 may share a tag 62. There may be instances in which an autonomous vehicle 12 that performs functions on the first defined area of operation 68 may be called upon to perform similar functions on the second defined area of operation 70. Although it may be possible for an autonomous vehicle 12 to move autonomously from the first defined area of operation 68 to the second defined area of operation 70, such movement may be aided by creation of a third defined area of operation 72, which geographically overlaps the first defined area of operation 68 and the second defined area of operation 70. In FIG. 2, the boundaries of the third defined area of operation 72 essentially correspond to tags 58, 60, 64, and 66, and tag 62 is positioned well within and away from the perimeter of the third defined area of operation 72.
When the autonomous vehicle 12 is in the area of overlap of the first defined area of operation 68 and the third defined area of operation 72, the autonomous vehicle 12 may autonomously terminate its functions in the first defined area of operation 68 and begin its functions the third defined area of operation 72. In the example of an autonomous vehicle 12 that functions as a mower, the autonomous vehicle 12 may finish its work in the first defined area of operation 68, and then move to that part of the first defined area of operation 68 that overlaps the third defined area of operation 72. The autonomous vehicle 12 may then begin its mowing operations in area between the first defined area of operation 68 and the second defined area of operation 70, such as mowing the grass in the region 74 between the playing areas 50, 52. In some cases, the autonomous vehicle 12 may move directly to the second defined
area of operation 70, by moving to that part of the third defined area of operation 72 that overlaps the second defined area of operation 70. The autonomous vehicle 12 may finish its work in the third defined area of operation 72 (which may comprise mowing or simply moving from one defined area of operation to another), moving to that part of the third defined area of operation 72 that overlaps the second defined area of operation 70. The autonomous vehicle 12 may then begin its mowing operations in the second defined area of operation 70.
In some cases, it may be advantageous for a large defined area of operation may be broken up into smaller defined areas of operation. In other cases, it may be desirable for a small defined area of operation to be made larger, or to have another defined area of operation positioned nearby or overlapping. The embodiments described herein encompass these possibilities.
In certain embodiments, the pose determination system and method can be employed in an indoor environment. In such instances, accuracy and precision of the pose determination system can be increased by synchronizing the plurality of tags by employing a single high stability oscillator. Using a single oscillator can also improve navigation performance, since different tags normally would have different oscillator drifts. It would also be possible to use open standards IEEE 1588 PTP-V2, or synchronous Ethernet to disseminate phase and frequency.
Implementation of one or more embodiments may realize one or more advantages, many of which have been mentioned already. The operation of the autonomous vehicle in a defined area of operation may simplify the programming of the autonomous vehicle, in that the autonomous vehicle can be programmed to deal with conditions, hazards, obstacles, and other various eventualities that affect the defined area of operation, rather than the much wider range of eventualities that may affect a broader geographical area. The tags described herein can serve as reference nodes that require little or no external power, and may include little or no infrastructure to communicate with other tags or with any other network. A system that uses UWB can be expected to be reliable at low power and is adaptable to a range of terrains and environments and weather conditions. For some terrains and environments, UWB coupled with GPS may provide additional reliability and adaptability.
Although some of the embodiments have been described in connection with autonomous vehicles, they need not be limited to autonomous vehicles. For example, embodiments may include vehicles that are not autonomous. Further, embodiments may include objects that are not conventionally thought of as vehicles, such as living things. Further, embodiments may be made applicable to a virtual world as well as to a real world. A pose of an object in a virtual world may be determined with respect to a virtual defined area of operation or virtual reference nodes. Virtual world applications may include various forms of gaming in a virtual world.
In the case in which the tags operate on low power and/or have low functionality, processing power can be concentrated in the autonomous vehicle, which may have fewer power constraints than the tags. Security against theft and other mischief can be concentrated in the autonomous vehicle. In an illustrative case, the autonomous vehicle may be locked up when not in use, but it may not be a practical necessity to lock up the tags, which may remain deployed near the defined area of operation.
Embodiments may be applied to run-of-the-mill activities as well as unusual activities. Applications may be civilian as well as military. Uses may be practical as well as artistic (for example, in the illustration in which the autonomous vehicle is a mower, the autonomous vehicle may be programmed to cut grass to produce a pleasing design).
While particular exemplary embodiments have been described along with their various functional components and operational functions many variations are possible. For example, various functions and components may be implemented in hardware, software, firmware, middleware or a combination thereof and utilized in systems, subsystems, components or subcomponents thereof. In particular for embodiments implemented in software, elements thereof may be instructions and/or code segments to perform the necessary tasks. The program or code segments may be stored in a machine readable medium, such as a processor readable, such as a processor readable medium or a computer program product, or transmitted by a computer data signal embodied in a carrier wave, or a signal modulated by a carrier, over a transmission medium or communication link. The machine readable medium or
processor readable medium may include any medium that can store or transfer information in a form readable and executable by a machine, for example a processor, computer, etc.
An embodiment may relate to a computer storage product with a computer-readable medium having computer code thereon for performing various computer-implemented operations. The computer-readable media and computer code may be those specially designed and constructed for the purposes of the disclosed embodiments, or they may be of the kind well known and available to those having skill in the computer software arts. Examples of computer-readable media include, but are not limited to: ROM and RAM devices including Flash RAM memory storage cards, sticks and chips, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and execute program code, such as application specific integrated circuits ("ASICs"), programmable logic devices ("PLDs") and ROM and RAM devices including Flash RAM memory storage cards, sticks and chips, for example. Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter. For example, an embodiment may be implemented using any suitable scripting, markup and/or programming languages and development tools. Another embodiment may be implemented in hardwired circuitry in place of, or in combination with, machine- executable software instructions.
The illustrative embodiments herein described are not intended to be exhaustive or limiting in nature. Many alterations and modifications are possible in the practice of this invention without departing from the scope thereof.
Claims
1. An autonomous vehicle system, comprising:
a vehicle, having:
a functional apparatus comprising a mobility apparatus to move the vehicle within an area of operation;
a pose-detecting apparatus having a plurality of antennas to wirelessly send an outbound signal to and receive inbound signals from a plurality of tags being fixed relative to the area of operation;
a supplemental location apparatus to generate a location-related signal; and
a controller to:
determine a pose of the vehicle relative to the tags based on one or more of:
information about the outbound signal and the inbound signals; and
the location-related signal; and
control the mobility apparatus based on the pose to keep the vehicle within the area of operation.
2. The autonomous vehicle system of claim 1 , wherein the controller determines 3D pose of the vehicle relative to a predetermined coordinate system defined by the plurality of tags.
3. The autonomous vehicle system of claim 1 or 2, wherein the supplemental location apparatus comprises one or more of a GPS receiver, a real time kinematic receiver, an inertial sensor, radar, an echolocation apparatus, a compass, an odometer, an accelerometer, a magnetometer, a gyroscope, a wheel sensor/encoder, a visual sighting system, or a remote-operator-assisted piloting system, or an altimeter.
4. The autonomous vehicle system of claim 1 , 2, or 3, wherein when inbound signals are lost or information derived from the outbound signal and the inbound signals is determined to be insufficient to determine pose of the vehicle, the controller selectively determines pose of the vehicle based on the location-related signal generated by the supplemental location apparatus, and controls the mobility apparatus based thereon.
5. The autonomous vehicle system of any one of claims 1 to 4, wherein the controller determines pose from the outbound signal and the inbound signals by detecting the distance to two or more tags employing time of flight.
6. The autonomous vehicle system of any one of claims 1 to 5, wherein the controller determines pose from the outbound signal and the inbound signals by detecting the angle of the tag relative to the vehicle employing the time difference of receipt of inbound signals at different ones of the plural antennas.
7. The autonomous vehicle system of any one of claims 1 to 6, wherein the controller determines pose from the outbound signal and the inbound signal by detecting a phase difference of arrival of the inbound signals between each of the plurality of antennas.
8. The autonomous vehicle system of any one of claims 1 to 7, wherein the outbound and inbound signals are ultra-wideband signals.
9. The autonomous vehicle system of any one of claims 1 to 8, wherein the plurality of antenna are operatively connected to receiver/transceiver radios.
10. A method of autonomously operating a vehicle, the method comprising:
wirelessly sending an outbound signal from at least one of a plurality of antennas of the vehicle;
wirelessly receiving at the plurality of antennas inbound signals emitted by a plurality of tags being fixed relative to an area of operation, the inbound signals emitted in response to the plurality of tags receiving the outbound signal;
generating at a controller of the vehicle an indication of whether a pose of the vehicle relative to the tags can be determined based on information about the outbound signal and the inbound signals;
determining at the controller a pose of the vehicle relative to the tags, comprising:
if the indication is positive, determining the pose based on the information about the outbound signal and the inbound signals; and
if the indication is negative:
obtaining a location-related signal at a supplemental location apparatus of the vehicle; and
determining the pose based on the location-related signal; and controlling the vehicle based on the pose to keep the vehicle within the area of operation.
1 1 . The method of claim 10, wherein determining pose relative to the tags comprises determining at the controller the 3D pose of the vehicle relative to a predetermined coordinate system defined by the tags.
12. The method of claim 10 or 1 1 , wherein the supplemental location apparatus comprises one or more of a GPS receiver, a real time kinematic receiver, an inertial sensor, radar, an echolocation apparatus, a compass, an odometer, an accelerometer, a magnetometer, a gyroscope, a wheel sensor/encoder, a visual sighting system, or a remote-operator-assisted piloting system, or an altimeter.
13. The method of claim 10, 1 1 , or 12, wherein the controller determines pose from the outbound signal and the inbound signals by detecting the distance to two or more tags employing time of flight.
14. The method of any one of claims 10 to 13, wherein the controller determines pose from the outbound signal and the inbound signals by detecting the angle of the tags relative to the vehicle employing the time difference of receipt of incoming signals at different ones of the plural antennas.
15. The method of any one of claims 10 to 14, wherein the controller determines pose from the outbound signals and the inbound signals by detecting a phase difference of arrival of the inbound signals between each of the plurality of antennas.
16. The method of any one of claims 10 to 15, wherein the outbound and inbound signals are ultra-wideband signals.
17. The method of any one of claims 10 to 16, further comprising synchronizing the plurality of tags using a single oscillator.
18. The method of any one of claims 10 to 17, further comprising synchronizing the plurality of hub antenna with a local oscillator.
19. A system for determining three dimensional location and orientation of a movable object in an area of operation, comprising:
at least first, second and third tags configured at respective predetermined non- collinear locations in or near the area of operation to define directly or indirectly a reference coordinate system, each tag having a tag antenna providing respective tag signals; and
a pose determination system adapted to be integrated with or coupled to the object so as to be movable therewith, the pose determination system comprising:
an antenna hub having at least first, second and third hub antennas
arranged in a non-collinear configuration to define an object coordinate system and to receive the tag signals; and
a controller coupled to the antenna hub, the controller having one or more processors which execute stored instructions to:
determine the distance of each of the first, second and third hub antennas to each of the first, second and third tags employing timing of receipt of the tag signals;
determine the location of each of the first, second and third hub antennas relative to the reference coordinate system using the antenna to tag distances and predetermined tag locations; and
determine the orientation of the object coordinate system relative to the reference coordinate system using the antenna locations.
20. A pose determination system adapted to be integrated with or coupled to an object so as to be movable therewith, the pose determination system comprising:
an antenna hub having at least first, second and third hub antennas arranged in a non-collinear configuration to define an object coordinate system and to receive at tag signals from first, second and third tags; and
a controller coupled to the antenna hub, the controller having one or more processors which execute stored instructions to:
determine the distance of each of the first, second and third hub antennas to each of the first, second and third tags employing timing of receipt of the tag signals;
determine the location of each of the first, second and third hub antennas relative to the reference coordinate system using the antenna to tag distances and predetermined tag locations; and
determine the orientation of the object coordinate system relative to the reference coordinate system using the antenna locations.
21 . The system of claim 19 or 20, wherein the first, second and third hub antennas are arranged to form an equilateral triangle.
22. The system of claim 19 or 20, wherein the first, second and third hub antennas are arranged to form an L shape.
23. The system of any one of claims 19 to 22, wherein the pose determination system further comprises a source of a common timing signal provided to the first, second and third hub antennas.
24. The system of any one of claims 19 to 23, wherein the hub antennas provide one or more outgoing signals to the tags.
25. The system of any one of claims 19 to 24, wherein the tag signals and outgoing signals are ultra-wideband signals.
26. The system of any one of claims 19 to 25, wherein the controller determines the object orientation relative to the reference coordinate system in terms of Euler angles for yaw, pitch and roll.
27. The system of any one of claims 19 to 26, wherein the hub antennas are operatively connected to receiver/transceiver radios.
28. A method for determining three dimensional location and orientation of a movable object in an area of operation, comprising:
providing first, second and third tag signals respectively from each of first, second and third tags configured at predetermined non-collinear locations in or near the area of operation to define directly or indirectly a reference coordinate system;
receiving, at an antenna hub having at least first, second and third hub antennas arranged in a non-collinear configuration to define an object coordinate system, the tag signals from the first, second and third tags;
determining at one or more programmed processors the distance of each of the first, second and third hub antennas to each of the first, second and third tags employing timing of receipt of the tag signals;
determining at one or more programmed processors the location of each of the first, second and third hub antennas relative to the reference coordinate system using the antenna to tag distances and predetermined tag locations; and
determining at one or more programmed processors the orientation of the object coordinate system relative to the reference coordinate system using the antenna locations.
29. The method of claim 28, further comprising, at an initial stage, surveying the locations of the first, second and third tags to define a tag coordinate system.
30. The method of claim 29, wherein the surveying comprises recursively defining tag coordinates relative to each other.
31 . The method of claim 29 or 30, wherein the surveying comprises moving a pose detection system to a first location, identifying the first tag with a three dimensional coordinate value, and defining second and third tags with respective coordinate values defined relative to the first tag.
32. The method of claim 31 , wherein the surveying further comprises moving the pose detection system to a new location and identifying an additional tag with coordinate values defined relative to the first, second or third tags.
33. The method of claim 29, further comprising converting the tag coordinate system to the reference coordinate system.
34. The method of claim 33, wherein the reference coordinate system is an East, North, Up frame determined by the local magnetic North.
35. The method of any one of claims 28 to 34, wherein the object is an autonomous vehicle.
36. The method of any one of claims 28 to 35, further comprising controlling the movable object based on the location and orientation to keep the movable object within an area of operation.
37. A system for tracking a moving object comprising:
a pose-detecting apparatus having a plurality of antennas to wirelessly send an outbound signal to and receive inbound signals from a plurality of tags being fixed relative to an area of operation;
a supplemental location apparatus to generate a location-related signal; and a processor to determine a pose of the moving object relative to the tags based on one or more of:
information about the outbound signal and the inbound signals; and the location-related signal.
38. The system of claim 37, wherein the processor determines 3D pose of the moving object relative to a predetermined coordinate system defined by the plurality of tags.
39. The system of claim 37 or 38, wherein the supplemental location apparatus comprises one or more of a GPS receiver, a real time kinematic receiver, an inertial sensor, radar, an echolocation apparatus, a compass, an odometer, an accelerometer, a magnetometer, a gyroscope, a wheel sensor/encoder, a visual sighting system, or a remote-operator-assisted piloting system, or an altimeter.
40. The system of claim 37, 38 or 39, wherein when inbound signals are lost or information derived from the outbound signal and the inbound signals is determined to be insufficient to determine pose of the moving object, the processor selectively determines pose of the moving object based on the location-related signal generated by the supplemental location apparatus.
41 . The system of any one of claims 37 to 40, wherein the processor determines pose from the outbound signal and the inbound signals by detecting the distance to two or more tags employing time of flight.
42. The system of any one of claims 37 to 41 , wherein the processor determines pose from the outbound signal and the inbound signals by detecting the angle of the tag relative to the moving object employing the time difference of receipt of inbound signals at different ones of the plural antennas.
43. The system of any one of claims 37 to 42, wherein the processor determines pose from the outbound signal and the inbound signal by detecting a phase difference of arrival of the inbound signals between each of the plurality of antennas.
44. The system of any one of claims 37 to 43, wherein the outbound and inbound signals are ultra-wideband signals.
45. The system of any one of claims 37 to 44, wherein the plurality of antenna are operatively connected to receiver/transceiver radios.
46. A method for determining pose of a movable object comprising:
wirelessly sending an outbound signal from at least one of a plurality of antennas affixed on the movable object;
wirelessly receiving at the plurality of antennas inbound signals emitted by a plurality of tags being fixed relative to an area of operation, the inbound signals emitted in response to the plurality of tags receiving the outbound signal;
generating at a processor an indication of whether a pose of the movable object relative to the tags can be determined based on information about the outbound signal and the inbound signals;
determining at the processor a pose of the movable object relative to the tags, comprising:
if the indication is positive, determining the pose based on the information
about the outbound signal and the inbound signals; and
if the indication is negative:
obtaining a location-related signal at a supplemental location apparatus affixed to the movable object; and
determining the pose based on the location-related signal.
47. The method of claim 46, wherein determining pose relative to the tags comprises determining at the processor the 3D pose of the movable object relative to a predetermined coordinate system defined by the tags.
48. The method of claim 46 or 47, wherein the supplemental location apparatus comprises one or more of a GPS receiver, a real time kinematic receiver, an inertial sensor, radar, an echolocation apparatus, a compass, an odometer, an accelerometer, a magnetometer, a gyroscope, a wheel sensor/encoder, a visual sighting system, or a remote-operator-assisted piloting system, or an altimeter.
49. The method of claim 46, 47 or 48, wherein the processor determines pose from the outbound signal and the inbound signals by detecting the distance to two or more tags employing time of flight.
50. The method of any one of claims 46 to 49, wherein the processor determines pose from the outbound signal and the inbound signals by detecting the angle of the tags relative to the movable object employing the time difference of receipt of incoming signals at different ones of the plural antennas.
51 . The method of any one of claims 46 to 50, wherein the processor determines pose from the outbound signals and the inbound signals by detecting a phase difference of arrival of the inbound signals between each of the plurality of antennas.
52. The method of any one of claims 46 to 51 , wherein the outbound and inbound signals are ultra-wideband signals.
53. The method of any one of claims 46 to 52, further comprising synchronizing the plurality of tags using a single oscillator.
54. The method of any one of claims 10 to 17, further comprising synchronizing the plurality of hub antenna with a local oscillator.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762567523P | 2017-10-03 | 2017-10-03 | |
US62/567,523 | 2017-10-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019068175A1 true WO2019068175A1 (en) | 2019-04-11 |
Family
ID=65994146
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CA2018/051229 WO2019068175A1 (en) | 2017-10-03 | 2018-10-01 | Pose determination system and method |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2019068175A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3757596A1 (en) * | 2019-06-28 | 2020-12-30 | BeSpoon SAS | Location system for detecting position and orientation of movable units |
CN112346473A (en) * | 2020-11-25 | 2021-02-09 | 成都云鼎智控科技有限公司 | Unmanned aerial vehicle attitude control system, flight control system and attitude control method |
WO2021040471A1 (en) | 2019-08-28 | 2021-03-04 | Samsung Electronics Co., Ltd. | Sensor fusion for localization and path planning |
US20210237765A1 (en) * | 2020-01-31 | 2021-08-05 | Toyota Jidosha Kabushiki Kaisha | Vehicle, vehicle control interface box, and autonomous driving vehicle |
US20210396836A1 (en) * | 2020-06-18 | 2021-12-23 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Method and apparatus for performing grid-based localization of a mobile body |
WO2022010684A1 (en) * | 2020-07-09 | 2022-01-13 | The Toro Company | Autonomous machine navigation using reflections from subsurface objects |
CN114430603A (en) * | 2022-01-25 | 2022-05-03 | 广州小鹏汽车科技有限公司 | Welcome control method and device and vehicle |
IT202200011396A1 (en) * | 2022-05-30 | 2023-11-30 | Claudio Salvador | ADAPTIVE SYSTEM BASED ON ULTRA WIDE BAND FOR DYNAMIC DETECTION OF POSSIBLE COLLISIONS |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5408411A (en) * | 1991-01-18 | 1995-04-18 | Hitachi, Ltd. | System for predicting behavior of automotive vehicle and for controlling vehicular behavior based thereon |
US7026992B1 (en) * | 2005-03-31 | 2006-04-11 | Deere & Company | Method for configuring a local positioning system |
US20060224308A1 (en) * | 2005-03-31 | 2006-10-05 | Deere & Company, A Delaware Corporation | System and method for determining a position of a vehicle |
US20100106356A1 (en) * | 2008-10-24 | 2010-04-29 | The Gray Insurance Company | Control and systems for autonomously driven vehicles |
EP2278357A2 (en) * | 2008-03-26 | 2011-01-26 | Genova Robot SRL | A method and a device for determining the position of a vehicle for the autonomous driving of a vehicle, in particular a robotized vehicle |
US20170023659A1 (en) * | 2015-05-08 | 2017-01-26 | 5D Robotics, Inc. | Adaptive positioning system |
WO2017079839A1 (en) * | 2015-11-10 | 2017-05-18 | Xco Tech Inc. | System and method for ultrawideband position location |
US9915947B1 (en) * | 2016-02-26 | 2018-03-13 | Waymo Llc | System and method for determining pose data for a vehicle |
-
2018
- 2018-10-01 WO PCT/CA2018/051229 patent/WO2019068175A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5408411A (en) * | 1991-01-18 | 1995-04-18 | Hitachi, Ltd. | System for predicting behavior of automotive vehicle and for controlling vehicular behavior based thereon |
US7026992B1 (en) * | 2005-03-31 | 2006-04-11 | Deere & Company | Method for configuring a local positioning system |
US20060224308A1 (en) * | 2005-03-31 | 2006-10-05 | Deere & Company, A Delaware Corporation | System and method for determining a position of a vehicle |
EP2278357A2 (en) * | 2008-03-26 | 2011-01-26 | Genova Robot SRL | A method and a device for determining the position of a vehicle for the autonomous driving of a vehicle, in particular a robotized vehicle |
US20100106356A1 (en) * | 2008-10-24 | 2010-04-29 | The Gray Insurance Company | Control and systems for autonomously driven vehicles |
US20170023659A1 (en) * | 2015-05-08 | 2017-01-26 | 5D Robotics, Inc. | Adaptive positioning system |
WO2017079839A1 (en) * | 2015-11-10 | 2017-05-18 | Xco Tech Inc. | System and method for ultrawideband position location |
US9915947B1 (en) * | 2016-02-26 | 2018-03-13 | Waymo Llc | System and method for determining pose data for a vehicle |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114072687A (en) * | 2019-06-28 | 2022-02-18 | 贝斯普恩公司 | Positioning system for detecting the position and orientation of a movable unit |
WO2020260606A1 (en) * | 2019-06-28 | 2020-12-30 | Bespoon Sas | Location system for detecting position and orientation of movable units |
US12123963B2 (en) | 2019-06-28 | 2024-10-22 | Be Spoon | Location system for detecting position and orientation of movable units |
EP3757596A1 (en) * | 2019-06-28 | 2020-12-30 | BeSpoon SAS | Location system for detecting position and orientation of movable units |
EP3999928A4 (en) * | 2019-08-28 | 2022-08-31 | Samsung Electronics Co., Ltd. | Sensor fusion for localization and path planning |
WO2021040471A1 (en) | 2019-08-28 | 2021-03-04 | Samsung Electronics Co., Ltd. | Sensor fusion for localization and path planning |
US11937539B2 (en) | 2019-08-28 | 2024-03-26 | Samsung Electronics Co., Ltd. | Sensor fusion for localization and path planning |
US20210237765A1 (en) * | 2020-01-31 | 2021-08-05 | Toyota Jidosha Kabushiki Kaisha | Vehicle, vehicle control interface box, and autonomous driving vehicle |
US20210396836A1 (en) * | 2020-06-18 | 2021-12-23 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Method and apparatus for performing grid-based localization of a mobile body |
US11550020B2 (en) * | 2020-06-18 | 2023-01-10 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Method and apparatus for performing grid-based locailization of a mobile body |
WO2022010684A1 (en) * | 2020-07-09 | 2022-01-13 | The Toro Company | Autonomous machine navigation using reflections from subsurface objects |
EP4386508A3 (en) * | 2020-07-09 | 2024-09-04 | The Toro Company | Autonomous machine navigation using reflections from subsurface objects |
CN112346473A (en) * | 2020-11-25 | 2021-02-09 | 成都云鼎智控科技有限公司 | Unmanned aerial vehicle attitude control system, flight control system and attitude control method |
CN114430603A (en) * | 2022-01-25 | 2022-05-03 | 广州小鹏汽车科技有限公司 | Welcome control method and device and vehicle |
CN114430603B (en) * | 2022-01-25 | 2024-03-12 | 广州小鹏汽车科技有限公司 | Welcome control method and device and vehicle |
WO2023233270A1 (en) * | 2022-05-30 | 2023-12-07 | Claudio Salvador | Adaptive system based on ultra-wide band for the dynamic detection of possible collisions |
IT202200011396A1 (en) * | 2022-05-30 | 2023-11-30 | Claudio Salvador | ADAPTIVE SYSTEM BASED ON ULTRA WIDE BAND FOR DYNAMIC DETECTION OF POSSIBLE COLLISIONS |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019068175A1 (en) | Pose determination system and method | |
US9423509B2 (en) | Moving platform INS range corrector (MPIRC) | |
US11460864B2 (en) | Moving body guidance system, moving body, guidance device, and computer program | |
US10852738B2 (en) | Self-propelled robotic tool navigation | |
US6859729B2 (en) | Navigation of remote controlled vehicles | |
US10136576B2 (en) | Navigation for a robotic working tool | |
EP3069203B1 (en) | Improved navigation for a robotic working tool | |
US20110156957A1 (en) | Precise positioning using a distributed sensor network | |
US9063211B2 (en) | Navigation system and method | |
CN114018273A (en) | Accurate positioning system and method for automatic driving vehicle in underground coal mine | |
US20160238714A1 (en) | Dead reckoning-augmented gps for tracked vehicles | |
KR101504063B1 (en) | Moving bag | |
Park et al. | Multilateration under flip ambiguity for UAV positioning using ultrawide-band | |
CN103293511A (en) | Method and system for point-to-point positioning of unmanned aerial vehicle | |
Nebot | Sensors used for autonomous navigation | |
US9250078B2 (en) | Method for determining the position of moving objects | |
SE2050629A1 (en) | Method of providing a position estimate of a robotic tool, a robotic tool, and a robotic tool system | |
CN114167467A (en) | Robot positioning method and robot | |
Ruan et al. | Accurate 2D localization for mobile robot by multi-sensor fusion | |
Karthikeyan et al. | UWB based novel approach for orientation estimation and path planning for field robots without IMU | |
CN203241533U (en) | Unmanned plane point-to-point positioning system | |
Kiriy | A Localization System for Autonomous Golf Course Mowers. | |
JPH07104846A (en) | Traveling controller for autonomously traveling vehicle | |
GB2550108B (en) | Radio locator system | |
Asgari | Drone navigation in GNSS-denied environments using custom, low-cost radio beacon systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18865165 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18865165 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18865165 Country of ref document: EP Kind code of ref document: A1 |