WO2017215024A1 - Pedestrian navigation device and method based on novel multi-sensor fusion technology - Google Patents
Pedestrian navigation device and method based on novel multi-sensor fusion technology Download PDFInfo
- Publication number
- WO2017215024A1 WO2017215024A1 PCT/CN2016/087281 CN2016087281W WO2017215024A1 WO 2017215024 A1 WO2017215024 A1 WO 2017215024A1 CN 2016087281 W CN2016087281 W CN 2016087281W WO 2017215024 A1 WO2017215024 A1 WO 2017215024A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- module
- imu
- processing unit
- raw data
- heading angle
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D21/00—Measuring or testing not otherwise provided for
- G01D21/02—Measuring two or more variables by means not covered by a single other subclass
Definitions
- the invention relates to the field of multi-sensor fusion and pedestrian navigation, in particular to a pedestrian navigation device and method based on a novel multi-sensor fusion technology.
- Pedestrian navigation technology based on wireless systems such as WiFi and Bluetooth usually has large fluctuations in wireless signal strength in harsh environments, and cannot provide complete navigation information such as three-dimensional position, velocity and attitude. System performance is highly dependent on the distribution and number of transmitting devices. , location information is not continuous smooth and other defects.
- the pedestrian navigation technology based on the micro inertial unit is short-term accurate, but the navigation error accumulates faster. Camera-based visual positioning is characterized by slow calibration of visual sensors, high error rate of feature information extraction, and slow calculation of navigation information in complex environments. Therefore, multi-sensor fusion has become the mainstream solution for pedestrian navigation.
- the existing multi-sensor fusion technology generally includes the following steps: (1) Calculating the position, velocity and attitude of the tracking object by the inertial mechanical programming algorithm with the measurement data of the inertial unit (three-axis accelerometer and three-axis gyroscope) (2) Establish an error model corresponding to the inertial mechanical programming algorithm and use it as a system model of the fusion filter; (3) Establish other auxiliary systems (GPS, WiFi, Bluetooth, RFID, GNSS, etc.) as fusion filters Observing the model; (4) estimating the state quantity error of the system through the prediction and update process of the fusion filter; and (5) compensating the inertial unit error of the system state quantity error and the position, velocity and attitude information based on the inertial mechanical programming algorithm Final location information, speed and attitude information.
- the inertial mechanical programming algorithm three-axis accelerometer and three-axis gyroscope
- the existing multi-sensor fusion technology has the following two fatal disadvantages: (1) in the absence of other auxiliary systems, the navigation error will accumulate rapidly; (2) when the inertial unit and the carrier are not fixed, for example: in pedestrian navigation Mobile phones and pedestrians, traditional multi-sensor fusion technology can not correctly estimate the carrier information. Therefore, the existing multi-sensor fusion technology cannot provide accurate pedestrian navigation information in many scenarios.
- the technical problem to be solved by the present invention is to provide a pedestrian navigation device and method based on a novel multi-sensor fusion technology, which can improve the navigation accuracy and usability of pedestrians.
- the present invention provides a pedestrian navigation device and method based on a novel multi-sensor fusion technology, including: a handheld intelligent device platform, an observation measurement processing unit, and a fusion filter; the handheld intelligent device acquires an inertial measurement unit by using its own hardware.
- IMU Inertial Measurement Unit
- IMU magnetometer
- pressure gauge pressure gauge
- WiFi Bluetooth Low Energy
- GNSS Global Navigation Satellite System
- observation processing unit handles handheld
- the raw data provided by the intelligent device provides the positional or velocity observation to the fusion filter
- the fusion filter uses the kinematic model as the system model, and the observation processing unit results in the observation model, and the fusion filter is processed to finally obtain the pedestrian navigation. result.
- the handheld smart device platform comprises an IMU, a magnetometer, a pressure gauge, a WiFi, a low energy Bluetooth and a GNSS, which are common to existing smart devices;
- the IMU provides raw data of acceleration and angular velocity;
- the magnetometer provides raw data of geomagnetism
- the pressure gauge provides raw data of atmospheric pressure;
- WiFi provides raw data of Received Signal Strength (RSS);
- BLE provides raw data of BLERSS;
- GNSS receiver provides raw data of GNSS; any device of intelligent equipment can provide observation
- Other sensors of information can be included in the proposed multi-sensor fusion algorithm.
- the observation processing unit comprises: an IMU processing unit, a magnetometer processing unit, a pressure gauge processing unit, a WiFi processing unit, a BLE processing unit, a GNSS processing unit, and the like; and the IMU processing unit processes the original acceleration and angular velocity provided by the IMU.
- the magnetometer processing unit processes the geomagnetic raw data provided by the magnetometer to obtain geomagnetic position information and transmits to the fusion filter
- the pressure gauge processing unit processes Raw data of atmospheric pressure provided by the pressure gauge to obtain elevation information and transmitted to the fusion filter
- the WiFi processing unit processes the RSS original data provided by the WiFi to obtain WiFi location information and transmit to the fusion filter
- BLE The processing unit processes the RSS raw data provided by the BLE to obtain BLE location information and transmits to the fusion filter
- the GNSS processing unit processes the position and velocity information provided by the GNSS receiver and transmits the information to the fusion filter.
- the observation processing unit also includes other processing units to process other sensors of the smart device platform to obtain position or velocity information and transmit to the fusion filter.
- the fusion filter comprises a system model and an observation model; the system model uses the kinematic model to predict the position and velocity information of the object to be measured, and transmits the information to the observation model; the observation model predicts the position of the system model,
- the speed information is combined with information such as position, speed, etc. of the IMU, magnetometer, pressure gauge, WiFi, BLE, and GNSS provided by the observation processing unit to update the final position and speed information of the target to be tested.
- the IMU processing unit comprises a user motion mode and a device usage pattern recognition module, a heading angle deviation estimation module, an improved dead reckoning algorithm module, a user motion mode and a device usage pattern recognition module according to the IMU of the handheld smart device platform and others
- the raw data provided by optional hardware identifies user motion patterns such as stationary, walking, running, and the device usage modes of handheld, text messaging, telephone, navigation, pocket, backpack, etc.
- the heading angle deviation estimation module is based on the user
- the motion mode and the device use the user motion mode and device usage mode output by the pattern recognition module and the raw data provided by the IMU of the handheld smart device platform and other optional hardware (eg, a magnetometer) to estimate the heading angle deviation
- the estimation algorithm module obtains IMU position information according to the heading angle deviation output by the heading angle deviation estimation module and the original data provided by the IMU of the handheld smart device platform and other optional hardware (for example, a magnetometer) and transmits the information to the fusion. filter.
- the improved dead reckoning algorithm module comprises a attitude measuring system module, a heading angle deviation compensation module, a step detecting module, a step estimating module, a dead reckoning algorithm module, and the attitude determining system module is based on the IMU of the handheld smart device platform.
- the original data provided by other optional magnetometers recognizes the attitude information of the handheld smart device; the heading angle deviation compensation module reads the heading angle deviation outputted by the heading angle deviation estimation module and compensates the pedestrian heading angle and outputs to the dead reckoning algorithm;
- the detection module algorithm module detects the step number of the pedestrian to the step estimation module according to the original data of the IMU of the handheld smart device platform; the step estimation module according to the result of the step detection module and the IMU of the handheld smart device platform
- the raw data estimates the step size of the pedestrian and feeds back to the dead reckoning module; the dead reckoning module estimates the step information output by the module according to the step size and the pedestrian heading angle output by the heading angle offset compensation module
- the information calculates the IMU position observation and feeds back to the fusion filter.
- the present invention also provides a pedestrian navigation method based on a novel multi-sensor fusion technology, comprising the following steps:
- the handheld smart device uses its own hardware to obtain raw data of IMU, magnetometer, pressure gauge, WiFi, BLE and GNSS; (2)
- the observation processing unit processes the raw data provided by the handheld smart device to provide position or velocity measurement.
- the fusion filter is used; (3) the fusion filter uses the kinematics model as the system model, and the observation processing unit is used to establish the observation model. After the fusion filter is processed, the pedestrian navigation result is finally obtained.
- the observation processing unit processes the raw data provided by the handheld smart device to provide a position or velocity observation to the fusion filter, including the following steps:
- the IMU processing unit processes the raw data of the acceleration and angular velocity provided by the IMU to obtain the IMU Position information is transmitted to the fusion filter;
- the magnetometer processing unit processes the geomagnetic raw data provided by the magnetometer to obtain geomagnetic position information and transmits to the fusion filter;
- the pressure gauge processing unit Processing raw data of atmospheric pressure provided by the pressure gauge to obtain elevation information and transmitting to the fusion filter;
- the WiFi processing unit processes the RSS provided raw data to obtain WiFi location information and transmitting the information to the a fusion filter;
- the BLE processing unit processes the RSS raw data provided by the BLE to obtain BLE location information and transmits the BLE location information to the fusion filter;
- the GNSS processing unit processes the raw data of the GNSS provided by the GNSS chip The position and velocity information of the GNSS is obtained and transmitted to the fusion filter.
- the IMU processing unit processes the raw data of the acceleration and angular velocity provided by the IMU to obtain the IMU position information and transmits the information to the fusion filter, including the following steps:
- the user motion mode and device usage pattern recognition module identifies user motion patterns such as still, walking, running, etc. according to raw data provided by the IMU of the handheld smart device platform and other optional hardware (eg, a magnetometer). Handheld, SMS, phone, navigation, pocket, backpack and other device usage modes;
- the heading angle deviation estimation module is provided according to the user motion mode and the device usage mode output by the user motion mode and the device usage mode recognition module, and the IMU of the handheld smart device platform and other optional hardware (eg, a magnetometer)
- the raw data estimates the heading angle deviation
- the improved dead reckoning algorithm module obtains IMU position information according to the heading angle deviation output by the heading angle deviation estimating module and the original data provided by the IMU of the handheld smart device platform and other optional hardware (for example, a magnetometer). And transmitted to the fusion filter.
- the improved dead reckoning algorithm module comprises the following steps:
- the attitude measuring system module identifies the posture information of the handheld smart device according to the original data provided by the IMU of the handheld smart device platform and other optional magnetometers;
- the heading angle deviation compensation module reads the heading angle deviation outputted by the heading angle deviation estimation module and compensates the pedestrian heading angle and outputs to the dead reckoning algorithm;
- the step detection module algorithm module detects the step number of the pedestrian feedback to the step size estimation module according to the original data of the IMU of the handheld smart device platform;
- the step estimation module estimates the step size of the pedestrian according to the result of the step detection module and the original data of the IMU of the handheld smart device platform, and feeds back to the dead reckoning module;
- the dead reckoning module calculates the IMU position observation based on the step information output by the step estimation module and the pedestrian heading angle information output by the heading angle offset compensation module, and feeds back to the fusion filter.
- the invention has the beneficial effects that the invention optimizes the use method of the IMU in the traditional pedestrian navigation, liberates it from the system model of the fusion filter into an observation model, and overcomes the situation that the traditional multi-sensor is integrated without other auxiliary systems. Under the disadvantage that navigation errors will accumulate quickly.
- the IMU processing module in the present invention considers various modes of handheld smart devices in daily life, and breaks through the limitation that the traditional multi-sensor fusion IMU needs to be fixed with the carrier. Therefore, the present invention greatly improves the accuracy and usability of pedestrian navigation.
- FIG. 1 is a schematic structural diagram of a pedestrian navigation device based on a novel multi-sensor fusion according to the present invention.
- FIG. 2 is a schematic structural view of an inertial unit processing module according to the present invention.
- FIG. 3 is a schematic diagram of a Gaussian kernel support vector machine nonlinear classifier according to the present invention.
- FIG. 4 is a schematic diagram of a user motion mode and a smart device usage pattern recognition support vector machine in the present invention.
- FIG. 5 is a schematic diagram of an improved pedestrian dead reckoning algorithm in the present invention.
- a pedestrian navigation device based on a novel multi-sensor fusion includes: a handheld smart device platform 1 , an observation measurement processing unit 2 , and a fusion filter 3 .
- the handheld smart device 1 utilizes its own hardware to acquire an Inertial Measurement Unit (IMU) 11, a magnetometer 12, a pressure gauge 13, a WiFi 14, a Bluetooth Low Energy (BLE) 15 and a Global Navigation Satellite System (Global Navigation).
- IMU Inertial Measurement Unit
- BLE Bluetooth Low Energy
- GNSS Global Navigation Satellite System
- the observation processing unit 2 processes the raw data provided by the handheld smart device 1 to provide a position or velocity observation to the fusion filter 3, and the fusion filter 3 utilizes the kinematic model as the system model 31.
- the result of the observation processing unit establishes the observation model 32, and the processing of the fusion filter 3 finally obtains the pedestrian navigation result.
- the above novel multi-sensor fusion pedestrian navigation device can be used for various handheld smart devices (including smart phones, tablet computers, smart watches, etc.), and the handheld smart device can be held by hand or fixed on a pedestrian.
- the new multi-sensor fusion pedestrian navigation system shown in Figure 1 subverts the use of IMU in traditional pedestrian navigation, liberating it from the system model 31 of the fusion filter into the observation model 32, overcoming the integration of traditional multi-sensor pedestrians.
- the disadvantage that the navigation device can accumulate rapidly without other auxiliary systems.
- the above-mentioned handheld smart device platform 1 includes an IMU 11, a magnetometer 12, a pressure gauge 13, a WiFi 14, a BLE 15, and a GNSS 16 which are common to existing smart devices; the IMU 11 provides raw data of acceleration and angular velocity; and the magnetometer 12 provides geomagnetism.
- GNSS 16 provides raw speed and position data for GNSS. Any other sensor that can hold the observation information of the handheld smart device 1 can be included in the proposed multi-sensor fusion algorithm.
- the above-described observation processing unit 2 includes an IMU processing unit 21, a magnetometer processing unit 22, a pressure gauge processing unit 23, a WiFi processing unit 24, a BLE processing unit 25, a GNSS processing unit 26, and the like.
- the IMU processing unit 21 processes the raw data of the acceleration and angular velocity provided by the IMU 11 to obtain IMU position information and transmits it to the fusion filter 3;
- the magnetometer processing unit 22 processes the original data of the geomagnetism provided by the magnetometer 12 to The geomagnetic position information is obtained and transmitted to the fusion filter 3;
- the pressure gauge processing unit 23 processes the raw data of the atmospheric pressure provided by the pressure gauge 13 to obtain elevation information and transmits to the fusion filter 3;
- the WiFi processing unit 24 Processing the RSS raw data provided by the WiFi 14 to obtain WiFi location information and transmitting the same to the fusion filter 3;
- the BLE processing unit 25 processes the RSS raw data provided by the BLE 15 to obtain BLE location information and transmit the information to the fusion.
- GNSS processing unit 26 processes the GNSS 16 raw data to obtain image position information and transmits to the fusion filter 3.
- the observation processing unit 2 also includes other processing units to process other sensors of the handheld smart device platform 1 to obtain position or velocity information and transmit to the fusion filter 3.
- the above-described fusion filter 3 includes a system model 31 and an observation model 32.
- the system model 31 predicts the position and velocity information of the object to be measured using the kinematic model and transmits it to the observation model 32;
- the observation model 32 predicts the position and velocity information of the system model 31 and the IMU based on the observation processing unit, and the magnetic force.
- the information such as the position, speed, and the like of the gauge 12, the pressure gauge 13, the WiFi 14, the BLE 15, and the GNSS 16 are combined to update the final position and speed information of the target to be tested.
- the IMU processing unit 21 includes a user motion mode and device usage mode recognition module 211, a heading angle deviation estimation module 212, a modified dead reckoning algorithm module 213, and a user motion mode and device usage pattern recognition module 211 according to the
- the raw data provided by the IMU 11 and other optional hardware (such as the magnetometer 12) of the handheld smart device platform recognizes user motion patterns such as still, walking, running, and device usage modes such as handheld, text messaging, telephone, navigation, pocket, and backpack. .
- the heading angle deviation estimation 212 module is based on the user motion mode and the device usage mode output by the user motion mode and the device usage pattern recognition module 211 and the IMU 11 and other optional hardware of the handheld smart device platform (eg, magnetometer 12)
- the raw data provided estimates the heading angle deviation.
- the improved dead reckoning algorithm module 213 obtains IMU position information based on the heading angle deviation output by the heading angle deviation estimating module 212 and the raw data provided by the handheld smart device platform 1IMU 11 and other optional hardware (eg, magnetometer 12). And transmitted to the fusion filter 3.
- the IMU processing unit in Figure 2 takes into account multiple movements of pedestrians Modes and multiple use modes of smart devices, designed IMU data processing methods for a variety of use scenarios, breaking through the limitations of IMU and carrier fixed in traditional algorithms, improving the usability of pedestrian navigation systems.
- the user motion mode and device usage pattern recognition module 211 uses existing handheld smart device 1 related sensor outputs: IMU 11, magnetometer 12, ranging sensor (optional), light sensor (optional).
- the IMU 11 and magnetometer 12 update frequency is 50-200 Hz; the output of the latter two sensors is scalar and updated to trigger user behavior.
- the user motion mode and the device use the pattern recognition algorithm to extract sensor statistics within 1-3 seconds to make a classification decision.
- User motion patterns and device usage pattern recognition algorithms can be implemented in a variety of ways.
- the present invention uses a Gaussian kernel dual type support vector machine as an example of implementation.
- the Gaussian kernel-based support vector machine can implicitly map eigenvectors to infinite dimensional linear spaces to achieve or exceed the effects of nonlinear classifications (such as traditional KNN).
- the prototype of the 1 norm soft marginal support vector machine is as follows:
- w ⁇ R d is the weight vector
- C is a controllable normalized constant for balancing the training
- ⁇ ( ⁇ ) is a feature vector mapping function.
- Participate in online classification calculations for example, to implement a nonlinear classifier based on randomly generated data as shown in Figure 3, KNN needs to store and use 1000 original training feature vectors, and support vector machines only need 142 support vectors) Therefore, the requirements for processor battery consumption and system memory are greatly reduced, and the application is suitable for the handheld smart device platform.
- the user motion mode and device usage pattern recognition module 211 classifies the pedestrian behavior patterns into five categories: 1. stationary; 2. walking; 3. running; 4. bicycle; 5. driving a car.
- the identification of pedestrian behavior patterns can be used to apply zero-speed correction, to adjust the variance of the tracking filter process noise, and to adjust the correlation time of the dynamic system Markov process.
- the user motion mode and the device usage mode recognition module 211 classify the device usage modes into four categories: 1. the front end is flat; 2. the ear side is vertical; 3. the backpack; 4. the armband.
- the identification of the handset gesture mode can be used for the determination of the heading direction (coordinate transformation) and for adjusting the variance of the tracking filter process noise.
- FIG. 4 a schematic diagram of a support vector machine for user motion mode and device usage pattern identification using a secondary classifier, including acceleration statistics 2111, angular velocity statistics 2112, rotation angle and tilt angle statistics 2113, light and distance statistics A quantity 2114, a speed feedback statistic 2115, a feature normalization module 2116, a principal component analysis module 2117, Support vector machine module 2118, user motion mode primary classifier 2119, and device usage mode secondary classifier 2110.
- the specific implementation steps include: collecting representative data sets offline, performing eigenvector normalization and principal component analysis, applying formulas (5) and (6) for training, extracting and storing support vectors, and calculating sensor output statistics online.
- Feature vector normalization and principal component extraction (same as the training concentration factor), application of the stored support vector, and equations (5) and (7) for secondary classification, determining user motion patterns and device usage patterns.
- the heading angle deviation estimation module 212 in the present invention includes a number of different methods.
- PCA Principal Component Analysis
- One of the characteristics of pedestrian movement is that the direction of pedestrian acceleration and deceleration is in the direction of travel. Therefore, the data of the accelerometer can be analyzed by PCA to obtain the traveling direction of the pedestrian.
- GNSS 16 the direction in which pedestrians travel can be calculated from the speed of the GNSS.
- the magnetometer 12 the direction in which the pedestrian travels can also be calculated by the magnetometer 12.
- the heading angle of the handheld smart device is derived from a nine-axis fusion or a six-axis fusion. Therefore, the heading angle deviation can be obtained by subtracting the convergence direction of the traveling direction of the pedestrian and the heading angle of the hand-held smart device obtained by various methods, and outputting to the heading angle deviation compensation module 2132.
- FIG. 5 it is a schematic diagram of the improved dead reckoning algorithm module 213, including a attitude measuring system module 2131, a heading angle deviation compensation module 2132, a step detecting module 2133, a step estimating module 2134, and a dead reckoning algorithm module 2135.
- the attitude system module 2131 identifies the gesture information of the handheld smart device 1 based on the raw data provided by the IMU 11 of the handheld smart device platform 1 and other optional magnetometers 12.
- the heading angle deviation compensation module 2132 reads the heading angle deviation output by the heading angle deviation estimation module 212 and compensates the pedestrian heading angle to the dead reckoning algorithm module 2135.
- the step detection module 2133 detects the step number of the pedestrian from the raw data of the IMU 11 of the handheld smart device platform and feeds back to the step estimation module 2134.
- the step estimation module 2134 estimates the step size of the pedestrian based on the result of the step detection module and the original data of the IMU 11 of the handheld smart device platform 1 and feeds back to the dead reckoning module 2135.
- the dead reckoning module 2135 calculates the IMU position observation based on the step information output by the step estimating module 2134 and the heading information output by the heading angle offset compensation module 2132, and outputs the IMU position observation to the fusion filter 3.
- the attitude measurement system module 2131 identifies the attitude information of the handheld smart device 1 based on the raw data provided by the IMU 11 of the handheld smart device platform 1 and other optional magnetometers 12.
- the attitude measurement system module 2131 uses an algorithm to select a nine-axis attitude determination algorithm or a six-axis attitude determination algorithm according to whether or not the geomagnetic information is available.
- the attitude measuring system module 2131 outputs the heading angle of the smart device 1 to the heading angle deviation compensation module 2132.
- the heading angle deviation compensation module 2132 reads the heading angle deviation output by the heading angle deviation estimation module 212 and The compensation is given to the pedestrian heading angle and output to the dead reckoning algorithm module 2135.
- the specific calculation formula is as follows:
- ⁇ p ⁇ d + ⁇ offset . (1)
- ⁇ p the pedestrian heading angle
- ⁇ d the heading angle of the equipment
- ⁇ offset the heading angle deviation
- the step detection module 2133 detects the step number of the pedestrian from the raw data of the IMU 11 of the handheld smart device platform 1 and feeds back to the step estimation module 2134.
- Pace detection can detect the pace by means of peak detection, zero-crossing detection, correlation detection and power spectrum detection.
- the present invention contemplates a variety of user motion modes and device usage modes.
- the step detection algorithm uses peak detection to simultaneously detect acceleration data and gyroscope data of the IMU 11.
- the step estimation module 2134 estimates the step size of the pedestrian according to the result of the step detection module 2133 and the original data of the IMU 11 of the handheld smart device platform 1, and outputs the step to the dead reckoning module 2135.
- the step size estimation can be calculated by different methods such as acceleration integral, pendulum model, linear model, and empirical model.
- the present invention contemplates a variety of user motion patterns and device usage patterns, and the step size estimation uses the following linear model:
- A, B and C are constants
- f k-1 and f k are the step frequencies at k-1 and k
- ⁇ acc, k-1 and ⁇ acc, k are the accelerations at k-1 and k
- the variance of the meter
- the dead reckoning module 2135 is based on the position [r e,k-1 r n,k-1 ] T at the k-1 time, the step information s k-1,k output by the step estimation module 2134 , and the heading angle offset compensation module.
- the heading angle information ⁇ k-1 outputted by 2132 derives the position [r e,k r n,k ] T at time k .
- the corresponding calculation formula is as follows:
- the dead reckoning module 2135 outputs an IMU position measurement to the fusion filter 3.
- the fusion filter 3 includes a system model 31 and an observation model 32.
- the traditional multi-sensor fusion structure generally processes the IMU measurement data through an inertial mechanical orchestration algorithm and establishes a related fusion filter system model. Since there are many integral operations in the inertial mechanical arrangement, the positioning error of the conventional multi-sensor fusion structure is rapidly accumulated when there is no external auxiliary system.
- the present invention overcomes the shortcomings of the conventional multi-sensor fusion structure, and uses the pedestrian motion model as a system model, and the IMU related data is used as an observation model like other systems.
- the fusion filter 3 can adopt Kalman Filter (KF), Adaptive Kalman Filter (AKF), UKF Lossless Kalman Filter (UKF) or particles.
- Filter (PF) The present invention gives a design example of KF.
- Other filters can refer to the KF design.
- the state vector implemented by the fusion filter 3 as KF is defined as follows:
- rn and ru are three-dimensional positions (Northeast celestial coordinate system), and ve, vn and vu are corresponding three-dimensional velocity components.
- the KF system model 32 uses a classical kinematic model and is defined as follows:
- k is the predicted state vector
- k is the previous state vector at time k
- ⁇ k,k+1 is a 6 ⁇ 6 transition matrix
- ⁇ t is the time difference between two moments.
- ⁇ k is the covariance matrix
- the processing noise is defined as follows:
- n e , n n and n u are Gaussian white noise, and ⁇ t is the time difference between two moments.
- the measurement model 31 implemented by the fusion filter 3 as KF is defined as follows:
- z k is the measurement vector and H k is the decision matrix.
- ⁇ k is the measurement noise with Gaussian white noise as the model, and its covariance matrix is z k and H k vary depending on the measurement.
- the typical z k and H k are defined as follows:
- the KF process has two phases: forecasting and updating.
- the prediction process the state vector and the covariance matrix are predicted based on the system model.
- the state vector and covariance matrix are updated according to the measurement model:
- K k is called the Kalman gain
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
A pedestrian navigation apparatus and method based on a novel multi-sensor fusion technology. The apparatus comprises a handheld intelligent device (1), an observation processing unit (2) and a fusion filter (3). The method comprises: the handheld intelligent device (1) acquiring, by hardware thereof, original data of an IMU (11), a magnetometer (12), a pressure gauge (13), WiFi (14), BLE (15) and GNSS (16); the observation processing unit (2) processing the original data provided by the handheld intelligent device (1) to provide observed quantities of position and speed to the fusion filter (3); and the fusion filter (3) obtaining a pedestrian navigation result by using a kinematics model as a system model and an observation model that is established according to a processing result by the observation processing unit (2) as well as through the processing of the fusion filter (3). The method, without the assistance of other supplemental systems, overcomes the issue of rapidly accumulated navigation errors. An IMU processing unit (21) takes into account multiple modes of the handheld intelligent device (1) and surmounts a limitation whereby a conventional multi-sensor combined with an IMU needs to be fixed to a carrier, enhancing the accuracy of pedestrian navigation.
Description
本发明涉及多传感器融合和行人导航领域,尤其是一种基于新型多传感器融合技术的行人导航装置和方法。The invention relates to the field of multi-sensor fusion and pedestrian navigation, in particular to a pedestrian navigation device and method based on a novel multi-sensor fusion technology.
随着移动互联网的发展,室内外的行人导航应用蓬勃发展,例如大型商场室内导航、医院病人追踪、超市人流分析等。国内外多个市场分析报告一致认为行人导航将是一个有巨大市场的研究方向。与此同时,便携式智能设备,例如:智能手机、平板电脑和智能手表等在过去十年中一直超高速发展,已成为人们生活不可缺少的一部分。这些便携式设备大多数具备强大的处理器、无线收发器、摄像头、全球导航卫星系统GNSS接收机和众多传感器。因此,这些便携式智能设备已成为多传感器融合和行人导航相关应用的理想平台。With the development of the mobile Internet, indoor and outdoor pedestrian navigation applications are booming, such as indoor shopping in large shopping malls, hospital patient tracking, and supermarket traffic analysis. A number of market analysis reports at home and abroad agree that pedestrian navigation will be a research direction with a huge market. At the same time, portable smart devices, such as smartphones, tablets and smart watches, have been developing at an ultra-high speed for the past decade and have become an indispensable part of people's lives. Most of these portable devices have powerful processors, wireless transceivers, cameras, GNSS receivers for GNSS, and numerous sensors. As a result, these portable smart devices have become an ideal platform for multi-sensor fusion and pedestrian navigation related applications.
目前单一行人导航技术都存在不同程度的缺陷。基于WiFi和蓝牙等无线系统的行人导航技术通常具有无线信号强度在恶劣的环境下波动较大,无法提供完整的导航信息如三维位置、速度和姿态,系统性能高度依赖于发射设备的分布和数量,位置信息不连续平滑等缺陷。基于微惯性单元的行人导航技术短期精准,但导航误差累积较快。基于摄像头的视觉定位在复杂环境下视觉传感器标定较慢、特征信息提取错误率高、导航信息计算慢等缺陷。因此,多传感器融合已成为目前行人导航的主流方案。At present, single pedestrian navigation technology has different degrees of defects. Pedestrian navigation technology based on wireless systems such as WiFi and Bluetooth usually has large fluctuations in wireless signal strength in harsh environments, and cannot provide complete navigation information such as three-dimensional position, velocity and attitude. System performance is highly dependent on the distribution and number of transmitting devices. , location information is not continuous smooth and other defects. The pedestrian navigation technology based on the micro inertial unit is short-term accurate, but the navigation error accumulates faster. Camera-based visual positioning is characterized by slow calibration of visual sensors, high error rate of feature information extraction, and slow calculation of navigation information in complex environments. Therefore, multi-sensor fusion has become the mainstream solution for pedestrian navigation.
目前,现有的多传感器融合技术一般包括以下几个步骤:(1)以惯性单元(三轴加速度计和三轴陀螺仪)的测量数据通过惯性机械编排算法计算跟踪物体的位置、速度和姿态信息;(2)建立与惯性机械编排算法对应的误差模型,并作为融合滤波器的系统模型;(3)将其他辅助系统(GPS、WiFi、蓝牙、RFID、GNSS等)建立为融合滤波器的观测模型;(4)通过融合滤波器的预测和更新过程估计出系统状态量误差;和(5)将系统状态量误差补偿惯性单元误差和基于惯性机械编排算法的位置、速度和姿态信息,得出最终的位置信息、速度和姿态信息。At present, the existing multi-sensor fusion technology generally includes the following steps: (1) Calculating the position, velocity and attitude of the tracking object by the inertial mechanical programming algorithm with the measurement data of the inertial unit (three-axis accelerometer and three-axis gyroscope) (2) Establish an error model corresponding to the inertial mechanical programming algorithm and use it as a system model of the fusion filter; (3) Establish other auxiliary systems (GPS, WiFi, Bluetooth, RFID, GNSS, etc.) as fusion filters Observing the model; (4) estimating the state quantity error of the system through the prediction and update process of the fusion filter; and (5) compensating the inertial unit error of the system state quantity error and the position, velocity and attitude information based on the inertial mechanical programming algorithm Final location information, speed and attitude information.
现有多传感器融合技术具有以下两个致命缺点(1)在没有其他辅助系统的情况下,导航误差会迅速累积;(2)当惯性单元与载体不固定的情况下,例如:行人导航中的手机和行人,传统的多传感器融合技术无法正确估计出载体的信息。因此,现有多传感器融合技术在很多场景下无法提供准确的行人导航信息。
The existing multi-sensor fusion technology has the following two fatal disadvantages: (1) in the absence of other auxiliary systems, the navigation error will accumulate rapidly; (2) when the inertial unit and the carrier are not fixed, for example: in pedestrian navigation Mobile phones and pedestrians, traditional multi-sensor fusion technology can not correctly estimate the carrier information. Therefore, the existing multi-sensor fusion technology cannot provide accurate pedestrian navigation information in many scenarios.
发明内容Summary of the invention
本发明所要解决的技术问题在于,提供一种基于新型多传感器融合技术的行人导航装置和方法,达到提升行人导航精度和可用性的效果。The technical problem to be solved by the present invention is to provide a pedestrian navigation device and method based on a novel multi-sensor fusion technology, which can improve the navigation accuracy and usability of pedestrians.
为解决上述技术问题,本发明提供一种基于新型多传感器融合技术的行人导航装置和方法,包括:手持智能设备平台、观测量处理单元以及融合滤波器;手持智能设备利用自身硬件获取惯性测量单元(Inertial Measurement Unit,IMU)、磁力计、压力计、WiFi、低耗能蓝牙(Bluetooth Low Energy,BLE)和全球导航卫星系统(Global Navigation Satellite System,GNSS)的原始数据,观测量处理单元处理手持智能设备提供的原始数据以提供位置或速度等观测量给融合滤波器,融合滤波器利用运动学模型作为系统模型,观测量处理单元的结果建立观测模型,经过融合滤波器的处理最终得到行人导航结果。In order to solve the above technical problem, the present invention provides a pedestrian navigation device and method based on a novel multi-sensor fusion technology, including: a handheld intelligent device platform, an observation measurement processing unit, and a fusion filter; the handheld intelligent device acquires an inertial measurement unit by using its own hardware. (Inertial Measurement Unit, IMU), magnetometer, pressure gauge, WiFi, Bluetooth Low Energy (BLE) and Global Navigation Satellite System (GNSS) raw data, observation processing unit handles handheld The raw data provided by the intelligent device provides the positional or velocity observation to the fusion filter, and the fusion filter uses the kinematic model as the system model, and the observation processing unit results in the observation model, and the fusion filter is processed to finally obtain the pedestrian navigation. result.
优选的,手持智能设备平台包括现有智能设备常见的IMU、磁力计、压力计、WiFi、低耗能蓝牙和GNSS等;IMU提供加速度和角速度的原始数据;所述磁力计提供地磁的原始数据;压力计提供大气气压的原始数据;WiFi提供WiFi接收信号强度(Received Signal Strength,RSS)的原始数据;BLE提供BLERSS的原始数据;GNSS接收机提供GNSS的原始数据;智能设备平台任何可以提供观测信息的其他传感器都可以包含在所提出的多传感器融合算法中。Preferably, the handheld smart device platform comprises an IMU, a magnetometer, a pressure gauge, a WiFi, a low energy Bluetooth and a GNSS, which are common to existing smart devices; the IMU provides raw data of acceleration and angular velocity; the magnetometer provides raw data of geomagnetism The pressure gauge provides raw data of atmospheric pressure; WiFi provides raw data of Received Signal Strength (RSS); BLE provides raw data of BLERSS; GNSS receiver provides raw data of GNSS; any device of intelligent equipment can provide observation Other sensors of information can be included in the proposed multi-sensor fusion algorithm.
优选的,观测量处理单元包括:IMU处理单元、磁力计处理单元、压力计处理单元、WiFi处理单元、BLE处理单元和GNSS处理单元等;IMU处理单元处理所述IMU提供的加速度和角速度的原始数据以得到IMU位置信息并传送给所述融合滤波器;磁力计处理单元处理所述磁力计提供的地磁的原始数据以得到地磁位置信息并传送给所述融合滤波器;压力计处理单元处理所述压力计提供的大气气压的原始数据以得到高程信息并传送给所述融合滤波器;WiFi处理单元处理所述WiFi提供的RSS原始数据以得到WiFi位置信息并传送给所述融合滤波器;BLE处理单元处理所述BLE提供的RSS原始数据以得到BLE位置信息并传送给所述融合滤波器;GNSS处理单元处理所述GNSS接收机提供的位置和速度信息并传送给所述融合滤波器。观测量处理单元还包括其他处理单元以处理智能设备平台的其他传感器来得到位置或速度信息并传送给融合滤波器。Preferably, the observation processing unit comprises: an IMU processing unit, a magnetometer processing unit, a pressure gauge processing unit, a WiFi processing unit, a BLE processing unit, a GNSS processing unit, and the like; and the IMU processing unit processes the original acceleration and angular velocity provided by the IMU. Data to obtain IMU position information and transmitted to the fusion filter; the magnetometer processing unit processes the geomagnetic raw data provided by the magnetometer to obtain geomagnetic position information and transmits to the fusion filter; the pressure gauge processing unit processes Raw data of atmospheric pressure provided by the pressure gauge to obtain elevation information and transmitted to the fusion filter; the WiFi processing unit processes the RSS original data provided by the WiFi to obtain WiFi location information and transmit to the fusion filter; BLE The processing unit processes the RSS raw data provided by the BLE to obtain BLE location information and transmits to the fusion filter; the GNSS processing unit processes the position and velocity information provided by the GNSS receiver and transmits the information to the fusion filter. The observation processing unit also includes other processing units to process other sensors of the smart device platform to obtain position or velocity information and transmit to the fusion filter.
优选的,融合滤波器包括系统模型和观测模型;系统模型使用运动学模型对待测目标的位置和速度信息进行预测,并传送给观测模型;观测模型将系统模型预测的位置、
速度信息与观测量处理单元提供的基于IMU、磁力计、压力计、WiFi、BLE和GNSS等的位置、速度等信息结合,更新待测目标的的最终位置和速度信息。Preferably, the fusion filter comprises a system model and an observation model; the system model uses the kinematic model to predict the position and velocity information of the object to be measured, and transmits the information to the observation model; the observation model predicts the position of the system model,
The speed information is combined with information such as position, speed, etc. of the IMU, magnetometer, pressure gauge, WiFi, BLE, and GNSS provided by the observation processing unit to update the final position and speed information of the target to be tested.
优选的,IMU处理单元包括用户运动模式和设备使用模式识别模块、航向角偏差估计模块、改进航位推算算法模块,用户运动模式和设备使用模式识别模块根据所述手持智能设备平台的IMU和其他可选的硬件(例如磁力计)提供的原始数据识别出静止、行走、跑步等用户运动模式和手持、短信、电话、导航、口袋、背包等设备使用模式;航向角偏差估计模块根据所述用户运动模式和设备使用模式识别模块输出的用户运动模式和设备使用模式和所述手持智能设备平台的IMU和其他可选的硬件(例如磁力计)提供的原始数据估计出航向角偏差;改进航位推算算法模块根据所述航向角偏差估计模块输出的航向角偏差和所述手持智能设备平台的IMU和其他可选的硬件(例如磁力计)提供的原始数据得到IMU位置信息并传送给所述融合滤波器。Preferably, the IMU processing unit comprises a user motion mode and a device usage pattern recognition module, a heading angle deviation estimation module, an improved dead reckoning algorithm module, a user motion mode and a device usage pattern recognition module according to the IMU of the handheld smart device platform and others The raw data provided by optional hardware (such as a magnetometer) identifies user motion patterns such as stationary, walking, running, and the device usage modes of handheld, text messaging, telephone, navigation, pocket, backpack, etc.; the heading angle deviation estimation module is based on the user The motion mode and the device use the user motion mode and device usage mode output by the pattern recognition module and the raw data provided by the IMU of the handheld smart device platform and other optional hardware (eg, a magnetometer) to estimate the heading angle deviation; The estimation algorithm module obtains IMU position information according to the heading angle deviation output by the heading angle deviation estimation module and the original data provided by the IMU of the handheld smart device platform and other optional hardware (for example, a magnetometer) and transmits the information to the fusion. filter.
优选的,改进航位推算算法模块包括测姿系统模块、航向角偏差补偿模块、步伐检测模块、步长估计模块、航位推算算法模块,测姿系统模块根据所述手持智能设备平台的IMU和其他可选磁力计提供的原始数据识别出手持智能设备的姿态信息;航向角偏差补偿模块读取航向角偏差估计模块输出的航向角偏差并补偿给行人航向角、输出给航位推算算法;步伐检测模块算法模块根据所述手持智能设备平台的IMU的原始数据检测出行人的步数反馈给步长估计模块;步长估计模块根据所述步伐检测模块的结果和所述手持智能设备平台的IMU的原始数据估计出行人的步长,并反馈给所述的航位推算模块;航位推算模块根据所述步长估计模块输出的步长信息和所述航向角偏差补偿模块输出的行人航向角信息计算出IMU位置观测量并反馈给所述的融合滤波器。Preferably, the improved dead reckoning algorithm module comprises a attitude measuring system module, a heading angle deviation compensation module, a step detecting module, a step estimating module, a dead reckoning algorithm module, and the attitude determining system module is based on the IMU of the handheld smart device platform. The original data provided by other optional magnetometers recognizes the attitude information of the handheld smart device; the heading angle deviation compensation module reads the heading angle deviation outputted by the heading angle deviation estimation module and compensates the pedestrian heading angle and outputs to the dead reckoning algorithm; The detection module algorithm module detects the step number of the pedestrian to the step estimation module according to the original data of the IMU of the handheld smart device platform; the step estimation module according to the result of the step detection module and the IMU of the handheld smart device platform The raw data estimates the step size of the pedestrian and feeds back to the dead reckoning module; the dead reckoning module estimates the step information output by the module according to the step size and the pedestrian heading angle output by the heading angle offset compensation module The information calculates the IMU position observation and feeds back to the fusion filter.
相应地,本发明还提供了一种基于新型多传感器融合技术的行人导航方法,包括以下步骤:Correspondingly, the present invention also provides a pedestrian navigation method based on a novel multi-sensor fusion technology, comprising the following steps:
(1)手持智能设备利用自身硬件获取IMU、磁力计、压力计、WiFi、BLE和GNSS的原始数据;(2)观测量处理单元处理手持智能设备提供的原始数据以提供位置或速度等观测量给融合滤波器;(3)融合滤波器利用运动学模型作为系统模型,观测量处理单元的结果建立观测模型,经过融合滤波器的处理最终得到行人导航结果。(1) The handheld smart device uses its own hardware to obtain raw data of IMU, magnetometer, pressure gauge, WiFi, BLE and GNSS; (2) The observation processing unit processes the raw data provided by the handheld smart device to provide position or velocity measurement. The fusion filter is used; (3) the fusion filter uses the kinematics model as the system model, and the observation processing unit is used to establish the observation model. After the fusion filter is processed, the pedestrian navigation result is finally obtained.
优选的,观测量处理单元处理手持智能设备提供的原始数据以提供位置或速度等观测量给融合滤波器包括如下步骤:Preferably, the observation processing unit processes the raw data provided by the handheld smart device to provide a position or velocity observation to the fusion filter, including the following steps:
(1)IMU处理单元处理所述IMU提供的加速度和角速度的原始数据以得到IMU
位置信息并传送给所述融合滤波器;(2)磁力计处理单元处理所述磁力计提供的地磁的原始数据以得到地磁位置信息并传送给所述融合滤波器;(3)压力计处理单元处理所述压力计提供的大气气压的原始数据以得到高程信息并传送给所述融合滤波器;(4)WiFi处理单元处理所述WiFi提供的RSS原始数据以得到WiFi位置信息并传送给所述融合滤波器;(5)BLE处理单元处理所述BLE提供的RSS原始数据以得到BLE位置信息并传送给所述融合滤波器;(6)GNSS处理单元处理所述GNSS芯片提供的GNSS的原始数据以得到GNSS的位置和速度信息并传送给所述融合滤波器。(1) The IMU processing unit processes the raw data of the acceleration and angular velocity provided by the IMU to obtain the IMU
Position information is transmitted to the fusion filter; (2) the magnetometer processing unit processes the geomagnetic raw data provided by the magnetometer to obtain geomagnetic position information and transmits to the fusion filter; (3) the pressure gauge processing unit Processing raw data of atmospheric pressure provided by the pressure gauge to obtain elevation information and transmitting to the fusion filter; (4) the WiFi processing unit processes the RSS provided raw data to obtain WiFi location information and transmitting the information to the a fusion filter; (5) the BLE processing unit processes the RSS raw data provided by the BLE to obtain BLE location information and transmits the BLE location information to the fusion filter; (6) the GNSS processing unit processes the raw data of the GNSS provided by the GNSS chip The position and velocity information of the GNSS is obtained and transmitted to the fusion filter.
优选的,IMU处理单元处理IMU提供的加速度和角速度的原始数据以得到IMU位置信息并传送给所述融合滤波器包括如下步骤:Preferably, the IMU processing unit processes the raw data of the acceleration and angular velocity provided by the IMU to obtain the IMU position information and transmits the information to the fusion filter, including the following steps:
(1)所述用户运动模式和设备使用模式识别模块根据所述手持智能设备平台的IMU和其他可选的硬件(例如磁力计)提供的原始数据识别出静止、行走、跑步等用户运动模式和手持、短信、电话、导航、口袋、背包等设备使用模式;(1) The user motion mode and device usage pattern recognition module identifies user motion patterns such as still, walking, running, etc. according to raw data provided by the IMU of the handheld smart device platform and other optional hardware (eg, a magnetometer). Handheld, SMS, phone, navigation, pocket, backpack and other device usage modes;
(2)航向角偏差估计模块根据所述用户运动模式和设备使用模式识别模块输出的用户运动模式和设备使用模式和所述手持智能设备平台的IMU和其他可选的硬件(例如磁力计)提供的原始数据估计出航向角偏差;(2) The heading angle deviation estimation module is provided according to the user motion mode and the device usage mode output by the user motion mode and the device usage mode recognition module, and the IMU of the handheld smart device platform and other optional hardware (eg, a magnetometer) The raw data estimates the heading angle deviation;
(3)改进航位推算算法模块根据所述航向角偏差估计模块输出的航向角偏差和所述手持智能设备平台的IMU和其他可选的硬件(例如磁力计)提供的原始数据得到IMU位置信息并传送给所述融合滤波器。(3) The improved dead reckoning algorithm module obtains IMU position information according to the heading angle deviation output by the heading angle deviation estimating module and the original data provided by the IMU of the handheld smart device platform and other optional hardware (for example, a magnetometer). And transmitted to the fusion filter.
优选的,所述改进航位推算算法模块包括如下步骤:Preferably, the improved dead reckoning algorithm module comprises the following steps:
(1)测姿系统模块根据所述手持智能设备平台的IMU和其他可选磁力计提供的原始数据识别出手持智能设备的姿态信息;(1) The attitude measuring system module identifies the posture information of the handheld smart device according to the original data provided by the IMU of the handheld smart device platform and other optional magnetometers;
(2)航向角偏差补偿模块读取航向角偏差估计模块输出的航向角偏差并补偿给行人航向角、输出给航位推算算法;(2) The heading angle deviation compensation module reads the heading angle deviation outputted by the heading angle deviation estimation module and compensates the pedestrian heading angle and outputs to the dead reckoning algorithm;
(3)步伐检测模块算法模块根据所述手持智能设备平台的IMU的原始数据检测出行人的步数反馈给步长估计模块;(3) The step detection module algorithm module detects the step number of the pedestrian feedback to the step size estimation module according to the original data of the IMU of the handheld smart device platform;
(4)步长估计模块根据所述步伐检测模块的结果和所述手持智能设备平台的IMU的原始数据估计出行人的步长,并反馈给所述的航位推算模块;(4) The step estimation module estimates the step size of the pedestrian according to the result of the step detection module and the original data of the IMU of the handheld smart device platform, and feeds back to the dead reckoning module;
(5)航位推算模块根据所述步长估计模块输出的步长信息和所述航向角偏差补偿模块输出的行人航向角信息计算出IMU位置观测量并反馈给所述的融合滤波器。
(5) The dead reckoning module calculates the IMU position observation based on the step information output by the step estimation module and the pedestrian heading angle information output by the heading angle offset compensation module, and feeds back to the fusion filter.
本发明的有益效果为:本发明优化了传统行人导航中IMU的使用方法,将其从融合滤波器的系统模型解放出来变成观测模型,克服了将传统多传感器融合在没有其他辅助系统的情况下,导航误差会迅速累积的缺点。本发明中的IMU处理模块考虑了日常生活中手持智能设备的多种模式,突破了传统多传感器融合IMU需和载体固定的限制。因此,本发明大大提高了行人导航的精确度和可用性。The invention has the beneficial effects that the invention optimizes the use method of the IMU in the traditional pedestrian navigation, liberates it from the system model of the fusion filter into an observation model, and overcomes the situation that the traditional multi-sensor is integrated without other auxiliary systems. Under the disadvantage that navigation errors will accumulate quickly. The IMU processing module in the present invention considers various modes of handheld smart devices in daily life, and breaks through the limitation that the traditional multi-sensor fusion IMU needs to be fixed with the carrier. Therefore, the present invention greatly improves the accuracy and usability of pedestrian navigation.
图1为本发明基于新型多传感器融合的行人导航装置的结构示意图。FIG. 1 is a schematic structural diagram of a pedestrian navigation device based on a novel multi-sensor fusion according to the present invention.
图2为本发明中惯性单元处理模块的结构示意图。2 is a schematic structural view of an inertial unit processing module according to the present invention.
图3为本发明中高斯内核支持向量机非线性分类器示意图。FIG. 3 is a schematic diagram of a Gaussian kernel support vector machine nonlinear classifier according to the present invention.
图4为本发明中用户运动模式和智能设备使用模式识别支持向量机示意图。4 is a schematic diagram of a user motion mode and a smart device usage pattern recognition support vector machine in the present invention.
图5为本发明中改进的行人航位推测算法示意图。FIG. 5 is a schematic diagram of an improved pedestrian dead reckoning algorithm in the present invention.
如图1所示,一种基于新型多传感器融合的行人导航装置,包括:手持智能设备平台1、观测量处理单元2以及融合滤波器3。手持智能设备1利用自身硬件获取惯性测量单元(Inertial Measurement Unit,IMU)11、磁力计12、压力计13、WiFi14、低耗能蓝牙(Bluetooth Low Energy,BLE)15和全球导航卫星系统(Global Navigation Satellite Systems,GNSS)16的原始数据,观测量处理单元2处理手持智能设备1提供的原始数据以提供位置或速度等观测量给融合滤波器3,融合滤波器3利用运动学模型作为系统模型31,观测量处理单元的结果建立观测模型32,经过融合滤波器3的处理最终得到行人导航结果。上述新型多传感器融合行人导航装置可以用于各种手持智能设备(包括智能手机、平板电脑、智能手表等),手持智能设备可以手持或固定于行人身上。图1所示的新型多传感器融合的行人导航系统颠覆了传统行人导航中IMU的使用方法,将其从融合滤波器的系统模型31解放出来变成观测模型32,克服了将传统多传感器融合行人导航装置在没有其他辅助系统的情况下,导航误差会迅速累积的缺点。As shown in FIG. 1 , a pedestrian navigation device based on a novel multi-sensor fusion includes: a handheld smart device platform 1 , an observation measurement processing unit 2 , and a fusion filter 3 . The handheld smart device 1 utilizes its own hardware to acquire an Inertial Measurement Unit (IMU) 11, a magnetometer 12, a pressure gauge 13, a WiFi 14, a Bluetooth Low Energy (BLE) 15 and a Global Navigation Satellite System (Global Navigation). Raw data of Satellite Systems, GNSS) 16, the observation processing unit 2 processes the raw data provided by the handheld smart device 1 to provide a position or velocity observation to the fusion filter 3, and the fusion filter 3 utilizes the kinematic model as the system model 31. The result of the observation processing unit establishes the observation model 32, and the processing of the fusion filter 3 finally obtains the pedestrian navigation result. The above novel multi-sensor fusion pedestrian navigation device can be used for various handheld smart devices (including smart phones, tablet computers, smart watches, etc.), and the handheld smart device can be held by hand or fixed on a pedestrian. The new multi-sensor fusion pedestrian navigation system shown in Figure 1 subverts the use of IMU in traditional pedestrian navigation, liberating it from the system model 31 of the fusion filter into the observation model 32, overcoming the integration of traditional multi-sensor pedestrians. The disadvantage that the navigation device can accumulate rapidly without other auxiliary systems.
上述手持智能设备平台1包括现有智能设备常见的IMU 11、磁力计12、压力计13、WiFi 14、BLE 15和GNSS 16等;IMU 11提供加速度和角速度的原始数据;磁力计12提供地磁的原始数据;所述压力计13提供大气气压的原始数据;WiFi 14提供WiFi接收信号强度(Received Signal Strength,RSS)的原始数据;BLE 15提供BLERSS的原始
数据;GNSS 16提供GNSS的原始速度和位置数据。手持智能设备1任何可以提供观测信息的其他传感器都可以包含在所提出的多传感器融合算法中。The above-mentioned handheld smart device platform 1 includes an IMU 11, a magnetometer 12, a pressure gauge 13, a WiFi 14, a BLE 15, and a GNSS 16 which are common to existing smart devices; the IMU 11 provides raw data of acceleration and angular velocity; and the magnetometer 12 provides geomagnetism. Raw data; the pressure gauge 13 provides raw data of atmospheric pressure; WiFi 14 provides raw data of Received Signal Strength (RSS); BLE 15 provides original BLERSS
Data; GNSS 16 provides raw speed and position data for GNSS. Any other sensor that can hold the observation information of the handheld smart device 1 can be included in the proposed multi-sensor fusion algorithm.
上述观测量处理单元2包括:IMU处理单元21、磁力计处理单元22、压力计处理单元23、WiFi处理单元24、BLE处理单元25和GNSS处理单元26等。IMU处理单元21处理所述IMU 11提供的加速度和角速度的原始数据以得到IMU位置信息并传送给所述融合滤波器3;磁力计处理单元22处理所述磁力计12提供的地磁的原始数据以得到地磁位置信息并传送给所述融合滤波器3;压力计处理单元23处理所述压力计13提供的大气气压的原始数据以得到高程信息并传送给所述融合滤波器3;WiFi处理单元24处理所述WiFi 14提供的RSS原始数据以得到WiFi位置信息并传送给所述融合滤波器3;BLE处理单元25处理所述BLE 15提供的RSS原始数据以得到BLE位置信息并传送给所述融合滤波器3;GNSS处理单元26处理所述GNSS 16原始数据以得到图像位置信息并传送给所述融合滤波器3。观测量处理单元2还包括其他处理单元以处理手持智能设备平台1的其他传感器来得到位置或速度信息并传送给融合滤波器3。The above-described observation processing unit 2 includes an IMU processing unit 21, a magnetometer processing unit 22, a pressure gauge processing unit 23, a WiFi processing unit 24, a BLE processing unit 25, a GNSS processing unit 26, and the like. The IMU processing unit 21 processes the raw data of the acceleration and angular velocity provided by the IMU 11 to obtain IMU position information and transmits it to the fusion filter 3; the magnetometer processing unit 22 processes the original data of the geomagnetism provided by the magnetometer 12 to The geomagnetic position information is obtained and transmitted to the fusion filter 3; the pressure gauge processing unit 23 processes the raw data of the atmospheric pressure provided by the pressure gauge 13 to obtain elevation information and transmits to the fusion filter 3; the WiFi processing unit 24 Processing the RSS raw data provided by the WiFi 14 to obtain WiFi location information and transmitting the same to the fusion filter 3; the BLE processing unit 25 processes the RSS raw data provided by the BLE 15 to obtain BLE location information and transmit the information to the fusion. Filter 3; GNSS processing unit 26 processes the GNSS 16 raw data to obtain image position information and transmits to the fusion filter 3. The observation processing unit 2 also includes other processing units to process other sensors of the handheld smart device platform 1 to obtain position or velocity information and transmit to the fusion filter 3.
上述融合滤波器3包括系统模型31和观测模型32。系统模型31使用运动学模型对待测目标的位置和速度信息进行预测,并传送给观测模型32;观测模型32将系统模型31预测的位置、速度信息与观测量处理单元提供的基于IMU 11、磁力计12、压力计13、WiFi 14、BLE 15和GNSS 16等的位置、速度等信息结合,更新待测目标的的最终位置和速度信息。The above-described fusion filter 3 includes a system model 31 and an observation model 32. The system model 31 predicts the position and velocity information of the object to be measured using the kinematic model and transmits it to the observation model 32; the observation model 32 predicts the position and velocity information of the system model 31 and the IMU based on the observation processing unit, and the magnetic force. The information such as the position, speed, and the like of the gauge 12, the pressure gauge 13, the WiFi 14, the BLE 15, and the GNSS 16 are combined to update the final position and speed information of the target to be tested.
如图2所示,IMU处理单元21包括用户运动模式和设备使用模式识别模块211、航向角偏差估计模块212、改进航位推算算法模块213,用户运动模式和设备使用模式识别模块211根据所述手持智能设备平台的IMU 11和其他可选的硬件(例如磁力计12)提供的原始数据识别出静止、行走、跑步等用户运动模式和手持、短信、电话、导航、口袋、背包等设备使用模式。航向角偏差估计212模块根据所述用户运动模式和设备使用模式识别模块211输出的用户运动模式和设备使用模式和所述手持智能设备平台的IMU 11和其他可选的硬件(例如磁力计12)提供的原始数据估计出航向角偏差。改进航位推算算法模块213根据所述航向角偏差估计模块212输出的航向角偏差和所述手持智能设备平台1IMU 11和其他可选的硬件(例如磁力计12)提供的原始数据得到IMU位置信息并传送给所述融合滤波器3。图2中的所述IMU处理单元考虑了行人多种运动
模式和智能设备的多种使用模式,设计了针对多种使用场景的IMU数据处理方法,突破了传统算法中IMU需要和载体固定的限制,提高了行人导航系统的可用性。As shown in FIG. 2, the IMU processing unit 21 includes a user motion mode and device usage mode recognition module 211, a heading angle deviation estimation module 212, a modified dead reckoning algorithm module 213, and a user motion mode and device usage pattern recognition module 211 according to the The raw data provided by the IMU 11 and other optional hardware (such as the magnetometer 12) of the handheld smart device platform recognizes user motion patterns such as still, walking, running, and device usage modes such as handheld, text messaging, telephone, navigation, pocket, and backpack. . The heading angle deviation estimation 212 module is based on the user motion mode and the device usage mode output by the user motion mode and the device usage pattern recognition module 211 and the IMU 11 and other optional hardware of the handheld smart device platform (eg, magnetometer 12) The raw data provided estimates the heading angle deviation. The improved dead reckoning algorithm module 213 obtains IMU position information based on the heading angle deviation output by the heading angle deviation estimating module 212 and the raw data provided by the handheld smart device platform 1IMU 11 and other optional hardware (eg, magnetometer 12). And transmitted to the fusion filter 3. The IMU processing unit in Figure 2 takes into account multiple movements of pedestrians
Modes and multiple use modes of smart devices, designed IMU data processing methods for a variety of use scenarios, breaking through the limitations of IMU and carrier fixed in traditional algorithms, improving the usability of pedestrian navigation systems.
用户运动模式和设备使用模式识别模块211使用现有的手持智能设备1相关传感器输出:IMU 11,磁强计12,测距传感器(可选),光传感器(可选)。IMU 11和磁强计12更新频率为50-200Hz;后两种传感器的输出为标量,更新为用户行为触发。用户运动模式和设备使用模式识别算法提取1-3秒内传感器统计数据,做出分类决定。用户运动模式和设备使用模式识别算法可以有多种实现方式。本发明用高斯内核对偶型支持向量机作为实现的示例。The user motion mode and device usage pattern recognition module 211 uses existing handheld smart device 1 related sensor outputs: IMU 11, magnetometer 12, ranging sensor (optional), light sensor (optional). The IMU 11 and magnetometer 12 update frequency is 50-200 Hz; the output of the latter two sensors is scalar and updated to trigger user behavior. The user motion mode and the device use the pattern recognition algorithm to extract sensor statistics within 1-3 seconds to make a classification decision. User motion patterns and device usage pattern recognition algorithms can be implemented in a variety of ways. The present invention uses a Gaussian kernel dual type support vector machine as an example of implementation.
如图3所示,基于高斯内核的支持向量机可以隐含的将特征向量映射到无限维线性空间,从而达到或超越非线性分类(如传统的KNN)的效果。l1范数软边际支持向量机原型如下:As shown in Figure 3, the Gaussian kernel-based support vector machine can implicitly map eigenvectors to infinite dimensional linear spaces to achieve or exceed the effects of nonlinear classifications (such as traditional KNN). The prototype of the 1 norm soft marginal support vector machine is as follows:
训练公式(1):Training formula (1):
s.t.yi(wTφ(xi)+b)≥1-ξi,Sty i (w T φ(x i )+b)≥1-ξ i ,
ξi≥0.ξ i ≥0.
其中,xi∈Rd,yi,i=1,2,…,N为训练集合中的特征向量与分类结果,w∈Rd为权重向量,C为可控规范化常量用以平衡对于训练集合中数据的拟合过度与不足,φ(·)为特征向量映射函数。Where x i ∈R d , y i , i=1, 2,..., N are the feature vectors and classification results in the training set, w∈R d is the weight vector, and C is a controllable normalized constant for balancing the training The fitting of the data in the set is excessive and insufficient, and φ(·) is a feature vector mapping function.
分类公式(2):Classification formula (2):
f(x)=wTφ(x)+b.f(x)=w T φ(x)+b.
由于满足KKT条件,l1范数软边际支持向量机对偶型如下:Since the KKT condition is satisfied, the l 1 norm soft marginal support vector machine duality is as follows:
训练公式(3):Training formula (3):
s.t.0≤ai≤C,St0 ≤ a i ≤ C,
分类公式(4):
Classification formula (4):
引入高斯内核作为高效计算映射与内积的方式如下(5):The introduction of the Gaussian kernel as an efficient way to calculate the mapping and inner product is as follows (5):
k(x,x′)=φ(x)Tφ(x′)=exp(-||x-x′||2/2σ2),σ>0.k(x,x')=φ(x) T φ(x')=exp(-||xx'|| 2 /2σ 2 ), σ>0.
则对偶型转可以化为如下形式:Then the dual type can be converted into the following form:
训练公式(6):Training formula (6):
s.t.0≤ai≤C,St0 ≤ a i ≤ C,
分类公式(7):Classification formula (7):
与传统的分类器KNN不同,支持向量机对偶型的最优解ai,i=1,2,…,N只存在少部分非零值,因此只需保留少部分训练特征向量(即支持向量)参与在线的分类计算(例如,实现如图3所示的基于随机产生数据的非线性分类器,KNN需要存储并使用1000个原始训练特征向量,支持向量机则只需142个支持向量),从而在很大程度上降低了对于处理器电池消耗与系统内存的需求,比较适合于手持智能设备平台的应用。Unlike the traditional classifier KNN, the optimal solution a i , i = 1, 2, ..., N of the support vector machine dual type only has a small number of non-zero values, so only a small number of training feature vectors (ie support vectors) need to be retained. Participate in online classification calculations (for example, to implement a nonlinear classifier based on randomly generated data as shown in Figure 3, KNN needs to store and use 1000 original training feature vectors, and support vector machines only need 142 support vectors) Therefore, the requirements for processor battery consumption and system memory are greatly reduced, and the application is suitable for the handheld smart device platform.
用户运动模式和设备使用模式识别模块211将行人行为模式分为5类:1.静止;2.行走;3.跑步;4.自行车;5.驾驶汽车。行人行为模式的识别可以用于应用零速矫正,调节跟踪滤波器过程噪声的方差,以及调节动态系统马尔可夫过程的相关时间。所述用户运动模式和设备使用模式识别模块211将设备使用模式分为4类:1.前端平置;2.耳侧竖置;3.背包;4.臂章。手机姿态模式的识别可以用于前进方向的确定(坐标变换),以及调节跟踪滤波器过程噪声的方差。The user motion mode and device usage pattern recognition module 211 classifies the pedestrian behavior patterns into five categories: 1. stationary; 2. walking; 3. running; 4. bicycle; 5. driving a car. The identification of pedestrian behavior patterns can be used to apply zero-speed correction, to adjust the variance of the tracking filter process noise, and to adjust the correlation time of the dynamic system Markov process. The user motion mode and the device usage mode recognition module 211 classify the device usage modes into four categories: 1. the front end is flat; 2. the ear side is vertical; 3. the backpack; 4. the armband. The identification of the handset gesture mode can be used for the determination of the heading direction (coordinate transformation) and for adjusting the variance of the tracking filter process noise.
如图4所示,为应用二级分类器的用户运动模式和设备使用模式识别支持向量机示意图,包括加速度统计量2111、角速度统计量2112、转动角和倾斜角统计量2113、光和距离统计量2114、速度反馈统计量2115、特征规范化模块2116、主成分分析模块2117、
支持向量机模块2118、用户运动模式一级分类器2119和设备使用模式二级分类器2110。具体实现步骤有:线下收集具有代表性的数据集,进行特征向量规范化与主成分分析,应用公式(5)与(6)进行训练,提取并存储支持向量;在线计算传感器输出统计量,进行特征向量规范化与主成分提取(与训练集中系数相同),应用存储的支持向量,以及公式(5)与(7)进行二级分类,决定用户运动模式和设备使用模式。As shown in FIG. 4, a schematic diagram of a support vector machine for user motion mode and device usage pattern identification using a secondary classifier, including acceleration statistics 2111, angular velocity statistics 2112, rotation angle and tilt angle statistics 2113, light and distance statistics A quantity 2114, a speed feedback statistic 2115, a feature normalization module 2116, a principal component analysis module 2117,
Support vector machine module 2118, user motion mode primary classifier 2119, and device usage mode secondary classifier 2110. The specific implementation steps include: collecting representative data sets offline, performing eigenvector normalization and principal component analysis, applying formulas (5) and (6) for training, extracting and storing support vectors, and calculating sensor output statistics online. Feature vector normalization and principal component extraction (same as the training concentration factor), application of the stored support vector, and equations (5) and (7) for secondary classification, determining user motion patterns and device usage patterns.
本发明中的航向角偏差估计模块212包含多种不同的方法。当只有IMU 11可用时,我们采用基于主成分分析方法(Principle Component Analysis,PCA)。行人运动的一个特征是行人加速和减速的方向都在行进方向。因此可以通过PCA分析加速度计的数据得到行人的行进方向。当GNSS 16可用时,行人行进的方向可以通过GNSS的速度计算得来。当磁力计12可用的时候,行人行进的方向也可以通过磁力计12计算得来。手持智能设备的航向角则通过九轴融合或者六轴融合得来。因此,航向角偏差可以通过各种方法得来的行人行进方向的融合解和手持智能设备的航向角相减得到,并输出到航向角偏差补偿模块2132。The heading angle deviation estimation module 212 in the present invention includes a number of different methods. When only IMU 11 is available, we use Principal Component Analysis (PCA). One of the characteristics of pedestrian movement is that the direction of pedestrian acceleration and deceleration is in the direction of travel. Therefore, the data of the accelerometer can be analyzed by PCA to obtain the traveling direction of the pedestrian. When GNSS 16 is available, the direction in which pedestrians travel can be calculated from the speed of the GNSS. When the magnetometer 12 is available, the direction in which the pedestrian travels can also be calculated by the magnetometer 12. The heading angle of the handheld smart device is derived from a nine-axis fusion or a six-axis fusion. Therefore, the heading angle deviation can be obtained by subtracting the convergence direction of the traveling direction of the pedestrian and the heading angle of the hand-held smart device obtained by various methods, and outputting to the heading angle deviation compensation module 2132.
如图5所示,为上述改进航位推算算法模块213示意图,包括测姿系统模块2131、航向角偏差补偿模块2132、步伐检测模块2133、步长估计模块2134、航位推算算法模块2135,测姿系统模块2131根据所述手持智能设备平台1的IMU 11和其他可选磁力计12提供的原始数据识别出手持智能设备1的姿态信息。航向角偏差补偿模块2132读取航向角偏差估计模块212输出的航向角偏差并补偿给行人航向角、输出给航位推算算法模块2135。步伐检测模块2133根据所述手持智能设备平台的IMU 11的原始数据检测出行人的步数反馈给步长估计模块2134。步长估计模块2134根据所述步伐检测模块的结果和所述手持智能设备平台1的IMU 11的原始数据估计出行人的步长,并反馈给所述的航位推算模块2135。航位推算模块2135根据所述步长估计模块2134输出的步长信息和所述航向角偏差补偿模块2132输出的行人航向信息计算出IMU位置观测量并输出给所述的融合滤波器3。As shown in FIG. 5, it is a schematic diagram of the improved dead reckoning algorithm module 213, including a attitude measuring system module 2131, a heading angle deviation compensation module 2132, a step detecting module 2133, a step estimating module 2134, and a dead reckoning algorithm module 2135. The attitude system module 2131 identifies the gesture information of the handheld smart device 1 based on the raw data provided by the IMU 11 of the handheld smart device platform 1 and other optional magnetometers 12. The heading angle deviation compensation module 2132 reads the heading angle deviation output by the heading angle deviation estimation module 212 and compensates the pedestrian heading angle to the dead reckoning algorithm module 2135. The step detection module 2133 detects the step number of the pedestrian from the raw data of the IMU 11 of the handheld smart device platform and feeds back to the step estimation module 2134. The step estimation module 2134 estimates the step size of the pedestrian based on the result of the step detection module and the original data of the IMU 11 of the handheld smart device platform 1 and feeds back to the dead reckoning module 2135. The dead reckoning module 2135 calculates the IMU position observation based on the step information output by the step estimating module 2134 and the heading information output by the heading angle offset compensation module 2132, and outputs the IMU position observation to the fusion filter 3.
测姿系统模块2131根据所述手持智能设备平台1的IMU 11和其他可选磁力计12提供的原始数据识别出手持智能设备1的姿态信息。测姿系统模块2131采用的算法根据地磁信息是否可用,选择九轴测姿算法或六轴测姿算法。最终,测姿系统模块2131输出智能设备1的航向角给航向角偏差补偿模块2132。The attitude measurement system module 2131 identifies the attitude information of the handheld smart device 1 based on the raw data provided by the IMU 11 of the handheld smart device platform 1 and other optional magnetometers 12. The attitude measurement system module 2131 uses an algorithm to select a nine-axis attitude determination algorithm or a six-axis attitude determination algorithm according to whether or not the geomagnetic information is available. Finally, the attitude measuring system module 2131 outputs the heading angle of the smart device 1 to the heading angle deviation compensation module 2132.
所述航向角偏差补偿模块2132读取航向角偏差估计模块212输出的航向角偏差并
补偿给行人航向角、输出给航位推算算法模块2135。具体计算公式如下:The heading angle deviation compensation module 2132 reads the heading angle deviation output by the heading angle deviation estimation module 212 and
The compensation is given to the pedestrian heading angle and output to the dead reckoning algorithm module 2135. The specific calculation formula is as follows:
θp=θd+θoffset .(1)式中θp是行人航向角,θd是设备航向角,θoffset是航向角偏差。θ p = θ d + θ offset . (1) where θ p is the pedestrian heading angle, θ d is the heading angle of the equipment, and θ offset is the heading angle deviation.
步伐检测模块2133根据所述手持智能设备平台1的IMU 11的原始数据检测出行人的步数反馈给步长估计模块2134。步伐检测可以通过峰值检测、过零检测、相关检测和功率谱检测等方法检测步伐。本发明考虑到多种用户运动模式和设备使用模式,步伐检测算法采用峰值检测同时检测IMU 11的加速度数据和陀螺仪数据。The step detection module 2133 detects the step number of the pedestrian from the raw data of the IMU 11 of the handheld smart device platform 1 and feeds back to the step estimation module 2134. Pace detection can detect the pace by means of peak detection, zero-crossing detection, correlation detection and power spectrum detection. The present invention contemplates a variety of user motion modes and device usage modes. The step detection algorithm uses peak detection to simultaneously detect acceleration data and gyroscope data of the IMU 11.
步长估计模块2134根据所述步伐检测模块2133的结果和所述手持智能设备平台1的IMU 11的原始数据估计出行人的步长,并输出给所述的航位推算模块2135。步长估算可以通过加速度积分、钟摆模型、线性模型、经验模型等不同方法计算。本发明考虑到多种用户运动模式和设备使用模式,步长估算采用如下线性模型:The step estimation module 2134 estimates the step size of the pedestrian according to the result of the step detection module 2133 and the original data of the IMU 11 of the handheld smart device platform 1, and outputs the step to the dead reckoning module 2135. The step size estimation can be calculated by different methods such as acceleration integral, pendulum model, linear model, and empirical model. The present invention contemplates a variety of user motion patterns and device usage patterns, and the step size estimation uses the following linear model:
sk-1,k=A·(fk-1+fk)+B·(σacc,k-1+σacc,k)+C (2)s k-1,k =A·(f k-1 +f k )+B·(σ acc,k-1 +σ acc,k )+C (2)
式中A,B和C是常数,fk-1和fk是k-1时刻和k时刻的步频,σacc,k-1和σacc,k是k-1时刻和k时刻的加速度计的方差。Where A, B and C are constants, f k-1 and f k are the step frequencies at k-1 and k, σ acc, k-1 and σ acc, k are the accelerations at k-1 and k The variance of the meter.
航位推算模块2135根据k-1时刻的位置[re,k-1 rn,k-1]T、步长估计模块2134输出的步长信息sk-1,k和航向角偏差补偿模块2132输出的航向角信息θk-1推算出k时刻的位置[re,k rn,k]T。对应的计算公式如下:The dead reckoning module 2135 is based on the position [r e,k-1 r n,k-1 ] T at the k-1 time, the step information s k-1,k output by the step estimation module 2134 , and the heading angle offset compensation module. The heading angle information θ k-1 outputted by 2132 derives the position [r e,k r n,k ] T at time k . The corresponding calculation formula is as follows:
最终,航位推算模块2135输出IMU位置观测量给所述的融合滤波器3。Finally, the dead reckoning module 2135 outputs an IMU position measurement to the fusion filter 3.
融合滤波器3包括系统模型31和观测模型32。传统的多传感器融合结构一般通过惯性机械编排算法处理IMU测量数据,并建立相关的融合滤波器系统模型。由于惯性机械编排中存在很多积分操作,因此当没有外在辅助系统的情况下,传统的多传感器融合结构的定位误差会迅速累积。本发明克服传统的多传感器融合结构的缺陷,以行人的运动模型作为系统模型,而IMU相关数据和其他系统一样作为观测模型。The fusion filter 3 includes a system model 31 and an observation model 32. The traditional multi-sensor fusion structure generally processes the IMU measurement data through an inertial mechanical orchestration algorithm and establishes a related fusion filter system model. Since there are many integral operations in the inertial mechanical arrangement, the positioning error of the conventional multi-sensor fusion structure is rapidly accumulated when there is no external auxiliary system. The present invention overcomes the shortcomings of the conventional multi-sensor fusion structure, and uses the pedestrian motion model as a system model, and the IMU related data is used as an observation model like other systems.
融合滤波器3可以采用卡尔曼滤波(Kalman Filter,KF)、自适应卡尔曼滤波(Adaptive Kalman Filter,AKF)、UKF无损卡尔曼滤波(Unscented Kalman Filter,UKF)或粒子
滤波(Particle Filter,PF)。本发明给出KF的设计范例。其他的滤波器可以参考KF的设计。融合滤波器3实现为KF的状态向量定义如下:The fusion filter 3 can adopt Kalman Filter (KF), Adaptive Kalman Filter (AKF), UKF Lossless Kalman Filter (UKF) or particles.
Filter (PF). The present invention gives a design example of KF. Other filters can refer to the KF design. The state vector implemented by the fusion filter 3 as KF is defined as follows:
x=[re rn ru ve vn vu]T (4)x=[r e r n r u v e v n v u ] T (4)
式中re,rn和ru是三维位置(东北天坐标系),ve,vn和vu是对应的三维速度组成。该KF系统模型32采用经典的运动学模型,定义如下:Where re, rn and ru are three-dimensional positions (Northeast celestial coordinate system), and ve, vn and vu are corresponding three-dimensional velocity components. The KF system model 32 uses a classical kinematic model and is defined as follows:
xk+1|k=Φk,k+1xk|k+ωk (5)x k+1|k =Φ k,k+1 x k|k +ω k (5)
式中xk+1|k是预测的状态向量,xk|k是在k时刻先前的状态向量,Φk,k+1是一个6×6转移矩阵:Where x k+1|k is the predicted state vector, x k|k is the previous state vector at time k, and Φ k,k+1 is a 6×6 transition matrix:
式中Δt是两个时刻的时间差。ωk是协方差矩阵的处理噪声,定义如下:Where Δt is the time difference between two moments. ω k is the covariance matrix The processing noise is defined as follows:
式中和是在k时刻东向天坐标系下的速度噪声,按随机游走建模。In the middle with It is the velocity noise in the east-to-day coordinate system at time k, which is modeled by random walk.
式中和是在k-1时刻东向天坐标系的速度噪声,ne,nn和nu是高斯白噪声,Δt是两个时刻的时间差。In the middle with It is the velocity noise of the eastward sky coordinate system at time k-1, n e , n n and n u are Gaussian white noise, and Δt is the time difference between two moments.
融合滤波器3实现为KF的测量模型31定义如下:The measurement model 31 implemented by the fusion filter 3 as KF is defined as follows:
zk=Hkxk|k+υk (9)z k =H k x k|k +υ k (9)
式中zk是测量向量,Hk是决策矩阵。υk是测量噪声以高斯白噪声为模型,其协方差矩阵为zk和Hk随观测量的不同而变化。当观测量来自于IMU 11时,典型的zk和Hk定义如下:
Where z k is the measurement vector and H k is the decision matrix. υ k is the measurement noise with Gaussian white noise as the model, and its covariance matrix is z k and H k vary depending on the measurement. When the observations are from IMU 11, the typical z k and H k are defined as follows:
当观测量来自于磁力计12时,典型的zk和Hk定义如下:When the observations are from magnetometer 12, the typical z k and H k are defined as follows:
zk=[re rn ru]T
z k =[r e r n r u ] T
当观测量来自于压力计13时,典型的zk和Hk定义如下:When the observations are from the pressure gauge 13, the typical z k and H k are defined as follows:
zk=ru
z k =r u
Hk=[0 0 1 0 0 0] (12)当观测量来自于WiFi 14时,典型的zk和Hk定义如下:H k =[0 0 1 0 0 0] (12) When the observation is from WiFi 14, the typical z k and H k are defined as follows:
zk=[re rn ru]T
z k =[r e r n r u ] T
当观测量来自于BLE15时,典型的zk和Hk定义如下:When the observations are from BLE15, the typical z k and H k are defined as follows:
zk=[re rn ru]T
z k =[r e r n r u ] T
当观测量来自于GNSS16时,典型的zk和Hk定义如下:When the observations are from GNSS16, the typical z k and H k are defined as follows:
zk=[re rn ru ve vn vu]T
z k =[r e r n r u v e v n v u ] T
KF处理过程有两个阶段:预测和更新。在预测过程,根据系统模型预测出状态向量和协方差矩阵。
The KF process has two phases: forecasting and updating. In the prediction process, the state vector and the covariance matrix are predicted based on the system model.
在更新过程中,根据测量模型更新状态向量和协方差矩阵:During the update process, the state vector and covariance matrix are updated according to the measurement model:
式中Kk称为卡尔曼增益。Where K k is called the Kalman gain.
尽管本发明就优选实施方式进行了示意和描述,但本领域的技术人员应当理解,只要不超出本发明的权利要求所限定的范围,可以对本发明进行各种变化和修改。
While the invention has been shown and described with respect to the preferred embodiments of the present invention, it will be understood that
Claims (12)
- 一种基于新型多传感器融合技术的行人导航装置,其特征在于,包括:手持智能设备(1)、观测量处理单元(2)以及融合滤波器(3);所述的手持智能设备(1)利用自身硬件获取观测量原始数据,所述观测量处理单元(2)处理手持智能设备(1)提供的原始数据以提供位置或速度等观测量给融合滤波器(3),所述融合滤波器利用运动学模型作为系统模型(32),观测量处理单元的结果建立观测模型(33),经过融合滤波器(3)的处理最终得到行人导航结果。A pedestrian navigation device based on a novel multi-sensor fusion technology, comprising: a handheld smart device (1), a measurement processing unit (2), and a fusion filter (3); the handheld smart device (1) The observation raw data is acquired by its own hardware, and the observation processing unit (2) processes the raw data provided by the handheld smart device (1) to provide a position or velocity observation to the fusion filter (3), the fusion filter Using the kinematics model as the system model (32), the observation model (33) is established as a result of the observation processing unit, and the pedestrian navigation result is finally obtained through the processing of the fusion filter (3).
- 如权利要求1所述的基于新型多传感器融合技术的行人导航装置,其特征在于,所述手持智能设备1自身硬件包括:惯性测量单元IMU(11)、磁力计(12)、压力计(13)、WiFi(14)、低耗能蓝牙BLE(15)和全球导航卫星系统GNSS(16);所述IMU(11)提供加速度和角速度的原始数据;所述磁力计(12)提供地磁的原始数据;所述压力计(13)提供大气气压的原始数据;所述WiFi(14)提供WiFi接收信号强度RSS的原始数据;所述BLE(15)提供BLERSS的原始数据;所述GNSS接收机(16)提供GNSS的原始数据;手持智能设备(1)还可以包括提供观测信息的其他传感器。The pedestrian navigation device based on the novel multi-sensor fusion technology according to claim 1, wherein the hardware of the handheld smart device 1 comprises: an inertial measurement unit IMU (11), a magnetometer (12), and a pressure gauge (13). ), WiFi (14), low energy Bluetooth BLE (15) and Global Navigation Satellite System GNSS (16); the IMU (11) provides raw data of acceleration and angular velocity; the magnetometer (12) provides the original geomagnetic Data; the pressure gauge (13) provides raw data of atmospheric pressure; the WiFi (14) provides raw data of WiFi received signal strength RSS; the BLE (15) provides raw data of BLERSS; the GNSS receiver ( 16) Providing raw data of the GNSS; the handheld smart device (1) may also include other sensors that provide observation information.
- 如权利要求1所述的基于新型多传感器融合技术的行人导航装置,其特征在于,观测量处理单元模块(2)包括:IMU处理单元(21)、磁力计处理单元(22)、压力计处理单元(23)、WiFi处理单元(24)、BLE处理单元(25)和GNSS处理单元(26);所述IMU处理单元(21)处理所述IMU(11)提供的加速度和角速度的原始数据以得到IMU位置信息并传送给所述融合滤波器(3);所述磁力计处理单元(22)处理所述磁力计(12)提供的地磁的原始数据以得到地磁位置信息并传送给所述融合滤波器(3);所述压力计处理单元(23)处理所述压力计提供(13)的大气气压的原始数据以得到高程信息并传送给所述融合滤波器(3);所述WiFi处理单元处理(24)所述WiFi(14)提供的RSS原始数据以得到WiFi位置信息并传送给所述融合滤波器(3);所述BLE处理单元处理(25)所述BLE(15)提供的RSS原始数据以得到BLE位置信息并传送给所述融合滤波器(3);所述GNSS处理单元(26)处理所述GNSS接收机(16)提供的位置和速度信息并传送给所述融合滤波器(3);观测量处理单元模块(2)还包括其他处理单元以处理智能设备平台的其他传感器来得到位置或速度信息并传送给融合滤波器(3)。The pedestrian navigation device based on the novel multi-sensor fusion technology according to claim 1, wherein the observation processing unit module (2) comprises: an IMU processing unit (21), a magnetometer processing unit (22), and a pressure gauge processing. a unit (23), a WiFi processing unit (24), a BLE processing unit (25), and a GNSS processing unit (26); the IMU processing unit (21) processes raw data of acceleration and angular velocity provided by the IMU (11) to Obtaining IMU position information and transmitting to the fusion filter (3); the magnetometer processing unit (22) processes the geomagnetic raw data provided by the magnetometer (12) to obtain geomagnetic position information and transmit the fusion to the fusion a filter (3); the pressure gauge processing unit (23) processes the raw data of the atmospheric pressure provided by the pressure gauge (13) to obtain elevation information and transmits to the fusion filter (3); the WiFi processing The unit processes (24) the RSS raw data provided by the WiFi (14) to obtain WiFi location information and transmits the information to the fusion filter (3); the BLE processing unit processes (25) the BLE (15) provided by the BLE (15) RSS raw data to obtain BLE location information and transmitted to the fusion filter The GNSS processing unit (26) processes the position and velocity information provided by the GNSS receiver (16) and transmits the information to the fusion filter (3); the observation processing unit module (2) further includes Other processing units obtain position or velocity information from other sensors that process the smart device platform and transmit them to the fusion filter (3).
- 如权利要求1所述的基于新型多传感器融合技术的行人导航装置,其特征在于,所述IMU处理单元(21)包括用户运动模式和设备使用模式识别模块、航向角偏差估 计模块、改进航位推算算法模块,所述用户运动模式和设备使用模式识别模块根据所述手持智能设备平台的IMU和其他可选的硬件提供的原始数据识别出静止、行走、跑步等用户运动模式和手持、短信、电话、导航、口袋、背包等设备使用模式;所述航向角偏差估计模块根据所述用户运动模式和设备使用模式识别模块输出的用户运动模式和设备使用模式和所述手持智能设备平台的IMU和其他可选的硬件提供的原始数据估计出航向角偏差;改进航位推算算法模块根据所述航向角偏差估计模块输出的航向角偏差和所述手持智能设备平台的IMU和其他可选的硬件提供的原始数据得到IMU位置信息并传送给所述融合滤波器。The pedestrian navigation device based on the novel multi-sensor fusion technology according to claim 1, wherein the IMU processing unit (21) comprises a user motion mode and a device usage pattern recognition module, and a heading angle deviation estimation a module, an improved dead reckoning algorithm module, the user motion mode and the device usage pattern recognition module identify user motions such as still, walking, running, etc. according to original data provided by the IMU of the handheld smart device platform and other optional hardware Mode and device usage mode of handheld, short message, telephone, navigation, pocket, backpack, etc.; the heading angle deviation estimation module according to the user motion mode and the device usage mode identification module output user motion mode and device usage mode and the handheld The raw data provided by the IMU of the smart device platform and other optional hardware estimates a heading angle deviation; the improved dead reckoning algorithm module estimates the heading angle deviation output by the heading angle deviation module and the IMU of the handheld smart device platform The raw data provided by other optional hardware is derived from the IMU location information and transmitted to the fusion filter.
- 如权利要求4所述的基于新型多传感器融合技术的行人导航装置,其特征在于,所述改进航位推算算法模块包括测姿系统模块、航向角偏差补偿模块、步伐检测模块、步长估计模块、航位推算算法模块,所述测姿系统模块根据所述手持智能设备平台的IMU和其他可选磁力计提供的原始数据识别出手持智能设备的姿态信息;所述航向角偏差补偿模块读取航向角偏差估计模块输出的航向角偏差并补偿给行人航向角、输出给航位推算算法;步伐检测模块算法模块根据所述手持智能设备平台的IMU的原始数据检测出行人的步数反馈给步长估计模块;步长估计模块根据所述步伐检测模块的结果和所述手持智能设备平台的IMU的原始数据估计出行人的步长,并反馈给所述的航位推算模块;航位推算模块根据所述步长估计模块输出的步长信息和所述航向角偏差补偿模块输出的行人航向角信息计算出IMU位置观测量并反馈给所述的融合滤波器。The pedestrian navigation device based on the novel multi-sensor fusion technology according to claim 4, wherein the improved dead reckoning algorithm module comprises a attitude measuring system module, a heading angle deviation compensation module, a step detecting module, and a step estimating module. a dead reckoning algorithm module, wherein the attitude determining system module identifies posture information of the handheld smart device according to the original data provided by the IMU of the handheld smart device platform and other optional magnetometers; the heading angle deviation compensation module reads The heading angle deviation of the heading angle deviation estimation module is compensated to the heading angle of the pedestrian and output to the dead reckoning algorithm; the step detecting module algorithm module detects the step number feedback of the pedestrian according to the original data of the IMU of the handheld smart device platform. a long estimation module; the step estimation module estimates the step size of the pedestrian according to the result of the step detection module and the original data of the IMU of the handheld smart device platform, and feeds back to the dead reckoning module; the dead reckoning module Step size information outputted by the step estimation module and the heading angle deviation compensation module input The pedestrian heading angle information is calculated to calculate the IMU position observation and fed back to the fusion filter.
- 如权利要求4所述的基于新型多传感器融合技术的行人导航装置,其特征在于,所述用户运动模式和设备使用模式识别模块采用一种基于高斯内核对偶型支持向量机的二级分类器,在周期性时间间隔中提取手持智能设备的传感器输出,运用支持向量与高斯内核对用户运动模式与手机使用模式进行识别与分类。The pedestrian navigation device based on the novel multi-sensor fusion technology according to claim 4, wherein the user motion mode and the device usage pattern recognition module adopt a second classifier based on a Gaussian kernel dual type support vector machine. The sensor output of the handheld smart device is extracted during the periodic time interval, and the user motion mode and the mobile phone usage mode are identified and classified by using the support vector and the Gaussian kernel.
- 如权利要求1所述的基于新型多传感器融合技术的行人导航装置,其特征在于:计算IMU,磁强计,可选测距,可选光传感器的输出统计量,及其衍生统计量,以其为特征向量进行规范化与主成分分析,截取相应于保留绝大多数数据方差的特征值的特征向量参与支持向量机的线下训练。The pedestrian navigation device based on the novel multi-sensor fusion technology according to claim 1, wherein: calculating an IMU, a magnetometer, an optional ranging, an output statistic of the optional photosensor, and a derived statistic thereof, It normalizes and principal component analysis for feature vectors, and intercepts the feature vectors corresponding to the feature values that retain most of the data variance to participate in the offline training of the support vector machine.
- 如权利要求1所述的基于新型多传感器融合技术的行人导航装置,其特征是:手持智能设备平台的在线分类计算,只需存储并运用降维后的少数支持向量,进行用户运动模式与设备使用模式的二级分类。 The pedestrian navigation device based on the novel multi-sensor fusion technology according to claim 1, wherein: the online classification calculation of the handheld intelligent device platform only needs to store and use a small number of support vectors after dimension reduction to perform user motion mode and device. Use the secondary classification of the pattern.
- 一种基于新型多传感器融合技术的行人导航方法,其特征在于,包括如下步骤:A pedestrian navigation method based on a novel multi-sensor fusion technology, comprising the following steps:(1)手持智能设备利用自身硬件获取IMU、磁力计、压力计、WiFi、BLE和GNSS的原始数据;(1) Handheld smart devices use their own hardware to obtain raw data of IMU, magnetometer, pressure gauge, WiFi, BLE and GNSS;(2)观测量处理单元处理手持智能设备提供的原始数据以提供位置或速度等观测量给融合滤波器;(2) the observation processing unit processes the raw data provided by the handheld smart device to provide a position or velocity observation to the fusion filter;(3)融合滤波器利用运动学模型作为系统模型,观测量处理单元的结果建立观测模型,经过融合滤波器的处理最终得到行人导航结果。(3) The fusion filter uses the kinematics model as the system model, and the observation and processing unit results to establish an observation model. After the fusion filter is processed, the pedestrian navigation result is finally obtained.
- 如权利要求9所述的一种基于新型多传感器融合技术的行人导航方法,其特征在于,观测量处理单元处理手持智能设备提供的原始数据以提供位置或速度等观测量给融合滤波器包括如下步骤:A pedestrian navigation method based on a novel multi-sensor fusion technology according to claim 9, wherein the observation processing unit processes the raw data provided by the handheld smart device to provide a position or velocity and the like to the fusion filter, including the following step:所述IMU处理单元处理所述IMU提供的加速度和角速度的原始数据以得到IMU位置信息并传送给所述融合滤波器;The IMU processing unit processes raw data of acceleration and angular velocity provided by the IMU to obtain IMU position information and transmits the same to the fusion filter;所述磁力计处理单元处理所述磁力计提供的地磁的原始数据以得到地磁位置信息并传送给所述融合滤波器;The magnetometer processing unit processes the geomagnetic raw data provided by the magnetometer to obtain geomagnetic position information and transmits the same to the fusion filter;所述压力计处理单元处理所述压力计提供的大气气压的原始数据以得到高程信息并传送给所述融合滤波器;The pressure gauge processing unit processes raw data of atmospheric pressure provided by the pressure gauge to obtain elevation information and transmits the same to the fusion filter;所述WiFi处理单元处理所述WiFi提供的RSS原始数据以得到WiFi位置信息并传送给所述融合滤波器;The WiFi processing unit processes the RSS original data provided by the WiFi to obtain WiFi location information and transmits the information to the fusion filter;所述BLE处理单元处理所述BLE提供的RSS原始数据以得到BLE位置信息并传送给所述融合滤波器;The BLE processing unit processes the RSS raw data provided by the BLE to obtain BLE location information and transmits the BLE location information to the fusion filter;所述GNSS处理单元处理所述GNSS芯片提供的GNSS的原始数据以得到GNSS的位置和速度信息并传送给所述融合滤波器。The GNSS processing unit processes raw data of the GNSS provided by the GNSS chip to obtain position and velocity information of the GNSS and transmits the information to the fusion filter.
- 如权利要求10所述的一种基于新型多传感器融合技术的行人导航方法,其特征在于,IMU处理单元处理IMU提供的加速度和角速度的原始数据以得到IMU位置信息并传送给所述融合滤波器包括如下步骤:A pedestrian navigation method based on a novel multi-sensor fusion technology according to claim 10, wherein the IMU processing unit processes the raw data of the acceleration and angular velocity provided by the IMU to obtain IMU position information and transmit the same to the fusion filter. Including the following steps:所述用户运动模式和设备使用模式识别模块根据所述手持智能设备平台的IMU和其他可选的硬件提供的原始数据识别出静止、行走、跑步等用户运动模式和手持、短信、电话、导航、口袋、背包等设备使用模式;The user motion mode and the device usage pattern recognition module identify user motion patterns such as still, walking, running, and the like, handheld, short message, telephone, navigation, according to original data provided by the IMU of the handheld smart device platform and other optional hardware. Pocket, backpack and other device usage modes;所述航向角偏差估计模块根据所述用户运动模式和设备使用模式识别模块输出的 用户运动模式和设备使用模式和所述手持智能设备平台的IMU和其他可选的硬件提供的原始数据估计出航向角偏差;The heading angle deviation estimation module outputs the output according to the user motion mode and the device usage pattern recognition module The heading angle deviation is estimated by the user motion mode and the device usage mode and raw data provided by the IMU of the handheld smart device platform and other optional hardware;改进航位推算算法模块根据所述航向角偏差估计模块输出的航向角偏差和所述手持智能设备平台的IMU和其他可选的硬件提供的原始数据得到IMU位置信息并传送给所述融合滤波器。The improved dead reckoning algorithm module obtains IMU position information according to the heading angle deviation output by the heading angle deviation estimating module and the original data provided by the IMU of the handheld smart device platform and other optional hardware, and transmits the IMU position information to the fusion filter. .
- 如权利要求11所述的一种基于新型多传感器融合技术的行人导航方法,其特征在于,改进航位推算算法模块包括如下步骤:A pedestrian navigation method based on a novel multi-sensor fusion technology according to claim 11, wherein the improved dead reckoning algorithm module comprises the following steps:所述测姿系统模块根据所述手持智能设备平台的IMU和其他可选磁力计提供的原始数据识别出手持智能设备的姿态信息;The attitude determining system module identifies posture information of the handheld smart device according to the original data provided by the IMU of the handheld smart device platform and other optional magnetometers;所述航向角偏差补偿模块读取航向角偏差估计模块输出的航向角偏差并补偿给行人航向角、输出给航位推算算法;The heading angle deviation compensation module reads the heading angle deviation outputted by the heading angle deviation estimating module and compensates the heading angle of the pedestrian to the dead reckoning algorithm;步伐检测模块算法模块根据所述手持智能设备平台的IMU的原始数据检测出行人的步数反馈给步长估计模块;The step detection module algorithm module detects the step number of the pedestrian feedback to the step estimation module according to the original data of the IMU of the handheld smart device platform;步长估计模块根据所述步伐检测模块的结果和所述手持智能设备平台的IMU的原始数据估计出行人的步长,并反馈给所述的航位推算模块;The step estimation module estimates the step size of the pedestrian according to the result of the step detection module and the original data of the IMU of the handheld smart device platform, and feeds back to the dead reckoning module;航位推算模块根据所述步长估计模块输出的步长信息和所述航向角偏差补偿模块输出的行人航向角信息计算出IMU位置观测量并反馈给所述的融合滤波器。 The dead reckoning module calculates the IMU position observation and feeds back to the fusion filter according to the step size information output by the step estimation module and the pedestrian heading angle information output by the heading angle offset compensation module.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610431107.0A CN106017454B (en) | 2016-06-16 | 2016-06-16 | A kind of pedestrian navigation device and method based on multi-sensor fusion technology |
CN201610431107.0 | 2016-06-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017215024A1 true WO2017215024A1 (en) | 2017-12-21 |
Family
ID=57089015
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/087281 WO2017215024A1 (en) | 2016-06-16 | 2016-06-27 | Pedestrian navigation device and method based on novel multi-sensor fusion technology |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN106017454B (en) |
WO (1) | WO2017215024A1 (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109682372A (en) * | 2018-12-17 | 2019-04-26 | 重庆邮电大学 | A kind of modified PDR method of combination fabric structure information and RFID calibration |
CN110132257A (en) * | 2019-05-15 | 2019-08-16 | 吉林大学 | Human body behavior prediction method based on Fusion |
CN110427046A (en) * | 2019-07-26 | 2019-11-08 | 沈阳航空航天大学 | A kind of three-dimensional smooth random walk unmanned aerial vehicle group mobility model |
CN110764506A (en) * | 2019-11-05 | 2020-02-07 | 广东博智林机器人有限公司 | Course angle fusion method and device of mobile robot and mobile robot |
CN110954109A (en) * | 2019-12-12 | 2020-04-03 | 杭州十域科技有限公司 | Indoor space positioning equipment and positioning method based on fingerprint positioning and inertial navigation system |
CN111174780A (en) * | 2019-12-31 | 2020-05-19 | 同济大学 | Road inertial navigation positioning system for blind people |
CN111811502A (en) * | 2020-07-10 | 2020-10-23 | 北京航空航天大学 | Motion carrier multi-source information fusion navigation method and system |
CN112268557A (en) * | 2020-09-22 | 2021-01-26 | 宽凳(北京)科技有限公司 | Real-time high-precision positioning method for urban scene |
CN112556696A (en) * | 2020-12-03 | 2021-03-26 | 腾讯科技(深圳)有限公司 | Object positioning method and device, computer equipment and storage medium |
CN112747754A (en) * | 2019-10-30 | 2021-05-04 | 北京初速度科技有限公司 | Fusion method, device and system of multi-sensor data |
CN113008224A (en) * | 2021-03-04 | 2021-06-22 | 国电瑞源(西安)智能研究院有限公司 | Indoor and outdoor self-adaptive navigation system and method integrating multiple sensors |
CN113029153A (en) * | 2021-03-29 | 2021-06-25 | 浙江大学 | Multi-scene PDR positioning method based on smart phone multi-sensor fusion and SVM classification |
CN113229804A (en) * | 2021-05-07 | 2021-08-10 | 陕西福音假肢有限责任公司 | Magnetic field data fusion circuit and method for joint mobility |
CN113655439A (en) * | 2021-08-31 | 2021-11-16 | 上海第二工业大学 | Indoor positioning method for improving particle filtering |
CN113790722A (en) * | 2021-08-20 | 2021-12-14 | 北京自动化控制设备研究所 | Pedestrian step size modeling method based on inertial data time-frequency domain feature extraction |
CN115200575A (en) * | 2022-07-13 | 2022-10-18 | 中国电子科技集团公司第五十四研究所 | Indoor positioning method for assisting pedestrian dead reckoning based on human activity recognition |
WO2024082214A1 (en) * | 2022-10-20 | 2024-04-25 | Telefonaktiebolaget Lm Ericsson (Publ) | Improved target positioning by using multiple terminal devices |
CN117942068A (en) * | 2024-01-29 | 2024-04-30 | 连云港市第二人民医院(连云港市临床肿瘤研究所) | Lower limb pose fuzzy guiding control method and system based on micro inertial navigation |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106767784B (en) * | 2016-12-21 | 2019-11-08 | 上海网罗电子科技有限公司 | A kind of fire-fighting precision indoor localization method of bluetooth training inertial navigation |
CN107328406B (en) * | 2017-06-28 | 2020-10-16 | 中国矿业大学(北京) | Method and system for positioning mine moving target based on multi-source sensor |
CN107990901B (en) * | 2017-11-28 | 2021-03-12 | 元力云网络有限公司 | User direction positioning method based on sensor |
CN107943042B (en) * | 2017-12-06 | 2021-04-27 | 东南大学 | Automatic construction method and device for geomagnetic fingerprint database |
CN110118549B (en) * | 2018-02-06 | 2021-05-11 | 刘禹岐 | Multi-source information fusion positioning method and device |
CN108413968B (en) * | 2018-07-10 | 2018-10-09 | 上海奥孛睿斯科技有限公司 | A kind of method and system of movement identification |
CN111984853B (en) * | 2019-05-22 | 2024-03-22 | 北京车和家信息技术有限公司 | Test driving report generation method and cloud server |
CN110849392A (en) * | 2019-11-15 | 2020-02-28 | 上海有个机器人有限公司 | Robot mileage counting data correction method and robot |
CN110986941B (en) * | 2019-11-29 | 2021-09-24 | 武汉大学 | Method for estimating installation angle of mobile phone |
CN111174781B (en) * | 2019-12-31 | 2022-03-04 | 同济大学 | Inertial navigation positioning method based on wearable device combined target detection |
CN111256709B (en) * | 2020-02-18 | 2021-11-02 | 北京九曜智能科技有限公司 | Vehicle dead reckoning positioning method and device based on encoder and gyroscope |
CN112379395B (en) * | 2020-11-24 | 2023-09-05 | 中国人民解放军海军工程大学 | Positioning navigation time service system |
CN115597602A (en) * | 2022-10-20 | 2023-01-13 | Oppo广东移动通信有限公司(Cn) | Robot navigation positioning method and device, readable medium and electronic equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103968827A (en) * | 2014-04-09 | 2014-08-06 | 北京信息科技大学 | Wearable human body gait detection self-localization method |
CN104931049A (en) * | 2015-06-05 | 2015-09-23 | 北京信息科技大学 | Movement classification-based pedestrian self-positioning method |
WO2016042296A2 (en) * | 2014-09-15 | 2016-03-24 | Isis Innovation Limited | Determining the position of a mobile device in a geographical area |
CN105433949A (en) * | 2014-09-23 | 2016-03-30 | 飞比特公司 | Hybrid angular motion sensor |
CN105588566A (en) * | 2016-01-08 | 2016-05-18 | 重庆邮电大学 | Indoor positioning system and method based on Bluetooth and MEMS (Micro-Electro-Mechanical Systems) fusion |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102175463B (en) * | 2011-02-12 | 2012-08-22 | 东南大学 | Method for detecting braking property of vehicle in road test based on improved Kalman filtering |
CN103759730B (en) * | 2014-01-16 | 2016-06-29 | 南京师范大学 | The collaborative navigation system of a kind of pedestrian based on navigation information two-way fusion and intelligent mobile carrier and air navigation aid thereof |
CN104613963B (en) * | 2015-01-23 | 2017-10-10 | 南京师范大学 | Pedestrian navigation system and navigation locating method based on human cinology's model |
-
2016
- 2016-06-16 CN CN201610431107.0A patent/CN106017454B/en active Active
- 2016-06-27 WO PCT/CN2016/087281 patent/WO2017215024A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103968827A (en) * | 2014-04-09 | 2014-08-06 | 北京信息科技大学 | Wearable human body gait detection self-localization method |
WO2016042296A2 (en) * | 2014-09-15 | 2016-03-24 | Isis Innovation Limited | Determining the position of a mobile device in a geographical area |
CN105433949A (en) * | 2014-09-23 | 2016-03-30 | 飞比特公司 | Hybrid angular motion sensor |
CN104931049A (en) * | 2015-06-05 | 2015-09-23 | 北京信息科技大学 | Movement classification-based pedestrian self-positioning method |
CN105588566A (en) * | 2016-01-08 | 2016-05-18 | 重庆邮电大学 | Indoor positioning system and method based on Bluetooth and MEMS (Micro-Electro-Mechanical Systems) fusion |
Non-Patent Citations (1)
Title |
---|
HUANG, CHENGKAI: "Application of Multiple Sensor Information Fusion Algorithms in Indoor Positioning", ELECTRONIC TECHNOLOGY & INFORMATION SCIENCE , CHINA MASTER'S THESES FULL-TEXT DATABASE, 15 August 2015 (2015-08-15), ISSN: 1674-0246 * |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109682372B (en) * | 2018-12-17 | 2022-10-18 | 重庆邮电大学 | Improved PDR method combining building structure information and RFID calibration |
CN109682372A (en) * | 2018-12-17 | 2019-04-26 | 重庆邮电大学 | A kind of modified PDR method of combination fabric structure information and RFID calibration |
CN110132257A (en) * | 2019-05-15 | 2019-08-16 | 吉林大学 | Human body behavior prediction method based on Fusion |
CN110427046A (en) * | 2019-07-26 | 2019-11-08 | 沈阳航空航天大学 | A kind of three-dimensional smooth random walk unmanned aerial vehicle group mobility model |
CN110427046B (en) * | 2019-07-26 | 2022-09-30 | 沈阳航空航天大学 | Three-dimensional smooth random-walking unmanned aerial vehicle cluster moving model |
CN112747754A (en) * | 2019-10-30 | 2021-05-04 | 北京初速度科技有限公司 | Fusion method, device and system of multi-sensor data |
CN110764506A (en) * | 2019-11-05 | 2020-02-07 | 广东博智林机器人有限公司 | Course angle fusion method and device of mobile robot and mobile robot |
CN110764506B (en) * | 2019-11-05 | 2022-10-11 | 广东博智林机器人有限公司 | Course angle fusion method and device of mobile robot and mobile robot |
CN110954109A (en) * | 2019-12-12 | 2020-04-03 | 杭州十域科技有限公司 | Indoor space positioning equipment and positioning method based on fingerprint positioning and inertial navigation system |
CN111174780A (en) * | 2019-12-31 | 2020-05-19 | 同济大学 | Road inertial navigation positioning system for blind people |
CN111811502A (en) * | 2020-07-10 | 2020-10-23 | 北京航空航天大学 | Motion carrier multi-source information fusion navigation method and system |
CN112268557A (en) * | 2020-09-22 | 2021-01-26 | 宽凳(北京)科技有限公司 | Real-time high-precision positioning method for urban scene |
CN112268557B (en) * | 2020-09-22 | 2024-03-05 | 宽凳(湖州)科技有限公司 | Real-time high-precision positioning method for urban scene |
CN112556696A (en) * | 2020-12-03 | 2021-03-26 | 腾讯科技(深圳)有限公司 | Object positioning method and device, computer equipment and storage medium |
CN113008224A (en) * | 2021-03-04 | 2021-06-22 | 国电瑞源(西安)智能研究院有限公司 | Indoor and outdoor self-adaptive navigation system and method integrating multiple sensors |
CN113029153A (en) * | 2021-03-29 | 2021-06-25 | 浙江大学 | Multi-scene PDR positioning method based on smart phone multi-sensor fusion and SVM classification |
CN113029153B (en) * | 2021-03-29 | 2024-05-28 | 浙江大学 | Multi-scene PDR positioning method based on intelligent mobile phone multi-sensor fusion and SVM classification |
CN113229804A (en) * | 2021-05-07 | 2021-08-10 | 陕西福音假肢有限责任公司 | Magnetic field data fusion circuit and method for joint mobility |
CN113790722B (en) * | 2021-08-20 | 2023-09-12 | 北京自动化控制设备研究所 | Pedestrian step length modeling method based on inertial data time-frequency domain feature extraction |
CN113790722A (en) * | 2021-08-20 | 2021-12-14 | 北京自动化控制设备研究所 | Pedestrian step size modeling method based on inertial data time-frequency domain feature extraction |
CN113655439A (en) * | 2021-08-31 | 2021-11-16 | 上海第二工业大学 | Indoor positioning method for improving particle filtering |
CN115200575A (en) * | 2022-07-13 | 2022-10-18 | 中国电子科技集团公司第五十四研究所 | Indoor positioning method for assisting pedestrian dead reckoning based on human activity recognition |
WO2024082214A1 (en) * | 2022-10-20 | 2024-04-25 | Telefonaktiebolaget Lm Ericsson (Publ) | Improved target positioning by using multiple terminal devices |
CN117942068A (en) * | 2024-01-29 | 2024-04-30 | 连云港市第二人民医院(连云港市临床肿瘤研究所) | Lower limb pose fuzzy guiding control method and system based on micro inertial navigation |
Also Published As
Publication number | Publication date |
---|---|
CN106017454A (en) | 2016-10-12 |
CN106017454B (en) | 2018-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017215024A1 (en) | Pedestrian navigation device and method based on novel multi-sensor fusion technology | |
EP2946167B1 (en) | Method and apparatus for determination of misalignment between device and pedestrian | |
US10184797B2 (en) | Apparatus and methods for ultrasonic sensor navigation | |
Ban et al. | Indoor positioning method integrating pedestrian Dead Reckoning with magnetic field and WiFi fingerprints | |
US10429196B2 (en) | Method and apparatus for cart navigation | |
US11041725B2 (en) | Systems and methods for estimating the motion of an object | |
US10652696B2 (en) | Method and apparatus for categorizing device use case for on foot motion using motion sensor data | |
EP3077992B1 (en) | Process and system for determining the location of an object by fusing motion features and iamges of the object | |
US8775128B2 (en) | Selecting feature types to extract based on pre-classification of sensor measurements | |
US8473241B2 (en) | Navigation trajectory matching | |
US8332180B2 (en) | Determining user compass orientation from a portable device | |
WO2020023534A1 (en) | Systems and methods for autonomous machine tracking and localization of mobile objects | |
US10302434B2 (en) | Method and apparatus for determining walking direction for a pedestrian dead reckoning process | |
US10837794B2 (en) | Method and system for characterization of on foot motion with multiple sensor assemblies | |
WO2016040166A1 (en) | Method and apparatus for using map information aided enhanced portable navigation | |
US9818037B2 (en) | Estimating heading misalignment between a device and a person using optical sensor | |
Filardo et al. | C-IPS: A smartphone based indoor positioning system | |
KR20160004084A (en) | Pedestrian Tracking apparatus and method at indoors | |
Saadatzadeh et al. | Pedestrian dead reckoning using smartphones sensors: an efficient indoor positioning system in complex buildings of smart cities | |
Woyano et al. | A survey of pedestrian dead reckoning technology using multi-sensor fusion for indoor positioning systems | |
WO2024182277A1 (en) | Image-based delocalization recovery | |
Aljeroudi et al. | MOBILITY DETERMINATION AND ESTIMATION BASED ON SMARTPHONES-REVIEW OF SENSING AND SYSTEMS | |
Zeng et al. | Method of Smartphone Navigation Heading Compensation Based on Gravimeter | |
CN116724213A (en) | Footstep-based positioning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16905138 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16905138 Country of ref document: EP Kind code of ref document: A1 |