CN109144052A - Navigation system and its method for automatic driving vehicle - Google Patents

Navigation system and its method for automatic driving vehicle Download PDF

Info

Publication number
CN109144052A
CN109144052A CN201810745627.8A CN201810745627A CN109144052A CN 109144052 A CN109144052 A CN 109144052A CN 201810745627 A CN201810745627 A CN 201810745627A CN 109144052 A CN109144052 A CN 109144052A
Authority
CN
China
Prior art keywords
driving vehicle
automatic driving
scene
module
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810745627.8A
Other languages
Chinese (zh)
Other versions
CN109144052B (en
Inventor
肖建雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Autex Co Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/027,777 external-priority patent/US10942519B2/en
Application filed by Individual filed Critical Individual
Publication of CN109144052A publication Critical patent/CN109144052A/en
Application granted granted Critical
Publication of CN109144052B publication Critical patent/CN109144052B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present invention provides a kind of for providing the system and method for navigation by the information for capturing and analyzing global scene and local objects around automatic driving vehicle (ADV) for the ADV.The system comprises merge the sensor module on the ADV and the computing device with the RFID reader communicative.The sensor module is configured to collect the environmental data around the ADV.The computing device includes processor and the memory cell for storing predefined scene module and environmental data.The computing device is configured to handle the environmental data to identify mobile and stationary object mobile object and stationary object.The computing device is configured to observe the environment scene around the ADV.It will be observed that environment scene be compared with predefined scene module.In addition, adjusting the predefined scene module using processed environmental data.The computing device provides the instruction for controlling the vehicle based on scene module adjusted.

Description

Navigation system and its method for automatic driving vehicle
Cross reference to related applications
This application claims No. 62/529,941 U.S. Provisional Patent Applications submitted on July 7th, 2017 " for based on certainly System and method (the Systems and Methods for Navigation Maps based of the dynamic navigation map driven Autonomous Driving) " equity, the content of the U.S. Provisional Patent Application is incorporated herein by reference.
Technical field
The present invention provides a kind of navigation system and method being related to for automatic driving vehicle.More specifically, this hair The bright information being related to by capturing and analyzing global scene and local objects around automatic driving vehicle is come for automatic Pilot vehicle (Autonomous Driving Vehicle, ADV) provides the system and method for navigation.
Background technique
Traditional automatic Pilot (Autonomous Driving Vehicle, ADV) method and system depends critically upon elder generation The preceding readable 3D map of traditional computer recorded using external system.These drive manners include semi-automatic driving system, height Automated driving system and full-automatic driving system.Semi-automatic driving system needs driver's continuous monitoring system, semi-automatic driving System needs driver persistently to the monitoring of system, while needing driver voluntarily to handle lane holding in special applications case And lane changing.Full-automatic driving system requirements driver can take over when needed, even if driver is not required to continue prison Control system.Full-automatic driving system does not need driver in special applications, but usually there is still a need for use pre-recorded (High Definition, HD) 3D map and laser radar (LiDAR) system creation point cloud.
One of these traditional automated driving systems the disadvantage is that, due to above-mentioned control loop, hereinafter referred to as they, highly according to Rely in high definition 3D map, the data stored on the map that they are surveyed by predefined physical parameter and previously are limited.This A little physical parameters and data include close label, traffic lights, lane and terrestrial reference details.Due to construction, accident or Landscape dynamics, compared with reality, map may be out-of-date or inaccurate.Therefore, because they are with depending on the HD 3D of real world Figure, above-mentioned traditional automated driving system usually require connectivity, cloud and crowdsourcing content.In view of existing solution, preparation HD 3D map usually requires that there is one or more automobiles the true map function of superfinishing to record perfect Centimeter Level map.In order to Realize fully automated driving, these data need just to integrate before the practical autonomous driving vehicle on road is sent.In addition to Except high cost, the limitation of this method and this autonomous driving vehicle also includes that cannot change its course, because automobile cannot not have It is travelled on the route for having pre-rendered Centimeter Level map.Vehicle can not also identify temporary traffic signal or be driving through parking lot.
The US7831433B1 of Robert Belvin et al. discloses a kind of for using context in navigation dialog box System and method.Navigation system includes route planning module and route guidance module.Route guidance module is configured to receive road Line and route and current location based on user, conversation history and geography and map knowledge, system provide a user position Specific instruction.Position specific instruction includes the reference to the specific visible oBject near user.However, this route of system is advised It draws module and still relies on map to provide a user position specific instruction.
The US9286520B1 of Wan-Yen Lo et al. discloses a kind of Real-time Road using template and appropriate color space Dazzle detection.The computing device of vehicle receives the image of vehicle environmental.Computing device can be configured to identify in multiple pixels Given pixel.Then, computing device to by image given pixel indicate object shape one or more features with The corresponding one or more features of the predetermined shape of road dazzle are compared;And determine that object represents the possibility of road dazzle Property.Computing device correspondingly modifies the control strategy of the driving behavior of vehicle.However, this device is restricted in use, only For determining the road dazzle on road and correspondingly modifying route.
It would therefore be highly desirable to the navigation system and method for a kind of automatic driving vehicle provide fully-automatic vehicle, it can be in office It is travelled with not having any problems on what road, and preferably not depends on pre-recorded high definition 3D map.
Summary of the invention
For the defects in the prior art, the object of the present invention is to provide a kind of navigation system for automatic driving vehicle And its method.
The one kind provided according to the present invention is used for by capturing and analyzing the global scene around automatic driving vehicle (ADV) The system and method for navigation are provided for automatic driving vehicle with the information of local objects, without using any existing high definition The physical map of 360/3D map or precedence record.
The navigation system for automatic driving vehicle include merge sensor module on ADV and with sensor group The computing device of part communication.Sensor module includes one or more sensors, and the sensor is configured to collect automatic Pilot The environmental data of vehicle periphery.Computing device includes processor and memory cell.Processor be configured to processing environment data with Identify the movement around automatic driving vehicle and stationary object mobile object and stationary object, and memory cell configurations are at depositing Store up predefined scene module and environmental data.Computing device is configured to the environment scene around observation ADV.The ring that will be observed that Border scene is compared with predefined scene module.In addition, adjusting predefined scene mould using processed environmental data Block, to create scene module adjusted.Then, computing device provides the instruction of control ADV based on scene module adjusted.
In one embodiment, sensor includes ultrasonic sensor, laser radar sensor, radar cell, accelerometer At least one of sensor, gyro sensor, compass detector, camera and stereoptics sensor.In one embodiment In, scene module adjusted includes the data with the path-dependent in travelable region, pavement marker and manipulation ADV.At one In embodiment, computer vision and Algorithm of Scene processing environment data are utilized.In one embodiment, environment scene is The 3D scene of environment around ADV.In one embodiment, predefined scene module is 3D scene module.
In one embodiment, the present invention provides a kind of air navigation aids for automatic driving vehicle.In a step In, it provides and merges the sensor module on ADV and the computing device with RFID reader communicative.Sensor module is configured to Collect the environmental data of automatic Pilot vehicle periphery.The computing device includes: processor, be configured to processing environment data with Movement and stationary object mobile object and stationary object around identification automatic driving vehicle;And memory cell, configuration At the predefined scene module of storage and environmental data.In a further step, the environment scene of ADV is observed.In a further step, It will be observed that environment scene be compared with predefined scene module.In a further step, using processed environment number According to predefined scene module is adjusted, to create scene module adjusted.In a further step, after computing device is based on adjustment Scene module provide control ADV instruction.
From below in conjunction with attached drawing description of preferred embodiments, other feature and advantage be will become obvious.
Compared with prior art, the present invention have it is following the utility model has the advantages that
1, it is used for provided by the present invention for the navigation system of automatic driving vehicle and its method through capture and analyzes certainly The information of the dynamic global scene driven around vehicle (ADV) and local objects come provided for automatic driving vehicle navigation system and Method;
2, provided by the present invention for the navigation system of automatic driving vehicle and its method without using any existing height The physical map of clear 360/3D map or precedence record.
Detailed description of the invention
Upon reading the detailed description of non-limiting embodiments with reference to the following drawings, other feature of the invention, Objects and advantages will become more apparent upon:
Fig. 1 is the environment schematic of the navigation system for automatic driving vehicle (ADV) of the embodiment of the present invention.
Fig. 2 is the block diagram of the navigation procedure for automatic driving vehicle of the embodiment of the present invention.
Fig. 3 is screen circle of the system that the local objects around ADV are distinguished using 3D bounding box of the embodiment of the present invention Face figure.
Fig. 4 is the screen interface figure of the system of the positioning and position of terrestrial reference in the identification scene of the embodiment of the present invention.
Specific embodiment
The present invention is described in detail combined with specific embodiments below.Following embodiment will be helpful to the technology of this field Personnel further understand the present invention, but the invention is not limited in any way.It should be pointed out that the ordinary skill of this field For personnel, without departing from the inventive concept of the premise, several changes and improvements can also be made.These belong to the present invention Protection scope.
The description to the embodiment of the present invention will be provided in conjunction with attached drawing now.It is of the invention spiritual or substantially special not departing from In the case where sign, it is contemplated that the present invention can be implemented in other specific forms.Described embodiment is all answered in all respects It is considered as merely illustrative and not restrictive.Therefore, the scope of the present invention by the appended claims rather than front Description indicates.All changes in the meaning and equivalent scope of claims are included within the scope of its.
The present invention provides a kind of for by analyzing come the global scene and local objects around automatic driving vehicle ADV Information for automatic driving vehicle provide the system and method for navigation.System configuration of the invention is at overcoming to perfect Centimeter Level The limitation of the dependence of HD 3D map, the perfection Centimeter Level HD 3D map needs are constantly updated according to real world. Provided by the present invention for automatic driving vehicle guidance system configuration at by the field 3D of the environment around ADV observed in real time Scape is compared with predefined 3D scene module, then pushes away to the presence and position of each object in 3D scene module By.This enables a system to capture global scene and local object information simultaneously.The system configuration is at dependent on providing by finger The standard navigation map of order.
As shown in Figure 1, being hereinafter referred to as provided by the present invention for the navigation system and its method of automatic driving vehicle System, provides the environment 100 of the navigation system for automatic driving vehicle ADV of embodiment according to the present invention.The system Including the computing device 106 for being merged into the sensor module 102 of ADV and being communicated with sensor module 102.In one embodiment In, sensor module 102 includes one or more sensors, and the sensor is configured to collect the ring of automatic Pilot vehicle periphery Border data.In one embodiment, sensor includes ultrasonic sensor, laser radar sensor, radar cell, accelerometer biography At least one of sensor, gyro sensor, compass detector, camera and stereoptics sensor.In one embodiment, Environmental data includes the information about barrier, mobile object or stationary object etc..
In one embodiment, computing device 106 includes processor 104 and memory cell 108.Processor 104 configures At processing environment data to identify the mobile object and stationary object around automatic driving vehicle.Memory cell 108 is configured to Store predefined scene module or standard navigation map and environmental data.The present invention utilizes computer vision algorithms make and scene point Cut algorithm and environmental data identify vehicle, pedestrian, cyclist, animal, object, mark, pavement marker, traffic lights and Other barriers.In addition, the system configuration at simultaneously by from analysis generate data model be fed to automatic driving vehicle with Automatic Pilot.
As shown in Fig. 2, showing the block diagram 200 of the navigation for automatic driving vehicle of embodiment according to the present invention. Computing device 106 is configured to the environment scene around observation ADV.It will be observed that environment scene and predefined scene module into Row compares.In addition, predefined scene module is adjusted using processed environmental data, to create scene module adjusted. Then, computing device 106 provides the instruction of control ADV based on scene module adjusted.Scene module adjusted include with It can travel region 204 and the related data of pavement marker 206.Computing device 106 is obtained and be can travel using semantic segmentation 202 Region 204 and the related real time data of pavement marker 206 are further used for coordinates measurement 208 to allow ADV automatic Pilot.
Fig. 3, which is schematically illustrated by using real-time 3D object identification, differently marks the local objects in scene Distinguish their screenshot capture 300.Automobile and pedestrian are labeled as possible mobile local objects using 3D bounding box 302.Entirely Other objects in office's scene, such as trees and house 304, through the invention positioning in real time.Global scene drawn game is captured simultaneously The information of portion's object.It is checked immediately by the camera system captured image of ADV by the processor 104 of computing device 106.Fig. 4 shows Positioning and the position of the terrestrial reference in identification scene are shown to example property, and is also marked using real-time analysis to enhance lane 402 Screenshot capture 400.For example, mark, with fuchsin color marker, the lines for separating lane are enhanced.The current route of vehicle includes red Wrapping lid, it illustrates the estimated travel paths of vehicle.
In one embodiment, a kind of air navigation aid for automatic driving vehicle is disclosed.In one step, it provides The computing device 106 for merging the sensor module 102 on ADV and being communicated with sensor module 102.Sensor module 106 is matched It is set to the environmental data for collecting automatic Pilot vehicle periphery.Computing device 106 includes processor 104 and memory cell 108, place Reason device 104 is configured to processing environment data to identify the mobile object and stationary object around automatic driving vehicle, memory list Member 108 is configured to store predefined scene module and environmental data.In a further step, the environment scene of ADV is observed.Another In one step, it will be observed that environment scene be compared with predefined scene module.In a further step, using processed Environmental data adjust predefined scene module, to create scene module adjusted.In a further step, computing device 106 provide the instruction of control ADV based on scene module adjusted.
Advantageously, the present invention no longer needs to retain the high definition inch precision map in the expected region used ADV.A side Face, this system provide fully automated driving vehicle and execute all safety-critical functions, such as identify temporary sign and take phase The driver behavior answered, and detection and avoiding obstacles.
In addition, the system no longer needs driver to control ADV at any time, and do not need using HD 3D map.This system Pavement marker, such as lane, road boundary, curb, obstacle can be identified and detected without relying on the map that superfinishing really records Object, and traffic sign and traffic lights can be understood to help to realize real automatic Pilot.Sensor input and in real time Understanding of the scene understanding technology-imitation mankind to scene.For example, the mankind are similar to, as long as providing instruction from standard navigation map, The present invention can provide navigation in unfamiliar environment in the case where not using previously stored HD-3D map.Automatically Analysis condition of road surface is simultaneously classified as one group of predefined template road conditions.The parameter of road is including but not limited to road class Type and road width carry out real-time estimation to it to adjust predefined template to match natural environment based on sensor input.
One skilled in the art will appreciate that in addition to realizing system provided by the invention in a manner of pure computer readable program code It, completely can be by the way that method and step be carried out programming in logic come so that the present invention provides and its other than each device, module, unit System and its each device, module, unit with logic gate, switch, specific integrated circuit, programmable logic controller (PLC) and embedding Enter the form of the controller that declines etc. to realize identical function.So system provided by the invention and its every device, module, list Member is considered a kind of hardware component, and to include in it can also for realizing the device of various functions, module, unit To be considered as the structure in hardware component;It can also will be considered as realizing the device of various functions, module, unit either real The software module of existing method can be the structure in hardware component again.
Specific embodiments of the present invention are described above.It is to be appreciated that the invention is not limited to above-mentioned Particular implementation, those skilled in the art can make a variety of changes or modify within the scope of the claims, this not shadow Ring substantive content of the invention.In the absence of conflict, the feature in embodiments herein and embodiment can any phase Mutually combination.

Claims (14)

1. a kind of navigation system for automatic driving vehicle, comprising:
Merge the sensor module on the automatic driving vehicle comprising one or more sensors, the sensor are matched It is set to the environmental data collected around the automatic driving vehicle;With
With the computing device of the RFID reader communicative comprising processor and memory cell,
Wherein the processor is configured to handle the environmental data to identify movement around the automatic driving vehicle and quiet Only object, and the memory cell configurations are at storing predefined scene module and environmental data, and
Wherein the computing device is configured to:
The environment scene around the automatic driving vehicle is observed,
The environment scene of capture is compared with predefined scene module,
The predefined scene module is adjusted using processed environmental data, and
Navigation is provided based on scene module adjusted for the automatic driving vehicle.
2. the navigation system according to claim 1 for automatic driving vehicle, wherein the sensor includes ultrasonic wave Sensor, laser radar sensor, radar cell, accelerometer sensor, gyro sensor, compass detector, camera and vertical At least one of bulk optics sensor.
3. the navigation system according to claim 1 for automatic driving vehicle, wherein the scene module adjusted Data including the path-dependent with travelable region, pavement marker and the manipulation vehicle.
4. the navigation system according to claim 1 for automatic driving vehicle, wherein utilizing computer vision and scene The partitioning algorithm processing environmental data.
5. the navigation system according to claim 1 for automatic driving vehicle, wherein the environment scene be it is described from The 3D scene of the dynamic environment for driving vehicle periphery.
6. the navigation system according to claim 1 for automatic driving vehicle, wherein the predefined scene module It is 3D scene module.
7. the navigation system according to claim 1 for automatic driving vehicle, wherein the predefined scene module It is standard navigation map.
8. a kind of air navigation aid for automatic driving vehicle, comprising:
Automatic driving vehicle is provided, comprising:
Merge the sensor module on the automatic driving vehicle comprising one or more sensors, the sensor are matched It is set to the environmental data collected around the automatic driving vehicle;With
With the computing device of the RFID reader communicative comprising processor and memory cell, and
Wherein the processor is configured to handle the environmental data to identify movement around the automatic driving vehicle and quiet Only object, and the memory cell configurations are at the predefined scene module of storage and environmental data;
Observe the environment scene of the vehicle;
The environment scene of capture is compared with predefined scene module;
The predefined scene module is adjusted using processed environmental data, and
Navigation is provided based on scene module adjusted for the vehicle.
9. the air navigation aid according to claim 8 for automatic driving vehicle, wherein the sensor includes ultrasonic wave Sensor, laser radar sensor, radar cell, accelerometer sensor, gyro sensor, compass detector, camera and vertical At least one of bulk optics sensor.
10. the air navigation aid according to claim 8 for automatic driving vehicle, wherein the scene module adjusted Data including the path-dependent with travelable region, pavement marker and the manipulation vehicle.
11. the air navigation aid according to claim 8 for automatic driving vehicle, wherein utilizing computer vision and scene Environmental data described in dividing processing.
12. the air navigation aid according to claim 8 for automatic driving vehicle, wherein the environment scene be it is described from The 3D scene of the dynamic environment for driving vehicle periphery.
13. the air navigation aid according to claim 8 for automatic driving vehicle, wherein the predefined scene module It is 3D scene module.
14. the navigation system according to claim 8 for automatic driving vehicle, wherein the predefined scene module It is standard navigation map.
CN201810745627.8A 2017-07-07 2018-07-09 Navigation system for autonomous vehicle and method thereof Active CN109144052B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201762529941P 2017-07-07 2017-07-07
US62/529,941 2017-07-07
US16/027,777 US10942519B2 (en) 2017-07-07 2018-07-05 System and method for navigating an autonomous driving vehicle
US16/027,777 2018-07-05

Publications (2)

Publication Number Publication Date
CN109144052A true CN109144052A (en) 2019-01-04
CN109144052B CN109144052B (en) 2021-12-28

Family

ID=64800070

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810745627.8A Active CN109144052B (en) 2017-07-07 2018-07-09 Navigation system for autonomous vehicle and method thereof

Country Status (1)

Country Link
CN (1) CN109144052B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111638536A (en) * 2019-03-01 2020-09-08 通用汽车环球科技运作有限责任公司 Method and apparatus for context-aware crowd-sourced sparse high definition maps

Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6016116A (en) * 1986-09-13 2000-01-18 Gec Avionics Limited Navigation apparatus
US20020147544A1 (en) * 1994-05-31 2002-10-10 Winged Systems Corporation High resolution autonomous precision positioning system
GR1004873B (en) * 2004-04-01 2005-04-26 Κιμων Βαλαβανης CONSTRUCTION OF AN AUTONOMOUS NAVIGATION SYSTEM FOR UNMANNED AERIAL VEHICLES (UAVs)
DE102005027655A1 (en) * 2005-06-15 2006-12-21 Robert Bosch Gmbh Driver assistance system e.g. adaptive cruise control system, for motor vehicle, has mechanism predicting elevation profile of roadway based on navigation system data, where system implements assistance functions based on predicted profile
EP1788643A2 (en) * 2005-09-30 2007-05-23 Fujinon Corporation Driving mechanism
US20090306881A1 (en) * 2008-06-06 2009-12-10 Toyota Motor Engineering & Manufacturing North America, Inc. Detecting principal directions of unknown environments
CN101900562A (en) * 2009-05-29 2010-12-01 通用汽车环球科技运作公司 Clear path detection using divide approach
CN101944176A (en) * 2009-05-08 2011-01-12 通用汽车环球科技运作公司 Exist the more excellent clear path of means of transportation sign to detect
CN102156481A (en) * 2011-01-24 2011-08-17 广州嘉崎智能科技有限公司 Intelligent tracking control method and system for unmanned aircraft
US20130063600A1 (en) * 2002-05-03 2013-03-14 Donnelly Corporation Vision system for vehicle
WO2013098486A1 (en) * 2011-12-30 2013-07-04 Rdnet Oy Method and arrangement for determining location and/or speed of a moving object and use of the arrangement
US20150134180A1 (en) * 2013-11-08 2015-05-14 Electronics And Telecommunications Research Institute Autonomous driving control apparatus and method using navigation technology
CN104679000A (en) * 2015-01-09 2015-06-03 中国科学院合肥物质科学研究院 Indoor simulation testing device and testing method for target object sensing capability of mobile robot
CN104794899A (en) * 2014-09-20 2015-07-22 徐彬 Road section traffic index estimation system based on unmanned aerial vehicle measurement
CN105136120A (en) * 2015-08-24 2015-12-09 陈建武 Object displacement image detection system and method
CN105393083A (en) * 2013-07-09 2016-03-09 齐诺马蒂赛股份有限公司 Communication controller, communication control method, terminal device, and information processing device
US20160092725A1 (en) * 2007-01-12 2016-03-31 International Business Machines Corporation Warning a user about adverse behaviors of others within an environment based on a 3d captured image stream
CN105607637A (en) * 2016-01-25 2016-05-25 重庆德新机器人检测中心有限公司 Unmanned vehicle autopilot system
US20160180171A1 (en) * 2014-12-17 2016-06-23 Toyota Motor Engineering & Manufacturing North America, Inc. Background map format for autonomous driving
CN105741595A (en) * 2016-04-27 2016-07-06 常州加美科技有限公司 Unmanned vehicle navigation driving method based on cloud database
CN205451514U (en) * 2016-01-27 2016-08-10 王德龙 Car real -time road conditions over --horizon radar of navigation and network alarm system
CN105844964A (en) * 2016-05-05 2016-08-10 深圳市元征科技股份有限公司 Vehicle safe driving early warning method and device
CN106155065A (en) * 2016-09-28 2016-11-23 上海仙知机器人科技有限公司 A kind of robot follower method and the equipment followed for robot
US20160379486A1 (en) * 2015-03-24 2016-12-29 Donald Warren Taylor Apparatus and system to manage monitored vehicular flow rate
CN106291736A (en) * 2016-08-16 2017-01-04 张家港长安大学汽车工程研究院 Pilotless automobile track dynamic disorder object detecting method
US20170010105A1 (en) * 2015-02-10 2017-01-12 Mobileye Vision Technologies Ltd. Navigation based on expected landmark location
CN106842231A (en) * 2016-11-08 2017-06-13 长安大学 A kind of road edge identification and tracking
CN106899626A (en) * 2015-12-18 2017-06-27 北京奇虎科技有限公司 A kind of vehicle data processing system and method based on car-mounted terminal
CN106909148A (en) * 2017-03-10 2017-06-30 南京沃杨机械科技有限公司 Based on the unmanned air navigation aid of agricultural machinery that farm environment is perceived
US20170221359A1 (en) * 2016-01-28 2017-08-03 Toyota Motor Engineering & Manufacturing North America, Inc. Sensor blind spot indication for vehicles
DE102016207463A1 (en) * 2016-04-29 2017-11-02 Robert Bosch Gmbh Method and device for operating at least one vehicle with respect to at least one passable object in the vicinity of the at least one vehicle
CN107444264A (en) * 2016-05-31 2017-12-08 法拉第未来公司 Use the object of camera calibration du vehicule

Patent Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6016116A (en) * 1986-09-13 2000-01-18 Gec Avionics Limited Navigation apparatus
US20020147544A1 (en) * 1994-05-31 2002-10-10 Winged Systems Corporation High resolution autonomous precision positioning system
US20130063600A1 (en) * 2002-05-03 2013-03-14 Donnelly Corporation Vision system for vehicle
GR1004873B (en) * 2004-04-01 2005-04-26 Κιμων Βαλαβανης CONSTRUCTION OF AN AUTONOMOUS NAVIGATION SYSTEM FOR UNMANNED AERIAL VEHICLES (UAVs)
DE102005027655A1 (en) * 2005-06-15 2006-12-21 Robert Bosch Gmbh Driver assistance system e.g. adaptive cruise control system, for motor vehicle, has mechanism predicting elevation profile of roadway based on navigation system data, where system implements assistance functions based on predicted profile
EP1788643A2 (en) * 2005-09-30 2007-05-23 Fujinon Corporation Driving mechanism
US20160092725A1 (en) * 2007-01-12 2016-03-31 International Business Machines Corporation Warning a user about adverse behaviors of others within an environment based on a 3d captured image stream
US20090306881A1 (en) * 2008-06-06 2009-12-10 Toyota Motor Engineering & Manufacturing North America, Inc. Detecting principal directions of unknown environments
CN101944176A (en) * 2009-05-08 2011-01-12 通用汽车环球科技运作公司 Exist the more excellent clear path of means of transportation sign to detect
CN101900562A (en) * 2009-05-29 2010-12-01 通用汽车环球科技运作公司 Clear path detection using divide approach
CN102156481A (en) * 2011-01-24 2011-08-17 广州嘉崎智能科技有限公司 Intelligent tracking control method and system for unmanned aircraft
WO2013098486A1 (en) * 2011-12-30 2013-07-04 Rdnet Oy Method and arrangement for determining location and/or speed of a moving object and use of the arrangement
CN104145172A (en) * 2011-12-30 2014-11-12 通力股份公司 Method and arrangement for determining location and/or speed of a moving object and use of the arrangement
CN105393083A (en) * 2013-07-09 2016-03-09 齐诺马蒂赛股份有限公司 Communication controller, communication control method, terminal device, and information processing device
US20150134180A1 (en) * 2013-11-08 2015-05-14 Electronics And Telecommunications Research Institute Autonomous driving control apparatus and method using navigation technology
CN104794899A (en) * 2014-09-20 2015-07-22 徐彬 Road section traffic index estimation system based on unmanned aerial vehicle measurement
US20160180171A1 (en) * 2014-12-17 2016-06-23 Toyota Motor Engineering & Manufacturing North America, Inc. Background map format for autonomous driving
CN104679000A (en) * 2015-01-09 2015-06-03 中国科学院合肥物质科学研究院 Indoor simulation testing device and testing method for target object sensing capability of mobile robot
US20170010105A1 (en) * 2015-02-10 2017-01-12 Mobileye Vision Technologies Ltd. Navigation based on expected landmark location
US20160379486A1 (en) * 2015-03-24 2016-12-29 Donald Warren Taylor Apparatus and system to manage monitored vehicular flow rate
CN105136120A (en) * 2015-08-24 2015-12-09 陈建武 Object displacement image detection system and method
CN106899626A (en) * 2015-12-18 2017-06-27 北京奇虎科技有限公司 A kind of vehicle data processing system and method based on car-mounted terminal
CN105607637A (en) * 2016-01-25 2016-05-25 重庆德新机器人检测中心有限公司 Unmanned vehicle autopilot system
CN205451514U (en) * 2016-01-27 2016-08-10 王德龙 Car real -time road conditions over --horizon radar of navigation and network alarm system
US20170221359A1 (en) * 2016-01-28 2017-08-03 Toyota Motor Engineering & Manufacturing North America, Inc. Sensor blind spot indication for vehicles
CN105741595A (en) * 2016-04-27 2016-07-06 常州加美科技有限公司 Unmanned vehicle navigation driving method based on cloud database
DE102016207463A1 (en) * 2016-04-29 2017-11-02 Robert Bosch Gmbh Method and device for operating at least one vehicle with respect to at least one passable object in the vicinity of the at least one vehicle
CN105844964A (en) * 2016-05-05 2016-08-10 深圳市元征科技股份有限公司 Vehicle safe driving early warning method and device
CN107444264A (en) * 2016-05-31 2017-12-08 法拉第未来公司 Use the object of camera calibration du vehicule
CN106291736A (en) * 2016-08-16 2017-01-04 张家港长安大学汽车工程研究院 Pilotless automobile track dynamic disorder object detecting method
CN106155065A (en) * 2016-09-28 2016-11-23 上海仙知机器人科技有限公司 A kind of robot follower method and the equipment followed for robot
CN106842231A (en) * 2016-11-08 2017-06-13 长安大学 A kind of road edge identification and tracking
CN106909148A (en) * 2017-03-10 2017-06-30 南京沃杨机械科技有限公司 Based on the unmanned air navigation aid of agricultural machinery that farm environment is perceived

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
J.Y. ZHENG: "Panoramic representation of scenes for route understand", 《[1990] PROCEEDINGS. 10TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION》 *
M. PAWAN KUMAR: "Efficiently selecting regions for scene understanding", 《 2010 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION》 *
宋振伟: "基于FPGA的车辆自动驾驶系统的研究与仿真设计", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
汪明磊: "智能车辆自主导航中避障路径规划与跟踪控制研究", 《中国博士学位论文全文数据库工程科技Ⅱ辑》 *
王强: "智能车辆视觉辅助导航中的道路检测技术研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111638536A (en) * 2019-03-01 2020-09-08 通用汽车环球科技运作有限责任公司 Method and apparatus for context-aware crowd-sourced sparse high definition maps
CN111638536B (en) * 2019-03-01 2023-12-08 通用汽车环球科技运作有限责任公司 Method and apparatus for context aware crowdsourcing sparse high definition maps

Also Published As

Publication number Publication date
CN109144052B (en) 2021-12-28

Similar Documents

Publication Publication Date Title
US10817731B2 (en) Image-based pedestrian detection
EP3759562B1 (en) Camera based localization for autonomous vehicles
US11670087B2 (en) Training data generating method for image processing, image processing method, and devices thereof
US9921585B2 (en) Detailed map format for autonomous driving
WO2018133851A1 (en) Point cloud data processing method and apparatus, and computer storage medium
Li et al. Springrobot: A prototype autonomous vehicle and its algorithms for lane detection
US10203210B1 (en) Systems and methods for road scene change detection using semantic segmentation
US10942519B2 (en) System and method for navigating an autonomous driving vehicle
US9576200B2 (en) Background map format for autonomous driving
CN110155053A (en) Method and apparatus for driving the information of vehicle is provided
WO2020165650A2 (en) Systems and methods for vehicle navigation
US11280630B2 (en) Updating map data
US20160252905A1 (en) Real-time active emergency vehicle detection
CN108362295A (en) Vehicle route guides device and method
CN109583415A (en) A kind of traffic lights detection and recognition methods merged based on laser radar with video camera
CN112418081B (en) Method and system for quickly surveying traffic accidents by air-ground combination
JP2014203465A (en) Lane-based localization
US20210191397A1 (en) Autonomous vehicle semantic map establishment system and establishment method
CN117576652B (en) Road object identification method and device, storage medium and electronic equipment
CN117685954B (en) Multi-mode semantic map construction system and method for mining area
US20240144594A1 (en) Method for create map using aviation lidar and computer program recorded on record-medium for executing method therefor
CN109144052A (en) Navigation system and its method for automatic driving vehicle
CN114341939A (en) Real world image road curvature generation as a data enhancement method
Khan et al. Real-time traffic light detection from videos with inertial sensor fusion
KR102540629B1 (en) Method for generate training data for transportation facility and computer program recorded on record-medium for executing method therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20181225

Address after: Mail Box 309 Yougran House, Maple Enterprise Co., Ltd., Grand Cayman Island, Cayman Islands

Applicant after: Autex Co., Ltd.

Address before: San Jose Venturich, California, USA

Applicant before: Xiao Jianxiong

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant