CN112172809B - Vehicle control device, vehicle control method, and storage medium - Google Patents
Vehicle control device, vehicle control method, and storage medium Download PDFInfo
- Publication number
- CN112172809B CN112172809B CN202010616306.5A CN202010616306A CN112172809B CN 112172809 B CN112172809 B CN 112172809B CN 202010616306 A CN202010616306 A CN 202010616306A CN 112172809 B CN112172809 B CN 112172809B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- road
- factor
- narrowing
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 14
- 230000001133 acceleration Effects 0.000 claims abstract description 13
- 208000031481 Pathologic Constriction Diseases 0.000 claims description 27
- 230000036262 stenosis Effects 0.000 claims description 27
- 208000037804 stenosis Diseases 0.000 claims description 27
- 238000010586 diagram Methods 0.000 description 24
- 238000004891 communication Methods 0.000 description 13
- 238000012545 processing Methods 0.000 description 11
- 230000015654 memory Effects 0.000 description 10
- 230000009471 action Effects 0.000 description 9
- 230000008859 change Effects 0.000 description 6
- 238000001514 detection method Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000006399 behavior Effects 0.000 description 3
- 238000002485 combustion reaction Methods 0.000 description 3
- 230000005484 gravity Effects 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000004888 barrier function Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000003703 image analysis method Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18163—Lane change; Overtaking manoeuvres
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/402—Type
- B60W2554/4029—Pedestrians
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Navigation (AREA)
Abstract
Provided are a vehicle control device, a vehicle control method, and a storage medium, which are capable of predicting interference between a road-narrowing factor and a vehicle. The vehicle control device is provided with: a surrounding area recognition unit that recognizes a surrounding area of the vehicle; and a driving control unit that controls acceleration and deceleration and steering of the vehicle based on the surrounding situation recognized by the surrounding recognition unit, wherein the recognition unit determines whether or not a road-narrowing factor is present on a road on which the vehicle is traveling, and recognizes a length of the road-narrowing factor in a traveling direction of the vehicle when it is determined that the road-narrowing factor is present, and wherein the driving control unit generates an avoidance track corresponding to the road-narrowing factor based on the recognized length of the road-narrowing factor in the traveling direction of the vehicle.
Description
Technical Field
The invention relates to a vehicle control device, a vehicle control method, and a storage medium.
Background
Conventionally, a technique for estimating whether or not a pedestrian is to come out of a lane based on a relationship between a support floor surface of the pedestrian and a position of a body center of gravity and detecting that the pedestrian is to perform avoidance action has been known (for example, patent document 1 (japanese patent application laid-open No. 2017-210118)).
Disclosure of Invention
Problems to be solved by the invention
However, in the conventional technique, studies on predicting interference between a vehicle and a road narrowing factor (traffic participant and stationary object) based on a relationship between stationary object and an evasion object of traffic participant such as a pedestrian are insufficient.
An object of the present invention is to provide a vehicle control device, a vehicle control method, and a storage medium, which are capable of predicting interference between a road narrowing factor and a vehicle.
Means for solving the problems
The vehicle control device, the vehicle control method, and the storage medium of the present invention adopt the following configurations.
(1): A vehicle control device according to an aspect of the present invention includes: a surrounding area recognition unit that recognizes a surrounding area of the vehicle; and a driving control unit that controls acceleration and deceleration and steering of the vehicle based on the surrounding situation recognized by the surrounding recognition unit, wherein the surrounding recognition unit determines whether or not a road-narrowing factor is present on a road on which the vehicle is traveling, and recognizes a length of the road-narrowing factor in a traveling direction of the vehicle when it is determined that the road-narrowing factor is present, and wherein the driving control unit generates an avoidance track corresponding to the road-narrowing factor based on the recognized length of the road-narrowing factor in the traveling direction of the vehicle.
(2): In the aspect of the above (1), the road narrowing factor includes a road shoulder rest and one or more traffic participants predicted to interfere with a predetermined travel track of the vehicle by avoiding the movement of the road shoulder rest.
(3): In the aspect of (2) above, the vehicle control device further includes a prediction unit that predicts the time-series position of the traffic participant based on a recognition result of the recognition by the surrounding recognition unit of the length of the road narrowing factor in the traveling direction of the vehicle.
(4): In the aspect of (3) above, the prediction unit predicts a period during which the traffic participant interferes with a predetermined travel track of the vehicle based on the length of the road narrowing factor recognized by the surrounding recognition unit.
(5): In the aspect of (4) above, the driving control unit waits until the period of time when it is determined that the road narrowing factor interferes with the predetermined travel track of the vehicle, while the vehicle is traveling slowly or stopped before the road narrowing factor, when the passable width of the vehicle for the predetermined travel is smaller than a predetermined width.
(6): In any one of the aspects (4) and (5), the driving control unit changes the lateral avoidance control based on a period during which the road narrowing factor interferes with the predetermined travel track of the vehicle when the passable width of the vehicle for the predetermined travel is equal to or greater than a predetermined width.
(7): In any one of the above (3) to (6), the driving control unit changes lateral avoidance control according to a time-series position of the road narrowing factor on a travel path on which the vehicle is scheduled to travel.
(8): A vehicle control method according to an aspect of the present invention causes a computer to execute: identifying a surrounding condition of the vehicle; controlling acceleration and deceleration and steering of the vehicle based on the surrounding conditions; determining whether a road-narrowing factor exists on a road on which the vehicle is traveling, and if it is determined that the road-narrowing factor exists, identifying a length of the road-narrowing factor in a traveling direction of the vehicle; and generating an avoidance track corresponding to the road stenosis factor based on a length of the identified road stenosis factor in a traveling direction of the vehicle.
(9): A storage medium according to an aspect of the present invention stores a program that causes a computer to execute: identifying a surrounding condition of the vehicle; controlling acceleration and deceleration and steering of the vehicle based on the surrounding conditions; determining whether a road-narrowing factor exists on a road on which the vehicle is traveling, and if it is determined that the road-narrowing factor exists, identifying a length of the road-narrowing factor in a traveling direction of the vehicle; and generating an avoidance track corresponding to the road stenosis factor based on a length of the identified road stenosis factor in a traveling direction of the vehicle.
Effects of the invention
According to (1) to (9), interference between the road narrowing factor and the vehicle can be predicted.
Drawings
Fig. 1 is a block diagram of a vehicle system 1 using a vehicle control device 100 according to a first embodiment.
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160.
Fig. 3 is a diagram schematically showing a road narrowing factor on the travel path LR on which the host vehicle M travels.
Fig. 4 is a diagram for explaining the position of the time series of the pedestrian P.
Fig. 5 is a plan view for explaining a case where the pedestrian P moves while avoiding the other vehicle mA.
Fig. 6 is a diagram for explaining an example of a rule of the vehicle speed of the host vehicle M when the second control unit 160 travels on the avoidance track.
Fig. 7 is a diagram showing an example of the avoidance orbit generated by the avoidance orbit generation unit 142.
Fig. 8 is a diagram showing another example of the avoidance orbit generated by the avoidance orbit generation unit 142.
Fig. 9 is a diagram for explaining a scenario in which a road shoulder stationary object on the travel path LR is a large vehicle mB.
Fig. 10 is a diagram for explaining a scene in which a road shoulder stationary object on the travel path LR is a large vehicle mB.
Fig. 11 is a flowchart showing an example of the flow of the road narrow factor avoidance process in the vehicle system 1.
Fig. 12 is a flowchart showing an example of the flow of the avoidance orbit generation processing by the avoidance orbit generation section 142 based on the prediction result of the prediction section 136.
Fig. 13 is a diagram showing an example of a hardware configuration of the various control devices according to the embodiment.
Reference numerals illustrate:
A vehicle system of 1 …, a camera of 10 …, a radar device of 12 …, a detector of 14 …, an object recognition device of 16 …, a communication device of 20 …, a vehicle sensor of 40 …, a navigation device of 50 …, a GNSS receiver of 51 …, a route determination part of 53 …, a recommended lane determination part of 61 …, a driving operation tool of 80 …, a vehicle control device of 100 …, a first control part of 120 …, a recognition part of 130 …, a recognition part of 132 …, a road narrow factor recognition part of 134 …, a prediction part of 136 …, a motion plan generation part of 140 …, a evasion track generation part of 142 …, a second control part of 160 …, an acquisition part of 162 …, a speed control part of 164 …, a steering control part of 166 …, a driving force output device of 200 …, a braking device of 210 …, a steering device of 220 …, an M … host vehicle, a mA … other vehicles, a mB … large vehicle, and a pedestrian of P ….
Detailed Description
Embodiments of a vehicle control device, a vehicle control method, and a storage medium according to the present invention are described below with reference to the drawings.
[ Integral Structure ]
Fig. 1 is a block diagram of a vehicle system 1 using a vehicle control device 100 according to a first embodiment. The vehicle on which the vehicle system 1 is mounted is, for example, a two-wheeled, three-wheeled, four-wheeled or the like vehicle, and the driving source thereof is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof. The motor operates using generated power of a generator connected to the internal combustion engine, or discharge power of a secondary battery and a fuel cell.
The vehicle system 1 includes, for example, a camera 10, a radar device 12, a detector 14, an object recognition device 16, a driving operation tool 80, a vehicle control device 100, a running driving force output device 200, a brake device 210, and a steering device 220. These devices and apparatuses are connected to each other via a plurality of communication lines such as CAN (Controller Area Network) communication lines, serial communication lines, and a wireless communication network. The configuration shown in fig. 1 is merely an example, and a part of the configuration may be omitted, or another configuration may be further added.
The camera 10 is, for example, a digital camera using solid-state imaging elements such as CCD (Charge Coupled Device) and CMOS (Complementary Metal Oxide Semiconductor). The camera 10 is mounted on an arbitrary portion of a vehicle (hereinafter, referred to as the host vehicle M) on which the vehicle system 1 is mounted. In the case of photographing the front, the camera 10 is mounted on the upper part of the front windshield, the rear view mirror back surface of the vehicle interior, or the like. The camera 10 periodically and repeatedly photographs the periphery of the host vehicle M, for example. The camera 10 may also be a stereoscopic camera.
The radar device 12 emits radio waves such as millimeter waves to the periphery of the host vehicle M, and detects at least the position (distance and azimuth) of the object by detecting radio waves (reflected waves) reflected by the object. The radar device 12 is mounted on an arbitrary portion of the host vehicle M. The radar device 12 may also detect the position and velocity of an object by means of FM-CW (Frequency Modulated Continuous Wave).
The detector 14 is a LIDAR (Light Detection AND RANGING). The detector 14 irradiates light around the vehicle M, and measures scattered light. The detector 14 detects the distance to the object based on the time from light emission to light reception. The irradiated light is, for example, pulsed laser light. The detector 14 is mounted on an arbitrary portion of the host vehicle M.
The object recognition device 16 performs a sensor fusion process on the detection results detected by some or all of the camera 10, the radar device 12, and the detector 14 to recognize the position, the type, the speed, and the like of the object. The object recognition device 16 outputs the recognition result to the vehicle control device 100. The object recognition device 16 may directly output the detection results of the camera 10, the radar device 12, and the detector 14 to the vehicle control device 100. The object recognition device 16 may also be omitted from the vehicle system 1.
The communication device 20 communicates with other vehicles existing in the vicinity of the automated driving vehicle, for example, using a cellular network, wi-Fi network, bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or the like, or communicates with various server devices via a wireless base station.
The HMI30 presents various information to the occupant of the automated driving vehicle and accepts an input operation of the occupant. HMI30 includes various display devices, speakers, buzzers, touch panels, switches, keys, etc.
The vehicle sensor 40 includes a vehicle speed sensor that detects the speed of the autonomous vehicle, an acceleration sensor that detects acceleration, a yaw rate sensor that detects the angular velocity about a vertical axis, an orientation sensor that detects the orientation of the autonomous vehicle, and the like.
The navigation device 50 includes, for example, a GNSS receiver 51, a navigation HMI52, and a route determination unit 53. The navigation device 50 holds the first map information 54 in a storage device such as an HDD or a flash memory. The GNSS receiver 51 determines the position of the autonomous vehicle based on signals received from GNSS satellites. The position of the autonomous vehicle may also be determined or supplemented by INS (Inertial Navigation System) using the output of the vehicle sensor 40. The navigation HMI52 includes a display device, speakers, a touch panel, keys, etc. The navigation HMI52 may be partially or entirely shared with the HMI30 described above. The route determination unit 53 determines a route (hereinafter referred to as an on-map route) from the position of the autonomous vehicle (or an arbitrary position inputted thereto) specified by the GNSS receiver 51 to the destination inputted by the occupant using the navigation HMI52, for example, with reference to the first map information 54. The first map information 54 is, for example, information indicating the shape of a road by a route indicating the road and nodes connected by the route. The first map information 54 may also include curvature of the road, POI (Point Of Interest) information, and the like. The route on the map is output to the MPU 60. The navigation device 50 may perform route guidance using the navigation HMI52 based on the route on the map. The navigation device 50 may be realized by the function of a terminal device such as a smart phone or a tablet terminal held by an occupant. The navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire a route equivalent to the route on the map from the navigation server.
The MPU60 includes, for example, a recommended lane determining unit 61, and holds the second map information 62 in a storage device such as an HDD or a flash memory. The recommended lane determining unit 61 divides the route on the map supplied from the navigation device 50 into a plurality of blocks (for example, every 100m in the vehicle traveling direction), and determines a recommended lane for each block with reference to the second map information 62. The recommended lane determining unit 61 determines which lane from the left side is to be traveled. The recommended lane determining unit 61 determines the recommended lane so that the autonomous vehicle can travel on a reasonable route for traveling to the branch destination when the branch point exists in the route on the map.
The second map information 62 is map information of higher accuracy than the first map information 54. The second map information 62 includes, for example, information on the center of a lane, information on the boundary of a lane, and the like. The second map information 62 may include road information, traffic restriction information, residence information (residence/zip code), facility information, telephone number information, and the like. The second map information 62 may be updated at any time by the communication device 20 communicating with other devices.
The steering operation member 80 includes, for example, an accelerator pedal, a brake pedal, a shift lever, a steering wheel, a profile steering wheel, a joystick, and other operation members. A sensor for detecting the amount of operation or the presence or absence of operation is attached to the driving operation element 80, and the detection result is output to the vehicle control device 100, or to some or all of the running driving force output device 200, the brake device 210, and the steering device 220.
The vehicle control device 100 includes, for example, a first control unit 120 and a second control unit 160. The first control unit 120 and the second control unit 160 are each realized by a hardware processor such as a CPU executing a program (software). Some or all of these components may be realized by hardware (circuit part) such as LSI, ASIC, FPGA, GPU, or may be realized by cooperation of software and hardware. The program may be stored in advance in a storage device such as an HDD or a flash memory of the vehicle control device 100 (a storage device including a non-transitory storage medium), or may be stored in a removable storage medium such as a DVD or a CD-ROM, and then installed in the HDD or the flash memory of the vehicle control device 100 by being mounted on a drive device via the storage medium (the non-transitory storage medium).
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160. The first control unit 120 includes, for example, a recognition unit 130 and an action plan generation unit 140. The first control unit 120 realizes, for example, a function based on AI (ARTIFICIAL INTELLIGENCE; artificial intelligence) and a function based on a predetermined model in parallel. For example, the function of "identifying intersections" can be realized by "performing, in parallel, identification of intersections by deep learning or the like and identification of conditions (presence of a signal, road sign, or the like that can be pattern-matched), and scoring both sides to comprehensively evaluate". Thereby, reliability of automatic driving is ensured.
The recognition unit 130 recognizes the periphery of the vehicle M and estimates the behavior of the recognized object. The identifying unit 130 includes, for example, a surrounding identifying unit 132, a road stenosis factor identifying unit 134, and a predicting unit 136.
The surrounding area recognition unit 132 recognizes the position, speed, acceleration, and other states of objects (including a preceding vehicle and a facing vehicle, which will be described later) in the vicinity of the autonomous vehicle based on information input from the camera 10, the radar device 12, and the detector 14 via the object recognition device 16. The position of the object is identified as a position on absolute coordinates having a representative point (center of gravity, center of drive shaft, etc.) of the autonomous vehicle as an origin, for example, and is used for control. The position of the object may be represented by a representative point such as the center of gravity or a corner of the object, or may be represented by a represented area. The "state" of the object may also include acceleration, jerk, or "behavior state" of the object (e.g., whether a lane change is in progress or about to be made).
The surrounding recognition unit 132 recognizes, for example, a lane (driving lane) in which the autonomous vehicle is driving. For example, the surrounding area identifying unit 132 identifies the driving lane by comparing the pattern (for example, the arrangement of the solid line and the broken line) of the road dividing line obtained from the second map information 62 with the pattern of the road dividing line of the surrounding area of the automated driving vehicle identified from the image captured by the camera 10. The surrounding recognition unit 132 is not limited to the road dividing line, and may recognize a traveling lane by recognizing a road dividing line, a traveling road boundary (road boundary) including a road shoulder, a curb, a center isolation belt, a guardrail, and the like. In this identification, the processing results of the INS and the position of the autonomous vehicle acquired from the navigation device 50 may be considered. In addition, the surrounding recognition unit 132 recognizes temporary stop lines, obstacles, red lights, tollgates, and other road phenomena.
When recognizing the driving lane, the surrounding recognition unit 132 recognizes the position and posture of the autonomous vehicle with respect to the driving lane. The surrounding recognition unit 132 may recognize, for example, an angle formed by a deviation of a reference point of the autonomous vehicle from the center of the lane and a traveling direction of the autonomous vehicle with respect to a line connecting the centers of the lanes as a relative position and posture of the autonomous vehicle with respect to the traveling lane. Instead of this, the surrounding recognition unit 132 may recognize the position of the reference point of the autonomous vehicle with respect to any side end portion (road dividing line or road boundary) of the travel lane as the relative position of the autonomous vehicle with respect to the travel lane.
The surrounding area identifying unit 132 identifies information on a surrounding area of the host vehicle M, particularly on a lane on which the host vehicle M is scheduled to travel, based on the surrounding area of the host vehicle M identified from the image captured by the camera 10, the congestion information on the surrounding area of the host vehicle M acquired by the navigation device 50, or the position information obtained from the second map information 62. The information on the lane of the predetermined travel includes, for example, the lane width (lane width) of the own vehicle M of the predetermined travel. The surrounding area recognition unit 132 outputs the recognition result to the road narrow factor recognition unit 134.
The road-narrowing-factor recognition unit 134 recognizes the road-narrowing factor using the recognition result recognized by the surrounding recognition unit 132. Road narrowing factors include, for example, other vehicles that are parked on a road shoulder, road shoulder stationary objects that temporarily narrow a road due to a safety barrier (guard barrier) provided for construction or the like, and traffic participants that move while avoiding the road shoulder stationary objects. Traffic participants are, for example, pedestrians, bicycles, motorcycles, and other vehicles. When it is recognized that the road-narrowing factor exists, the road-narrowing-factor recognition unit 134 obtains a result of further recognizing the road-narrowing factor as three-dimensional information (hereinafter, simply referred to as "three-dimensional information").
The road-narrowing-factor identifying unit 134 outputs the road-narrowing factor to the predicting unit 136 as a result of three-dimensional information identification, for example.
The prediction unit 136 predicts the time-series position of the road-narrowing factor, particularly the time-series position of the traffic participant, based on the recognition result recognized by the surrounding recognition unit 132 and the recognition result recognized by the road-narrowing factor recognition unit 134. The prediction unit 136 sets an arbitrary point in the outline of the road stenosis factor as a candidate point (point), and predicts the time-series position of the traffic participant by predicting the time-series movement position of the candidate point.
The action plan generation unit 140 generates a target track for future travel of the host vehicle M so as to travel on the recommended lane determined by the recommended lane determination unit 61 in principle and to perform automatic driving according to the surrounding situation of the host vehicle M. The target track includes, for example, a speed element. For example, the target track is represented by a track in which points (track points) where the host vehicle M should reach are sequentially arranged. The track point is a point where the own vehicle M should reach every predetermined travel distance (for example, several [ M ] level) in terms of the distance along the road, and is generated as a part of the target track at intervals of a predetermined sampling time (for example, several tenths [ sec ] level) in accordance with the distance along the road.
The action plan generation unit 140 includes, for example, an avoidance orbit generation unit 142. The avoidance trajectory generation unit 142 generates a target trajectory for the host vehicle M to move while avoiding the road narrowing factor, based on the prediction result of the prediction unit 136. The avoidance orbit generation unit 142 generates the avoidance orbit corresponding to the road narrowing factor based on the length of the road narrowing factor in the traveling direction of the vehicle, which is recognized by the road narrowing factor recognition unit 134.
The second control unit 160 controls the running driving force output device 200, the braking device 210, and the steering device 220 so that the autonomous vehicle passes through the target track generated by the behavior plan generation unit 140 at a predetermined timing. The configuration in which the action plan generation unit 140 and the second control unit 160 are combined is an example of a "driving control unit".
Returning to fig. 1, the second control unit 160 includes, for example, an acquisition unit 162, a speed control unit 164, and a steering control unit 166. The acquisition unit 162 acquires information of the target track (track point) generated by the action plan generation unit 140, and stores the information in a memory (not shown). The speed control portion 164 controls the running driving force output device 200 or the braking device 210 based on a speed element accompanying the target track stored in the memory. The steering control unit 166 controls the steering device 220 according to the curved state of the target track stored in the memory. The processing by the speed control unit 164 and the steering control unit 166 is realized by a combination of feedforward control and feedback control, for example. As an example, the steering control unit 166 performs a combination of feedforward control according to the curvature of the road ahead of the autonomous vehicle and feedback control based on the deviation from the target track.
The running driving force output device 200 outputs a running driving force (torque) for running the vehicle to the driving wheels. The running driving force output device 200 includes, for example, an ECU that controls a combination of an internal combustion engine, an electric motor, a transmission, and the like. The ECU controls the above-described configuration in accordance with information input from the second control portion 160 or information input from the driving operation element 80.
The brake device 210 includes, for example, a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake ECU. The brake ECU controls the electric motor in accordance with information input from the second control portion 160 or information input from the driving operation member 80 so that a braking torque corresponding to a braking operation is output to each wheel. The brake device 210 may include, as a spare, a mechanism for transmitting hydraulic pressure generated by operation of a brake pedal included in the drive operation element 80 to a hydraulic cylinder via a master cylinder. The brake device 210 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device that transmits the hydraulic pressure of the master cylinder to the hydraulic cylinder by controlling the actuator in accordance with information input from the second control unit 160.
The steering device 220 includes, for example, a steering ECU and an electric motor. The electric motor applies a force to the rack-and-pinion mechanism to change the direction of the steered wheel, for example. The steering ECU drives the electric motor in accordance with information input from the second control unit 160 or information input from the driving operation element 80 to change the direction of the steered wheels.
Fig. 3 is a diagram schematically showing a road narrowing factor on the travel path LR on which the host vehicle M travels. The road width of the travel path LR on which the host vehicle M travels is WR. The travel path LR may be a single-lane road, or may have other adjacent lanes, which are not shown. The X-axis in the drawing is a longitudinal direction of the travel path LR and is an axis of the own vehicle M in a predetermined traveling direction. The Y axis in the drawing is an axis in the width direction of the travel path LR with respect to the travel direction of the host vehicle M. The Z axis in the drawing is the axis in the height direction of the host vehicle M. The road narrow factor identifying unit 134 identifies, as three-dimensional information, the length in the X-axis direction of the shoulder rest in the traveling direction of the host vehicle M, the length (width) in the Y-axis direction of the shoulder rest, the height in the Z-axis direction of the shoulder rest, and information of traffic participants.
In the following description, it is assumed that a road shoulder rest of a road narrowing factor is other vehicle mA and a traffic participant is pedestrian P.
The road narrow factor identifying unit 134 identifies another vehicle mA parked in front of the own vehicle M in the traveling direction (X-axis direction) and a pedestrian P ahead of the other vehicle mA in the traveling direction.
The prediction unit 136 predicts the time-series position of the pedestrian P based on the recognition result recognized by the surrounding recognition unit 132 and the recognition result recognized by the road narrow factor recognition unit 134.
The prediction unit 136 outputs a prediction result using one or more indices. The prediction unit 136 derives a first index R (risk potential) having a negative value as it gets closer to the road stenosis factor identified by the road stenosis factor identification unit 134, for each of a plurality of candidate points (points) on the traveling direction side of the vehicle M, and associates each of the plurality of candidate points with each other. The term "associated with" means, for example, that information corresponding to each other is stored in a memory. In the present embodiment, the value is "negative", and the value close to zero is "affirmative", and the score hypothesis value described later is more affirmative (preferred value) as the value is closer to zero, but the relationship may be reversed. Accordingly, the prediction unit 136 derives the first index R, which becomes a larger value as it gets closer to the road-narrowing factor identified by the road-narrowing-factor identifying unit 134, for each of the plurality of candidate points (points) on the traveling direction side of the host vehicle M. The prediction unit 136 derives the first index RmA for each candidate point so that the distance from the representative point becomes larger as the distance from the representative point becomes smaller, for example, around the representative point of other vehicle mA, which is a road shoulder rest that is a road narrowing factor.
For example, the distribution of the first index RmA of the other vehicle mA is derived so that, for example, a contour line of the value is obtained, the distribution becomes an ellipse long in the X-axis direction. The ratio of the major axis to the minor axis of the ellipse varies, for example, according to the longitudinal length of the road stenosis factor. The prediction unit 136 sets the major axis of the ellipse representing the distribution of the first index RmA of the other vehicle mA based on the vehicle length LmA of the other vehicle mA, and sets the minor axis of the ellipse based on the vehicle width WmA of the other vehicle mA. In fig. 3, the outer edge line of an ellipse in which the first index RmA of the other vehicle mA becomes zero is indicated by a broken line.
Similarly, the prediction unit 136 derives the first index RP for each candidate point so that the distance from the representative point becomes larger as the distance from the representative point becomes smaller, with the representative point of the pedestrian P as a road stenosis factor as the center. In fig. 3, the outer edge line of an ellipse where the first index RP of the pedestrian P becomes zero is shown by a broken line.
The prediction unit 136 predicts the passable width D by which the own vehicle M can pass by predicting the time-series position of the pedestrian P, and outputs the predicted passable width D to the avoidance orbit generation unit 142.
Fig. 4 is a diagram for explaining the position of the time series of the pedestrian P. Fig. 4 shows the following scenario: after a predetermined time has elapsed from the state shown in fig. 3, the pedestrian P moves while avoiding the other vehicle mA, and interferes with the predetermined travel track of the own vehicle M. While the pedestrian P moves while avoiding the other vehicle mA, the road width that the own vehicle M can pass through is minimized (D shown in the figure).
The prediction unit 136 derives a first index R of a road narrowing factor obtained by combining a first index RmA of another vehicle mA, which is a road shoulder stationary object, with a first index RP of a pedestrian P, which is a traffic participant. For example, when the pedestrian P approaches the other vehicle mA, the prediction unit 136 may derive the first index R by regarding the other vehicle mA and the pedestrian as an integral road stenosis factor, or may derive the first index RmA of the other vehicle mA and the first index RP of the pedestrian P, respectively.
The prediction unit 136 predicts a period during which the host vehicle M interferes with the road narrowing factor based on the vehicle length LmA of the other vehicle mA. The period of interference is a period of lateral position interference in a case where the first index RP of the pedestrian P that is avoiding the other vehicle mA is opposite to the first index RmA of the other vehicle mA.
In addition, the prediction unit 136 may determine that the host vehicle M and the road narrowing factor do not interfere with each other when it is predicted that the pedestrian P will wait for the host vehicle M to pass by the other vehicle mA in the state shown in fig. 3.
The avoidance trajectory generation unit 142 generates the avoidance trajectory by changing the lateral avoidance control based on the time-series position of the pedestrian P predicted by the prediction unit 136, the relation between the passable width D and the vehicle width WM of the host vehicle, and the period during which the host vehicle M interferes with the road narrowing factor. The avoidance trajectory generation unit 142 generates the avoidance maneuver of the vehicle M, particularly, taking into consideration the time-series positions of the pedestrians P having the smallest passable width D. The evasive action includes stopping the host vehicle M temporarily in front of the other vehicle mA or slowing down in front of the other vehicle mA. The second control unit 160 may change the speed of the own vehicle M traveling on the avoidance track generated by the avoidance track generation unit 142 according to whether the own vehicle M is perceived by the pedestrian P. The second control unit 160 controls the speed of the host vehicle M so as to travel more slowly than when the pedestrian P notices the host vehicle M, if the pedestrian P does not notice the host vehicle M.
The estimation of whether or not the pedestrian P is aware of the own vehicle M may be performed by the prediction unit 136 based on, for example, the face orientation of the pedestrian P captured by the camera 10 or the change in the moving speed of the pedestrian P, or may be performed by another image analysis method.
Fig. 5 is a plan view for explaining a case where the pedestrian P moves while avoiding the other vehicle mA. The upper left diagram of fig. 5 is a diagram illustrating the position prediction performed by the prediction unit 136 when the pedestrian P starts to avoid the other vehicle mA at time T. The prediction unit 136 predicts that the pedestrian P starts to evade the other vehicle mA at time T. The pedestrian P moves to the vicinity of the vehicle front side of the other vehicle mA at a time t+ta after a predetermined time ta has elapsed from the time T (upper right diagram in fig. 5), and moves to the vicinity of the vehicle rear side of the other vehicle mA at a time t+2ta after a predetermined time ta has elapsed further (lower left diagram in fig. 5). The pedestrian P ends to evade the other vehicle mA at time t+3ta (lower right diagram of fig. 5). The prediction unit 136 derives the first index R of the road stenosis factor based on the time-series position from the time T to the time t+3ta at which the pedestrian P is predicted to move while avoiding the other vehicle mA as shown in fig. 5. In addition, the prediction unit 136 predicts the passable width D at the portion where the road narrowing factor is most prominent in the Y-axis direction at each time shown in fig. 5.
The prediction unit 136 determines whether or not the avoidance period in which the own vehicle M is avoiding the other vehicle mA overlaps with the avoidance period in which the pedestrian P is avoiding the other vehicle mA. If it is determined that the vehicle is not overlapping, the pedestrian P is less likely to affect the traveling of the vehicle M, and therefore, the pedestrian P is output to the action plan generation unit 140 so as to execute control to avoid the other vehicle mA.
The prediction unit 136 determines that, when the passable width D is sufficiently larger than the vehicle width WM of the host vehicle M (for example, when the passable width D is equal to or greater than the threshold Th 1), it is: even if the pedestrian P passes by the other vehicle mA, the own vehicle M can be caused to travel while avoiding the road narrowing factor, that is, by the other vehicle mA and the pedestrian P. Here, the threshold Th1 is a value that can be defined by, for example, the sum of the vehicle width WM and a predetermined interval (about 50 to 80 cm). When the passable width D is smaller than the threshold value Th1 and equal to or greater than the threshold value Th2, the prediction unit 136 determines that: while the pedestrian P is passing by the other vehicle mA, the own vehicle M is caused to slow down and travel while avoiding the road narrowing factor, that is, the other vehicle mA and the pedestrian P. Here, the threshold value Th2 is a threshold value defining a condition that is stricter than the threshold value Th1 (that is difficult to determine as being able to be avoided by the host vehicle M), and is a value that can be defined by, for example, the sum of the vehicle width WM and a predetermined interval (20 to 50[ cm ] or so). When the passable width D is smaller than the threshold Th2, the prediction unit 136 determines that: while the pedestrian P is passing by the other vehicle mA, the own vehicle M is temporarily stopped before the other vehicle mA, and the vehicle is restarted after waiting until the pedestrian P passes by the period determined to be interference. The threshold Th1 and threshold Th2 are examples of "predetermined width".
The avoidance orbit generation unit 142 generates the avoidance orbit based on the prediction result of the prediction unit 136 shown in fig. 5.
When the prediction unit 136 outputs the result of predicting that the pedestrian P will avoid the other vehicle mA, the avoidance trajectory generation unit 142 may set the avoidance trajectory of the own vehicle M in consideration of the change in the area corresponding to the pedestrian P of the first index R with time as shown in fig. 5. More specifically, at the time T, the pedestrian P is present at a position distant from the host vehicle M, and therefore the avoidance orbit can be set so as to pass a position laterally close to the other vehicle mA, and at the time t+3ta, the orbit can be set so as to pass a position laterally close to the other vehicle mA after passing beside the pedestrian P.
That is, the avoidance maneuver generating section 142 may generate the avoidance maneuver in which the large avoidance maneuver is not assumed to always exist for the pedestrian P, but the minimum avoidance maneuver is realized in accordance with the first index R, based on the overall prediction results at the times T to t+3ta.
Fig. 6 is a diagram for explaining an example of a rule of the vehicle speed of the host vehicle M when the second control unit 160 travels on the avoidance track. The second control unit 160 controls the vehicle speed during which the host vehicle M avoids the road narrowing factor based on the legal speed VL, for example, based on the passable width D and the result of the estimation of whether the host vehicle M is perceived by the pedestrian P.
Fig. 7 is a diagram showing an example of the avoidance orbit generated by the avoidance orbit generation unit 142. The avoidance orbit generation unit 142 generates the avoidance orbit based on the vehicle length LmA of the other vehicle mA recognized by the road narrow factor recognition unit 134. When the passable width D is smaller than the threshold Th2, for example, the avoidance track generation unit 142 generates the avoidance track K1 (left view in fig. 7) that temporarily stops the own vehicle M before the other vehicle mA and waits until the pedestrian P ends moving beside the other vehicle mA. The avoidance orbit generation unit 142 may generate the avoidance orbit K2 (right view in fig. 7) for causing the host vehicle M to travel while avoiding the other vehicle mA after the pedestrian P finishes moving beside the other vehicle mA. The second control unit 160 causes the host vehicle M to travel in accordance with the vehicle speed rule shown in fig. 6.
Fig. 8 is a diagram showing another example of the avoidance orbit generated by the avoidance orbit generation unit 142. When the passable width D is larger than the threshold value Th1, for example, the avoidance track generation unit 142 generates the avoidance track K3 that allows the own vehicle M to travel beside the pedestrian P even if the pedestrian P is moving beside the other vehicle mA. The second control unit 160 causes the host vehicle M to travel in accordance with the vehicle speed rule shown in fig. 6.
Fig. 9 and 10 are diagrams for explaining a scene in which a road shoulder stationary object on the travel path LR is a large vehicle mB. The large vehicle mB is a vehicle having a longer vehicle length than other vehicles mA. The prediction unit 136 predicts the period during which the large vehicle mB and the pedestrian P, which are road stenosis factors, interfere with the host vehicle M based on the recognition result of the vehicle width WmB and the vehicle length LmB of the large vehicle mB by the surrounding recognition unit 132. The avoidance orbit generation unit 142 generates the avoidance orbit K4 shown in fig. 10 based on the prediction result of the prediction unit 136.
[ Process flow ]
Fig. 11 is a flowchart showing an example of the flow of the road narrow factor avoidance process in the vehicle system 1.
First, the surrounding area recognition unit 132 recognizes the surrounding area of the vehicle M (step S100). Next, the road-narrowing-factor identifying unit 134 determines whether or not there are road-narrowing factors, that is, road-shoulder standers and traffic participants, in the vicinity of the host vehicle M (step S102). When it is determined that the road narrowing factor does not exist, the road narrowing factor identifying unit 134 ends the processing of the present flowchart. When it is determined that the road-narrowing factor exists, the road-narrowing-factor identifying unit 134 identifies the length of the road-shoulder stationary object (step S104). Next, the road-narrowing-factor-recognition unit 134 recognizes the moving speed of the traffic participant (step S106).
Next, the prediction unit 136 predicts the position of the traffic participant based on the recognition result recognized by the road narrow factor recognition unit 134 (step S108), and determines whether or not the traffic participant is moving to the travel path of the host vehicle M for which the host vehicle is scheduled to travel (step S110). When it is determined that the vehicle is not moving on the traveling path, the prediction unit 136 ends the processing of the present flowchart. When it is determined that the vehicle is moving on the road, the prediction unit 136 predicts a period in which the traffic participant moves on the road and stays (step S112). Next, the avoidance orbit generation unit 142 generates the avoidance orbit of the vehicle M based on the prediction result predicted by the prediction unit 136 (step S114), and ends the processing of the present flowchart.
Fig. 12 is a flowchart showing an example of the flow of the avoidance orbit generation processing by the avoidance orbit generation section 142 based on the prediction result of the prediction section 136. The flowchart shown in fig. 12 illustrates the processing in steps S108 to S114 of the flowchart of fig. 11 in more detail.
First, the prediction unit 136 sets an index related to a road-narrowing factor based on the result of the recognition of the length of the shoulder rest by the road-narrowing-factor recognition unit 134 (step S200). Next, the prediction unit 136 predicts the time-series position of the traffic participant on the road, and predicts the period during which the traffic participant interferes (step S202). Next, the prediction unit 136 determines whether or not the avoidance period in which the host vehicle M is avoiding the other vehicle mA, which is a road stenosis factor, and the avoidance period in which the traffic participant is avoiding the other vehicle mA overlap (step S204). If it is not determined that there is a superimposition, the prediction unit 136 outputs to the action plan generation unit 140 that only the other vehicle mA is considered as a road narrowing factor so as to generate an avoidance track in which the own vehicle M avoids the other vehicle mA (step S206). When it is determined that the two vehicles overlap, the prediction unit 136 considers both the other vehicle mA and the traffic participant as road narrow factors, and therefore performs estimation as to whether the traffic participant is aware of the vehicle itself, and determines the degree of deceleration of the vehicle M when the vehicle is traveling while avoiding the vicinity of the road narrow factors based on the estimation result (step S208). Next, the prediction unit 136 determines whether or not the passable width D is smaller than a predetermined value due to a road narrowing factor (step S210). When the value is equal to or greater than the predetermined value, the avoidance orbit generation unit 142 generates the avoidance orbit in which the lateral avoidance control is changed based on the recognition result by the road narrow factor recognition unit 134 (step S212). When the vehicle M is less than the predetermined value, the avoidance trajectory generation unit 142 generates a trajectory for waiting for the vehicle M to slow down or stop (step S214). After the processing of step S212 or step S214, the second control unit 160 performs acceleration/deceleration and steering control along the trajectory generated in step S212 or step S214 (step S216). The processing of the present flowchart is ended in the above manner.
As described above, according to the present embodiment, the surrounding area identifying unit 132 identifies the surrounding area of the host vehicle M, the road narrow factor identifying unit 134 determines whether or not a road narrow factor is present on the road on which the host vehicle M is traveling, and if it is determined that a road narrow factor is present, the avoidance orbit generating unit 142 generates the avoidance orbit corresponding to the result of the identification of the road narrow factor identified by the road narrow factor identifying unit 134 by identifying the length of the road narrow factor in the traveling direction of the host vehicle M, whereby interference between the road narrow factor, that is, a road shoulder rest and the traffic participant, and the host vehicle M can be predicted.
[ Hardware Structure ]
Fig. 13 is a diagram showing an example of a hardware configuration of the various control devices according to the embodiment. As shown in the figure, various control devices are configured such that a communication controller 100-1, a CPU100-2, a RAM100-3 used as a working memory, a ROM100-4 storing a boot program or the like, a storage device 100-5 such as a flash memory or an HDD, a drive device 100-6, and the like are connected to each other via an internal bus or a dedicated communication line. The communication controller 100-1 performs communication with components other than the vehicle control device 100. The program 100-5a executed by the CPU100-2 is stored in the storage device 100-5. The program is developed into the RAM100-3 by a DMA (Direct Memory Access) controller (not shown) or the like, and executed by the CPU 100-2. Thereby, a part or all of the first control unit 120 and the second control unit 160 are realized.
The embodiments described above can be described as follows.
A vehicle control device is provided with:
A storage device storing a program; and
A hardware processor is provided with a processor that,
The hardware processor is configured to execute the following processing by executing a program stored in the storage device:
The surrounding conditions of the vehicle are identified,
Controlling acceleration and deceleration and steering of the vehicle based on the surrounding conditions,
Determining whether a road stenosis factor exists on a road on which the vehicle is traveling, and if it is determined that the road stenosis factor exists, identifying a length of the road stenosis factor in a traveling direction of the vehicle, and generating an avoidance track corresponding to the road stenosis factor based on the identified length of the road stenosis factor in the traveling direction of the vehicle.
The specific embodiments of the present invention have been described above using the embodiments, but the present invention is not limited to such embodiments, and various modifications and substitutions can be made without departing from the scope of the present invention.
Claims (8)
1. A vehicle control apparatus, wherein,
The vehicle control device includes:
a surrounding area recognition unit that recognizes a surrounding area of the vehicle; and
A driving control unit that controls acceleration and deceleration and steering of the vehicle based on the surrounding situation recognized by the surrounding recognition unit,
The surrounding area identifying unit determines whether or not a road narrow factor exists on a road on which the vehicle is traveling, and if it is determined that the road narrow factor exists, identifies the length of the road narrow factor in the traveling direction of the vehicle,
The driving control portion generates an avoidance orbit corresponding to the road narrowing factor based on the length of the identified road narrowing factor in the traveling direction of the vehicle,
Among the road stenosis factors are a shoulder rest and one or more traffic participants predicted to interfere with a predetermined travel track of the vehicle by moving while avoiding the shoulder rest,
The vehicle control device further includes a prediction unit that predicts a time-series position of the traffic participant based on a recognition result of the recognition by the surrounding recognition unit of the length of the road stenosis factor in the traveling direction of the vehicle,
The prediction unit estimates whether the traffic participant perceives the vehicle,
The driving control unit determines a degree of deceleration of the vehicle when the vehicle is evading the road-narrowing factor, based on a passable width of the vehicle to be driven and a result of the estimation by the prediction unit.
2. The vehicle control apparatus according to claim 1, wherein,
The prediction unit predicts a period during which the traffic participant interferes with a predetermined travel track of the vehicle based on the length of the road narrowing factor identified by the surrounding identification unit.
3. The vehicle control apparatus according to claim 2, wherein,
The driving control unit waits until the period of time when it is determined that the road narrowing factor interferes with the predetermined travel track of the vehicle, while the vehicle is traveling slowly or stopped before the road narrowing factor, when the passable width of the vehicle for the predetermined travel is smaller than a predetermined width.
4. The vehicle control apparatus according to claim 2, wherein,
The driving control unit changes lateral avoidance control based on a period during which the road narrowing factor interferes with a predetermined travel track of the vehicle when a passable width of the vehicle for scheduled travel is equal to or greater than a predetermined width.
5. The vehicle control apparatus according to claim 3, wherein,
The driving control unit changes lateral avoidance control based on a period during which the road narrowing factor interferes with a predetermined travel track of the vehicle when a passable width of the vehicle for scheduled travel is equal to or greater than a predetermined width.
6. The vehicle control apparatus according to any one of claims 1 to 5, wherein,
The driving control unit changes lateral avoidance control according to a time-series position of the road narrowing factor on a road on which the vehicle is scheduled to travel.
7. A vehicle control method, wherein,
The vehicle control method causes a computer to execute:
Identifying a surrounding condition of the vehicle;
controlling acceleration and deceleration and steering of the vehicle based on the surrounding conditions;
Determining whether a road-narrowing factor exists on a road on which the vehicle is traveling, and if it is determined that the road-narrowing factor exists, identifying a length of the road-narrowing factor in a traveling direction of the vehicle, wherein the road-narrowing factor includes a road-shoulder stationary object and one or more traffic participants predicted to interfere with a predetermined traveling track of the vehicle by avoiding the road-shoulder stationary object;
Generating an evasion track corresponding to the road stenosis factor based on a length of the identified road stenosis factor in a traveling direction of the vehicle;
Predicting a position of the time series of traffic participants based on a recognition result of the length recognition of the road stenosis factor in the traveling direction of the vehicle;
making an estimation of whether the traffic participant perceived the vehicle; and
A degree of deceleration of the vehicle while evading the travel beside the road-narrowing factor is determined based on a passable width of the vehicle for the predetermined travel and the result of the estimation.
8. A storage medium storing a program, wherein,
The program causes a computer to execute:
Identifying a surrounding condition of the vehicle;
controlling acceleration and deceleration and steering of the vehicle based on the surrounding conditions;
Determining whether a road-narrowing factor exists on a road on which the vehicle is traveling, and if it is determined that the road-narrowing factor exists, identifying a length of the road-narrowing factor in a traveling direction of the vehicle, wherein the road-narrowing factor includes a road-shoulder stationary object and one or more traffic participants predicted to interfere with a predetermined traveling track of the vehicle by avoiding the road-shoulder stationary object;
Generating an evasion track corresponding to the road stenosis factor based on a length of the identified road stenosis factor in a traveling direction of the vehicle;
Predicting a position of the time series of traffic participants based on a recognition result of the length recognition of the road stenosis factor in the traveling direction of the vehicle;
making an estimation of whether the traffic participant perceived the vehicle; and
A degree of deceleration of the vehicle while evading the travel beside the road-narrowing factor is determined based on a passable width of the vehicle for the predetermined travel and the result of the estimation.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-124392 | 2019-07-03 | ||
JP2019124392A JP2021009653A (en) | 2019-07-03 | 2019-07-03 | Vehicle control device, vehicle control method, and program |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112172809A CN112172809A (en) | 2021-01-05 |
CN112172809B true CN112172809B (en) | 2024-06-25 |
Family
ID=73919405
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010616306.5A Active CN112172809B (en) | 2019-07-03 | 2020-06-30 | Vehicle control device, vehicle control method, and storage medium |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2021009653A (en) |
CN (1) | CN112172809B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107458373A (en) * | 2016-06-03 | 2017-12-12 | 本田技研工业株式会社 | Travel controlling system |
JP2019098965A (en) * | 2017-12-04 | 2019-06-24 | スズキ株式会社 | Travel support device |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010020371A (en) * | 2008-07-08 | 2010-01-28 | Yazaki Corp | Vehicle control system |
JP4614005B2 (en) * | 2009-02-27 | 2011-01-19 | トヨタ自動車株式会社 | Moving locus generator |
JP6294342B2 (en) * | 2013-11-05 | 2018-03-14 | 株式会社日立製作所 | Autonomous mobile system |
WO2016024314A1 (en) * | 2014-08-11 | 2016-02-18 | 日産自動車株式会社 | Travel control device and method for vehicle |
JP6375770B2 (en) * | 2014-08-11 | 2018-08-22 | 日産自動車株式会社 | Travel control device and travel control method |
JP6457826B2 (en) * | 2015-01-30 | 2019-01-23 | 株式会社Subaru | Vehicle driving support device |
JP6532786B2 (en) * | 2015-08-07 | 2019-06-19 | 株式会社日立製作所 | Vehicle travel control device and speed control method |
JP6340037B2 (en) * | 2016-06-07 | 2018-06-06 | 株式会社Subaru | Vehicle travel control device |
JP6478415B2 (en) * | 2016-12-20 | 2019-03-06 | 本田技研工業株式会社 | Vehicle control system, vehicle control method, and vehicle control program |
RU2720226C1 (en) * | 2017-04-19 | 2020-04-28 | Ниссан Мотор Ко., Лтд. | Method of assisting in movement and device for assisting in movement |
JP6525401B2 (en) * | 2017-08-30 | 2019-06-05 | マツダ株式会社 | Vehicle control device |
-
2019
- 2019-07-03 JP JP2019124392A patent/JP2021009653A/en active Pending
-
2020
- 2020-06-30 CN CN202010616306.5A patent/CN112172809B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107458373A (en) * | 2016-06-03 | 2017-12-12 | 本田技研工业株式会社 | Travel controlling system |
JP2019098965A (en) * | 2017-12-04 | 2019-06-24 | スズキ株式会社 | Travel support device |
Also Published As
Publication number | Publication date |
---|---|
CN112172809A (en) | 2021-01-05 |
JP2021009653A (en) | 2021-01-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111731321B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN110239547B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN109484404B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN110060467B (en) | Vehicle control device | |
CN111201170B (en) | Vehicle control device and vehicle control method | |
CN110281920B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN110053617B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN111133489B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN110341704B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN110271542B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN110271547B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN112486161B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN112677966B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN113525409B (en) | Moving object control device, moving object control method, and storage medium | |
CN111183082A (en) | Vehicle control device, vehicle control method, and program | |
CN111301415B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN110341703B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN109559540B (en) | Periphery monitoring device, periphery monitoring method, and storage medium | |
CN113492845B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN112677978B (en) | Prediction device, vehicle system, prediction method, and storage medium | |
CN112172805B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN112141097B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN112141108B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN111688693B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN112298171B (en) | Vehicle control device, vehicle control method, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |