CN116424328A - Method for automatically starting a parallel assistance system of a vehicle based on the behavior of the driver - Google Patents
Method for automatically starting a parallel assistance system of a vehicle based on the behavior of the driver Download PDFInfo
- Publication number
- CN116424328A CN116424328A CN202310513471.1A CN202310513471A CN116424328A CN 116424328 A CN116424328 A CN 116424328A CN 202310513471 A CN202310513471 A CN 202310513471A CN 116424328 A CN116424328 A CN 116424328A
- Authority
- CN
- China
- Prior art keywords
- driver
- vehicle
- distance
- lane
- behavior
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 56
- 230000001815 facial effect Effects 0.000 claims abstract description 41
- 230000006399 behavior Effects 0.000 claims abstract description 38
- 230000003213 activating effect Effects 0.000 claims abstract 4
- 230000008859 change Effects 0.000 claims description 38
- 210000003128 head Anatomy 0.000 claims description 21
- 238000012544 monitoring process Methods 0.000 claims description 16
- 210000001508 eye Anatomy 0.000 claims description 13
- 210000001747 pupil Anatomy 0.000 claims description 2
- 238000010191 image analysis Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 6
- 230000003936 working memory Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000004397 blinking Effects 0.000 description 3
- 230000015654 memory Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000004913 activation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000015541 sensory perception of touch Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18163—Lane change; Overtaking manoeuvres
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/015—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
- B60R21/01512—Passenger detection systems
- B60R21/01552—Passenger detection systems detecting position of specific human body parts, e.g. face, eyes or hands
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
Abstract
A method of automatically activating a merging assistance system of a vehicle based on driver behavior is disclosed. A method for automatically activating a merging assistance system of a vehicle based on a driver's behavior, comprising: capturing a face image of a driver in real time by a camera installed inside the vehicle; analyzing the captured facial image of the driver to obtain information related to facial features of the driver; judging whether a driver has preset behaviors or not based on the acquired information related to the facial features of the driver; and automatically starting a parallel line auxiliary system of the vehicle to carry out warning notification under the condition that the driver is judged to have preset behaviors.
Description
Technical Field
The present disclosure relates to methods, systems, devices, storage media, and program products for automatically starting a merging assistance system of a vehicle based on driver behavior.
Background
In order to improve the safety of the running of the vehicle, a parallel line auxiliary system is generally installed on the vehicle. In general, a parallel line assist system mounted on a vehicle may detect whether there are remaining vehicles or obstacles in a certain range (for example, 50 meters) behind the vehicle side by a sensor or the like mounted on, for example, an outside rear view mirror or a rear bumper. In case that it is detected that there is a remaining vehicle or obstacle in a certain range of the side rear of the vehicle, for example, the parallel line assisting system may issue a warning, for example, the driver may be alerted to the coming vehicle of the side rear by the lighting or blinking of the LED lamp installed in the outside rear view mirror, thereby improving the safety of the vehicle running. Currently, the parallel assist system is activated only when the running speed of the vehicle exceeds a certain speed (for example, 60 km/h).
Disclosure of Invention
An aspect of the present disclosure discloses a method for automatically starting a merging-assist system of a vehicle based on a driver's behavior, the method comprising: capturing a face image of a driver in real time by a camera installed inside the vehicle; analyzing the captured facial image of the driver to obtain information related to facial features of the driver; judging whether a driver has preset behaviors or not based on the acquired information related to the facial features of the driver; and automatically starting a parallel line auxiliary system of the vehicle to carry out warning notification under the condition that the driver is judged to have preset behaviors.
Other features and advantages of the present disclosure will become apparent from the following description with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description, serve to explain, without limitation, the principles of the disclosure. Like reference numerals are used to denote like items throughout the various figures.
FIG. 1 is an exemplary block diagram of a system for automatically starting a merging assistance system of a vehicle based on driver behavior, according to an embodiment of the present disclosure.
Fig. 2 is an exemplary flowchart illustrating a method for automatically starting a merging assistance system of a vehicle based on a driver's behavior according to an embodiment of the present disclosure.
FIG. 3 is an exemplary block diagram of a system for automatically starting a merging assistance system of a vehicle based on driver behavior, according to another embodiment of the present disclosure.
Fig. 4 is an exemplary flowchart illustrating a method for automatically starting a merging assistance system of a vehicle based on a driver's behavior according to another embodiment of the present disclosure.
Fig. 5 illustrates a general hardware environment in which the present disclosure may be applied, according to an embodiment of the present disclosure.
Detailed Description
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the described exemplary embodiments. It will be apparent, however, to one skilled in the art that the described embodiments may be practiced without some or all of these specific details. In the described exemplary embodiments, well known structures or processing steps have not been described in detail in order to avoid unnecessarily obscuring the concepts of the present disclosure.
The blocks within each block diagram shown below may be implemented by hardware, software, firmware, or any combination thereof to implement the principles of the present disclosure. It will be appreciated by those skilled in the art that the blocks described in each block diagram may be combined or divided into sub-blocks to implement the principles of the present disclosure.
The steps of the methods presented in this disclosure are intended to be illustrative. In some embodiments, the method may be accomplished with one or more additional steps not described and/or without one or more of the steps discussed. Furthermore, the order in which the steps of a method are illustrated and described is not intended to be limiting.
Generally, a parallel line assist system will be installed on a vehicle to improve the safety of the vehicle running. For example, a parallel line assist system mounted on a vehicle may detect whether there are remaining vehicles or obstacles in a certain range (for example, 50 meters) behind the vehicle side by a sensor or the like mounted on, for example, an outside mirror or a rear bumper. In the case where the presence of the remaining vehicle or the obstacle in a certain range of the vehicle side rear is detected by the above-described sensor or the like, the parallel line assist system may issue a warning, for example, by the lighting or blinking of an LED lamp mounted in the outside rear view mirror, to alert the driver to the coming vehicle in the side rear, thereby improving the safety of the running of the vehicle. Currently, the parallel assist system is activated only when the running speed of the vehicle exceeds a preset prescribed speed (for example, 60 km/h). In other words, the parallel assist system is not activated when the running speed of the vehicle is lower than a prescribed speed set in advance. The above-mentioned preset prescribed speed is generally not changeable.
However, the above-described manner for starting the parallel assist system may have the following drawbacks during running of the vehicle. For example, when the running speed of the vehicle has not exceeded a prescribed speed set in advance, the parallel assist system of the vehicle is not started, but the driver may need to make a lane change at this time. The lane change is performed without the parallel assist system being started, which may result in that the driver cannot know in real time whether the remaining vehicles or obstacles exist behind the vehicle side, so that there may be a case where the driver does not notice the remaining vehicles behind the vehicle side to perform the lane change, which increases the risk of the vehicle collision and reduces the safety of the vehicle running.
In addition, in the case where the parallel assist system is started, there may be a case where an unnecessary warning is issued. For example, when a vehicle is traveling on an urban road, since the number of vehicles on the road is large, in the case where the parallel assist system is activated, the parallel assist system may be caused to frequently detect the presence of the remaining vehicles within a certain range (for example, 30 meters) behind the vehicle side, thereby constantly giving a warning, for example, causing the LED lamp mounted on the outside rear view mirror to blink constantly. This may in turn cause unnecessary disturbance to the driver, distract the driver, and affect the normal driving of the driver, which may instead reduce the safety of the vehicle running.
Accordingly, the present disclosure provides a new method for starting a parallel assist system of a vehicle. According to the method of the present disclosure, the merging assistance system of the vehicle may be automatically started based on the behavior of the driver, regardless of the running speed of the vehicle. According to the method of the present disclosure, for example, the merging auxiliary system of the vehicle may be automatically started according to the actual lane change demand of the driver, regardless of the traveling speed of the vehicle, thereby improving the safety of the traveling of the vehicle.
Next, a system for automatically starting a merging assist system of a vehicle based on a driver's behavior according to an embodiment of the present disclosure will be described first with reference to fig. 1.
Fig. 1 is an exemplary block diagram of a system 100 for automatically starting a merging assistance system of a vehicle based on driver behavior according to an embodiment of the present disclosure. As shown in fig. 1, in some embodiments, the system 100 may include a facial image capturing unit 110, a facial image analysis unit 120, a determination unit 130, and a system activation unit 140. The face image capturing unit 110 may capture a face image of the driver in real time through a camera mounted inside the vehicle. The face image analysis unit 120 may analyze the captured face image of the driver to acquire information related to the facial features of the driver. The judging unit 130 may judge whether the driver has a preset behavior based on the acquired information related to the facial features of the driver. In the case where it is determined by the determination unit 130 that there is a preset behavior of the driver, the system start unit 140 may automatically start the parallel assistance system of the vehicle to make a warning notification.
The operation of the respective units as shown in fig. 1 will be described in further detail below.
Fig. 2 is an exemplary flowchart illustrating a method 200 for automatically starting a merging assistance system of a vehicle based on driver behavior, according to an embodiment of the present disclosure.
As shown in fig. 2, the method 200 begins at step S210. At step S210, the face image capturing unit 110 may capture a face image of the driver in real time through a camera mounted inside the vehicle. Specifically, the face image capturing unit 110 may capture a face image of the driver in real time by a camera mounted on an in-vehicle rear view mirror, a vehicle drive recorder of the vehicle, a camera mounted on a steering wheel, or the like, and then transmit the captured face image of the driver to a face image analyzing unit 120 described later. The camera on the above-mentioned interior mirror and the camera mounted on the steering wheel may be any known type of camera.
Next, the method 200 proceeds to step S220. At step S220, the face image analysis unit 120 may analyze the captured face image of the driver received from the face image capturing unit 110 to acquire information related to the facial features of the driver. For example, the face image analysis unit 120 may analyze the captured face image of the driver based on, for example, known face recognition techniques, thereby acquiring information related to the facial features of the driver.
In some embodiments, the information related to the facial features of the driver acquired by the facial image analysis unit 120 may include one or more of the following: the position and/or orientation of the driver's head, the position and/or orientation of the driver's eyes, the rotational speed and/or rotational frequency of the driver's head, the speed and/or frequency of movement of the driver's eyes, and the position and/or size of the driver's pupils. For example, the face image analysis unit 120 may acquire information related to the head of the driver to determine the orientation of the head of the driver (toward the right, toward the left, or facing the front), the rotational speed/frequency of the head of the driver, and the like. Further, the face image analysis unit 120 may also acquire information related to the eyes of the driver to determine the orientation of the eyes of the driver (toward the right, toward the left, or facing the front), the moving speed/frequency of the eyes of the driver, and the like. It should be understood that the above-mentioned information related to the facial features of the driver is merely exemplary and not limiting. More information about the facial features of the driver can be acquired as needed.
Next, the method 200 proceeds to step S230. At step S230, the determination unit 130 may determine whether the driver has a preset behavior, for example, described later, based on the information related to the facial features of the driver acquired by the facial image analysis unit 120. It should be understood that the driver's preset behavior described below is merely exemplary and not limiting.
In some embodiments, for example, the judging unit 130 may judge whether the duration for which the driver views the outside rear view mirror reaches a predetermined threshold based on the information related to the facial features of the driver acquired by the facial image analyzing unit 120.
In some embodiments, for example, as described above, the facial image analysis unit 120 may acquire information related to the orientation of the head of the driver. Then, the judging unit 130 may judge whether or not the duration for which the driver views the outside mirror reaches a predetermined threshold, for example, based on the information on the orientation of the driver's head acquired by the face image analyzing unit 120. For example, if the information related to the orientation of the head of the driver acquired by the facial image analysis unit 120 indicates that the head of the driver is oriented toward the front left or front right of the driver itself for a predetermined period of time (for example, 3 seconds), the determination unit 130 may determine that the duration for which the driver views the outside rear view mirror reaches a predetermined threshold. The predetermined threshold value may be set to, for example, 3 seconds as described above, or may be set to a different value as needed.
In addition, in addition to the above-described information related to the orientation of the head of the driver, information related to the eyes of the driver may be acquired. In some embodiments, for example, the facial image analysis unit 120 may obtain information related to the orientation of the driver's eyes. Then, the judging unit 130 may judge whether or not the duration for which the driver views the outside mirror reaches a predetermined threshold, for example, based on the information on the orientation of the driver's eyes acquired by the face image analyzing unit 120. The judging unit 130 may judge that the duration for which the driver views the outside rear view mirror reaches a predetermined threshold value if the information on the orientation of the driver's eyes, specifically, eyeballs, acquired by the facial image analyzing unit 120 indicates that the driver's eyes are oriented toward the front left or front right of the driver itself for a predetermined period of time (for example, 3 seconds).
It is to be understood that the above examples are illustrative only and not limiting. It may be determined whether the duration of time for which the driver views the outside rear view mirror reaches a predetermined threshold value based on other facial features of the driver.
In some embodiments, the judging unit 130 may judge whether or not the following exists based on the information related to the facial features of the driver acquired by the facial image analyzing unit 120: the duration of time that the driver looks at the outside mirror does not reach the above-mentioned predetermined threshold value, and the frequency that the driver looks at the outside mirror reaches another predetermined threshold value during a certain period of time.
In some embodiments, for example, as described above, the facial image analysis unit 120 may acquire information related to the orientation and rotation frequency of the head of the driver. Then, the judging unit 130 may first judge whether the duration for which the driver views the outside rear view mirror reaches a predetermined threshold value based on the information related to the orientation of the driver's head acquired by the face image analyzing unit 120. If the judging unit 130 judges that the duration of the driver's viewing of the outside rear view mirror does not reach the predetermined threshold based on the information related to the orientation of the driver's head acquired by the facial image analyzing unit 120, the judging unit 130 may further judge whether the frequency of the driver's viewing of the outside rear view mirror reaches another predetermined threshold during a certain period of time based on the information related to the rotational frequency of the driver's head acquired by the facial image analyzing unit 120.
For example, first, if the information related to the orientation of the head of the driver acquired by the facial image analysis unit 120 indicates that the head of the driver is oriented toward the front left or front right of the driver itself for not a predetermined period of time (for example, 3 seconds), the determination unit 130 may determine that the duration for which the driver views the outside rear view mirror does not reach a predetermined threshold. Then, in the case where the judging unit 130 judges that the duration of time for which the driver views the outside rear view mirror does not reach the predetermined threshold, if the information on the rotation frequency of the driver's head acquired by the face image analyzing unit 120 further indicates that the number of times the driver rotates the head toward the left front or the right front within a certain period of time (for example, 5 seconds) reaches a predetermined number of times (for example, 3 times), the judging unit 130 may judge that the frequency for which the driver views the outside rear view mirror reaches another predetermined threshold during a certain period of time. Therefore, based on the above-described two-time judgment process, the judgment unit 130 can finally judge that the following situation exists: the duration of time that the driver looks at the outside mirror does not reach the above-mentioned predetermined threshold value, and the frequency that the driver looks at the outside mirror reaches another predetermined threshold value during a certain period of time. The above-described other predetermined threshold value may be set to, for example, 3 times as described above, or may be set to a different value as needed.
Similarly, it is also possible to determine whether or not the above situation exists based on information related to the direction of the eyes of the driver and the frequency of movement. The specific judgment process is similar to the above-described judgment process based on information related to the direction of the head of the driver and the rotation frequency, and thus a detailed description is omitted here. It is to be understood that the above examples are illustrative only and not limiting. Whether or not the above situation exists may be determined based on other facial features of the driver.
Then, the method 200 proceeds to step S240. At step S240, in the case where it is determined by the determination unit 130 that there is a behavior preset as described above, the system start unit 140 may automatically start the parallel assistance system of the vehicle to make a warning notification.
In some embodiments, for example, in the case where it is determined by the determination unit 130 that the duration in which the driver views the outside rear view mirror reaches the above-described predetermined threshold value, the merging support system of the vehicle may be automatically activated, so that a warning notification is made through the merging support system of the vehicle. The warning notification by the parallel line assist system of the vehicle includes, for example, by lighting or blinking an LED lamp mounted on the outside mirror. It should be noted that the warning notification by the parallel assist system of the vehicle mentioned here is different from the warning performed by the warning unit 330 described later and is performed separately.
In some embodiments, for example, in the case where it is determined by the determination unit 130 that there is the following situation, the merging auxiliary system of the vehicle may be automatically started, so that warning notification is made by the merging auxiliary system of the vehicle: the duration of time that the driver looks at the outside mirror does not reach the above-mentioned predetermined threshold value, and during a certain period of time the frequency that the driver looks at the outside mirror reaches the above-mentioned other predetermined threshold value.
By the method 200 described above, the merging aid system of the vehicle may be automatically activated based on the driver's behavior (e.g., the duration of time the driver looks at the outside rear view mirror and/or the frequency at which the driver looks at the outside rear view mirror during a period of time) regardless of the speed of travel of the vehicle. In this way, even if the running speed of the vehicle does not exceed the preset prescribed speed, the parallel line assist system of the vehicle is automatically started whenever the judgment unit 130 judges that the driver has the preset behavior as described above, so that the driver can be timely reminded of the coming vehicle from the rear side, thereby avoiding the collision of the vehicle as much as possible, and improving the running safety of the vehicle.
An exemplary block diagram of a system for automatically starting a merging assistance system of a vehicle based on the behavior of a driver according to another embodiment of the present disclosure will be described below with reference to fig. 3. The system 100 corresponds to the system 100 in fig. 1, and may include a distance information acquisition unit 310, a lane change operation monitoring unit 320, and an alarm unit 330 in addition to the face image capturing unit 110, the face image analyzing unit 120, the judging unit 130, and the system starting unit 140 shown in fig. 1. In fig. 3, the same portions as those in fig. 1 are denoted by the same reference numerals, and duplicate descriptions are omitted.
As shown in fig. 3, in some embodiments, the system 100 may further include a distance information acquisition unit 310, a lane change operation monitoring unit 320, and an alarm unit 330. The distance information acquisition unit 310 may acquire information on a distance from a vehicle in a surrounding lane. The lane change operation monitoring unit 320 may monitor whether a lane change operation is present for the driver. In the case where it is detected by the lane change operation monitoring unit 320 that there is a lane change operation of the driver, the warning unit 330 may perform different levels of warning based on the information on the distance from the vehicle in the surrounding lane acquired by the distance information acquisition unit 310.
The operation of the respective units as shown in fig. 3 will be described in further detail below.
Fig. 4 is an exemplary flowchart illustrating a method 400 for automatically starting a merging assistance system of a vehicle based on a driver's behavior according to another embodiment of the present disclosure.
As shown in fig. 4, the method 400 begins at step S410. At step S410, the distance information acquisition unit 310 may acquire information about a distance from the vehicle in the surrounding lane. In some embodiments, the distance information acquisition unit 310 may acquire information about the distance to the vehicle in the surrounding lane, for example, by a camera mounted on the vehicle body, a distance sensor, or a combination thereof. It should be understood that the above-described manner of obtaining information regarding distance from a vehicle in a surrounding lane is merely exemplary and not limiting.
Next, the method 400 proceeds to step S420. At step S420, the lane change operation monitoring unit 320 may monitor whether a lane change operation is present by the driver. In some embodiments, the lane change operation monitoring unit 320 may monitor whether the driver turns on the left/right turn signal of the vehicle to determine whether the driver has a lane change operation. If the lane change operation monitoring unit 320 monitors that the driver turns on the left/right turn signal of the vehicle, it may be determined that a lane change operation is present for the driver.
In addition, in some embodiments, the lane change operation monitoring unit 320 may also monitor whether there is a voice command indicating a turn through a microphone or the like in the vehicle. For example, if the lane change operation monitoring unit 320 monitors a voice instruction indicating a steering such as "left steering", "right steering", or the like through a microphone or the like in the vehicle, it may be determined that a lane change operation is present for the driver. The above-described voice instruction indicating the steering may be issued by the occupant, for example. It should be understood that the lane change operation described above is exemplary only and not limiting.
Next, the method 400 proceeds to step S430. At step S430, in the case where it is detected by the lane change operation monitoring unit 320 that there is a lane change operation by the driver, the warning unit 330 may perform different levels of warning based on the information on the distance from the vehicle in the surrounding lane acquired by the distance information acquisition unit 310.
In some embodiments, for example, in a case where the lane change operation monitoring unit 320 monitors that the driver has the lane change operation described above and the information on the distance to the vehicle in the surrounding lane acquired by the distance information acquisition unit 310 indicates that the distance to the vehicle laterally rearward in the surrounding lane is smaller than the first distance and larger than the second distance, the warning unit 330 may perform the warning of the first level. Here, the first distance is greater than the second distance. For example, the warning unit 330 may perform the warning of the first level when it is monitored by the lane change monitoring unit 320 that the driver turns on the left/right turn lamp of the vehicle and the information on the distance to the vehicle in the surrounding lane, which is acquired by the distance information acquisition unit 310, indicates that the distance to the vehicle laterally behind in the surrounding lane is less than, for example, 100 meters and greater than 50 meters. It should be understood that the specific values of the first and second distances herein are merely exemplary and not limiting. The specific values of the first distance and the second distance may be changed according to actual requirements.
In some embodiments, the first level of alert may include, for example, broadcasting a preset alert via a voice system within the vehicle. For example, a prompt such as "there is a vehicle in the surrounding lane, please take precautions and line up" may be announced by a voice system in the vehicle to audibly alert the driver to the coming vehicle laterally behind. Additionally, in some embodiments, the first level of alerting may also include displaying an early warning image through a display screen inside the vehicle, for example. For example, an early warning pattern such as an exclamation mark or a warning phrase such as "cautious merging" may be displayed through a display screen in the vehicle interior to visually alert the driver to the approach of the vehicle laterally behind. It should be understood that the manner and content of voice broadcasting and the manner and content of display described above are exemplary only and not limiting. The manner and content of the voice broadcast by the voice system in the vehicle and the manner and content of the display by the display screen in the vehicle may be changed as needed as long as the driver can be timely reminded of the coming vehicle at the rear side by the voice broadcast or display.
In some embodiments, for example, in a case where the lane change operation monitoring unit 320 monitors that the driver has the lane change operation described above and the information on the distance to the vehicle in the surrounding lane acquired by the distance information acquisition unit 310 indicates that the distance to the vehicle laterally rearward in the surrounding lane is smaller than the second distance, the warning unit 330 may perform the warning of the second level. For example, the warning unit 330 may perform the warning of the second level when it is monitored by the lane change monitoring unit 320 that the driver turns on the left/right turn lamp of the vehicle and the information on the distance to the vehicle in the surrounding lane, which is acquired by the distance information acquisition unit 310, indicates that the distance to the vehicle laterally behind in the surrounding lane is less than, for example, 50 meters.
In some embodiments, the second level of warning may include, for example, increasing the resistance to rotation of the steering wheel, which may make it difficult for the driver to turn the steering wheel, thereby alerting the driver to a laterally rearward drive-in by tactile sensation. In addition, the second level of warning may also include, for example, providing vibrations to the seat belt or steering wheel, similar to that described above, to alert the driver to a laterally rearward drive-in by touch. It should be understood that the above examples of the second level of alerting are merely illustrative and not limiting. Since the second level of warning alerts the driver to notice the laterally rearward vehicle, rather than audibly and/or visually, the second level of warning is more direct and more quickly noticeable to the driver than the first level of warning described above, and is thus more suitable for use in situations where the distance from the laterally rearward vehicle in the surrounding lane is relatively close.
In some embodiments, while the second level of alerting is being performed, the first level of alerting may be performed, for example, simultaneously. For example, while increasing the turning resistance of the steering wheel, a preset prompt can be broadcast by a voice system inside the vehicle, so that the driver is reminded of the coming vehicle at the rear side on both the auditory sense and the tactile sense.
In some embodiments, the first distance and the second distance described above may be automatically adjusted based on a current travel speed of the vehicle. For example, when the running speed of the vehicle is fast (e.g., 100 km/h), the distance between the vehicles should be kept, for example, greater than 100 meters. Therefore, at this time, the first distance may be adjusted to 100 meters and the second distance may be adjusted to 50 meters. In addition, for example, when the running speed of the vehicle is slow (for example, 60 km/h), the distance between the vehicles should be kept, for example, greater than 60 meters. Therefore, at this time, the first distance may be adjusted to 60 meters and the second distance may be adjusted to 30 meters. It should be understood that the above examples are illustrative only and not limiting. Different adjustments can be made according to actual vehicle travel speed and/or actual demand so that the driver can be timely reminded of coming vehicles at the rear side.
With the method 400 described above, it is possible to perform different levels of warning based on information about the distance from the vehicle in the surrounding lane in the event that it is monitored that there is a lane change operation by the driver. By this method, the driver can be timely reminded of the coming vehicle at the rear side according to the distance from the vehicle in the surrounding lane. For example, when the distance from the vehicle in the surrounding lane is large, the driver may be alerted to the coming vehicle in the rear side by means of a voice broadcast and/or a warning image display or the like in a gentle manner. For example, when the distance from the vehicle in the surrounding lane is short, the driver may be alerted to the incoming vehicle in the rear side by increasing the steering wheel rotation resistance and/or providing vibrations to the seat belt or steering wheel, or the like in a more intense manner. By the method 400, the driver can be timely reminded of the coming vehicles at the rear side, so that the driver can carefully perform lane changing operation, the collision of the vehicles is avoided as much as possible, and the driving safety of the vehicles is improved.
In addition to the above-described methods 200 and 400, the method according to the present disclosure may further include determining whether a voice instruction indicating steering is recognized through a voice system or the like such as the inside of the vehicle in the case where it is determined by the determination unit 130 that the driver does not have the preset behavior as described above. For example, in the case where it is determined that a voice instruction indicating steering is recognized by a voice system or the like such as the inside of the vehicle, the system start unit 140 automatically starts the merging auxiliary system of the vehicle. In some embodiments, the above-described operation regarding determining whether a voice instruction indicating steering is recognized through a voice system such as the inside of a vehicle or the like may be performed by the determination unit 130 or other processing unit. The above-described voice instruction indicating the steering may be, for example, a sound such as "left-side-by-side"/"right-side-by-side" issued by the occupant indicating the steering.
As described above, the present disclosure provides a system and method for automatically starting a merging assist system of a vehicle based on a driver's behavior. By using the system and method according to the present disclosure, the merging aid system of the vehicle can be automatically started based on the driver's behavior, i.e. -the merging aid system of the vehicle can be automatically started according to the actual lane change demand of the driver, irrespective of the driving speed of the vehicle. Therefore, unlike the conventional method of starting the parallel assist system, even if the running speed of the vehicle does not exceed the preset prescribed speed, the parallel assist system of the vehicle is automatically started whenever the driver has the preset behavior as described above, so that the driver can be timely reminded of the coming vehicle from the rear side, so that the driver can carefully perform the lane change operation, thereby avoiding the collision of the vehicle as much as possible, and improving the running safety of the vehicle. Further, by using the system and method according to the present disclosure, it is possible to perform different levels of warning based on information about the distance from the vehicle in the surrounding lane in the case where it is monitored that there is a lane change operation by the driver. Therefore, the driver can be timely reminded of the coming vehicles at the rear side according to the distance from the vehicles in the surrounding lanes, so that the driver can carefully perform lane changing operation, the vehicles are prevented from being collided as much as possible, and the driving safety of the vehicles is further improved.
Hardware implementation
Fig. 5 illustrates a general hardware environment 500 in which the present disclosure may be applied, according to an exemplary embodiment of the present disclosure.
With reference to fig. 5, a computing device 500 will now be described as an example of a hardware device applicable to aspects of the present disclosure. Computing device 500 may be any machine configured to perform processes and/or calculations and may be, but is not limited to, a workstation, a server, a desktop computer, a laptop computer, a tablet computer, a personal digital assistant, a smart phone, a portable camera, or any combination thereof. The system 100 described above may be implemented in whole or at least in part by a computing device 500 or similar device or system.
Software elements may reside in a working memory 514 including, but not limited to, an operating system 516, one or more application programs 518, drivers, and/or other data and code. Instructions for performing the above-described methods and steps may be included in one or more applications 518, and the above-described system 100 may be implemented by one or more processors 504 reading and executing the instructions of one or more applications 518. More specifically, the face image capturing unit 110 may be implemented, for example, by the processor 504 when executing the application 518 having instructions to perform step S210. The facial image analysis unit 120 may be implemented, for example, by the processor 504 when executing the application 518 having instructions to perform step S220. The determination unit 130 may be implemented, for example, by the processor 504 when executing the application 518 having instructions to perform step S230. The system start-up unit 140 may be implemented, for example, by the processor 504 when executing the application 518 with instructions to perform step S240. The distance information acquiring unit 310 may be implemented, for example, by the processor 504 when executing the application 518 having instructions to perform step S410. The lane change operation monitoring unit 320 may be implemented, for example, by the processor 504 when executing the application 518 having instructions to perform step S420. The alarm unit 330 may be implemented, for example, by the processor 504 when executing an application 518 having instructions to perform step S430. Executable code or source code of instructions of the software elements may be stored in a non-transitory computer readable storage medium, such as the storage device(s) 510 described above, and may be read into the working memory 514, possibly compiled and/or installed. Executable code or source code for the instructions of the software elements may also be downloaded from a remote location.
From the above embodiments, it is apparent to those skilled in the art that the present disclosure may be implemented by software and necessary hardware, or may be implemented by hardware, firmware, etc. Based on this understanding, embodiments of the present disclosure may be implemented, in part, in software. The computer software may be stored in a computer program and/or a computer readable storage medium, such as a floppy disk, hard disk, optical disk, or flash memory. The computer software includes a series of instructions that cause a computer (e.g., a personal computer, a service station, or a network terminal) to perform a method according to various embodiments of the present disclosure, or a portion thereof.
Having thus described the present disclosure, it is clear that the present disclosure can be varied in a number of ways. Such variations are not to be regarded as a departure from the spirit and scope of the present disclosure, and all such modifications as would be obvious to one skilled in the art are intended to be included within the scope of the following claims.
Claims (14)
1. A method for automatically activating a merging assistance system of a vehicle based on a driver's behavior, the method comprising:
capturing a face image of a driver in real time by a camera installed inside the vehicle;
analyzing the captured facial image of the driver to obtain information related to facial features of the driver;
judging whether a driver has preset behaviors or not based on the acquired information related to the facial features of the driver; and
and automatically starting a parallel assistance system of the vehicle to carry out warning notification when the driver is judged to have preset behaviors.
2. The method according to claim 1, wherein determining whether the driver has a preset behavior based on the acquired information related to the facial features of the driver comprises:
based on the acquired information related to the facial features of the driver, it is determined whether or not the duration of the driver's viewing of the outside rear view mirror reaches a predetermined threshold, and
wherein the merging assist system of the vehicle is automatically started in a case where it is determined that the duration of time for which the driver views the outside rear view mirror reaches the predetermined threshold value.
3. The method according to claim 2, wherein determining whether the driver has a preset behavior based on the acquired information related to the facial features of the driver further comprises:
based on the acquired information related to the facial features of the driver, it is determined whether or not the following situation exists: the duration of the driver's view of the exterior mirror does not reach the predetermined threshold, and during a period of time the frequency of the driver's view of the exterior mirror reaches another predetermined threshold, and
wherein, in case it is determined that the situation exists, the merging auxiliary system of the vehicle is automatically started.
4. A method according to any one of claims 1 to 3, wherein the acquired information relating to the facial features of the driver comprises one or more of: the position and/or orientation of the driver's head, the position and/or orientation of the driver's eyes, the rotational speed and/or rotational frequency of the driver's head, the speed and/or frequency of movement of the driver's eyes, and the position and/or size of the driver's pupils.
5. A method according to any one of claims 1 to 3, further comprising:
acquiring information about a distance from a vehicle in a surrounding lane;
monitoring whether a lane change operation exists by a driver; and
in the case where it is detected that there is a lane change operation by the driver, different levels of warning are performed based on the acquired information about the distance from the vehicle in the surrounding lane.
6. The method of claim 5, wherein performing different levels of alerting based on the acquired information about the distance to the vehicle in the surrounding lane in the event that a lane change operation of the driver is monitored comprises:
in the case where it is detected that there is a lane change operation by the driver and the acquired information on the distance to the vehicle in the surrounding lane indicates that the distance to the vehicle laterally rearward in the surrounding lane is smaller than the first distance and larger than the second distance, an alarm of the first level is performed, and
when it is detected that there is a lane change operation by the driver and the acquired information on the distance to the vehicle in the surrounding lane indicates that the distance to the vehicle laterally rearward in the surrounding lane is smaller than the second distance, a second level of warning is performed,
wherein the first distance is greater than the second distance.
7. The method of claim 6, wherein,
the first-level warning comprises broadcasting a preset prompt through a voice system in the vehicle and/or displaying an early warning image through a display screen in the vehicle, and
the second level of warning includes increasing the rotational resistance of the steering wheel and/or providing vibrations to the seat belt or steering wheel.
8. The method of claim 5, wherein the lane-change operation comprises turning on a turn light of a vehicle and/or issuing a voice command indicating a turn.
9. The method of claim 6, wherein,
the first distance and the second distance are automatically adjusted based on a current travel speed of the vehicle.
10. The method of claim 1, further comprising:
judging whether a voice instruction for indicating steering is recognized through a voice system in the vehicle or not under the condition that the driver does not have the preset behavior; and
if it is determined that a voice command indicating steering is recognized by a voice system inside the vehicle, a parallel assist system of the vehicle is automatically started.
11. A system for automatically activating a merging assistance system of a vehicle based on driver behavior, comprising:
means for performing the method according to any one of claims 1-10.
12. An apparatus for automatically starting a parallel assistance system of a vehicle based on a driver's behavior, comprising
At least one processor; and
at least one storage device storing instructions that, when executed by the at least one processor, cause the at least one processor to perform the method of any of claims 1-10.
13. A non-transitory computer-readable storage medium storing instructions which, when executed by a processor, cause performance of the method recited in any one of claims 1-10.
14. A program product storing a program which when executed by a processor causes the method according to any one of claims 1-10 to be performed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310513471.1A CN116424328A (en) | 2023-05-09 | 2023-05-09 | Method for automatically starting a parallel assistance system of a vehicle based on the behavior of the driver |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310513471.1A CN116424328A (en) | 2023-05-09 | 2023-05-09 | Method for automatically starting a parallel assistance system of a vehicle based on the behavior of the driver |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116424328A true CN116424328A (en) | 2023-07-14 |
Family
ID=87087360
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310513471.1A Pending CN116424328A (en) | 2023-05-09 | 2023-05-09 | Method for automatically starting a parallel assistance system of a vehicle based on the behavior of the driver |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116424328A (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009020570A (en) * | 2007-07-10 | 2009-01-29 | Denso Corp | Vehicle traveling support apparatus |
CN102991504A (en) * | 2012-12-07 | 2013-03-27 | 长安大学 | Device and method for determining and prewarning lane change safety of drivers |
CN205737520U (en) * | 2016-05-23 | 2016-11-30 | 杭州谱地新能源科技有限公司 | A kind of doubling aid system for automobile |
CN106935077A (en) * | 2017-03-17 | 2017-07-07 | 合肥工业大学 | One safety pre-warning system with auxiliary of parking is detected based on vehicle blind spot |
WO2017139916A1 (en) * | 2016-02-15 | 2017-08-24 | 吴伟民 | Vehicle lane-change assistant driving method and system |
CN110533958A (en) * | 2018-05-24 | 2019-12-03 | 上海博泰悦臻电子设备制造有限公司 | Vehicle lane change based reminding method and system |
CN210116437U (en) * | 2019-04-19 | 2020-02-28 | 北京汽车股份有限公司 | Doubling auxiliary device for vehicle |
CN112428954A (en) * | 2020-11-09 | 2021-03-02 | 江苏大学 | Adaptive-adjustment doubling auxiliary device and control method |
CN114312785A (en) * | 2021-12-15 | 2022-04-12 | 上汽大众汽车有限公司 | Lane changing auxiliary system and method for vehicle |
-
2023
- 2023-05-09 CN CN202310513471.1A patent/CN116424328A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009020570A (en) * | 2007-07-10 | 2009-01-29 | Denso Corp | Vehicle traveling support apparatus |
CN102991504A (en) * | 2012-12-07 | 2013-03-27 | 长安大学 | Device and method for determining and prewarning lane change safety of drivers |
WO2017139916A1 (en) * | 2016-02-15 | 2017-08-24 | 吴伟民 | Vehicle lane-change assistant driving method and system |
CN205737520U (en) * | 2016-05-23 | 2016-11-30 | 杭州谱地新能源科技有限公司 | A kind of doubling aid system for automobile |
CN106935077A (en) * | 2017-03-17 | 2017-07-07 | 合肥工业大学 | One safety pre-warning system with auxiliary of parking is detected based on vehicle blind spot |
CN110533958A (en) * | 2018-05-24 | 2019-12-03 | 上海博泰悦臻电子设备制造有限公司 | Vehicle lane change based reminding method and system |
CN210116437U (en) * | 2019-04-19 | 2020-02-28 | 北京汽车股份有限公司 | Doubling auxiliary device for vehicle |
CN112428954A (en) * | 2020-11-09 | 2021-03-02 | 江苏大学 | Adaptive-adjustment doubling auxiliary device and control method |
CN114312785A (en) * | 2021-12-15 | 2022-04-12 | 上汽大众汽车有限公司 | Lane changing auxiliary system and method for vehicle |
Non-Patent Citations (2)
Title |
---|
曹江卫: "智能网联汽车概论", 31 August 2021, 吉林大学出版社, pages: 108 * |
柳献初: "汽车工程学引论", 31 January 2017, 同济大学出版社, pages: 212 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2523173B1 (en) | Driver assisting system and method for a motor vehicle | |
JP5082834B2 (en) | Aside look detection device and method, and program | |
JP6516298B2 (en) | Information display device | |
JP5880740B1 (en) | Collision avoidance system and collision avoidance method | |
US10752172B2 (en) | System and method to control a vehicle interface for human perception optimization | |
CN108528333A (en) | A kind of lane change auxiliary alarm system | |
US10632924B1 (en) | Blind-spot monitoring using machine vision and precise FOV information | |
CN112119438A (en) | Vehicle rear-lateral-direction warning device and vehicle rear-lateral-direction warning method | |
KR20180107512A (en) | System and method for preventing collision of door of vehicle | |
KR101241861B1 (en) | System and Method for intelligent monitoring using proximity sensor and in vehicle camera | |
JP7180421B2 (en) | vehicle controller | |
CN108099906B (en) | Collision early warning system installed on automobile and automobile | |
CN116424328A (en) | Method for automatically starting a parallel assistance system of a vehicle based on the behavior of the driver | |
KR20120062215A (en) | Vehicle rear side waring apparatus for driver | |
CN116674463A (en) | Prompting method and device for rear-end collision early warning, storage medium and electronic device | |
KR20170069645A (en) | System of preventing pedestrian accident and method performing thereof | |
JP5120893B2 (en) | Safe driving support device | |
CN113276765B (en) | Driving warning method and system and computer program product | |
CN114572232A (en) | Attention assistance for dynamic blind zones accompanying the state of the driver during driving | |
KR20110131899A (en) | Apparatus for warning collision risk at vehicle turn by using camera and control method thereof | |
KR20210038791A (en) | Back warning apparatus for older driver and method thereof | |
KR20180039838A (en) | Alarm controlling device of vehicle and method thereof | |
KR20240049006A (en) | Apparatus for warining driver distraction and method thereof | |
KR100566272B1 (en) | System for warning drowsy driving | |
CN220809278U (en) | Vehicle door closing warning system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |