CN104620298B - Coordinate the system and method for the sensor operations for collision detection - Google Patents

Coordinate the system and method for the sensor operations for collision detection Download PDF

Info

Publication number
CN104620298B
CN104620298B CN201380046869.3A CN201380046869A CN104620298B CN 104620298 B CN104620298 B CN 104620298B CN 201380046869 A CN201380046869 A CN 201380046869A CN 104620298 B CN104620298 B CN 104620298B
Authority
CN
China
Prior art keywords
land vehicle
collision
sensing
vehicle
clause
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201380046869.3A
Other languages
Chinese (zh)
Other versions
CN104620298A (en
Inventor
杰弗里·A·鲍尔斯
杰弗里·F·迪恩
罗德里克·A·海德
内森·孔特茨
内森·P·米佛德
戴维·R·史密斯
克拉伦斯·T·特格林
小洛厄尔·L·伍德
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Elwha LLC
Original Assignee
Elwha LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/544,770 external-priority patent/US9165469B2/en
Priority claimed from US13/544,757 external-priority patent/US9558667B2/en
Priority claimed from US13/544,799 external-priority patent/US9000903B2/en
Application filed by Elwha LLC filed Critical Elwha LLC
Publication of CN104620298A publication Critical patent/CN104620298A/en
Application granted granted Critical
Publication of CN104620298B publication Critical patent/CN104620298B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

A kind of collision detecting system of land vehicle can be configured as coordinating the operation of sensor with one or more of the other sensor-based system of one or more of the other land vehicle.Coordination may include configuring other sensing systems.In some embodiments, the coordination includes forming the more base sensors for including one or more transmitters and/or one or more receivers.Collision detecting system can be configured to receive the detection signal emitted by one or more of the other sensor-based system.Coordinate the detection signal that can further comprise guiding more base sensors.Collision detecting system can use the sensing data by using coordinated sensing system acquisition to generate collision detection model.

Description

Coordinate the system and method for the sensor operations for collision detection
The cross reference of related application
This application involves and/or advocate that the earliest of application (" priority application ") for being exemplified below (if having) available has The equity of the applying date is imitated (for example, advocating that available priority date is used for the application different from temporary patent application, Huo Zhegen earliest Advocate that equity is used for temporary patent application according to 35USC § 119 (e), is used for any and all patents, the female case of priority application Apply, the parent application etc. of female case).In addition, the application further relates to relevant application.
Priority application
It is required outside method to meet U.S.Patent & Trademark Office, the application constitutes entitled《For cooperateing with collision detection System and method》, with Geoffrey A Bauer this, Jeffree F Dien, Rhoderick A Hai De, interior gloomy hole Te Ci, interior gloomy P meter Fo De, dimension R Smiths, Clarens T spy Green and small Lowell L Woods work are worn For inventor, attorney docket number be 0510-035-001-000000, in the U.S. Patent application submitted on July 9th, 2012 No.13/544,757 part continuation application, the patent is co-pending at present or application co-pending at present is allowed to enjoy this The application of the qualification of the applying date.
It is required outside method to meet U.S.Patent & Trademark Office, the application constitutes entitled《System for vehicle monitoring And method》, with Geoffrey A Bauer this, Jeffree F Dien, Rhoderick A Hai De, interior gloomy hole spy thatch, Interior gloomy P meter Fo De, dimension R Smiths, Clarens T spy Green and small Lowell L Woods are worn as hair A person of good sense, attorney docket number be 0510-035-003-000000, the U.S. Patent application No.13/ that is submitted on July 9th, 2012 544,799 part continuation application, the patent is co-pending at present or application co-pending at present is allowed to enjoy this application day Qualification application.
The issued notice of U.S. Patent Office (USPTO), the computer program of USPTO needs patent applicant to the effect that It quotes sequence number and indicates that application is the continuation application, part continuation application or divisional application of patent application.Stefan G. Library is peaceful,《The benefit for the application previously submitted》,《U.S.Patent & Trademark Office's Official Journal》, on March 18th, 2003.United States Patent (USP) quotient Mark office additionally provides《Application data form》Table, it allows automatic load data in literature, but needs to judge that each application is female Continuation application, part continuation application or the divisional application of application.The applicant's entity (hereinafter referred to as " applicant ") exists The specific reference to application has been provided above, provided by law requires these priority applied.Applicants have appreciated that law Be clear in terms of its specific specific quote language, do not need sequence number, also need not such as " continue " or " part continue " it Any feature of class carrys out the priority of requirement U.S. Patent application.In spite of above-mentioned regulation, but applicants have appreciated that, United States Patent (USP) quotient The computer program of mark office has certain data entry requirement, and therefore, applicant is provided about the application and its parent application Between in the specified of the relationship illustrated in any ADS for being submitted above and in this application, but it is explicitly noted that this The specified of sample is not interpreted whether to the application also include any new content other than the content of its parent application in any way Any kind of comment and/or approval.
Any and all parent patent application of priority application and related application and priority application and related application, The theme of ancestral's patent application, great-grandfather's patent application etc. including any priority claim, such theme not with the application phase In contradictory degree, also through being incorporated by.
Technical field
This disclosure relates to the system and method for coordinating the sensor operations for collision detection.
Invention content
A kind of vehicle may include collision detecting system, collision detecting system be configured to detection be related to vehicle and/or The potential collision of other objects near vehicle.Object can include but is not limited to:Pedestrian, animal, vehicle, road barricade Object, roadway characteristic object (for example, barrier, bridge pad) and the like.Collision detecting system is configured to vehicle The sensing system of sensing system and/or other one or more vehicles obtains sensing data.Collision detecting system can use Acquired sensing data detects potential collision.The potential collision of detection may include accessing to use acquired sensing The collision detection model that device data generate." collision detection model " used herein refers to the movement of the object near vehicle Object model.Collision detection model can also include position, direction, size of object etc..In some embodiments, it collides Detection model further comprises weight of object estimation, operability estimation etc..Collision detection model may include that object is opposite In the kinematics characteristic (kinematics) of specific reference system, such as relative position, speed, acceleration, shutdown rate, direction Deng.(translate) is converted between the reference system that collision detection model can use in different vehicle impact detection systems. Collision detection model can be partly generated by vehicle impact detection system.Alternatively, collision detection model (and/or its Part) it can be generated by other vehicles.
Collision detecting system can be configured to obtain sensing data, one or more of source packets from one or more sources It includes but is not limited to:The sensing system of collision detecting system, the sensing system of other vehicles and/or other external sources.In some realities It applies in mode, collision detecting system uses the movenent performance that object is judged by the sensing data that one or more sources obtain.It touches Hit detecting system can combination sensor data to improve the movenent performance of object, judge position, direction, size of object etc.. Acquired sensing data can be used to generate collision detection model for collision detecting system.Collision detecting system can be with other vehicles Cooperation is with shared collision detection data (such as sensing data), shared collision detection model etc..
Collision detecting system can be further configured to obtain auxiliary data from other one or more vehicles.Auxiliary data May include " itself data " (" Self-knowledge "), such as vehicle dimension, direction, position, speed.Auxiliary data can wrap Include processed sensing data, the reading of such as speedometer, positioning system information, temporal information etc..In some embodiment party In formula, collision detecting system can be carried out combination sensor data using auxiliary data and/or generate collision detection model.
In some embodiments, collision detecting system can not use sensing system, and may rely on by other vehicles The sensing data obtained detects potential collision.Alternatively, or additionally, which can merge and make The sensing data obtained with inner sense system and the sensor obtained from one or more external sources (for example, other vehicles) Data.Merge sensor data may include:In transformative transducer data to suitable coordinate system and/or reference system;Correction Sensing data;Weight sensing data etc..Merge sensor data may include weighting sensing data, such as institute above It states.
Collision detecting system can be further configured to coordinate the operation of sensor.In some embodiments, collision inspection Examining system can coordinate sensor operations to form compound sensing system with other sensing systems.Compound sensing system may include The sensor of two or more vehicles.Compound sensing system may include one or more in following item:It is more base sensors, double Base sensor, single base sensor, and the like.Collision detecting system can configure sensing system to operate as passive biography Sensor (for example, receiving the detection signal from other vehicles), active sensor are (for example, be sent in the inspection of other vehicles reception Survey signal), and/or combination active and passive operation.
Collision detecting system can be configured as supervising data storage in permanent storage device.It is alternatively or attached Ground, collision detecting system is added to can be transmitted in monitoring data to the server of one or more network-accessibles.Monitoring data can be with Including before and after, during collision with vehicle movement (and/or operation of vehicle) related data.Monitoring data may include Sensing data, collision detection modeling data etc..Monitoring data may include time and/or place with reference to auxiliary data, vehicle Identification information etc..Monitoring data can be fixed the authenticity for allowing to verify that monitoring data and/or source.
Network Accessible Service device can be configured as monitoring data of the aggregation from multiple vehicles.Network Accessible Service Device can index and/or arrange according to time, place, vehicle identification etc. monitoring data.Network Accessible Service device can be through One or more requester accesses monitoring datas are made by network.Access monitoring data can be according to such as payment, bid, interaction number It is waited according to access (monitoring data of requestor) or the like and valence is judged.
Description of the drawings
Fig. 1 depicts an embodiment of collision detecting system;
Fig. 2A depicts another embodiment of collaboration collision detecting system;
Fig. 2 B depict another embodiment of collaboration collision detecting system;
Fig. 2 C depict another embodiment of collaboration collision detecting system;
Fig. 3 is the flow chart of an embodiment of the method for coordinating collision detection;
Fig. 4 is used to coordinate the flow chart of another embodiment of the method for collision detection;
Fig. 5 A depict an embodiment of the collision detecting system for being configured to coordinate sensor operations;
Fig. 5 B depict another embodiment for the collision detecting system for being configured to coordinate sensor operations;
Fig. 6 depicts one of the collision detecting system for being configured to coordination sensor operations and/or shared sensor data Embodiment;
Fig. 7 depicts the another of the collision detecting system for being configured to coordination sensor operations and/or shared sensor data Embodiment;
Fig. 8 is the flow chart of an embodiment of the method for the operation for coordinating sensing system;
Fig. 9 is the flow chart of another embodiment of the method for the operation for coordinating sensing system;
Figure 10 is the block diagram of an embodiment of monitoring service;
Figure 11 is the flow chart of an embodiment of the method for providing monitoring service;And
Figure 12 is the flow chart of another embodiment of the method for providing monitoring service.
Specific implementation mode
Can be to be easy to obtain with some infrastructure that embodiments disclosed herein is used together, these bases Facility is such as:All-purpose computer, RF labels, radio-frequency antenna and associated reader, camera and relevant image processing section, wheat Gram wind and related audio processing component, computer programming instrument and technology, digital storage media and communication network.Calculating is set It is standby to may include processor, such as microprocessor, microcontroller, logic circuit, or the like.The processor may include special place Manage device, such as application-specific integrated circuit (ASIC), programmable logic array (PAL), programmable logic array (PLA), programmable Logical device (PLD), field programmable gate array (FPGA) or other customizable and/or programming devices.Computing device may be used also With including machine readable storage device, such as nonvolatile memory, static state RAM, dynamic ram, ROM, CD-ROM, disk, Tape, magnetic storage, optical memory, flash memories or other machine readable storage mediums.
The many aspects of certain embodiments can be realized using hardware, software, firmware or combinations thereof.As made herein Software module or component may include being located at any types in machine readable storage medium or on machine readable storage medium Computer instruction or computer-executable code.Software module for example may include one or more physics or logic meter Calculation machine instruction block, can be organized into routine, program, object, component, data structure etc., to execute one or more Task realizes particular abstract data type.
In some embodiments, specific software module may include being stored in the different location of machine readable storage medium Entirely different instruction, they realize the function of module together.In fact, module may include single instruction or many fingers It enables, and can be distributed on several different code segments, be distributed between distinct program and across multiple machine readable storage mediums Distribution.Some embodiments can realize that in a distributed computing environment, task is by passing through communication in a distributed computing environment The remote processing devices of network linking execute.
In the accompanying drawings in discribed exemplary embodiment, label, computing device, advertisement, camera, antenna, Mike The otherwise size of wind and mobile device, shape, direction, position, configuration and/or other feature are only illustrative 's.Specifically, mobile device, computing device, label and relevant component size be made to it is very small, and can That need not protrude as depicted in the drawings.Furthermore, it is possible to than illustrated apparent small image, audio and RF marks Label can with it is less it is intrusive place and/or in attached drawing it is discribed those be configured differently.
Embodiment of the present disclosure will be sufficiently understood that by reference to attached drawing, wherein from start to finish identical component is all by phase Same label is indicated.In disclosed embodiment as usually the present invention attached drawing described in and graphic component can be by cloth Set and be designed to a variety of different configurations.In addition, feature associated with an embodiment, structure and operation are applicable to In conjunction with other embodiment describe feature, structure or operation or can with combine other embodiment describe feature, structure or Operative combination.In other instances, well known structure, material or operation are not shown or described in detail, to avoid the disclosure is made Aspect indigestion.
Therefore, the embodiment of the system and method for the disclosure it is described in detail below be not intended to limit it is as claimed The scope of the present disclosure, and be only to represent possible embodiment.In addition, a kind of the step of method, it is not absolutely required to any Particular order executes, in this embodiment it is not even necessary to execute in order, these steps are also not necessarily only performed once.
A kind of vehicle may include collision detecting system, the collision detecting system be configured to detection be related to vehicle and/or The potential collision of other objects near the vehicle.Object can include but is not limited to:Pedestrian, animal, vehicle, road barricade Object, roadway characteristic object and the like.Collision detecting system be configured to vehicle sensing system and/or one or more The sensing system of other a vehicles obtains sensing data.Collision detecting system can be examined using acquired sensing data Survey potential collision.The potential collision of detection may include accessing the collision detection mould generated using acquired sensing data Type." collision detection model " used herein refers to the moving object model of the object near vehicle.Collision detection model Can also include position, direction, size of object etc..In some embodiments, collision detection model further comprises object Body weight estimation, operability estimation etc..Collision detection model may include kinematics of the object relative to specific reference system Characteristic, such as relative position, speed, acceleration, shutdown rate, direction.Collision detection model can be touched in different vehicles It hits and converts between the reference system used in detecting system.Collision detection mould can be partly generated by vehicle impact detection system Type.Alternatively, collision detection model (and/or its part) can be generated by other vehicles.
Collision detecting system can be configured to obtain sensing data, one or more of source packets from one or more sources It includes but is not limited to:The sensing system of collision detecting system, the sensing system of other vehicles and/or other external sources.In some realities It applies in mode, collision detecting system uses the movenent performance that object is judged by the sensing data that one or more sources obtain.It touches Hit detecting system can combination sensor data to optimize and/or judge the movable information in relation to object, such as the acceleration of object, Speed, position, direction, size etc..Acquired sensing data can be used to generate collision detection model for collision detecting system. Collision detecting system can be with other vehicle cooperatives with shared collision detection data (such as sensing data), shared collision detection mould Type etc..
Collision detecting system can be further configured to obtain auxiliary data from other one or more vehicles.Auxiliary data May include " itself data ", such as vehicle dimension, direction, position, speed.Auxiliary data may include processed sensor number According to the reading, positioning system information, temporal information etc. of, such as speedometer.In some embodiments, collision detecting system can To use auxiliary data to carry out combination sensor data and/or generate collision detection model.
In some embodiments, collision detecting system can not use sensing system, and may rely on by other vehicles The sensing data obtained detects potential collision.Alternatively, or additionally, collision detecting system can merge use The sensing data that inner sense system obtains and the sensor number obtained from one or more external sources (for example, other vehicles) According to.Merge sensor data may include:In transformative transducer data to suitable coordinate system and/or reference system;Correction passes Sensor data;Weight sensing data etc..Merge sensor data may include weighting sensing data, as described above.
Collision detecting system can be further configured to coordinate the operation of sensor.In some embodiments, collision inspection Examining system can coordinate sensor operations to form compound sensing system with other sensing systems.Compound sensing system may include The sensor of two or more vehicles.Compound sensing system may include one or more in following item:It is more base sensors, double Base sensor, single base sensor, and the like.Collision detecting system can configure sensing system to operate as passive biography Sensor (for example, receiving the detection signal from other vehicles), active sensor are (for example, be sent in the inspection of other vehicles reception Survey signal), and/or combination active and passive operation.
Collision detecting system can be configured as supervising data storage in permanent storage device.It is alternatively or attached Ground, collision detecting system is added to can be transmitted in monitoring data to the server of one or more network-accessibles.Monitoring data can be with Including before and after, during collision with vehicle movement (and/or operation of vehicle) related data.Monitoring data may include Sensing data, collision detection modeling data etc..Monitoring data may include time and/or place with reference to auxiliary data, vehicle Identification information etc..Monitoring data can be fixed the authenticity for allowing to verify that monitoring data and/or source.
Network Accessible Service device can be configured as monitoring data of the aggregation from multiple vehicles.Network Accessible Service Device can index and/or arrange according to time, place, vehicle identification etc. monitoring data.Network Accessible Service device can be through One or more requester accesses monitoring datas are made by network.Access monitoring data can be visited according to such as payment, bid, interaction It asks that (monitoring data of requestor) or the like waits to judge valence.
Fig. 1 is the block diagram 100 for an embodiment for describing collision detecting system 101.Collision detecting system 101 can be by Be deployed in surface car 102, surface car 102 such as automobile, truck, bus, or the like.Collision detecting system 101 May include sensing system 110, processing module 120, communication module 130, vehicle interface module 140, memory module 150 and Coordination module 160.Sensing system 110 may be configured to obtain letter related with the object in the detection range 112 of vehicle 102 Breath.Processing module 120 can use the information obtained by sensing system 110 (and/or other sensing data sources), latent to detect Collision.Potentially collision may include for detection:The time model of object, judgement collision involved by the potential collision of identification It encloses (for example, time of collision) etc..Communication module 130 can be used for other vehicles (for example, vehicle 103 and/or vehicle 104), emergency service entity, network 132, Network Accessible Service device 154 etc. are communicated.Memory module 150 can be used for Store the configuration of the collision detecting system 101, the operating condition of vehicle 102 and/or collision peripheral information etc..Coordination module 160 can be configured as the operation for coordinating collision detecting system 101 and/or sensing system 110 with other vehicles 103,104.
Sensing system 110 can be configured as acquisition and may be caused to vehicle 102 (and/or other vehicles 103,104) The related information of object of risk of collision.Sensing system 110 can be configured to obtain the operation about vehicle 102 Information, such as direction, position, speed, acceleration.In some embodiments, sensing system 110 is configured as obtaining movement letter Breath.As it is used herein, kinematics characteristic refers to the motion feature of object;Movable information may include but be not limited to:Speed adds Speed, direction etc..Any suitable coordinate system and/or reference system can be used to indicate for movable information.Correspondingly, movable information can Be represented as cartesian coordinate system, polar coordinate system, or the like in component value, vector, or the like.In addition, movement Information can be relative to specific reference system;Such as;Movable information may include and particular vehicle 102,103 and/or 104 Direction, position, speed and/or the corresponding object of acceleration direction, position, speed, acceleration (for example, shutdown rate) etc. Deng.
Sensing system 110 may include one or more active sensors and/or passive sensor, may include, but It is not limited to:One or more electromagnetism sensing systems (for example, radar sensing system, capacitive sensing system etc.), photoelectricity sensing system (for example, laser sensing system, light detection and ranging (LIDAR) system etc.), phonoreception examining system, ultrasonic wave sensing system, magnetic strength Examining system, imaging system (for example, camera, image processing system, stereocamera etc.), and the like.Collision detection system System 101 can also include the sensor for judging the kinematics characteristic of vehicle 102 (for example, " itself data ").Therefore, it senses System 110 may include one or more speedometers, accelerometer, gyroscope, information receiving system (for example, global positioning system (GPS) receiver), radio network interface etc.) and the like.Alternatively, or additionally, collision detecting system 101 can Include the control system 105 of (or being communicably coupled to) vehicle 102.As it is used herein, vehicle " control system " refers to A kind of system for being input to vehicle for providing control, control, which inputs, for example to be turned to, brakes, accelerates etc..Collision detecting system 101 can include the part of vehicle control system 105, and the part is as judging speed, acceleration, braking ability (for example, anti- Antilock brake system) etc. sensor.Collision detecting system 101 can be configured to supervisor control input 105, to predict the change of vehicle kinematics characteristic (for example, the accelerator and/or braking input prediction that are controlled based on operator are added The variation of speed).Although there is provided herein the particular example of sensing system, the present invention is not limited to this aspect, and can be with It include any sensing system 110 for including any kind of and/or quantity sensor.
Sensing system 110, which can be configured as, provides sensing data to other vehicles 103,104 and/or from other vehicles 103,104 receiving sensor data.In some embodiments, sensing system 110 can coordinate sensor behaviour with other vehicles Make;For example, sensing system 110 can serve as the transmitter for one or more of the other sensing system (not shown), and/or it is anti- It is as the same.
Sensing system 110 can obtain the information about object in the detection range 112 of vehicle 102.As made herein , " detection range " of sensing system 110, which refers to sensing system 110, can obtain (and/or being configured as obtaining) object The range of information.As it is used herein, the detection range 112 of sensing system 110 can refer to the detected envelope of sensing system 110 Line.In some embodiments, (sensing system 110 can be with than the maximum detection range of sensing system 110 for detection range 112 Steadily obtain object information maximum magnitude) be restricted it is more.Detection range 112 can be arranged by user configuration and/or Can automatically be judged according to the operating condition of vehicle 102, operating condition such as car speed and/or direction, other objects speed, Weather conditions etc..For example, detection range 112 can be in response to vehicle 102 with lower speed travels and reduces, and can ring It should be in vehicle 102 with higher speed travels and expands.Similarly, detection range 112 can be based on its near vehicle 102 His object of which movement characteristic.For example, detection range 112 can in response to detect another vehicle 103 relative to vehicle 102 with compared with High speed travels and expands, such vehicle 102 is travelled with low velocity.
In some embodiments, sensing system 110 may include orientation sensor (for example, beam forming radar, phased Array etc.).Collision detecting system 101 can shape in response to operating condition and/or control the detection range of sensing system 110 112.For example, when vehicle 102 at a high speed moves forward, detection range 112 can be oriented towards the front of vehicle 102;Work as vehicle 102 when turning to, and detection range 112 can be adjusted along the direction of steering;Etc..
Collision detecting system 101 can utilize communication module 130 to coordinate with other vehicles.Communication module 130 may include, But it is not limited to, it is one or more:Radio network interface, cellular data interface, satellite communication interface, electrical-optical network interface (example Such as, infrared communication interface), and the like.Communication module 130 may be configured to point-to-point with " ad hoc " network and/or Infrastructure network 132 is communicated, and is led to IP network (for example, internet, LAN, wide area network etc.) Letter.
In some embodiments, collision detecting system 101 may be configured to other vehicles (for example, other senses Examining system and/or other collision detecting systems) coordinated.Coordination may include from other entities (for example, other vehicles 103, 104) it obtains sensing data and/or provides and give other entities by the sensing data that sensing system 110 obtains.Coordinating can be into One step includes shared collision detection data, each section, collision detection data and/or alarm of collision detection model 122 etc. Deng.
Coordination can enable collision detecting system 101 to obtain and (example except the detection range of sensing system 110 112 Such as, expand the detection range 112 of collision detecting system) the related sensing data in region.Similarly, collision detecting system 101 can obtain the related sensor in region (for example, the region blocked by other objects) that cannot be accessed with sensing system 110 Data.For example, as depicted in figure 1, the position of vehicle 103 can prevent sensing system 110 from reliably obtaining and region 125 Related sensing data.Collision detecting system 101 can obtain the sensing data in relation to region 125 from another source, another The sensing system 113 of source such as vehicle 103 and/or the sensing system 114 of vehicle 104.As described below, coordinate can be with for sensing data Further comprise judge and/or optimize movable information (for example, vector component), and judge and/or optimization object position (such as Pass through triangular measuring transducer data), size, angular range, dependence of angle range, direction etc..
Collision detecting system 101 can also be configured to provide the sensing data obtained by sensing system 110 to such as Other entities of vehicle 103,104 etc.Collision detecting system 101 can make sensing data can be via communication module 130 It obtains (for example, sensing data can be propagated).Alternatively, or additionally, collision detecting system 101 can in response to from its The request (for example, passing through point-to-point communication mechanism) of his entity provide sensing data (and/or with 101 phase of collision detecting system The other information of pass).
In some embodiments, collision detecting system can be configured with especially Coordination module 160 and other realities Body coordinated manipulation.For example, sensing system 110 can be can obtain it is related with the object in specific region 127 reliable , accurate information, but may not reliably obtain letter related with the object in other regions (for example, region 125) Breath.Collision detecting system 101 can be coordinated with other sensing systems 113 and/or 114 to be carried to these sensing systems 113,114 For sensing data related with the object in region 127.In exchange, other sensing systems 113,114 can be to collision Detecting system 101 provides sensing data related with the object in other regions in such as region 125 etc.This coordination can To include the detection of collision detecting system 101 (for example, by beam forming, steering or similar fashion) configuration sensing system 110 Range 112 to obtain the information about region 127 (do not include other regions), the information in other regions will by sensing system 113, 114 provide.
In some embodiments, collision detecting system 101 can coordinate sensor behaviour with other sensing systems 113,114 Make and/or configures.As described in more detail below, Coordination module 160 can configure sensing system 110 with:Serve as other sensing systems The transmitter (for example, in the configuration of bistatic and/or more bases sensor) of system 113,114;Receiver is served as to detect by one The sensor signal that a or a number of other sensing systems 113,114 are sent;It is combined with other sensing systems 113,114 and serves as combination Emitter/receiver;Etc..
Collision detecting system 101 may further include processing module 120, and processing module 120 can be used is by sensing System 110 obtains the information of (and/or being obtained from other sources) to detect potential collision.Processing module 120 may include one or Multiple processors, including but not limited to:General purpose microprocessor, microcontroller, logic circuit, ASIC, FPGA, PAL, PLD, PLA Etc..Processing module 120 can also include volatile memory, persistent machine readable storage medium 152 etc..It is persistent Machine readable storage medium 152 may include machine readable storage medium, which is configured as making processing Module 120:Operation and/or configuration sensing system 110, with other collision detecting systems coordinations (such as by communicating and/or assisting Mode transfer block 130,160), the potential collision of detection etc., as described herein.
Processing module 120 can be configured as the potential collision of detection.Processing module 120 can be used from any number of The infomation detection that source obtains potentially collides, which includes but not limited to:The sensing data obtained from sensing system 110; From other sensing systems (for example, sensing system 113,114) and/or with other sensing systems (for example, sensing system 113,114) Coordinate the sensing data obtained;The collision detection data obtained from other collision detecting systems;Pass through 130 (example of communication module Such as, from public safety entity, weather service station or similar means) information that receives;Etc..
Processing module 120 can use any suitable detection technique to detect potential collision.In some embodiments, Processing module 120 detects potential collision using collision detection model 122.As it is used herein, " collision detection model " refers to Be object kinematics characteristic model.Collision detection model may include, but be not limited to:The size of object, position, direction, Speed, acceleration (for example, shutdown rate), angular range, dependence of angle range etc..The kinematics of collision detection model is special Property can be (for example, relative velocity, acceleration etc.) relative to vehicle 102.Alternatively, collision detection model may be incorporated into vehicle In 102 kinematics characteristic and/or another reference system can be limited to (for example, the ginseng of GPS location, another vehicle 103,104 According to being) in.Collision detection model 112 can be used to infer and/or predict the kinematics characteristic of object, object in processing module 120 The kinematics characteristic of body may indicate that the potential collision (for example, object crosspoint in collision detection model), potential of object The time of collision, the impact velocity potentially collided, potential collision involved power, the possible result of collision etc..
Collision detection model 122 can also include letter related with current operational conditions such as condition of road surface, visibility Breath.For example, collision detection model 122 may include with operation surface the related information of situation (e.g., road), as road whether It is muddy, wet, ice-cold, accumulated snow etc..Processing module 120 can be manipulated using current operating condition information to estimate Object is especially to avoid the probability (and/or capacity) (for example, turning, deceleration etc.) potentially collided.
In some embodiments, collision detection model 122 can also include the information of prediction.For example, collision detection mould Type 122 may include the estimation of the size of object, weight etc..The information of the prediction can be used to judgement and can be used for judging The object momentum and other features of the potential result (for example, object of which movement characteristic after the potential collision occurred) of collision. For example, in the example of fig. 1, collision detecting system 101 can determine that collided between vehicle 103 and 104 it is potential as a result, this is potential Result may include the kinematics characteristic that the estimation after potentially colliding has occurred for vehicle 103,104.
Collision detection model 122 can also include collision elimination information, and collision elimination information may include on how to keep away Exempt from the instruction potentially collided detected by processing module 120.Collision elimination information can be related to vehicle 102 and/or its Its vehicle 103,104.For example, collision elimination information may include for avoiding potentially colliding between vehicle 103 and 104 Information.Collision elimination information can also include that vehicle 102 is avoided as collision is related to (for example, what is avoided collision is latent In result) information.
Collision detecting system 101 can be configured as in response to detecting potential collision and take one or more action. Such action may include but be not limited to:Activating alarm has potential collision to the operator of vehicle 102;Judge the row of collision elimination It is dynamic;Judge the potential result of collision (for example, estimation object of which movement characteristic after an impact);Judgement avoids the row of potential result It is dynamic;Automatically take one or more collision elimination action;Collision detection model 122 is sent to other vehicles (and/or its portion Point);Coordinate the response to potentially colliding with other vehicles;Contact emergency service entities etc..
Coordination module 160 can make other vehicles 103,104 (via communication module 130) that can use collision detection model If 122 stem portion.Alternatively, or additionally, Coordination module 160 can be configured as receives from other vehicles 103,104 Collision detection data.Collision detection data may include sensing data, collision detection model (and/or its part), vehicle fortune Phoronomics characteristic, collision detection result, avoidance information etc..
Collision detecting system 101 may include and/or can be communicatively coupled to the man-machine interface part 107 of vehicle 102.People Machine interface unit 107 may include, but be not limited to:Visual display component is (for example, display screen, head-up display or similar Object), acoustic component (for example, vehicle audio frequency system, loud speaker, or the like), haptics member (for example, electronic-controlled power steering control, power Reponse system, or the like) etc..
Collision detecting system 101 can use 107 activating alarm of man-machine interface part to have potentially to the operator of vehicle 102 Collision.The alarm may include one or more in following items:Audible alarm (for example, alarm), visual alarm, haptic alerts, Or similar alarm.In some embodiments, alarm may include collision elimination instruction to help operator to avoid potentially colliding (and/or being related to the potential collision result of other vehicles).Avoiding instruction can be used as one or more audible instructions, vision to carry Show (for example, being shown on head-up display), haptic stimulus, or the like provide.For example, collision elimination instruction can be passed through vehicle Speaker system (for example, the instruction of " left-hand bend ") audibly transmit, by icon in display interface (for example, turning Icon, braking icon, release braking icon etc.) visually transmit, and/or by touch feedback (for example, vibration surface, actuating Control input etc.) transmission.Although the specific example of alarm is described herein, the invention is not limited in this aspect, But it may be adapted to be incorporated to any suitable man-machine interface part 107.
As discussed above, collision detecting system 101 can be configured as in response to detecting potential collision and adopt Take one or more automatic collision avoidance behaviors.Collision elimination action may include but be not limited to:Accelerate, slow down, turning, activating vehicle System (for example, lighting system, loudspeaker etc.) etc..Therefore, collision detecting system 101 can be communicatively coupled to vehicle 102 control system 105, and control input can be provided it.Automatic collision avoidance behavior can be configured to prevent potential Collision, avoid potentially colliding result (for example, being related to the collision of other vehicles) etc..Automatic collision avoidance behavior can be with Other vehicle cooperations determine.For example, collision detecting system 101 can be cooperated with vehicle 103, to judge collision elimination action (or instruction), so that both vehicle 102,103 is avoided that potential collision, while also avoiding potentially touching between each other It hits.
Collision detecting system 101 may be configured to execute automatic collision avoidance behavior, without vehicle operators License and/or intervention.Alternatively, or additionally, collision detecting system 101 can be before taking automatic collision avoidance behavior Ask the license of operator.Human-machine interface module 107 may include be configured to that vehicle operators is enable to indicate license one Or multiple input, such as button (for example, steering wheel), audio input, vision input or similar input on a control plane.License It can ask and/or can be asked before detecting potential collision in advance when detecting potential collision.License can be with Judge in advance after the time judged in advance and/or in response to certain condition (for example, potential collision avoided after, Vehicle 102 is shut down rear etc.) and terminate.Therefore, collision detecting system 101 can be configured as periodically re-request vehicle The license of operator.For example, collision detecting system 101 request license can execute automatic collision when starting vehicle 102 every time Avoidance behavior.
Collision detecting system 101 may be configured so that automatic collision avoidance behavior cannot be by vehicle operators override (override).Therefore, collision detecting system 101, which can be configured as, " cuts the part of vehicle operators and control system 105 It is disconnected ".After automatic collision avoidance behavior completion and/or collision detecting system 101 judge that potentially collision has avoided, to vehicle The access of control system 105 can restore.Collision detecting system 101 can be configured as vehicle operators and all vehicles Control operation " cut-out ".Alternatively, vehicle operators can be allowed to carry out limited access to control system 105.For example, Control system 105 can receive operator do not interfere automatic collision avoidance behavior and/or not with automatic collision avoidance behavior phase The input of conflict (for example, vehicle operators can be allowed to provide limited steering input, but being unable to acceleration/deceleration).
Alternatively, collision detecting system 101 may be configured so that vehicle operators being capable of override (override) One or more of automatic collision avoidance behavior.In response to override, collision detecting system 101 can stop executing automatic collision Avoidance behavior, and vehicle operators can be returned control to.Override may include that vehicle operators provide input directly to control System 105 (or other man-machine interface parts 107).In another example, collision detecting system 101 can pass through actuated vehicle 102 control (for example, steering wheel rotation) is come to execute automatic collision avoidance behavior and override may include that vehicle operators are supported Anti- or counteracting automatically controls actuating.
In some embodiments, collision detecting system 101, which can try to be the first, disposes and/or can be configured to deployment of trying to be the first The security system of vehicle 102.For example, collision detecting system 101 may be configured to dispose one before the impact of collision occurs A or multiple air bags.Collision detecting system 101 can be configured to make the deployment of security system to adapt to an impending collision (according to the place by the impact that collides on vehicle 102, carrying out security system deployment).
Collision detecting system 101 detect it is potential collide and take any of above action after, can continue to monitor The kinematics characteristic of object.Collision detecting system 101 can respond the kinematics characteristic of variation (for example, one or more collisions As a result, the action etc. of other vehicles 103,104) continue to change and/or update above-mentioned action.
Collision detecting system 101, which may further include, is configured as storage and (and/or the vehicle of collision detecting system 101 102) memory module 150 of function, configuration and/or the related information of mode of operation.Memory module 150 may include persistence Machine readable storage medium 152, such as hard disk, solid-state memory, optical storage medium or the like.It is alternatively or attached Adding ground, memory module 150, which can be configured as, stores data in Network Accessible Service device 154 (by communication module 130), Such as cloud storage service device or the like.
Memory module 150 can be configured as be stored with cut-off 102 any information, may include, but be not limited to: The kinematics characteristic of vehicle 102, operator control input (for example, turn to, braking etc.), collision detection model 122 (for example, its Kinematics characteristic, collision detection of its vehicle etc.), in response to detecting potentially collide the action taken, operator to automatic The override of collision elimination action, the communication etc. with other vehicles.Therefore, memory module 150 can serve as detailed description vehicle The black box of 102 operating conditions and/or other collision ambient conditions.
Memory module 150 may be configured to prevent from carrying out unwarranted access and/or modification to the information stored. Therefore, memory module 150 can be configured to the information encryption for storage.Memory module 150 may be provided for testing Demonstrate,prove the authenticity of stored information;For example, memory module 150 can be configured as is encrypted label to the information stored Name.
Coordination module 160 can be configured as coordinates collision detection behaviour with other entities of such as vehicle 103,104 etc Make.Coordination may include the sensor configuration to cooperate, shared sensor data, shared processed information etc..Coordinating can be with It establishes on the basis of ad hoc (for example, one or more vehicles 102,103 and/or 104 can broadcast collision detection model 122 Part and/or other collision detection data), it can be established in response to request (for example, coordination of vehicle and vehicle), or similar feelings Shape.In some embodiments, the coordination of collision detecting system can be shared according to payment, interaction or other exchange and judge.
Fig. 2A is the block diagram 200 for another embodiment for describing collision detecting system 101.The sense of collision detecting system 101 Examining system 110 is possibly can not access region 225.In the example of Fig. 2A, due to the position of vehicle 103 and 144 so that region 225 is inaccessible.Correspondingly, Coordination module 160 can be configured as sends request with regard to the sensing data in relation to region 225 223 (passing through communication module 130).
In some embodiments, request 223 can be sent in response to other conditions.For example, collision detecting system 101 can Can be inactive (for example, it may be not running) not including sensing system 110 and/or sensing system 110.Therefore, Collision detecting system 101 can rely on the sensing data from such as other sources of vehicle 103 etc, potentially be touched with detection It hits.Alternatively, collision detecting system 101 can ask sensing data from all available sources, include the biography about region Sensor data, sensing system 110 can obtain sensing data from the region.Collision detecting system 101 can use redundancy Sensing data, to verify and/or improve the sensing data obtained by sensing system 110.
Request 223 may include request sensing data related with specific region 225 and/or may include asking all available Sensing data.Request 223, which may be directed to special entity (for example, vehicle 103) and/or can be broadcast to, disclosure satisfy that Any source of request 223.Correspondingly, in some embodiments, request 223 may include establishing communication link (example with vehicle 103 Such as, it broadcasts the message via one or more network discoveries and finds vehicle 103, execution Handshake Protocol etc.).
Request 223 may include that compensation quotation accesses requested sensing data to exchange for.Correspondingly, request 223 can be with Including negotiating, to establish acceptable exchange (for example, acceptable payment, interaction data are shared or similar situation).It can root It is assisted automatically according to the strategy judged in advance, rule and/or the threshold value that are stored on persistent machine readable storage medium 152 Quotient.Alternatively, negotiation may include the holder with vehicle 102,103 and/or other entities (for example, by network 130) Interaction.For example, vehicle 102,103 can with have agreed to the tissue of shared collision detection data (for example, Automobile Associations, insurance are public Department or allied organization) it is associated.In some embodiments, the sensing system 113 of vehicle 103 can be configured as automatic wide The data for broadcasting sensor make the explicit request 233 to sensing data be unwanted.
Vehicle 103 can provide sensing data 227, and sensing data 227 can be received via communication module 130. Sensing data 227 may include by the sensing system 113 of vehicle obtain (or (not by other one or more vehicles or source Show) obtain) sensing data.Collision detecting system 101 can detect potential collision using sensing data 227, As described above.For example, processing module 120 produces the collision detection module for including sensing data 227.In some embodiment party In formula, in addition to (and/or replacement) sensing data 227, vehicle 103 can also provide auxiliary data 229.Auxiliary data 229 can To include processed sensing data, such as " itself data " in relation to vehicle 103 may include, but be not limited to:Identity (identification), vehicle dimension, direction of traffic, the weight of vehicle, position (absolute position or relative to vehicle 102 Position), speed (for example, valocity indicator section), acceleration (for example, accelerometer readings), time reference (for example, time synchronization believe Number) etc..Processing module 120 can use auxiliary data 229 by sensing data 227 transform to vehicle 102 reference system or Other suitable reference systems, as described above.Transformative transducer data 227 can also include correction sensing data (for example, will The sensing data obtained by sensing system 110 is aligned with sensing data 227).Correction may include sensing relative to others Device data sample and/or time shift and/or time adjustment sensing data 227.In this way, correction sensing data 227 may include school The sensing data of positive time label infers sensing data (for example, being pushed away from speed and/or direction inferred position, from acceleration Calculate speed etc.), the sensing data etc. of time shift.
In some embodiments, Coordination module 160 can be configured to provide collision detection data 222 and arrive vehicle 103.It touches It hits detection data 222 may include, but is not limited to:Collision detection model 122 (and/or its part) is obtained by sensing system 110 It is sensing data, information related with the potentially collision detected by collision detecting system 101, related with vehicle 102 auxiliary Help data etc..
Therefore, in some embodiments, collision detecting system 101 can be configured as aggregation from multiple sources (for example, Sensing system 110, vehicle 103 etc.) sensing data, use sensing data (and/or auxiliary data, if any) Collision detection model 122 is generated, and provides collision detection model 122 (via collision detection data 222 are sent) and gives other vehicles 103、144.Therefore, the vehicle in the communication range of vehicle 102 (communication range of communication module 130) can utilize collision inspection Survey model 122.In some embodiments, one or more vehicles can be configured as retransmission and/or re-broadcast and touch Hit detection data 222 arrive other vehicles, other vehicles can extend collision detecting system 101 effective communication range (for example, As in ad hoc radio network configuration).
In some embodiments, collision detecting system 101 can be configured as offer and/or storage monitoring data 272 To one or more persistent storage systems, server 154, persistent machine readable storage medium such as network-accessible 152, or the like.Monitoring data 272 may include but be not limited to:Collision detection data 222 are used by collision detecting system 101 Sensing data (sensor information obtained using sensing system 110, the sensing obtained from other sources such as vehicle 103 Device information etc.), collision detection model 122, information related with the potentially collision detected by collision detecting system 101, The collision warning that is generated by collision detecting system 101, with vehicle 102 and/or other vehicles 103,144 related diagnostic messages, Operating condition, place (for example, GPS coordinate), temporal information etc..Diagnostic message may include but be not limited to:About other vehicles 103,144 whether include collision detecting system and/or whether be configured to collision detecting system 101 coordinate collision detection finger Show, whether can be with collision detecting system 103 (for example, capableing of the data of Receiving collision detection) about other vehicles 103,144 The instruction communicated, response, which is taken at, detects that potential collision and/or alarm other vehicles have potential collision and take Action, etc..
Monitoring data 272 can be used to rebuild the condition around collision, if vehicle 102,103 and/or 144 is in collision Before, during and/or after kinematics characteristic.Monitoring data 272 can also include potentially colliding (example in response to detecting Such as, operator controls input, automatic collision avoidance behavior etc.) action taken by vehicle 102,103 and/or 144 (if If having) related information, etc..In some embodiments, monitoring data 272 may include the time label and/or it is other Auxiliary data so that the place of monitoring data 272 and/or time can be determined.
Monitoring data 272 can also include vehicle identification information (for example, the letter of identification vehicle 102,103 and/or 144 Breath), such as vehicle identification number (VIN), license plate information, log-on message, vehicle model, model and color mark, etc..Vehicle mark Supplementary number can be exported from the sensing data that is obtained by sensing system 110 (or other vehicles 103) and/or can be used as by knowing symbol It is received according to from other one or more vehicles;Such as vehicle 102,103 and/or 144 can be configured as provide identification information to Other vehicles (for example, via network, near-field communication,Equal broadcast identification informations).In other examples In, one or more of vehicle 102,103 and/or 144 may include radio frequency identifiers (RFID), can pass through sensing system 110 RFID reader inquiry.Other objects may include identification information, for example, pedestrian, building, road feature (for example, Street sign indicator, traffic lights etc.) etc..These objects can be configured as provide identification information to vehicle 102,103 and/or One or more of 144, vehicle 102,103 and/or 144 identification information can be incorporated into collision detection model 122 and/or In monitoring data 272.For example, a people, which can carry, is configured as (for example, passing through RFID) broadcast and/or offer identification information The article of (name of such as the people, address, anaphylaxis, emergency contact information, insurer, licence number).Similarly, road Feature can be configured as offer identification information.For example, signal of communication can be configured as broadcasting sites information (for example, signal Place), status information (for example, red light, green light etc.) etc..
As described above, in some embodiments, monitoring data 272 can be secured against 272 quilt of monitoring data Modification;For example, collision detection data 272 may include digital signature, can be encrypted.Monitoring data 272 can be made by protection The authenticity and/or source for obtaining monitoring data 272 can be verified.
In some embodiments, Network Accessible Service device 154 can be configured as storage from multiple and different vehicles Monitoring data 272.Crash feature data 272 can be received by network 132 and/or from persistent machine of vehicle (for example, vehicle 102) extracts in readable storage medium storing program for executing 152.Network Accessible Service device can temporally, place, vehicle identification Etc. index and/or arrangement monitoring data 272.Network Accessible Service device 154 can be based on selection criteria (for example, the time, Point, identity etc.) monitoring data 272 is provided to requestor.In some embodiments, Network Accessible Service device 154 can be Monitoring data 272 is provided to valence (for example, payment, interactive access etc.).
In some embodiments, collision detection data 222 can be in response to detecting that collision is provided to emergency services reality Body.Collision detection data 222 can be used for judging and/or estimate collision movement characteristic (for example, impact velocity, impact vector Deng), it can be for the last stop of the vehicle involved by the power involved by estimating in collision, possible damage status, collision Place (or Vehicular occupant) etc..
Collision detecting system 101 can be configured to make a response to the request of collision detection data 222.One In a little embodiments, as described above, the sensing that collision detecting system 101 can will be obtained in response to request by sensing system Device data are supplied to other one or more vehicles (for example, vehicle 103).In another example, collision detecting system 101 can To provide collision detection model 122 (and/or its part) to other vehicles and/or entity.Collision detecting system 101 can by with It is set to and arrives collision detection data (such as collision detection model 122 and/or acquired sensing data) storage via network 132 Network Accessible Service device 154, emergency service entity, traffic control entity or similar solid.
Fig. 2 B are the block diagrams 201 for another embodiment for describing collision detecting system 101.In some embodiments, it touches It hits detecting system 101 and may be configured to combination sensor data to judge the different components of object of which movement characteristic (for example, speed The different components of degree, acceleration etc.).As described above, movable information can be represented as in specific coordinate system and/or reference The vector of system's (for example, cartesian coordinate system, polar coordinate system or similar coordinate system).The vector can be relative to specific ginseng According to being (for example, vehicle 102,103 etc.).Vector can be deconstructed into one or more components;In cartesian coordinate system, Vector may include x, y, and/or z-component;In polar coordinate system, vector may include r, θ (range and angle) and/or z points Amount;Etc..In some embodiments, the ability of the certain components of sensing system judgement object of which movement characteristic can especially take The certainly position and/or direction in sensing system relative to object.For example, Doppler radar can be obtained about object of which movement spy The data of certain components of property, rather than other, it is specifically dependent upon direction and/or position of the Doppler radar relative to object.
As shown in Figure 2 B, the sensing system 110 of collision detecting system 101 can position and/or determine relative to vehicle 204 To so that sensing system 110 can be obtained about (e.g., " x-axis " component, corresponding to the model of " edge to edge " of component 260 Enclose, speed etc.) object of which movement characteristic.But sensing system 110 can be that component 261 can not be judged (for example, " y Axis " component corresponds to range, the speed of " forward " etc.).For example, sensing system 110 may include Doppler radar, it is more General Le radar is effective at 260 aspect of judgement component, but is not effective at 261 aspect of judgement component.Vehicle 203 it is another A sensing system 213 can be the object of which movement characteristic that can be obtained about component 261, but can be can not obtain about The object of which movement characteristic of component 260.
The Coordination module 160 of collision detecting system 101 can be configured as with 203 shared sensor data 221 of vehicle, May include that (the related component 260) sensing data obtained by sensing system 110 is provided and/or is received by vehicle 203 (related component 261) sensing data that sensing system 213 obtains.Coordination module 160 can be configured to ask to access by vehicle 203 sensing datas obtained, as described above.Coordination module 160 can be further configured to provide to being by sensing The access for the sensing data that system 110 obtains, as described above (for example, to exchange the sensing data to being obtained by vehicle 203 Access, payment etc.).Sensing data 221 can be shared via communication module 130, as described above.
The processing module 120 of collision detecting system 101 " can merge " by sensing system 110 obtain (and with component 260 It is related) sensing data with (and related with the component 261) sensing data obtained from vehicle 203 to form vehicle The more complete and accurate model of 204 kinematics characteristic.Merge sensor data may include that transformative transducer data arrive In common coordinate system and/or reference system, weighting sensing data etc..By using Component Analysis or other processing appropriate Technology to judge the kinematics characteristic of object and/or can be used these sensing datas to optimize with combination sensor data Other sensing datas.In the example of Fig. 2 B, merge sensor data may include using the biography obtained by sensing system 110 Sensor data are with the component (component 260) of the object of which movement characteristic (for example, motion feature of edge to edge) judged, and use The sensing data obtained by vehicle 203 is to judge the object of which movement characteristic in component 261 (for example, forward movement is special Sign).The information of the range and/or angle that can further comprise combination sensor data 221 is merged to judge and/or optimize vehicle 204 position relative to vehicle 102 and/or 203, may include to the range of sensing data and/or angle information into Row triangulation.Similarly, merge sensor data may include the size for judging object, direction, angular range, angle dependence Property range etc..For example, the range information from different sensors can be used to judgement position and/or angle is orientated (for example, logical It crosses and is analyzed using crossover range radius).
Combination sensor data may also include weighting sensing data.Sensing data can be according to the precision of the data (for example, signal-to-noise ratio), relative to certain objects sensing data direction and/or position etc. be weighted.
The combination of sensing data especially can be according to the relative position and/or side of sensing system 110 and/or vehicle 203 Always judge, as described above.It will be understood to those of skill in the art that other sensor orientations may result in different type Sensing data combination.Fig. 2 C are the block diagrams of another embodiment of collision detecting system.In the example of Fig. 2 C, sensing System 110 and vehicle 203 are relative to vehicle 204 in different directions.As a result, sensing data can melt in different ways It closes.For example, component 260 can be by by the sensing data obtained by sensing system 110 and the sensor obtained by vehicle 203 Data combination judges (not with the sensing data that is mainly obtained by sensing system 110 as in the example of Fig. 2 B Together).The relative contribution of different sensing datas especially can according to the relative direction of vehicle 102 and 203 (for example, angle 262, 263) depending on.The combination can be dynamically responsive to the relative position of vehicle 102,203 and/or 204 and/or the variation (example in direction Such as, the variation at angle 262 and/or 263) and update.
In some embodiments, merge sensor data may also include weighting sensing data.The phase of sensing data The signal-to-noise ratio of sensing data, sensing data can correspond to for the position of specific object and/or direction etc. to weight. Therefore, weight can be applied on the basis of each object.Sensing is passed through for component 260 referring back to the example of Fig. 2 B The weight for the sensing data that system 110 obtains may be relatively high (is ideally positioned relative to each other because of sensing system 110 to measure Component 260), and for component 261, the weight of sensing data may be low (divides for measuring because of sensing system 110 The unfavorable position of amount 261).
Fig. 3 is the flow chart of an embodiment of the method 300 for coordinating collision detection.Method 300 can be by touching Hit detecting system to implement, as described herein.In some embodiments, method 300 can be presented as be stored in it is persistent Instruction (for example, persistent machine readable storage medium 152) on machine readable storage medium.Described instruction can be configured At one or more of the step of making processor execute method 300.
In the step 310, method 300 starts and initializes, this may include that load machine readable is deposited from persistent The instruction and access of storage media and/or initialization resource, for example, sensing system 110, processing module 120, communication module 130, Coordination module 160 etc..
Step 320 may include obtaining sensing data in vehicle 102.The sensing data of step 320 can be from vehicle Source (such as another vehicle) outside 102 obtains (for example, the sensing data obtained by the sensing system 113 of vehicle 103).Sensing Device data may be in response to ask and/or negotiate to obtain, as described above.Alternatively, sensing data can not asked In the case of obtain (for example, the sensing data obtained in step 320 can be from original broadcast, as described above).In some realities It applies in mode, step 320 may further include from the source of sensing data and receive auxiliary data.Auxiliary data may include closing " itself data " data in sensing data source, size, weight, direction, position, kinematics characteristic etc..
In some embodiments, step 320 may include the sensing data that will be obtained in step 320 with from other sources Other Data Fusion of Sensor that (for example, sensing system 110 of collision detecting system 101) obtains.Correspondingly, step 330 can To include that sensing data is transformed to suitable coordinate system and/or reference system (for example, using vehicle 102 and/or sensor number According to source auxiliary data) in.Merge sensor data may also include weighting and/or correction sensing data, this may include Time shift sensing data, deduction sensing data etc., as described above.
Step 330 may include generating collision detection model using the sensing data obtained in step 320.Generation is touched It may include using the kinematics characteristic of sensing data judgement object, the position of such as object, speed, acceleration to hit detection model Degree, direction etc..It generates collision detection model and may further include the size of judgement and/or estimation object, weight etc..Step Rapid 330 may include combination sensor data to judge and/or optimize the quantity of one or more components.For example, step 330 can To include:To in sensing data range and/or angle information carry out triangulation to judge the position of object, using intersection Range radius analyze to judge angle direction, merge sensor data to judge the different components of the kinematics characteristic of object, etc. Deng.
Step 330 can also include in transformation collision detection model to suitable coordinate system and/or reference system.For example, Step 330 may include generating collision detection model in specific reference system (for example, relative to vehicle 102).Step 330 can be with Including in transformation collision detection model to other coordinate systems and/or reference system.For example, step 330 may include transformation collision inspection It surveys in model to the reference system of another vehicle (for example, vehicle 103).Shift step 330 (and/or step 320) can be based on Step 320 obtain sensing data source position, speed, acceleration and/or direction and/or specific reference system position, Speed, acceleration and/or direction.
In some embodiments, step 330 can also include using collision detection model inspection potentially collision and/or In response to detecting that one or more action are taken in potential collision, as described above.Method 300 terminates in step 340, until Additional sensing data is obtained in step 320.
Fig. 4 is the flow chart of another embodiment of the method 400 for coordinating collision detection.In step 410, method 400 start and initialize, as described above.
Step 412 may include using the acquisition sensing data of vehicle sensing system 110, as described above.One can be used The sensing system including any amount of different sensor of a or multiple and different types carrys out the sensor number of obtaining step 412 According to.
Step 414 may include asking sensing data from external entity (for example, other vehicles 103).Step 414 is asked Asking can fail to capture specific region (for example, region 125,225) in response to the sensing data of determination step 412, fail It captures certain component motions (for example, certain components 261 of the kinematics characteristic of object) of object etc. and proposes.It is alternative The request on ground, step 414 can be not considering to propose in the case of the property for the sensing data that step 412 obtains.Ask The sensing data asked can be used to enhance and/or optimize the sensing data obtained in step 412 and/or be obtained from other sources The sensing data taken.
In some embodiments, the request of step 414 may be sent to that special entity (for example, specific vehicle 103). Correspondingly, step 414 may include:The communication with entity is established, may include finding entity (for example, passing through one or more Broadcast message);Establish the communication link with entity;Etc..Alternatively, the request of step 414 can be not directed to any spy Fixed entity, but any entity for being capable of providing sensing data can be broadcast to.
Request can identify discussed specific region (for example, region 125,225).The region discussed can be relative to Vehicle 102 (requestor) and/or another reference system are specified.Correspondingly, step 414 may include that will become with related information is asked It changes in another coordinate system and/or reference system, as described above.Alternatively, or additionally, request can be identified and be discussed Object and/or request in specific direction and/or the data of the related object of position acquisition.Requested data can be used for The not getable component motion of sensing system 110 of judgement and/or optimization vehicle 102, as described above.
The request may include exchanging the quotation of access sensors data for.The quotation may include payment, bid, interaction visit Ask, collision detection data or other to valence.Therefore, in some embodiments, step 414 may include using in following item One or more agree acceptable exchange:The strategy that judges in advance, rule, threshold value etc..Step 414 can also include Receive from requestor, the source of sensing data, and/or another entity (for example, association, insurance company, or the like) hold It converts, as described above.
Step 422 may include obtaining requested sensing data using communication module 130, as described above.Although side Method 400 depicts request step 414, but in some embodiments, it may not be necessary to the request step 414.For example, at some In embodiment, sensing data can be provided freely (for example, broadcast) so that can be in step 422 in no explicit request In the case of obtain sensing data.Step 422 may include the acquired sensing data of transformation, as described above.
Step 432 may include being existed using the sensing data and/or use that are obtained by using vehicle sensing system 110 The sensing data that step 422 is obtained from other vehicles generates collision detection model.Generating collision detection model may include:Melt Sensing data (such as combination sensor data) is closed, merged sensing data is used to judge object of which movement characteristic, etc. Deng.Generate collision detection model can further comprise by collision detection model transform to one or more suitable coordinate systems and/ Or reference system.Step 432 can also include potentially being collided using collision detection model inspection, may include:It identifies potential Collision involved in object, judgement potentially collide time, judgement collision elimination action and/or instruction, send out one or Multiple alarms and/or notice etc..
Step 434 may include to one or more of the other entity (for example, the sensing data obtained in step 422 Source) access of the offer to collision detection data.Step 434 may include providing the collision detection model generated in step 432 A part give other one or more vehicles, provide one or more collision detection alarms to other vehicles, sensor is provided Data give other one or more vehicles etc..Step 434 may include send collision detection data to specific vehicle and/or Broadcast collision detection data.Collision detection data may include auxiliary information, as the position of vehicle 102 and/or kinematics characteristic, Temporal information etc. can enable recipient to convert collision detection data to other coordinate systems and/or reference system.One In a little embodiments, step 434 may include:Monitoring data 272 is provided and arrives Network Accessible Service device 154, storage is monitored Data 272 to persistent machine readable storage medium 152 on, etc..
Method 400 ends at step 440, until obtaining additional sensing data.
Although Fig. 4 depicts multiple steps of particular sequence, the present invention is not limited to this aspect;For example, vehicle 102 Can synchronously the collision detection model be being generated from another entity receiving sensor data, in step 432 in step 422 And/or sensing data is obtained using sensing system 110 while step 434 accesses collision detection data.
In some embodiments, collision detecting system 101 can be further configured to the sensing system with other vehicles Cooperating of uniting sensing system 110.Cooperating may include to be formed comprising sensing system 110 and other land vehicles one Or more base sensors of multiple sensing systems.As it is used herein, it includes two or more that " more base sensors ", which refers to, A spatially different sensor that can be configured as the sensing system synergistically operated.For example, one in sensing system Or it is multiple may be configured to emit corresponding detection signal, these detection signals can be by one or more of sensing system Receiver receives.Sensor synergism operation may include the one or more detection letters for coordinating to be emitted by one or more sensing systems Number (for example, beam forming, form phased array etc.).
Fig. 5 A depict the one of the collision detecting system 101 for the operation for being configured as coordinating with other sensing systems sensor A embodiment 500.In example 500, sensing system 110 includes detection signal projector 512 and receiver 514.Transmitter 512 may include radar transmitter, EO transmitters, acoustic transmitter, ultrasonic transmitter, or the like.Receiver 514 can be by It is configured to detect the detection signal of one or more returns.Therefore, receiver 514 may include one or more antennas, EO spies Survey device, acoustic receiver, ultrasonic receiver, or the like.
Collision detecting system 101 can be configured as with the sensing system of other vehicles (for example, sensing system 570 and/or 580) coordinate the operation of sensing system 110.Coordinate to may include forming more base sensors, which includes that sensing is System 110 and one or more sensing systems 570 and/or 580.
In some embodiments, collision detecting system 101 can be coordinated with another sensing system to obtain and feel The related information of object outside the detection range of examining system 110 and/or to extend the sensor number obtained by sensing system 110 According to.As it is used herein, " outside the detection range of sensing system 110 " object refer to sensing system 110 cannot be reliably Any object for obtaining surrounding information may include, but be not limited to:The object of detection range beyond sensing system 110 Body, one or more motion features that object is judged by the object of other object masks or blocking, in prevention sensing system 110 Position and/or direction (for example, as shown in Figure 2 B) object etc..Therefore, sensing data it is insufficient reliable and/ Or it is considered as detection range in sensing system 110 that the object of one or more motion features cannot be reliably released from it Outside.As it is used herein, the sensing data of " fully reliable " refers to the sensor for meeting one or more reliability standards Data, the standard may include, but be not limited to:Noise threshold value, signal strength threshold, resolution ratio (for example, accuracy) threshold value, Or similar value.
It can be the vehicle 522 outside the detection range of sensing system 110 to be depicted in the example of Fig. 5 A;Vehicle 520 It " can stop " the detection signal of transmitter 512 so that receiver 514 cannot reliably obtain the data in relation to vehicle 522.It rings Should be outside the detection range of sensing system 110 in judgement vehicle 522, collision detecting system 101 can be configured as from one Or other multiple vehicles (for example, vehicle 505) ask the sensing data in relation to vehicle 522, as described above.Request can be rung Should be detection range in the sensing system of one or more of other vehicles vehicle in judgement vehicle 522 (or other regions) It is generated in interior and/or envelope.Alternatively, or additionally, the Coordination module 160 of collision detecting system 101 can be configured The sensing system 580 of vehicle 505 is accessed for request.It may include that request sensing system 580 is assisted with sensing system 110 that request, which accesses, Adjust operation.In the example of Fig. 5 A, Coordination module 160 can be configured to form more base sensors, more base sensor packets Include the sensing system 110 of the first land vehicle 102 and the sensing system 580 of land vehicle 505.More base sensors may include The detection signal receiver 514 of the detection signal projector 582 and sensing system 110 of sensing system 580.In response to request, hair Emitter 582 can be configured to emit the detection signal 587 for being configured as receiving by the receiver 514 of sensing system 110.It can be with It receives detection signal 587 rather than receives the detection signal that is emitted by the transmitter 512 of sensing system 110, or can be in addition to Receiving the detection signal emitted by the transmitter 512 of sensing system 110, also reception detection signal 587 (is emitted by transmitter 512 outside Detection signal be not shown in fig. 5, to avoid the details of fuzzy embodiment).In addition, collision detecting system 101 can be from Vehicle 505 obtains auxiliary data, may include, but be not limited to:Vehicle 505 relative to the direction of vehicle 102, position, speed, Acceleration etc.;Time synchronizing signal;Etc..Processing module 120 can explain received detection signal using auxiliary data 587, may include the referential, etc. that change detection signal 587 arrives vehicle 102, as described above.
As described above, coordination sensor operations may further include the generation of sensing system 110 and be configured to by one Or one or more detection signals that a number of other sensing systems 570 and/or 580 receive.For example, transmitter 512 can be configured To send detection signal (not shown) towards vehicle 522;Detecting signal can be received simultaneously by the receiver 584 of sensing system 580 Information related with vehicle 522 can be provided.Sensing system 580 can merge the biography received in response to spontaneous emission detection signal Sensor data and the sensing data received in response to the detection signal sent out by vehicle 102, as described above.Therefore, more Base sensor may include the transmitter 512,582 and receiver 514,584 of both vehicle 102 and 505.
As described above, it may include forming more base sensors and/or generating one or more to coordinate sensor operations Signal is detected, one or more detection signal is configured as obtaining and outside the detection range of one or more sensing systems The related information of one or more objects.Therefore, coordinate sensor operations may include along the direction that judges in advance guide one or Multiple detection signals and/or the two or more detection signals of coordination may include, but be not limited to:Beam forming, formation And/or configuration phased array etc..
Coordination module 160 can be configured as the operation for coordinating sensor, to enhance and/or improve one or more objects Data acquisition.For example, Coordination module 160 can ask sensing system 570 detects signal 575 to generate, detection signal 575 can For obtaining the more accurate sensing data in relation to vehicle 520;In the example of Fig. 5 A, sensing system 110 is to vehicle 520 The detection signal that (not shown) is sent out can partly be covered by another vehicle 521.In response to request, sensing system 570 is configurable Transmitter 572 detects signal 575 to send, and detection signal 575 can be configured as information of the acquisition in relation to vehicle 520 and can To be detected by the receiver 514 of sensing system 110.As described above, coordinate may further include obtained from vehicle 504 it is auxiliary Help data, auxiliary data can enable 101 processing detection signal 575 of collision detecting system, as described above.
Coordination module 160 can also be configured to adjustment and cooperate with other sensing systems 570 and/or 580 to give birth to by transmitter 512 At detection signal.In some embodiments, Coordination module 160 can configure transmitter 512 to respond from one or more The request (for example, the request of guiding in the detection signal in specific object and/or region) of other sensing systems.Fig. 5 B are depicted It is configured as coordinating another embodiment 501 of the collision detecting system 101 of sensor operations with other sensing systems.
In the example of Fig. 5 B, sensing system 101 can be with the relatively unobstructed visual field of vehicle 530 and 531.So And sensing system 580 can be stopped by vehicle 532 and/or 520.Collision detecting system 101 can be received through communication module 130 Coordinate the request of the operation of sensor.Collision detecting system 101 can configure sensing system 110 according to the request, may include The one or more detection signals 515 and 517 of transmitting;Signal 515 and 517 can be configured as obtain related vehicle 530 and/or It 531 exercise data and may be configured to detect by the receiver 584 of sensing system 580.Transmitting detection signal 515 And/or 517 may include emitting multiple individual detection signals, beam forming transmission device 512 one or more detection signals etc. Deng.Coordination module 160 can be configured to send auxiliary data to sensing system 580 by communication module 130, thus It can enable sensing system 580 that received detection signal 515 and/or 517 is transformed to the reference of sensing system 580 System, as described above.
Although Fig. 5 A and 5B describe detection, signal 575,585,587,515 and 517 is used as " point source ", and the present invention is unrestricted In this aspect.Detection signal disclosed herein may include multiple detection signals and/or detection signal cover.In addition, Although Fig. 5 A and 5B describe the sensing system 110 for including 514 the two of detection signal projector 512 and receiver, the present invention It is not only restricted to this aspect.It in some embodiments, can be with for example, sensing system 110 can be passive, and therefore Including receiver 514, but it does not include transmitter 512 (and/or detecting system transmitter 512 can not enable).Therefore, sensing system System 110 can passively obtain sensing data and/or in response to by such as above-mentioned sensing system 570 and 580 etc its The detection signal that its sensing system is sent obtains sensing data.Alternatively, sensing system 110 can be active, and And therefore, it may include detection signal projector 512, but do not include receiver 514 (and/or receiver 514 can not enable). Therefore, sensing system 110 can be from other sensing systems (for example, sensing system 570 and/or 580) in response to being emitted by them Detection signal acquisition sensing data.
Fig. 6 shows the collision detecting system 101 for being configured as coordination sensor operations and/or shared sensor data Another embodiment 600.As shown in fig. 6, sensing system 110 can obtain sensing data in relation to vehicle 620,630 and The sensing data in relation to vehicle 631 is obtained on limited extent;However, especially because the reason of vehicle 620,632 meeting of vehicle It is outside the detection range of sensing system 110.Another vehicle 604 may include that sensing system 570, sensing system 570 can obtain Sensing data in relation to vehicle 620,632 and the sensing data in relation to vehicle 631 is obtained on limited extent.Vehicle 630 can be outside the detection range of sensing system 570.
Coordination module 160 can be configured as the operation for coordinating sensing system 110 and 570.Coordination may include configuration sense Examining system 110 and 570, with acquisition and the related sensing data in region (and/or object) in its respective detection range, and It is obtained and the related sensor number of object and/or region outside its respective detection range by another sensing system 110 or 570 According to.
For example, in the example of fig. 6, it is related with region 619 to obtain that Coordination module 160 can configure sensing system 110 Sensing data may include configuring the detection that the transmitting of transmitter 512 is suitable for obtaining information related with the object in region 619 Signal.The configuration may include that beam forming, formation phased array, guiding and/or focusing one or more detect beams etc., as above Described in text.Therefore, coordination may include configuring sensing system 110 to obtain the region outside the detection range with sensing system 570 And/or the related sensing data of object (for example, vehicle 630).As a result, the detection signal of sensing system 110 can be guided Far from other areas and/or region (for example, region 679).
Coordination module 160 can be additionally configured to request sensing system 570 and obtain related region 679 (for example, vehicle 632) Sensing data.The request can identify the region 679 in the reference system of vehicle 604, as described above.In response, feel Examining system 570 can configure transmitter 572 to obtain the sensing data in relation to region 679, as described above (for example, guiding and/ Or focus detection signal is to region 679).
Coordination module 160 can be additionally configured to provide the sensing data in relation to region 619 (and/or object 630) to Vehicle 604 and/or the sensor in relation to region 679 (and/or object 632) is received from vehicle 604 by using communication module 130 Data.Coordination may further include auxiliary data of the transmission in relation to vehicle 102 and 604, such as position, speed, acceleration, direction Deng as described above.
In some embodiments, coordination can further comprise forming more base sensors, and more base sensors include sense Examining system 110 and sensing system 570.It may include configuration transmitter 512 and/or 572 to form more base sensors, to guide inspection Signal is surveyed to certain objects and/or the region studied.In the example of fig. 6, more base sensors can be configured as guiding Signal is detected to vehicle 631.As described above, sensing system 110 and sensing system 570 may be unable to obtain related vehicle 631 quality data (for example, reason is vehicle barrier).Sensing system can be made by forming more base sensors 570 and/or 110 can obtain the data of better quality.For example, transmitter 572 and 512 can configure the detection letter being transmitted from Number phase and/or amplitude so that the detection signal of the related vehicle 631 emitted by transmitter 572 is detected by receiver 514 The detection signal for the related vehicle 631 for arriving, and being emitted by transmitter 512 is detected by receiver 574.By receiver 574 and 514 The sensing data of acquisition can be merged to judge the more acurrate and/or more complete kinematics characteristic model of vehicle 631.As above Described in text, merge sensor data may include converting the sensing data between the reference system of vehicle 102 and/or 604.In this way, Coordination may include exchanging auxiliary data, as described above.
Coordination module 160 can be configured as in response to detecting sensing system 570 in the communication range of communication module 130 And ask configuration variation.Once setting up correspondence, Coordination module 160, which can be configured as, coordinates sensing system 110 and sense The operation of examining system 570, as described above.In addition, when additional vehicle sensing system is found, they can be included in In coordination (for example, to form the more base sensors for including three or more sensing systems).Alternatively, Coordination module 160 can be configured as needed to ask coordinated operation.For example, Coordination module 160 can be configured as in response to The one or more regions of judgement and/or object outside the detection range of sensing system 110 (for example, being covered by other objects) and assist Adjust the operation of sensing system.
In some embodiments, Coordination module 160 can be configured as in response to asking to be assisted with other sensing systems It adjusts (for example, request from the sensing system 570).For example, sensing system 570 can initiate request to coordinate sensor behaviour Make, and in response, Coordination module 160 can configure sensing system 110 according to the request.As described above, coordinate sensor The request of operation may include one or more quotations, such as pay, bid, to interaction data access quotation, to collision examine Access of measured data etc..
Fig. 7 shows that collision detecting system 101 is configured as coordinating the another of sensor operations and/or shared sensor data One example 700.As described above, Coordination module 160 can be configured as the communication range in response to detecting communication module 130 Interior other sensing systems and coordinate sensor operations.In response to detecting one or more of the other sensing system, Coordination module 160 may be configured to coordinate sensor operations, may include the inspection to form more base sensors, configure other sensing systems Survey signal, switching sensors data, exchange auxiliary data etc..
Fig. 7 depicts an example of ad hoc more base sensors, which includes sensing system 110,570 With 580.When detecting other vehicles comprising other sensing system (not shown), Coordination module 160 can be with these sensings System coordination is to expand more base sensors.More base sensors may include multiple transmitters 512,572 and/or 582 and/ Or multiple receivers 514,574 and/or 584.Coordination module 160 can configure transmitter 512,572 and/or 582, to guide by it The detection signal of transmitting is to interested specific region and/or object, as described above.Coordination may include coordinating by transmitter 512, the detection phase of signals, amplitude, and/or sequential of 572 and/or 582 transmittings are (for example, use beam forming and/or phased Array technique).Coordination can further comprise coordination reception device 514,574 and/or 584, to detect specific detection signal (example Such as, the phased array of receiver and/or antenna is formed).Therefore, the more bases formed by sensing system 110,570 and/or 580 Sensor may include any number of transmitter and any number of receiver (for example, N number of transmitter and M receiver).
Coordination module 160 can be configured to form multistatic radar, multistatic radar be configured as relative to one or Multiple objects obtain sensing data from multiple and different visual angles and/or direction.For example, in sensing system 110,570 and 580 It is each to can be configured as the sensing data obtained about vehicle 721.The detection emitted by transmitter 512,572 and/or 582 Signal can be detected by one or more of receiver 514,574 and/or 584.Collision detecting system 101 can merge by The sensing data that receiver 514 obtains and the acquisition of receiver 574 and/or 584 by other sensing systems 570 and/or 580 Sensing data, as discussed above, so that the kinematics characteristic modelling of vehicle 721.Fusion is in response to from relative to vehicle The sensing datas for the different detection signal acquisitions that 721 different location and/or direction are sent can make collision detection system System 101 can obtain the more complete and/or more accurate model of vehicle 721.
In some embodiments, communication module 130 can be configured as using ad hoc networking mechanism (for example, ad hoc road By mechanism) extend collision detecting system 101 communication range.For example, sensing system 580 can be in the straight of communication module 130 It connects outside communication range.As it is used herein, " direct communications range " refers to that communication module 130 can be directly logical with another entity Believe the range of (for example, the communication of entity to entity).Communication module 130 can be configured as by being located in direct communications range One or more entities routing communication.For example, collision detecting system 101 can be configured as it is to/from by sensing system 570 It route and communicates in sensing system 580.
Fig. 8 is the flow chart of an embodiment of the method 800 of the operation for coordinating sensing system.In step 810, side Method 800 can start and initialize, as described above.
Step 820 may include generating request to configure the sensing system of the second land vehicle.The request can pass through The collision detecting system 101 (for example, Coordination module 160 of collision detecting system 101) of one land vehicle 102 generate and/or from The collision detecting system 101 is sent.The request can detect the second land vehicle (directly in response to collision detecting system 101 Or indirectly, as described above it is) at it in communication range, in response to 101 determinating area of collision detecting system and/or object The detection range of sensing system 110 is outer and/or judges that the object and/or region are the inspections in the sensing system of the second land vehicle It surveys the inside of range or envelope and is generated and/or sends.Therefore, the sensing system of the second land vehicle of request configuration can be with It is proposed based on needs.The request may comprise compensating for quotation to exchange configuration sensing system for.The quotation may include, but unlimited In:Payment, bid, interaction data access etc..Step 820 can also include receiving quotation (or counter-offering), acceptance of offer etc., As described above.
In some embodiments, may include guiding sensing system to one or more in step 820 configuration sensing system Specified region and/or object.May include guiding the detection signal of sensing system to one in step 820 guiding sensing system Or multiple regions and/or object, may include the phase for adjusting the detection signal emitted by sensing system, amplitude, sequential, coke Point or other features.
Step 820 may further include configuration the second land vehicle sensing system with one or more of the other sensing Systematic collaboration operates, and may include forming more base sensors, which includes the sensing of the second land vehicle At least part of at least part of system and one or more sensing systems of other land vehicles.Therefore, step 820 Configuration may include more base sensor configurations, may include, but be not limited to:Beam forming, formation phased array etc..
Step 820 may further include the sensing system of the second land vehicle of configuration, to send sensing data to one Other a or multiple sensing systems and/or collision detecting system, such as collision detecting system 101 of the first land vehicle 102.It sends Sensing data may include exchange by using the second land vehicle sensing system acquisition sensing data, transmission it is related The auxiliary data of second vehicle transmits collision detection data (for example, a part for collision detection model 122, collision detection alarm Deng) etc., as described above.
Step 830 may include being generated using the sensing data that the sensing system by using the second land vehicle obtains Collision detection model (and as configured in step 820).Step 830 may include receiving by using the second sensing system Sensing data that receiver obtains and that collision detecting system 101 is transmitted to via communication module 130.It is alternatively or attached Ground, step 830 is added to may include the receiver 514 of sensing system 110 in response to the sensing system transmitting by the second land vehicle One or more detection signals and detection sensor data.Step 830 may also include reception and/or related second Vehicle of judgement Auxiliary data.Step 830 can also be including transformative transducer data to one or more of the other reference system and/or coordinate System provides collision detection data 222 to other sensing systems and/or vehicle, storage and/or transmission monitoring data 272 etc., such as It is described above.Step 830 can also include potentially being collided using collision detection model inspection, potentially being touched in response to detecting Hit generation and/or send one or more alarms, take one or more collision elimination action etc..Step 830 can be further If the stem portion including offer collision detection model is to other one or more vehicles, as described above.Method 800 ends to walk Rapid 840.
Fig. 9 is the flow chart of an embodiment of the method 900 of the operation for coordinating sensing system.In step 910 In, method 900 can start and initialize, as described above.
Step 920 may include the sensing system 110 in response to request configuration collision detecting system 101.The request can be with Coordinate the operation of sensing system 110 and one or more sensing systems of other land vehicles including request, and can pass through Communication module 130 is received.Request may include as the quotation to valence for exchanging configuration sensing system 110.Step 920 can be with Including receiving the quotation, generating and counter-offer, as described above.
Step 920 may include configuration sensing system 110 with other sensing system coordinated manipulations, may include, but not It is limited to:The sensor number that guiding sensing system 110 is obtained to specific region and/or object, offer by using sensing system 110 According to other one or more vehicles, be provided with cut-off 102 auxiliary data arrive other one or more vehicles, formation includes More base sensors of sensing system 110 etc..Correspondingly, step 920 may include configuring by the transmitter of sensing system 110 The detection signal that the 512 other sensing systems of collaboration generate, may include, but be not limited to:The adjustment detection phase of signal, amplitude, Sequential, focus or other characteristics, as described above.Step 920 can also include the receiver for configuring the sensing system 110 514 to receive the detection signal (for example, to form phased antenna array) generated by other sensing systems.
Step 930 may include utilizing the sensor obtained by using the sensing system configured as in step 920 Data generate collision detection model.Therefore, step 930 may include using by using two or more every steps 920 The sensing data that all sensing system of coordinated manipulation obtains generates collision model.Step 930 may include obtain in response to by The sensing data of one or more detection signals of one or more of the other sensing system transmitting is received by using one or more The sensing data of a other sensing system acquisitions receives auxiliary data etc. from one or more of the other sensing system.Step 930 can also include potentially being collided using collision detection model inspection, in response to detecting that potential collision is generated and/or sent out Give one or more alarms, take one or more collision elimination action etc..Step 930 can also include transformative transducer number According to one or more of the other reference system and/or coordinate system, provide collision detection data 222 to other sensing systems and/or Vehicle, storage and/or transmission monitoring data 172 etc., as described above.Method 900 ends at step 940.
In some embodiments, collision detecting system 101 can be configured as storage and/or send monitoring data 272, Monitoring data 272 may include for carrying out weight to surrounding collision environment before, during and/or after collision as described above The data built and/or modeled.Monitoring data 272 may include but be not limited to:Collision detection model 122 and/or its part (for example, The movable information of object), by using sensing system 110 obtain sensing data, from other sources obtain sensing data The auxiliary data of (for example, other sensing systems), vehicle 102 and/or other vehicles is (for example, direction, position, speed, acceleration Deng), by collision detecting system 101 detect it is potential collision, in response to detecting the avoidance behavior potentially collided and taken Kinematics characteristic etc. after (if necessary), collision movement characteristic, collision.
Figure 10 is the block diagram 1000 of an embodiment of monitoring server 1040.Monitoring server 1040 can counted Calculate device 1030 on operate, computing device 1030 may include processor 1032, memory 1034, communication module 1036 and persistently Property storage device 1038, as described above.Monitoring server 1040 is stored as persistent storage medium (for example, holding Long property storage device 1038) on one or more machine readable storage mediums be carried out.Include the finger of monitoring server 1040 Order can be configured to execute (for example, being configured as holding on the processor 1032 of computing device 1030 on computing device 1030 Row).Alternatively, or additionally, the part (and other modules and system disclosed herein) of monitoring server 1040 The machine elements such as application specific processor, ASIC, FPGA, PAL, programmable logic device, PLA or similar devices can be used To implement.
It can be configured into module 1042 to ask and/or connect from the collision detecting system 101A-N of land vehicle 102A-N It returns the vehicle to the garage and knock off a monitoring data 272.As described above, monitoring data 272 can include but is not limited to:Collision detection data 222, collision Used in detecting system 101A-N sensing data (sensing data that is obtained by collision detecting system 101A-N, from other sources The sensing data etc. of acquisition), collision detection model 122 (and/or its part), detected by collision detecting system 101A-N It being examined with related information, the collision warning generated by collision detecting system 101A-N, related vehicle 102A-N is potentially collided Disconnected information, collision rebuild data, object of which movement characteristic, vehicle operating condition, auxiliary data (for example, place temporal information, etc. Deng) etc..
In some embodiments, monitoring data 272 (can pass through the communication mould of computing device 1030 via network 132 Block 1036) it receives.For example, and as described above, one or more of collision detecting system 101A-N is (for example, collision inspection Examining system 101A-C) it can be configured to keep and/or send during vehicle operation (for example, " real-time ") monitoring data 272. Alternatively, one or more of collision detecting system 101A-N can be configured as periodically, intermittently, and/or respond In detecting that specific event or operating condition send monitoring data 272.For example, collision detecting system 101A-N can be configured For in response to detect vehicle operate in a specific way (for example, hypervelocity, drive unstable or similar situation), detect it is specific Vehicle, detect and potentially collide, detect actual collision or similar situation and send monitoring data 272.It is alternative Ground, or additionally, one or more collision detecting system 101A-N can be configured as in response to coming from monitoring server 1040 Request send monitoring data 272.Therefore, collision detecting system 101A-N, which can be configured as, arrives monitoring data 272 " pushing away " Monitoring server 1040 and/or monitoring server 1040 may be configured to from one or more in collision detecting system 101A-N A " drawing " monitoring data 272.
As described above, collision detecting system 101A-N, which can be configured as, is sent intermittently by monitoring data 272.For example, Collision detecting system 101N can be configured as on memory module 150N storage monitoring data 272, and monitoring data 272 can be by Intermittently it is uploaded to monitoring server 1040.For example, when communication module 130N is activated, when communication module 130N and network When 132 communication (for example, being in the communication range of wireless access point), or when in similar situation, monitoring number can be uploaded According to 272.In another example, the monitoring data 272 stored can be accessed by computing device 1037 from storage service 150N, meter It calculates device 1037 and can be configured as transmission monitoring data 272 to monitoring server 1040.When vehicle 102N is repaired, it is When in the communication range of computing device 1037, when can be accessed as the part diagnosed after collision or in similar feelings When shape, the monitoring data 272 stored can be accessed.In some embodiments, computing device 1037 may include mobile logical T unit (for example, cellular phone), can via wireless communication interface (for example, near-field communication (NFC),Deng) access The monitoring data 272 stored.
Monitoring server 1040 may be configured to quote to valence for offer monitoring data 272.This may include propping up to valence It pays, bid, interaction data access one or more in (for example, accessing stored monitoring data 1072A-N, as described below) etc. It is a.It can also include the feature of access monitoring server 1040 to valence, such as access collision warning 1047 (as described below), etc. Deng.
It can be handled by entering module 1042 in the monitoring data 272 that monitoring server 1040 receives.Into module 1042 It can be configured to handle monitoring data entry 1072A-N in a non-volatile storage 1054 and/or by monitoring data item Mesh 1072A-N is stored in a non-volatile storage 1054.It can be further configured to by one or more into module 1042 A indexing criterion indexes monitoring data 1072A-N, and indexing criterion may include, but be not limited to:Time, place, vehicles identifications device, The collision detected, and/or other suitable standards.Indexing criterion can be stored in respective directory entry 1073A-N. Alternatively, indexing criterion can store together with monitoring data entry 1072A-N.
It can be configured into module 1042 from the extraction of monitoring data 272 and/or export indexing criterion received.Such as Monitoring data 272 may include time synchronizing signal, time label or other time series datas, when can be determined that according to these Between indexing criterion.Equally, monitoring data 272 may include auxiliary data (for example, GPS coordinate), can be with according to the auxiliary data Judge locality indexes information.Therefore, extraction indexing criterion may include extracting one or more data flows from monitoring data 272 And/or data field (for example, extraction time label and/or the signal of time synchronization, extract location coordinates, etc.).
Monitoring data 272 can further comprise the information that can be derived from indexing criterion.Export indexing criterion can wrap It includes and judges indexing criterion using monitoring data 272.For example, vehicle identifiers can be exported from the monitoring data 272 received, Monitoring data 272 such as VIN codes, license plate information, the RFID of vehicle, image data (for example, image of car plate) etc..Export rope Tendering standard may include from sensing data judgement vehicle identifiers (for example, image in monitoring data 272), from vehicle movement Learn the place etc. of characteristic judgement vehicle.
In some embodiments, into module 1042 can be configured to convert and/or normalize monitoring data 272 (and/ Or from its extraction and/or export index data).For example, can be configured as transformation timing information to properly into module 1042 Time zone, conversion and/or transformation location information (for example, from GPS coordinate to another reference location and/or coordinate system), transformation The data (such as collision detection model 122 and/or vehicle movement information) of collision detection arrive Different Frames of Reference and/or coordinate system etc. Deng as described above.
In some embodiments, it can be configured to increase monitoring data 272 into module 1042.For example, into module 1042 can be configured to combine the monitoring data in relation to same time and/or place (for example, the time of overlapping and/or place) 272.It can be configured as the monitoring data 272 of aggregation " overlapping " into module 1042, the monitoring data 272 of " overlapping " can wrap It includes modification and/or optimizes the part of monitoring data 272.
It can also be configured to certification monitoring data 272 into module 1042, may include, but be not limited to:Verification prison Control the vouchers of data 272, verification to the signature of monitoring data 272, to decryption of monitoring data 272 etc..In some embodiments, The monitoring data 272 that cannot be certified may be rejected (for example, not being included in non-volatile storage 1054 and/or cannot It is indexed as described above).
As described above, it can be configured into module 1042 to be asked from one or more vehicle 101A-N via network 132 Seek monitoring data.The request can be with specified time, place, and/or the identifier studied.For example, can be with into module 1042 Request is sent out with regard to monitoring data related with one or more collisions of vehicle 101A-N.The request can specify the time of collision And/or place, and can identify vehicle involved in collision.Time and/or place can be designated as range, such as touch Time range before and after, during hitting, crash site close to the place, etc. in threshold value.Request can also include knowing Not in relation to information of vehicles involved in collision.In response to request, collision detecting system 101A-N can determine that the prison of any storage Whether control data meet request, if it is satisfied, then can send monitoring data 272 arrives monitoring server 1040, as described above. Alternatively, or additionally, collision detecting system 101A-N can be configured as storage request, and be configurable in response to It obtains the monitoring data 272 for meeting request and sends monitoring data 272.
In some embodiments, monitoring server 1040 may include notification module 1044, notification module 1044 by with It is set to and judges whether received monitoring data 272 shows to have occurred and that (or prediction will occur) collision.Notification module 1044 It can be configured as the one or more crash notifications 1045 of transmission and/or collision warning 1047.Notification module 1044 can be configured To indicate that the monitoring data 272 of collision is coordinated with emergency response entity 1060 in response to receiving;Monitoring server 1040 can be with Send crash notification 1045 to emergency response entity 1060 or other entities (for example, public safety entity, traffic control entity or Similar solid).It may include extracting collision information from monitoring data 272 to send crash notification 1045, as described above, can be with Include, but are not limited to:Collision detection model, sensing data, the movable information in relation to collision are (for example, judging impact velocity, estimating Survey the involved power etc. of collision), the estimation of the involved vehicle (and/or vehicle occupant) of collision stop place, collision Place, the time of collision, the quantity of vehicle involved by collision, the severity of estimation of collision etc..Send crash notification 1045 may include that location determination identification emergency response entity 1060, transformation and/or conversion monitoring data 272 based on collision are Suitable format etc. for emergency response entity 1060.
Notification module 1044 can also be configurable to provide collision warning 1047 to one in collision detecting system 101A-N It is a or multiple.Collision warning 1047 can be sent to the vehicle 102A-N near collision and/or may be just towards traveling at collision Vehicle 102A-N.Collision warning 1047 may include the serious journey with the estimation for colliding related place and/or time, collision Degree etc. related information, as described above.Collision detecting system 101A-N can be collided just to vehicle operators and be sent out alarm And/or in response to receiving navigation system of the recommendation alternative route of collision warning 1047 to vehicle 102A-N.
Notification module 1044 can be configured to send crash notification 1045 and/or collision warning 1047 to other Object and/or entity, such as pedestrian, mobile communication equipment etc..For example, in some embodiments, notification module 1044 can To be configured as warning collision via one or more of network 132 wireless transmitter (for example, cellular data transceiver) 1047 broadcast of report is to (one or more pedestrians and/or vehicle operators) mobile communication equipment.Collision warning 1047 can refer to Show that collision has occurred and that and/or predict to occur, as described above.
In another example, monitoring server 1040 can respond the request from emergency service entity 1060.For example, Emergency service entity 1060 can ask data related with particular vehicle, the particular vehicle to guard against (AMBER as met with peace amber ALERTTM) vehicle.Monitoring server 1040 can ask the data in relation to vehicle from vehicle 101A-N.In response to what is received Associated monitoring data 272, monitoring server 1040 can send monitoring data 272 and arrive emergency service entity 1060.Send monitoring Data 272 may include transformation to emergency service entity 1060 and/or convert monitoring data 272 at suitable format, such as above It is described.Monitoring server 1040 (for example, " real-time ") can provide monitoring data 272 when monitoring data 272 is received, And/or the monitoring data being stored on non-volatile storage 1054 can be provided.
As described above, it can be configured to store and/or index the prison in non-volatile storage 1054 into module 1042 Control data 1072A-N.Monitoring data 1072A-N can be retained in the time for continuing to judge in advance on non-volatile storage 1054 Section.In some embodiments, it can retain and collide (and/or potential collision) related monitoring data 1072A-N, and its His monitoring data 1072A-N can be removed after the period judged in advance (and/or be moved to long term memory, it is such as mobile Arrive tape backup etc.).
Monitoring server 1040 can be configured to respond from one or more request entity 1080A-N to monitoring The request 1081 of data.Request entity 1080A-N may include, but be not limited to:Individual, enterprise (such as insurance company), investigation entity (such as public security department), ruling entity (such as court, trouble-shooter).It can be by computing device to the request 1081 of monitoring data It generates, computing device such as laptop, laptop computer, tablet computer, smart phone etc., and may include one Or multiple request standards, such as time, place, vehicle identifiers.
Monitoring server 1040 may include the enquiry module 1046 for the request 1081 for being configured in response to monitoring data. Enquiry module 1046 can extract the standard of request from request, and can determine whether non-volatile storage includes corresponding to request Monitoring data 1072A-N (for example, with time and/or place related monitoring data specified in request 1081).The judgement It can be made with entry 1072A-N and/or directory entry 1073A-N by comparing the standard of request 1081.Enquiry module 1046 produce response 1083, and response 1083 may include the part for meeting monitoring data 1072A-N.Generating response 1083 can be with Including converting and/or converting monitoring data 1072A-N (and/or its part), as described above.For example, request entity 1080A-N Can collide the owner of involved vehicle, and it may include time and/or the place of request and collision to ask 1081 Related monitoring data 1072A-N.Monitoring data 1072A-N can be used to rebuild the environment around collision, so as to spy Failure and/or the insurance coverage of collision are not judged.
In some embodiments, monitoring server 1040 can be provided to the access for monitoring entry 1072A-N to exchange for To valence, such as pays, bids, interaction data accesses (for example, the monitoring of one or more vehicles to request entity 1080A-N The access of data 272) etc..Therefore, request 1081 may include quotation and/or payment.Enquiry module 1046 can determine that request Enough (for example, meeting one or more policy rules) whether 1081 quotation.Enquiry module 1046 can be refused to ask, can To counter-offer to request entity 1080A-N etc. including sending instruction that request do not meet, sending.It may include transfer to receive request Request entity 1080A-N is arrived in payment (or other exchanges) and transmission response 1083, as described above.Alternatively, or it is additional Ground, enquiry module 1046 can be configured to be accessed in response to offer and monitor one or more of entry 1072A-N and generate Bill and/or invoice.Bill and/or invoice can be generated based on the price-list judged in advance, and price-list can be supplied to and ask Realistic body 1080A-N.Bill and/or invoice can be sent to request entity 1080A-N by network 132.
In some embodiments, enquiry module 1046 is configured to whether decision request entity 1080A-N is authorized to visit It asks stored monitoring data (monitoring entry 1072A-N), may include being recognized by asking 1081 described in especially certification Request entity 1080A-N, certification are demonstrate,proved by request entity 1080A-N vouchers provided etc..It authorizes and accesses stored monitoring item Mesh 1072A-N can be based on the one or more access control data structures 1074 maintained by monitoring server 1040.It visits Ask that control data structure 1074 may include any suitable data structure for judging access rights, such as access control row Table (ACL), the access of based role, group permission etc..For example, request entity 1080A can subscribe to monitoring server 1040 with And it therefore can be identified as " authorized entity " in one or more access control data structures 1074.Monitoring server 1040 can allow identity and/or checking request entity 1080A of the request entity 1080A in response to certification request entity 1080A It is included in one or more of access control data structure 1074 and access monitoring entry 1072A-N.
Figure 11 is the flow chart of an embodiment of the method 1100 for providing monitoring service.In step 1110, side Method 1100 starts and initializes, as described above.
Step 1120 may include receiving monitoring data 272 from one or more collision detecting system 101A-N.It can ring Ying Yu from monitoring server 1040 request, in response to collision detecting system 101A-N during operation and/or specific Time interval and/or (for example, collision, collision detecting system 101A-N and network 132 are established and communicate pass in response to particular event System, etc.) it sends monitoring data 272, and/or accesses the monitoring data 272 of storage in response to computing device 1037 to receive prison Data 272 are controlled, as described above.
Step 1120 can further comprise quoting and/or providing to valence with exchange monitoring data 272.Exchange may include for prison Control data 272, which provide, to be paid, bids for access monitoring data 272, provides interactive access etc., as described above.
Step 1130 may include storing monitoring data in non-volatile storage 1054.Step 1130 may further include Monitoring data is indexed by one or more indexing criterions, indexing criterion may include, but be not limited to:Time, place, vehicle mark Know symbol etc..Therefore, step 1130 may include extracting and/or leading from the monitoring data 272 received in step 1120 Go out indexing criterion 1130, as described above.In some embodiments, step 1130 further includes transformation and/or conversion monitoring number According to 272 (for example, monitoring data 272 is transformed to absolute reference system etc. from the reference system of particular vehicle 102A-N).
It may indicate that collision has occurred and that and/or predict to occur in the monitoring data 272 that step 1120 receives.Cause This, step 1130 may further include generation and/or send crash notification 1045 to emergency service entity 1060.Such as institute above State, crash notification 1045 can identify place and/or the time of collision, may include estimation impact force (and generate collision Impact force and/or vehicle kinematics characteristic), etc..Step 1130 may further include generation and/or send one or more Collision warning gives one or more vehicle 102A-N, mobile communication equipment, pedestrian, emergency service entity etc., as described above. Method 1100 ends at step 1140.
Figure 12 is the flow chart of another embodiment of the method 1200 for providing monitoring service.In step 1210, side Method 1200 starts and initializes, as described above.
Step 1220 may include receiving request to monitoring data (for example, one or more monitoring entry 1072A-N Data).The request of step 1220 can be received by network 132 from request entity 1080A-N.Request may include request mark Standard, such as time, place, vehicle identifiers etc., as described above.Request can also include that the quotation to valence is asked with exchanging for fulfil It asks.The quotation may include, but be not limited to:Payment, bid, interaction data access etc..Step 1220 may include judgement quotation Whether it is acceptable, and if it is not, request entity is given in refusal quotation and/or generation and/or transmission quotation (or counter-offering) 1080A-N.Step 1220 can further comprise whether certification request entity and/or decision request entity are authorized to be stored Monitoring entry 1072A-N, it is as described above (for example, access control data structure 1074 based on one or more).
Step 1230 may include identification meet request monitoring data (for example, with specified time in the request, Point, and/or the associated monitoring data of vehicle identifiers).Therefore, step 1230 may include that identification meets the one of request standard A or multiple monitoring entry 1072A-N may include the standard and entry 1072A-N and/or directory entry for comparing request 1073A-N, as described above.For example, step 1230 may include mark with request in specify association in time, in the request Specified place association, the monitoring entry 1072A-N being associated with the vehicle identifiers specified in the request etc..
Step 1240 may include that generating and/or send response 1083 gives request entity 1080A-N.Step 1240 may include The data for the monitoring entry 1072A-N that transformation and/or conversion are identified in step 1230, as described above.Method 1200 terminates In step 1250.
The disclosure is illustrated with reference to various illustrative embodiments.However, those skilled in the art will recognize that, not In the case of being detached from the scope of the present disclosure, illustrative embodiments can be changed and modified.For example, a variety of operating procedures And the component for executing operating procedure can be applied or consider any amount of related with system operatio according to specific Operating cost with alternative come implement (for example, one or more of step can be deleted, change or with other steps Combination).Therefore, the disclosure should be considered as illustrative rather than restrictive, and all such modifications are intended to be wrapped It includes within the scope of its.Equally, benefit, other advantages and solution to the problem are carried out above according to various embodiments Description.However, benefit, advantage, solution to the problem and can lead to occur or become any benefit more outstanding, excellent Any element of point or solution is not interpreted as crucial, required or necessary feature or element.Such as this paper institutes It using, term "comprising", " comprising " and their any other variant are intended to and cover non-exclusive inclusion, so that Including the process of a series of elements, method, article or device include not only these elements, but may include that other are not known List or these processes, method, article or the intrinsic element of device.In addition, as used herein, term " coupling Close ", " coupling " and its any other variant be intended to cover physical connection, electrical connection, magnetic connects, optics connection, communicate to connect, Function connects and/or any other connection.
In addition, as will be understood by those skilled in the art, the principle of the disclosure can be embodied in machine can It reads in the computer program product on storage medium, which has the machine being included in storage medium can Reader code device.Any tangible, non-transitory computer readable storage medium can be utilized, including magnetic storage is set Standby (hard disk, floppy disk etc.), light storage device (CD-ROM, DVD disc, Bli-Ray CDs etc.), flash memory and/or analog.This A little computer program instructions can be loaded into all-purpose computer, special purpose computer or other programmable data processing devices with life At machine so that the instruction executed on computer or other programmable data processing devices creates for realizing specified work( The device of energy.These computer program instructions alternatively can be stored in machine readable memory, so as to instruct computer or Other programmable data processing devices are run in a specific way so that the instruction being stored in computer-readable memory generates system Product, including implement to realize the measure of specified function.The computer program instructions can also be loaded into computer or it is other can So that series of operation steps execute on a computer or other programmable device in programming data processing equipment, to generate meter The process that calculation machine is realized so that provided for realizing specified function in the instruction that computer or other programmable devices execute The step of.
Although the principle of the disclosure is shown in numerous embodiments, in the principle and range for not departing from the disclosure In the case of, structure, arrangement, ratio, element, material and the component required particularly suitable for specific environment and operation can be used Many modifications.These and other is altered or modified scheme and is intended to be included in the scope of the present disclosure.
Several aspects of theme described in the invention are elaborated in the clause of following number:
1. a kind of method comprising:
Request is generated to configure the sensing system of the second land vehicle in the first land vehicle;And
Collision detection is generated using the sensing data acquired in the sensing system by using second land vehicle Model.
Further include sending the request to the second land vehicle 2. according to the method described in clause 1
Further include being received in response to the request by using second land 3. according to the method described in clause 1 The sensing data that the sensing system of vehicle obtains.
4. according to the method described in clause 1, further include:
Sensing data is obtained by using the sensing system of first land vehicle;And
Using acquired in the sensing system by using second land vehicle sensing data and by using institute It states the sensing data acquired in the sensing system of the first land vehicle and generates collision detection model.
5. according to the method described in clause 1, further include:
Be based at least partially in item set forth below at least one of select second land vehicle:Second land The position of ground vehicle, the direction of second land vehicle, the capacity sensor of second land vehicle, second land Vehicle is relative to the position relative to specified object of position, second land vehicle in specified region, second Vehicle Direction relative to specified object of direction and the second land vehicle relative to specified region.
6. according to the method described in clause 1, wherein acquired in sensing system by using second land vehicle Sensing data includes object relative to the range of second land vehicle, speed, angle, dependence of angle range, angle At least one of boundary.
7. according to the method described in clause 1, wherein the request identification region, and wherein described second land vehicle Sensing system guides detection signal to the region identified in response to the request.
8. according to the method described in clause 7, wherein the detection signal is configured to through first land vehicle The receiver of sensing system detects.
9. according to the method described in clause 1, wherein request identification object, and wherein described second land vehicle Sensing system guides detection signal to the object identified in response to the request.
10. according to the method described in clause 9, wherein the detection signal is configured to through first land vehicle The receiver of sensing system detects.
11. according to the method described in clause 1, further including will be by using the sensing system institute of first land vehicle The sensing data of acquisition is supplied to another land vehicle.
Further include forming more base sensors 12. according to the method described in clause 1, which includes institute State at least part of at least part of the sensing system of the first land vehicle and the sensing system of second land vehicle.
13. according to the method described in clause 12, wherein more base sensors include one or more detection signal hairs Emitter, one or more of detection signal projectors include the detection signal projector of first land vehicle.
14. according to the method described in clause 12, wherein more base sensors include one or more receivers, it is described One or more receivers include the receiver of the first land vehicle.
15. according to the method described in clause 12, wherein it includes configuring first land to form more base sensors The sensing system of vehicle is to receive by the detection signal of the sensing system transmitting of second land vehicle.
16. according to the method described in clause 12, wherein it includes configuring first land to form more base sensors The sensing system of vehicle is to emit the sensing signal for being configured to be detected by the receiver of second land vehicle.
Further include in response to the sensing system institute by first land vehicle 17. according to the method described in clause 16 The sensing signal of transmitting uses the sensing data next life that the sensing system by using second land vehicle obtains At the collision detection model.
18. according to the method described in clause 12, wherein it includes turning to sensing signal in advance to form more base sensors The region first judged.
19. according to the method described in clause 12, wherein it includes first described in beam forming to form more base sensors The sensing system of the sensing system of land vehicle and second land vehicle.
20. according to the method described in clause 19, wherein beam forming includes guiding more bases along the direction judged in advance The detection signal of ground radar.
21. according to the method described in clause 19, wherein beam forming includes changing by the sensing of first land vehicle The phase of the detection signal of a transmitting in the sensing system of system and second land vehicle.
22. according to the method described in clause 19, wherein beam forming includes changing by the sensing of first land vehicle The amplitude of the detection signal of a transmitting in the sensing system of system and second land vehicle.
23. according to the method described in clause 12, wherein it includes forming multistatic radar to form more base sensors, The multistatic radar includes at least part and second land in the radar sensing system of first land vehicle At least part in the radar sensing system of vehicle.
24. according to the method described in clause 12, wherein it includes being formed to connect comprising multiple radars to form more base sensors The multistatic radar of device is received, the multiple radar receiver includes the radar receiver of first land vehicle.
25. according to the method described in clause 12, wherein it includes being formed comprising multiple radars to form more base sensors The multistatic radar of transmitter, the multiple radar transmitter include the transmitter receiver of the first land vehicle.
26. according to the method described in clause 12, wherein it includes forming bistatic radar to form more base sensors, institute State the radar receiver of radar transmitter and second land vehicle that bistatic radar includes first land vehicle.
27. according to the method described in clause 12, wherein it includes forming bistatic radar to form more base sensors, institute State the radar transmitter of radar receiver and second land vehicle that bistatic radar includes first land vehicle.
28. according to the method described in clause 12, wherein it includes forming phased array to form more base sensors, institute State the sensing of at least part and second land vehicle of the sensing system that phased array includes first land vehicle At least part of system.
Further include making the phased array towards the directional steering judged in advance 29. according to the method described in clause 27.
30. according to the method described in clause 12, further include:
Obtain the auxiliary data in relation to second land vehicle;And
More base sensors are formed using the acquired auxiliary data.
31. according to the method described in clause 30, wherein the acquired auxiliary data includes second land vehicle Acceleration, speed, one in position and direction.
Further include asking auxiliary data from second land vehicle 32. according to the method described in clause 30.
33. according to the method described in clause 1, wherein the sensing system of second land vehicle include electrical-optical sensor, Three-dimensional sensor, laser radar (LIDAR), sonic transducer, ultrasonic sensor, imaging sensor, radar, electromagnetic sensor, One kind in Magnetic Sensor and capacitance sensor.
34. according to the method described in clause 1, wherein the sensing system of first land vehicle include electrical-optical sensor, Three-dimensional sensor, laser radar (LIDAR), sonic transducer, ultrasonic sensor, imaging sensor, radar, electromagnetic sensor, One kind in Magnetic Sensor and capacitance sensor.
Further include in response to judging that movable information related with object is unsatisfactory for threshold 35. according to the method described in clause 1 It is worth and generates the request to configure the sensing system of second land vehicle.
36. according to the method described in clause 35, further comprise judgement sensing data related with the object Signal-to-noise ratio is unsatisfactory for the threshold value.
37. according to the method described in clause 35, further comprise the sensing system for judging first land vehicle Direction prevents to judge one or more motion features of the object.
Further include in response to judging that object is the sense in first land vehicle 38. according to the method described in clause 1 The outside of the detected envelope line of examining system and generate the request.
Further include in response to judging that object is the sense in second land vehicle 39. according to the method described in clause 1 The inside of the detected envelope line of examining system and generate the request.
Further include judging the object by another object masks 40. according to the method described in clause 38.
Further include judging that the DR position of the object is in first land 41. according to the method described in clause 38 Outside the detection range of the sensing system of ground vehicle.
42. according to the method described in clause 1, wherein the request bag includes payment quotation.
43. according to the method described in clause 1, wherein the request bag includes valence.
44. according to the method described in clause 1, wherein the request bag includes the quotation for accessing the collision detection model.
45. according to the method described in clause 1, wherein the request bag includes access by using first land vehicle The quotation for the sensing data that sensing system obtains.
46. according to the method described in clause 1, wherein the request bag includes the quotation for accessing stored monitoring data.
47. according to the method described in clause 1, wherein the collision detection model is at least partly in first Vehicle Generate.
Further include sending at least part of the collision detection model described in 48. according to the method described in clause 1 Second land vehicle.
49. according to the method described in clause 1, wherein the collision detection model is at least partly in second Vehicle Generate.
50. according to the method described in clause 1, wherein the sensing system of second land vehicle is configured as acquisition and object The related sensing data of body.
51. according to the method described in clause 50, wherein the object is one kind in vehicle, pedestrian, roadblock and animal.
52. according to the method described in clause 50, wherein the collision detection model includes the direction of the object.
53. according to the method described in clause 50, wherein the collision detection model includes the size of the object.
54. according to the method described in clause 50, wherein the collision detection model includes the position of the object.
55. according to the method described in clause 50, wherein the collision detection model includes the speed of the object.
56. according to the method described in clause 50, wherein the collision detection model includes the acceleration of the object.
57. according to the method described in clause 50, wherein the collision detection model includes the identity of the object.
Further include that will be obtained by using the sensing system of second land vehicle 58. according to the method described in clause 1 The sensing data taken is converted into the reference system of first land vehicle.
Further include that the collision detection model is converted into absolute frame 59. according to the method described in clause 1.
Further include that the collision detection model is converted into first Vehicle 60. according to the method described in clause 1 Reference system.
61. according to the method described in clause 1, wherein the collision detection model includes object relative to first land The position of vehicle.
62. according to the method described in clause 1, wherein the collision detection model includes object relative to first land The direction of vehicle.
63. according to the method described in clause 1, wherein the collision detection model includes object relative to first land The speed of vehicle.
64. according to the method described in clause 1, wherein the collision detection model includes phase object for first land The acceleration of vehicle.
Further include that will be obtained by using the sensing system of first land vehicle 65. according to the method described in clause 1 The sensing data taken is converted into the reference system of second land vehicle.
Further include that the collision detection model is converted into second Vehicle 66. according to the method described in clause 1 Reference system.
67. according to the method described in clause 1, wherein the collision detection model includes object relative to second land The position of vehicle.
68. according to the method described in clause 1, wherein the collision detection model includes object relative to second land The direction of vehicle.
69. according to the method described in clause 1, wherein the collision detection model includes object relative to second land The speed of vehicle.
70. according to the method described in clause 1, wherein the collision detection model includes object relative to second land The acceleration of vehicle.
Further include that will be obtained by using the sensing system of second land vehicle 71. according to the method described in clause 1 The sensing data taken is converted into the reference system of third land vehicle.
Further include that the collision detection model is converted into the third Vehicle 72. according to the method described in clause 1 Reference system.
73. according to the method described in clause 1, wherein the collision detection model includes object relative to third land vehicle Position.
74. according to the method described in clause 1, wherein the collision detection model includes object relative to third land vehicle Direction.
75. according to the method described in clause 1, wherein the collision detection model includes object relative to third land vehicle Speed.
76. according to the method described in clause 1, wherein the collision detection model includes object relative to third land vehicle Acceleration.
Further include latent between the object detected in the collision detection model 77. according to the method described in clause 1 Collision.
Further include by using the collision detection model inspection object and described 78. according to the method described in clause 1 Potential collision between first land vehicle.
Further include by using the collision detection model inspection object and described 79. according to the method described in clause 1 Potential collision between second land vehicle.
Further include by using the collision detection model inspection object and third 80. according to the method described in clause 1 Potential collision between land vehicle.
It further include the time that the judgement potential collision occurs 81. according to the method described in clause 77.
Further include the judgement position potentially collided 82. according to the method described in clause 77.
Further include the judgement impact velocity potentially collided 83. according to the method described in clause 77.
Further include the estimation one or more impact forces potentially collided 84. according to the method described in clause 77.
Further include one or more in the estimation object potentially collided 85. according to the method described in clause 77 Kinematics characteristic after a collision.
Further include in response to detecting that the potential collision generates alarm 86. according to the method described in clause 77.
Further include providing the alarm to being predicted in the potential collision 87. according to the method described in clause 77 The middle object being related to.
Further include in response to detecting described in the potential crash-active 88. according to the method described in clause 77 The collision avoidance system of one land vehicle.
Further include in response to detecting described in the potential crash-active 89. according to the method described in clause 77 The collision warning systems of one land vehicle.
90. according to the method described in clause 89, wherein it includes activating first land to activate the collision warning systems The sound early warning system of vehicle.
91. according to the method described in clause 89, wherein it includes activating first land to activate the collision warning systems The electrical-optical transmitter of vehicle.
Further include in response to detecting that the potential collision shows alarm 92. according to the method described in clause 77.
Further include in response to detecting the potential collision described first 93. according to the method described in clause 77 Audio alert is generated in land vehicle.
Further include being shown on the user display screen of first land vehicle 94. according to the method described in clause 77 It is described potentially collide visually indicate.
Further include being shown on the head-up display of first land vehicle 95. according to the method described in clause 77 It is described potentially collide visually indicate.
Further include that identification will occur in the described flat of first land vehicle 96. according to the method described in clause 95 Depending on the object potentially collided in display.
Further include in response to detecting that the potential collision is touched based on described 97. according to the method described in clause 77 It hits detection model and generates collision elimination instruction.
98. according to the method described in clause 97, wherein the collision elimination instruction includes to deceleration, acceleration and steering Instruction in one kind.
Further include the alarm that generation includes collision elimination instruction 99. according to the method described in clause 97.
100. according to the method described in clause 77, further include:
By using the result potentially collided described in the collision detection model prediction;And
Collision elimination instruction is generated by using the result predicted.
Further include in response to detecting that the potential collision is sent an alert to 101. according to the method described in clause 77 Second land vehicle.
102. according to the method described in clause 101, wherein the alarm includes that the object phase potentially collided occurs For the position of second land vehicle.
103. according to the method described in clause 101, wherein the alarm includes that the object phase potentially collided occurs For the speed of second land vehicle.
Further include the collision elimination generated for second land vehicle 104. according to the method described in clause 101 Instruction.
105. according to the method described in clause 101, further include:
Use the result potentially collided described in the collision detection model prediction;And
It is generated by using the result predicted and is instructed for the collision elimination of second land vehicle.
Further include in response to detecting that the potential collision is sent an alert to 106. according to the method described in clause 77 Third land vehicle.
107. according to the method described in clause 106, wherein it includes broadcasting the alarm to send the alarm.
108. according to the method described in clause 106, wherein the third land vehicle is related to the potential collision.
109. according to the method described in clause 106, wherein the alarm includes that the object phase potentially collided occurs For the position of the third land vehicle.
110. according to the method described in clause 106, wherein the alarm includes that the object phase potentially collided occurs For the speed of the third land vehicle.
111. according to the method described in clause 106, further include:
Use the result potentially collided described in the collision detection model prediction;And
It is generated based on the result predicted and is instructed for the avoidance of the third land vehicle.
Further include being generated for described the based on the result predicted 112. according to the method described in clause 111 The collision elimination of three land vehicles instructs.
Further include asking auxiliary data from second land vehicle 113. according to the method described in clause 1.
Further include receiving auxiliary data from second land vehicle 114. according to the method described in clause 1.
Further include that the collision detection is converted using the auxiliary data 115. according to the method described in clause 114 In model to the reference system of second land vehicle.
116. according to the method described in clause 114, further including will be by using described second using the auxiliary data The sensing data that the sensing system of land vehicle obtains is converted into the reference system of first land vehicle.
Further include carrying out transformed response in by described using the auxiliary data 117. according to the method described in clause 114 The sensor that the detection signal of the sensing system transmitting of second land vehicle is received in the sensing system of first land vehicle Data.
118. according to the method described in clause 114, wherein the auxiliary data includes the acceleration of second land vehicle One kind in degree, speed, position and direction.
119. according to the method described in clause 114, wherein the auxiliary data includes the sense with second land vehicle The related information of capacity of examining system.
120. according to the method described in clause 114, wherein the auxiliary data includes GPS coordinates.
121. according to the method described in clause 114, wherein the auxiliary data includes the measurement result of speedometer.
122. according to the method described in clause 114, wherein the auxiliary data includes time synchronizing signal.
Further include receiving auxiliary data from the third land vehicle 123. according to the method described in clause 114.
124. according to the method described in clause 1, further include:
The collision detection model is generated on first land vehicle;And
At least part of the collision detection model is provided to another land vehicle.
Further include receiving the second collision detection model from another land vehicle 125. according to the method described in clause 1 At least part.
Further include the combination collision detection model and second collision 126. according to the method described in clause 125 Detection model.
Further include being touched using described in the second collision detection model optimization 127. according to the method described in clause 125 Hit detection model.
Further include receiving the sense accessed by using second land vehicle 128. according to the method described in clause 1 The request for the sensing data that the sensing system of examining system and first land vehicle obtains.
129. according to the method described in clause 128, reported wherein the request for accessing the sensing data includes payment Valence, the method further include meeting the payment threshold value judged in advance in response to the payment quotation and providing to requested sensing The access of device data.
130. according to the method described in clause 128, wherein the request bag for accessing the sensing data includes valence, institute The method of stating further includes meeting the bid threshold value judged in advance in response to the bid and providing to requested sensing data It accesses.
131. according to the method described in clause 128, wherein the request for accessing the sensing data includes offer pair The quotation of the access of the collision detection data of designated vehicle.
132. according to the method described in clause 131, the method further includes:
In response to judging that the collision detection data of the specified vehicle can be used for generating the collision detection mould Type and the access to the requested sensing data is provided.
133. according to the method described in clause 1, the method further includes:
Request accesses the collision detection model generated by specified vehicle;And
The collision detection model is generated using the collision detection model of the specified vehicle.
134. according to the method described in clause 133, wherein it includes providing payment to exchange for ask the collision detection model Access the requested collision detection model.
135. according to the method described in clause 133, wherein it includes requested to accessing to ask the collision detection model The collision detection model bid.
136. according to the method described in clause 133, wherein it includes to access sensors number to ask the collision detection model According to quotation the requested collision detection model is accessed to exchange for.
Further include receiving the request for accessing the collision detection model 137. according to the method described in clause 1.
138. according to the method described in clause 137, wherein the request for accessing the collision detection model includes payment, The method further includes being provided to the collision detection model in response to meeting the payment of the payment threshold value judged in advance It accesses.
139. according to the method described in clause 137, wherein the request bag for accessing the collision detection model includes valence, The method further includes meeting the bid threshold value judged in advance in response to the bid and providing to the collision detection model It accesses.
140. according to the method described in clause 137, wherein the request for accessing the collision detection model includes providing Quotation to the access of the collision detection data of designated vehicle.
141. method according to clause 140 further includes:In response to judging to touch described in the specified vehicle Hitting detection data can be used for generating the collision detection model and providing the access to the collision detection model.
142. method according to clause 1 further includes establishing first land vehicle and second Vehicle Communication link between.
143. method according to clause 142, wherein the communication link includes peer-to-peer network, AD-HOC network, basis One kind in facility network, wireless network, cellular data network and electrical-optical network.
144. method according to clause 142 further includes in response to detecting first land in communication range One in ground vehicle and second land vehicle is established the communication link.
145. method according to clause 142 further includes in response to detecting first land vehicle close to institute It states the second land vehicle and establishes the communication link.
146. method according to clause 145, wherein detect first land vehicle close to second land Vehicle includes that broadcast communication finds that signal and detection communication find one kind in signal broadcast.
147. method according to clause 145, wherein detect first land vehicle close to second land Vehicle includes determining whether one place in first land vehicle and second land vehicle.
148. method according to clause 147 further includes registering first land vehicle and second land One place in vehicle.
149. method according to clause 1, further include via network transmission include the collision detection model extremely At least part of monitoring data.
150. method according to clause 149 further includes sending the monitoring data to traffic control system, network One in accessible services station, insurance company and public security organs.
151. method according to clause 149 further includes the monitoring data ensured via the network transmission Safety.
152. method according to clause 151, wherein it is ensured that the safety of monitoring data includes to the monitoring data Signature.
153. method according to clause 151, wherein it is ensured that the safety of monitoring data includes to the monitoring data Encryption.
154. method according to clause 149 further includes being marked to the monitoring data application time.
155. method according to clause 149 further includes that place identifier is included in the monitoring data.
156. method according to clause 1, it will includes at least part of of the collision detection model to further include Supervising data storage is on persistent storage medium.
157. method according to clause 156 further includes the safety of monitoring data for ensuring to be stored.
158. method according to clause 156 further includes that the monitoring data stored is sent on network.
A kind of 159. collision detecting systems comprising:
The Coordination module of first land vehicle is configurable to generate request to configure the sensing system of the second land vehicle System;With
Processing module is configured to the sensor obtained using the sensing system by using second land vehicle Data generate collision detection model.
160. collision detecting system according to clause 159, further include be configured as transmitting the request to it is described The communication module of second land vehicle.
161. collision detecting system according to clause 159 further includes being configured in response to the request to receive The communication module of the sensing data acquired in sensing system by using second land vehicle.
162. collision detecting system according to clause 159 further includes being configured to receive by using described the The communication module of sensing data acquired in the sensing system of two land vehicles, wherein the processing module is configured to use By using the sensing data of the sensing system acquisition of second land vehicle and by using first land The sensing data that the sensing system of ground vehicle obtains is to generate the collision detection model.
163. collision detecting system according to clause 159, wherein the Coordination module is configured at least partly Ground selects second land vehicle based at least one in item set forth below:The position of second land vehicle, institute The direction of the second land vehicle, the capacity sensor of second land vehicle, second land vehicle are stated relative to specified The position in region, second land vehicle are relative to the position of specified object, second land vehicle relative to the finger Determine the direction of the direction and the second land vehicle in region relative to the specified object.
164. collision detecting system according to clause 159, wherein by using the sensing of second land vehicle Sensing data acquired in system includes that object is relied on relative to the range of second land vehicle, speed, angle, angle At least one of property range, angular limitation.
165. collision detecting system according to clause 159, wherein the request identification region, and wherein described The sensing system of two land vehicles guides detection signal to the region identified in response to the request.
166. collision detecting system according to clause 165, wherein the detection signal is configured to by described The receiver of the sensing system of one land vehicle detects.
167. collision detecting system according to clause 159, wherein the request identifies object, and wherein described the The sensing system of two land vehicles guides detection signal to the object identified in response to the request.
168. collision detecting system according to clause 167, wherein the detection signal is configured to by described The receiver of the sensing system of one land vehicle detects.
169. collision detecting system according to clause 159, further including will be by using first land vehicle Sensing system acquired sensing data is supplied to the communication module of another land vehicle.
170. collision detecting system according to clause 159, wherein configuring the sensing system of second land vehicle Including forming more base sensors, which includes at least part of the sensing system of first land vehicle With at least part of the sensing system of second land vehicle.
171. collision detecting system according to clause 170, wherein the more bases formed by the Coordination module Ground sensor includes one or more detection signal projectors, and one or more of detection signal projectors include described first The transmitter of land vehicle.
172. collision detecting system according to clause 170, wherein the more bases formed by the Coordination module Ground sensor includes one or more receivers, and one or more of receivers include the reception of first land vehicle Device.
173. collision detecting system according to clause 170, wherein the Coordination module is configured to configure described The sensing system of one land vehicle is to receive by the detection signal of the sensing system transmitting of second land vehicle.
174. collision detecting system according to clause 170, wherein the Coordination module is configured to configure described The sensing system of one land vehicle is to emit the sensing signal for being configured to be detected by the receiver of second land vehicle.
175. collision detecting system according to clause 174, wherein the processing module is configured in response to by institute It states the sensing signal that the sensing system of the first land vehicle is emitted and uses sense by using second land vehicle The sensing data that examining system obtains generates the collision detection model.
176. collision detecting system according to clause 170, wherein the Coordination module is configured to more bases The sensing signal of ground sensor turns to the region judged in advance.
177. collision detecting system according to clause 170, wherein the Coordination module is configured to beam forming institute State one or more detection signals of more base sensor emissions.
178. collision detecting system according to clause 177, wherein the Coordination module is configured to along advance judgement Direction guide one or more of detection signals.
179. collision detecting system according to clause 177, wherein the Coordination module is configured to change by described The phase of the detection signal of a transmitting in the sensing system of the sensing system of first land vehicle and second land vehicle Position.
180. collision detecting system according to clause 177, wherein the Coordination module is configured to change by described The width of the detection signal of a transmitting in the sensing system of the sensing system of first land vehicle and second land vehicle Value.
181. collision detecting system according to clause 170, wherein the Coordination module is configured to form more bases Radar, the multistatic radar include at least part and described second in the radar sensing system of first land vehicle At least part in the radar sensing system of land vehicle.
182. collision detecting system according to clause 170, wherein the Coordination module is configured to be formed comprising more The multistatic radar of a radar receiver, the multiple radar receiver include the radar receiver of first land vehicle.
183. collision detecting system according to clause 170, wherein the Coordination module is configured to be formed comprising more The multistatic radar of a radar transmitter, the multiple radar transmitter include the transmitter of the first land vehicle.
184. collision detecting system according to clause 170, wherein the Coordination module be configured to be formed it is bistatic Radar, the bistatic radar include that the radar transmitter of first land vehicle and the radar of second land vehicle connect Receive device.
185. collision detecting system according to clause 170, wherein the Coordination module be configured to be formed it is bistatic Radar, the bistatic radar include the radar receiver of first land vehicle and the radar hair of second land vehicle Emitter.
186. collision detecting system according to clause 170, wherein the Coordination module is configured to form phased array Row, the phased array includes at least part of the sensing system of first land vehicle and second land vehicle At least part of sensing system.
187. collision detecting system according to clause 186, wherein the Coordination module be configured to make it is described phased Array is towards the directional steering judged in advance.
188. collision detecting system according to clause 170 further includes obtaining related second land vehicle The communication module of auxiliary data, and the wherein described Coordination module is configured to using the acquired auxiliary data to be formed State more base sensors.
189. collision detecting system according to clause 188, wherein the acquired auxiliary data includes described the One in the acceleration of two land vehicles, speed, position and direction.
190. collision detecting system according to clause 188, wherein the Coordination module is configured to from described second Land vehicle asks auxiliary data.
191. collision detecting system according to clause 159, wherein the sensing system of second land vehicle includes Electrical-optical sensor, three-dimensional sensor, laser radar (LIDAR), sonic transducer, ultrasonic sensor, imaging sensor, radar, One kind in electromagnetic sensor, Magnetic Sensor and capacitance sensor.
192. collision detecting system according to clause 159, wherein the sensing system of first land vehicle includes Electrical-optical sensor, three-dimensional sensor, laser radar (LIDAR), sonic transducer, ultrasonic sensor, imaging sensor, radar, One kind in electromagnetic sensor, Magnetic Sensor and capacitance sensor.
193. collision detecting system according to clause 159, wherein the Coordination module is configured in response to judge Movable information related with object is unsatisfactory for threshold value and generates the request.
194. collision detecting system according to clause 193, wherein judgement movable information related with object is not Meet threshold value and includes determining whether that the signal-to-noise ratio of sensing data related with the object is unsatisfactory for threshold value.
195. collision detecting system according to clause 193, wherein judgement movable information related with object is not Meet threshold value and includes determining whether that the direction of the sensing system of first land vehicle prevents to judge the one or more of the object Motion feature.
196. collision detecting system according to clause 159, wherein the Coordination module is configured in response to judge Object is to generate the request in the outside of the detected envelope line of the sensing system of first land vehicle.
197. collision detecting system according to clause 159, wherein the Coordination module is configured in response to judge Object is to generate the request in the inside of the detected envelope line of the sensing system of second land vehicle.
198. collision detecting system according to clause 196, where it is determined that the object is in the sensing system Include determining whether the object by another object masks outside detection range.
199. collision detecting system according to clause 196, where it is determined that the object is in the sensing system Include determining whether that the DR position of the object is the detection range in the sensing system of first land vehicle outside detection range Outside.
200. collision detecting system according to clause 159, wherein the request bag includes payment quotation.
201. collision detecting system according to clause 159, wherein the request bag includes valence.
202. collision detecting system according to clause 159, wherein the request bag, which includes, accesses the collision detection mould The quotation of type.
203. collision detecting system according to clause 159, wherein the request bag includes access by using described the The quotation for the sensing data that the sensing system of one land vehicle obtains.
204. collision detecting system according to clause 159, wherein the request bag, which includes, accesses stored monitoring number According to quotation.
205. collision detecting system according to clause 159 further includes being configured to send the collision detection mould Communication module of at least part of type to the second land vehicle.
206. collision detecting system according to clause 159, wherein the collision detection model is at least partly in institute State the generation of the second land vehicle.
207. collision detecting system according to clause 159, wherein the sensing system of second land vehicle by with It is set to acquisition sensing data related with object.
208. collision detecting system according to clause 207, wherein the object is vehicle, pedestrian, roadblock and animal In one kind.
209. collision detecting system according to clause 207, wherein the collision detection model includes the object Direction.
210. collision detecting system according to clause 207, wherein the collision detection model includes the object Size.
211. collision detecting system according to clause 207, wherein the collision detection model includes the object Position.
212. collision detecting system according to clause 207, wherein the collision detection model includes the object Speed.
213. collision detecting system according to clause 207, wherein the collision detection model includes the object Acceleration.
214. collision detecting system according to clause 207, wherein the collision detection model includes the object Identity.
215. collision detecting system according to clause 159, wherein the processing module be configured to by by using The sensing data that the sensing system of second land vehicle obtains is converted into the reference system of the first land vehicle.
216. collision detecting system according to clause 159, wherein the processing module is configured to the collision Detection model is converted into absolute frame.
217. collision detecting system according to clause 159, wherein the processing module is configured to the collision Detection model is converted into the reference system of first land vehicle.
218. collision detecting system according to clause 159, wherein the collision detection model includes relative to described The position of first land vehicle object.
219. collision detecting system according to clause 159, wherein the collision detection model include object relative to The direction of first land vehicle.
220. collision detecting system according to clause 159, wherein the collision detection model include object relative to The speed of first land vehicle.
221. collision detecting system according to clause 159, wherein the collision detection model include object relative to The acceleration of first land vehicle.
222. collision detecting system according to clause 159, wherein the processing module be configured to by by using The sensing data that the sensing system of first land vehicle obtains is converted into the reference system of second land vehicle In.
223. collision detecting system according to clause 159, wherein the processing module is configured to the collision Detection model is converted into the reference system of second land vehicle.
224. collision detecting system according to clause 159, wherein the collision detection model include object relative to The position of second land vehicle.
225. collision detecting system according to clause 159, wherein the collision detection model include object relative to The direction of second land vehicle.
226. collision detecting system according to clause 159, wherein the collision detection model include object relative to The speed of second land vehicle.
227. collision detecting system according to clause 159, wherein the collision detection model includes relative to described The acceleration of second land vehicle object.
228. collision detecting system according to clause 159, wherein the conversion module be configured to by by using The sensing data that the sensing system of second land vehicle obtains is converted into the reference system of third land vehicle.
229. collision detecting system according to clause 159, wherein the conversion module is configured to the collision Detection model is converted into the reference system of the third land vehicle.
230. collision detecting system according to clause 159, wherein the collision detection model include object relative to The position of third land vehicle.
231. collision detecting system according to clause 159, wherein the collision detection model include object relative to The direction of third land vehicle.
232. collision detecting system according to clause 159, wherein the collision detection model include object relative to The speed of third land vehicle.
233. collision detecting system according to clause 159, wherein the collision detection model include object relative to The acceleration of third land vehicle.
234. collision detecting system according to clause 159, wherein the processing module is configured to detection described Potential collision between object in collision detection model.
235. collision detecting system according to clause 159, wherein the processing module is configured to by using institute State the potential collision between collision detection model inspection object and first land vehicle.
236. collision detecting system according to clause 159, wherein the processing module is configured to by using institute State the potential collision between collision detection model inspection object and second land vehicle.
237. collision detecting system according to clause 159, wherein the processing module is configured to by using institute State the potential collision between collision detection model inspection object and third land vehicle.
238. collision detecting system according to clause 234, wherein the processing module is configured to judge described dive Collision occur time.
239. collision detecting system according to clause 234, wherein the processing module is configured to judge described dive Collision position.
240. collision detecting system according to clause 234, wherein the processing module is configured to judge described dive Collision impact velocity.
241. collision detecting system according to clause 234, wherein the processing module is configured to estimate described dive Collision one or more impact forces.
242. collision detecting system according to clause 234, wherein the processing module is configured to estimate described dive One or more of the object of collision collision after kinematics characteristic.
243. collision detecting system according to clause 234, wherein the processing module is configured in response to detect Alarm is generated to the potential collision.
244. collision detecting system according to clause 234, wherein the Coordination module is configured to provide the police Offer the object for being predicted will to be related in the potential collision.
245. collision detecting system according to clause 234 further includes in response to detecting the potential collision Activate the collision avoidance system of first land vehicle.
246. collision detecting system according to clause 234 further includes vehicle interface module in response to detecting The collision warning systems of first land vehicle described in the potential crash-active.
247. collision detecting system according to clause 246, wherein the vehicle interface is configured to activate described The sound early warning system of one land vehicle.
248. collision detecting system according to clause 246, wherein the vehicle interface is configured in response to detect To the electrical-optical transmitter of the first land vehicle described in the potential crash-active.
249. collision detecting system according to clause 246, wherein the vehicle interface is configured in response to detect Alarm is shown to the potential collision.
250. collision detecting system according to clause 246, wherein the vehicle interface is configured in response to detect Audio alert is generated to the potential collision.
251. collision detecting system according to clause 246, wherein the vehicle interface is configured in response to detect To the potential collision show on the user display screen of first land vehicle it is described potentially collide visually indicate.
252. collision detecting system according to clause 246, wherein the vehicle interface is configured in response to detect To the potential collision show on the head-up display of first land vehicle it is described potentially collide visually indicate.
253. collision detecting system according to clause 252 is happened at wherein the vehicle interface is configured to identification The object potentially collided in the head-up display of first land vehicle.
254. collision detecting system according to clause 234, wherein the processing module is configured in response to detect Collision elimination instruction is generated based on the collision detection model to the potential collision.
255. collision detecting system according to clause 254, wherein collision elimination instruction includes slowing down, adding One kind in speed and the instruction turned to.
256. collision detecting system according to clause 254 further includes the alarm that generation includes collision elimination instruction Vehicle interface module.
257. collision detecting system according to clause 234, wherein the processing module is configured to by using institute It states the result potentially collided described in collision detection model prediction and is collided back by using the result generation predicted Keep away instruction.
258. collision detecting system according to clause 234 further includes in response to detecting the potential collision Send an alert to the communication module of second land vehicle.
259. collision detecting system according to clause 258, wherein the alarm includes that the potential collision occurs Position of the object relative to second land vehicle.
260. collision detecting system according to clause 258, wherein the alarm includes that the potential collision occurs Speed of the object relative to second land vehicle.
261. collision detecting system according to clause 258, wherein the processing module is configured in response to detect It generates to the potential collision and is instructed for the collision elimination of second land vehicle.
262. collision detecting system according to clause 234, wherein the processing module is configured to by using institute It states the result potentially collided described in collision detection model prediction and is generated by using the result predicted and be used for institute State the collision elimination instruction of the second land vehicle.
263. collision detecting system according to clause 234 further includes in response to detecting the potential collision Broadcast warning.
264. collision detecting system according to clause 234 further includes in response to detecting the potential collision Send an alert to the communication module of third land vehicle.
265. collision detecting system according to clause 264, wherein the third land vehicle be related to it is described potential Collision.
266. collision detecting system according to clause 264, wherein the alarm includes that the potential collision occurs Position of the object relative to the third land vehicle.
267. collision detecting system according to clause 264, wherein the alarm includes that the potential collision occurs Speed of the object relative to the third land vehicle.
268. collision detecting system according to clause 264, wherein the processing module is configured to by using institute It states the result potentially collided described in collision detection model prediction and is generated by using the result predicted and be used for institute State the collision elimination instruction of third land vehicle.
269. collision detecting system according to clause 159, wherein the Coordination module is configured to from described second Land vehicle asks auxiliary data.
270. collision detecting system according to clause 159 further includes obtaining related second land vehicle The communication module of auxiliary data.
271. collision detecting system according to clause 270, wherein the processing module is configured to by using institute The auxiliary data obtained converts in the collision detection model to the reference system of second land vehicle.
272. collision detecting system according to clause 270, wherein the processing module is configured to by using institute The sensing data obtained by using the sensing system of second land vehicle is converted by the auxiliary data obtained In the reference system of first land vehicle.
273. collision detecting system according to clause 270, wherein the processing module is configured to by using institute The auxiliary data obtained to convert in institute in response to the detection signal of the sensing system transmitting by second land vehicle State the sensing data that the sensing system of the first land vehicle receives.
274. collision detecting system according to clause 270, wherein the auxiliary data includes second Vehicle Acceleration, speed, one kind in position and direction.
275. collision detecting system according to clause 270, wherein the auxiliary data includes and second land The related information of capacity of the sensing system of vehicle.
276. collision detecting system according to clause 270 is sat wherein the auxiliary data includes global positioning system Mark.
277. collision detecting system according to clause 270, wherein the auxiliary data includes the measurement knot of speedometer Fruit.
278. collision detecting system according to clause 270, wherein the auxiliary data includes time synchronizing signal.
279. collision detecting system according to clause 270 further includes from third land vehicle reception and institute State the communication module of the related auxiliary data of the second land vehicle.
280. collision detecting system according to clause 159 further includes providing the collision detection model at least A part arrives the communication module of another land vehicle.
281. collision detecting system according to clause 159 further includes receiving the second collision from another land vehicle At least part of communication module of detection model.
282. collision detecting system according to clause 281 is generated wherein the processing module is configured to combination The collision detection model and the second collision detection model.
283. collision detecting system according to clause 281, wherein the processing module is configured to by using institute It states the second collision detection model and generates the collision detection model.
284. collision detecting system according to clause 159 leads to wherein the Coordination module is configured to receive to access Cross the sensing data obtained using the sensing system of second land vehicle and the sensing system of first land vehicle Request.
285. collision detecting system according to clause 284, wherein accessing the request bag of the sensing data Payment quotation is included, and the wherein described Coordination module is configured in response to the payment quotation and meets the payment threshold value judged in advance And provide the access to requested sensing data.
286. collision detecting system according to clause 284, wherein accessing the request bag of the sensing data Include valence, and the wherein described Coordination module is configured in response to the bid and meets the bid threshold value judged in advance and provide pair The access of requested sensing data.
287. collision detecting system according to clause 284, wherein accessing the request bag of the sensing data Include the quotation that the access to the collision detection data of designated vehicle is provided.
288. collision detecting system according to clause 287, wherein the Coordination module is configured in response to judge The collision detection data of the specified vehicle can be used for generating the collision detection model and providing to requested The access of the sensing data.
289. collision detecting system according to clause 159, wherein the Coordination module, which is configured to receive, accesses institute State the request of collision detection model.
290. collision detecting system according to clause 289, wherein accessing the request of the collision detection model It offers including payment, and the wherein described Coordination module is configured in response to the payment quotation and meets the payment threshold judged in advance It is worth and the access to the collision detection model is provided.
291. collision detecting system according to clause 289, wherein accessing the request of the collision detection model Including bid, and the wherein described Coordination module is configured in response to meet the bid offer of the bid threshold value judged in advance Access to the collision detection model.
292. collision detecting system according to clause 289, wherein accessing the request of the collision detection model Quotation including providing the access to the collision detection data of designated vehicle.
293. collision detecting system according to clause 292, wherein the Coordination module is configured in response to judge The collision detection data of the specified vehicle can be used for generating the collision detection model and providing to the collision The access of detection model.
294. collision detecting system according to clause 159 further includes being configured to establish first Vehicle The communication module of communication link between second land vehicle.
295. collision detecting system according to clause 294, wherein the communication link includes peer-to-peer network (peer- To-peer network), AD-HOC network, infrastructure network, wireless network, in cellular data network and electrical-optical network It is a kind of.
296. collision detecting system according to clause 294, wherein the communication module is configured in response to logical Detect that one in first land vehicle and second land vehicle is established communication link in letter range.
297. collision detecting system according to clause 294, wherein the communication module is configured in response to detect Communication link is established to first land vehicle close to second land vehicle.
298. collision detecting system according to clause 297, wherein the communication module is configured to broadcast communication hair Show signal to detect first land vehicle close to second land vehicle.
299. collision detecting system according to clause 297, wherein the communication module is configured to judge described One place in one land vehicle and second land vehicle.
300. collision detecting system according to clause 299 further includes registration first land vehicle and described One place in second land vehicle.
301. collision detecting system according to clause 159 further includes being configured to via network transmission include institute State the communication module of at least part of monitoring data of collision detection model.
302. collision detecting system according to clause 301 further includes sending the monitoring data to traffic administration One in system, Network Accessible Service station, insurance company and public security organs.
303. collision detecting system according to clause 301, wherein the communication module is configured to ensure that via institute State the safety of monitoring data of network transmission.
304. collision detecting system according to clause 301, wherein the communication module is configured to the monitoring Data are signed.
305. collision detecting system according to clause 301, wherein the communication module is configured to the monitoring Data encryption.
306. collision detecting system according to clause 301, wherein the communication module is configured to the monitoring Data application time marks.
307. collision detecting system according to clause 301, wherein the supervision packet includes place identifier.
308. collision detecting system according to clause 159, further including will be including the collision detection model extremely Memory module of at least part of supervising data storage on persistent storage medium.
309. collision detecting system according to clause 308, wherein the memory module is configured to ensure that and is stored The safety of monitoring data.
A kind of 310. machine readable storage mediums comprising it is configured to the instruction for making collision detecting system execute method, This method includes:
Request is generated to configure the sensing system of the second land vehicle in the first land vehicle;And
Collision detection is generated using the sensing data acquired in the sensing system by using second land vehicle Model.
311. machine readable storage medium according to clause 310, the method further include sending the request to institute State the second land vehicle.
312. machine readable storage medium according to clause 310, the method further include connecing in response to the request Receive the sensing data obtained by using the sensing system of second land vehicle.
313. machine readable storage medium according to clause 310, the method further include:
Sensing data is obtained by using the sensing system of first land vehicle;And
Using acquired in the sensing system by using second land vehicle sensing data and by using institute It states the sensing data acquired in the sensing system of the first land vehicle and generates the collision detection model.
314. machine readable storage medium according to clause 310, the method further include:
Be based at least partially in item set forth below at least one of select second land vehicle:Second land The position of ground vehicle, the direction of second land vehicle, the capacity sensor of second land vehicle, second land Vehicle is relative to the position relative to specified object of position, second land vehicle in specified region, second Vehicle Direction relative to specified object of direction and second land vehicle relative to specified region.
315. machine readable storage medium according to clause 310, wherein by using second land vehicle Sensing data acquired in sensing system includes range, speed, angle, angle of the object relative to second land vehicle At least one of dependence range, angular limitation.
316. machine readable storage medium according to clause 310, wherein the request identification region, and wherein institute The sensing system for stating the second land vehicle guides detection signal to the region identified in response to the request.
317. machine readable storage medium according to clause 316, wherein the detection signal is configured to pass through institute The receiver of the sensing system of the first land vehicle is stated to detect.
318. machine readable storage medium according to clause 310, wherein the request identifies object, and wherein institute The sensing system for stating the second land vehicle guides detection signal to the object identified in response to the request.
319. machine readable storage medium according to clause 318, wherein the detection signal is configured to pass through institute The receiver of the sensing system of the first land vehicle is stated to detect.
320. machine readable storage medium according to clause 310, the method further includes will be by using described Sensing data acquired in the sensing system of one land vehicle is supplied to another land vehicle.
321. machine readable storage medium according to clause 310, the method further include forming more base sensors, More base sensors include at least part of the sensing system of first land vehicle and second land vehicle At least part of sensing system.
322. machine readable storage medium according to clause 321, wherein more base sensors include one or Multiple detection signal projectors, one or more of detection signal projectors include the detection signal of first land vehicle Transmitter.
323. machine readable storage medium according to clause 321, wherein more base sensors include one or Multiple receivers, one or more of receivers include the receiver of first land vehicle.
324. machine readable storage medium according to clause 321, wherein it includes matching to form more base sensors The sensing system of first land vehicle is set to receive by the detection signal of the sensing system transmitting of second land vehicle.
325. machine readable storage medium according to clause 321, wherein it includes matching to form more base sensors It sets the sensing system of first land vehicle and is configured to be detected by the receiver of second land vehicle to emit Sensing signal.
326. machine readable storage medium according to clause 325, the method further include in response to by described first The sensing signal that the sensing system of land vehicle is emitted uses the sensing system by using second land vehicle The sensing data of acquisition generates the collision detection model.
327. machine readable storage medium according to clause 321, wherein it includes making to form more base sensors Sensing signal turns to the region judged in advance.
328. machine readable storage medium according to clause 321, wherein it includes wave to form more base sensors The sensing system of the sensing system of first land vehicle described in beam shaping and second land vehicle.
329. machine readable storage medium according to clause 328, wherein beam forming include along the side judged in advance To the detection signal for guiding the multistatic radar.
330. machine readable storage medium according to clause 328, wherein beam forming include changing by described first The phase of the detection signal of a transmitting in the sensing system of the sensing system of land vehicle and second land vehicle.
331. machine readable storage medium according to clause 328, wherein beam forming include changing by described first The amplitude of the detection signal of a transmitting in the sensing system of the sensing system of land vehicle and second land vehicle.
332. machine readable storage medium according to clause 321, wherein it includes shape to form more base sensors At multistatic radar, the multistatic radar include at least part in the radar sensing system of first land vehicle and At least part in the radar sensing system of second land vehicle.
333. machine readable storage medium according to clause 321, wherein it includes being formed to form more base sensors Include the multistatic radar of multiple radar receivers, the multiple radar receiver includes that the radar of first land vehicle connects Receive device.
334. machine readable storage medium according to clause 321, wherein it includes shape to form more base sensors At the multistatic radar for including multiple radar transmitters, the multiple radar transmitter includes that the transmitter of the first land vehicle connects Receive device.
335. machine readable storage medium according to clause 321, wherein it includes shape to form more base sensors At bistatic radar, the bistatic radar includes the radar transmitter of first land vehicle and second land vehicle Radar receiver.
336. machine readable storage medium according to clause 321, wherein it includes shape to form more base sensors At bistatic radar, the bistatic radar includes the radar receiver of first land vehicle and second land vehicle Radar transmitter.
337. machine readable storage medium according to clause 321, wherein it includes shape to form more base sensors At phased array, the phased array includes at least part of the sensing system of first land vehicle and second land At least part of the sensing system of ground vehicle.
338. machine readable storage medium according to clause 337, the method further include making the phased array court The directional steering judged in advance.
339. machine readable storage medium according to clause 321, the method further include:
Obtain the auxiliary data in relation to second land vehicle;And
More base sensors are formed using the acquired auxiliary data.
340. machine readable storage medium according to clause 339, wherein the acquired auxiliary data includes institute State one in the acceleration of the second land vehicle, speed, position and direction.
341. machine readable storage medium according to clause 339, the method further include from second Vehicle Request auxiliary data.
342. machine readable storage medium according to clause 339, wherein the sensing system of second land vehicle Including electrical-optical sensor, three-dimensional sensor, laser radar (LIDAR), sonic transducer, ultrasonic sensor, imaging sensor, One kind in radar, electromagnetic sensor, Magnetic Sensor and capacitance sensor.
343. machine readable storage medium according to clause 310, wherein the sensing system of first land vehicle Including electrical-optical sensor, three-dimensional sensor, laser radar (LIDAR), sonic transducer, ultrasonic sensor, imaging sensor, One kind in radar, electromagnetic sensor, Magnetic Sensor and capacitance sensor.
344. machine readable storage medium according to clause 310, the method further include in response to judgement and object Related movable information is unsatisfactory for threshold value and generates the request to configure the sensing system of second land vehicle.
345. machine readable storage medium according to clause 344, the method further includes judgements and the object The signal-to-noise ratio of the related sensing data of body is unsatisfactory for the threshold value.
346. machine readable storage medium according to clause 344, the method further includes judgements described first The direction of the sensing system of land vehicle prevents to judge one or more motion features of the object.
347. machine readable storage medium according to clause 310, the method further include being in response to judgement object The request is generated in the outside of the detected envelope line of the sensing system of first land vehicle.
348. machine readable storage medium according to clause 310, the method further include being in response to judgement object The request is generated in the inside of the detected envelope line of the sensing system of second land vehicle.
349. machine readable storage medium according to clause 347, the method further include that the judgement object is another One object masks.
350. machine readable storage medium according to clause 347, the method further include judging pushing away for the object It is outside the detection range of the sensing system of first land vehicle to calculate position.
351. machine readable storage medium according to clause 310, wherein the request bag includes payment quotation.
352. machine readable storage medium according to clause 310, wherein the request bag includes valence.
353. machine readable storage medium according to clause 310, wherein the request bag, which includes, accesses the collision inspection Survey the quotation of model.
354. machine readable storage medium according to clause 310, wherein the request bag includes access by using institute State the quotation of the sensing data of the sensing system acquisition of the first land vehicle.
355. machine readable storage medium according to clause 310, wherein the request bag, which includes, accesses stored prison Control the quotation of data.
356. machine readable storage medium according to clause 310, wherein the collision detection model is at least partly It is generated in first land vehicle.
357. machine readable storage medium according to clause 310, the method further include sending the collision detection At least part of model is to second land vehicle.
358. machine readable storage medium according to clause 310, wherein the collision detection model is at least partly It is generated in second land vehicle.
359. machine readable storage medium according to clause 310, wherein the sensing system of second land vehicle It is configured as obtaining sensing data related with object.
360. machine readable storage medium according to clause 359, wherein the object be vehicle, pedestrian, roadblock and One kind in animal.
361. machine readable storage medium according to clause 359, wherein the collision detection model includes the object The direction of body.
362. machine readable storage medium according to clause 359, wherein the collision detection model includes the object The size of body.
363. machine readable storage medium according to clause 359, wherein the collision detection model includes the object The position of body.
364. machine readable storage medium according to clause 359, wherein the collision detection model includes the object The speed of body.
365. machine readable storage medium according to clause 359, wherein the collision detection model includes the object The acceleration of body.
366. machine readable storage medium according to clause 359, wherein the collision detection model includes the object The identity of body.
367. machine readable storage medium according to clause 310, the method further includes will be by using described The sensing data that the sensing system of two land vehicles obtains is converted into the reference system of first land vehicle.
368. machine readable storage medium according to clause 310, the method further include by the collision detection mould Type is converted into absolute frame.
369. machine readable storage medium according to clause 310, the method further include by the collision detection mould Type is converted into the reference system of first land vehicle.
370. machine readable storage medium according to clause 310, wherein the collision detection model includes object phase For the position of first land vehicle.
371. machine readable storage medium according to clause 310, wherein the collision detection model includes object phase For the direction of first land vehicle.
372. machine readable storage medium according to clause 310, wherein the collision detection model includes object phase For the speed of first land vehicle.
373. machine readable storage medium according to clause 310, wherein the collision detection model includes object phase For the acceleration of first land vehicle.
374. machine readable storage medium according to clause 310, the method further includes will be by using described The sensing data that the sensing system of one land vehicle obtains is converted into the reference system of second land vehicle.
375. machine readable storage medium according to clause 310, the method further include by the collision detection mould Type is converted into the reference system of second land vehicle.
376. machine readable storage medium according to clause 310, wherein the collision detection model includes object phase For the position of second land vehicle.
377. machine readable storage medium according to clause 310, wherein the collision detection model includes object phase For the direction of second land vehicle.
378. machine readable storage medium according to clause 310, wherein the collision detection model includes object phase For the speed of second land vehicle.
379. machine readable storage medium according to clause 310, wherein the collision detection model includes object phase For the acceleration of second land vehicle.
380. machine readable storage medium according to clause 310, the method further includes will be by using described The sensing data that the sensing system of two land vehicles obtains is converted into the reference system of third land vehicle.
381. machine readable storage medium according to clause 310, the method further include by the collision detection mould Type is converted into the reference system of the third land vehicle.
382. machine readable storage medium according to clause 310, wherein the collision detection model includes object phase For the position of third land vehicle.
383. machine readable storage medium according to clause 310, wherein the collision detection model includes object phase For the direction of third land vehicle.
384. machine readable storage medium according to clause 310, wherein the collision detection model includes object phase For the speed of third land vehicle.
385. machine readable storage medium according to clause 310, wherein the collision detection model includes object phase For the acceleration of third land vehicle.
386. machine readable storage medium according to clause 310, the method further include that detection is examined in the collision Survey the potential collision between the object in model.
387. machine readable storage medium according to clause 310, the method further include by using the collision Potential collision between detection model detection object and first land vehicle.
388. machine readable storage medium according to clause 310, the method further include by using the collision Potential collision between detection model detection object and second land vehicle.
389. machine readable storage medium according to clause 310, the method further include by using the collision Potential collision between detection model detection object and third land vehicle.
390. machine readable storage medium according to clause 389, the method, which further includes that judgement is described, potentially touches Hit the time of generation.
391. machine readable storage medium according to clause 389, the method, which further includes that judgement is described, potentially touches The position hit.
392. machine readable storage medium according to clause 389, the method, which further includes that judgement is described, potentially touches The impact velocity hit.
393. machine readable storage medium according to clause 389, the method, which further includes that estimation is described, potentially touches The one or more impact forces hit.
394. machine readable storage medium according to clause 389, the method, which further includes that estimation is described, potentially touches Kinematics characteristic after the collision of one or more of the object hit.
395. machine readable storage medium according to clause 389, the method further include described in response to detecting Potential collision generates alarm.
396. machine readable storage medium according to clause 389, the method further include providing the alarm to quilt Predict the object being related in the potential collision.
397. machine readable storage medium according to clause 389, the method further include described in response to detecting The collision avoidance system of first land vehicle described in potential crash-active.
398. machine readable storage medium according to clause 389, the method further include described in response to detecting The collision warning systems of first land vehicle described in potential crash-active.
399. machine readable storage medium according to clause 398, wherein activation collision warning systems include activation institute State the sound early warning system of the first land vehicle.
400. machine readable storage medium according to clause 398, wherein activation collision warning systems include activation institute State the electrical-optical transmitter of the first land vehicle.
401. machine readable storage medium according to clause 389, the method further include described in response to detecting Potential collision shows alarm.
402. machine readable storage medium according to clause 389, the method further include described in response to detecting Potential collision generates audio alert in first land vehicle.
403. machine readable storage medium according to clause 389, the method further include in first Vehicle User display screen on show it is described potentially collide visually indicate.
404. machine readable storage medium according to clause 389, the method further include in first Vehicle Head-up display on show it is described potentially collide visually indicate.
405. machine readable storage medium according to clause 404, the method further include that identification is happened at described the The object potentially collided in the head-up display of one land vehicle.
406. machine readable storage medium according to clause 389, the method further include described in response to detecting Potential collision generates collision elimination instruction based on the collision detection model.
407. machine readable storage medium according to clause 406, wherein the collision elimination instruction includes subtracting One kind in speed, the instruction for accelerating and turning to.
408. machine readable storage medium according to clause 406, the method further include that generation includes the collision Avoid the alarm of instruction.
409. machine readable storage medium according to clause 389, the method further include:
By using the result potentially collided described in the collision detection model prediction;And
Collision elimination instruction is generated by using the result predicted.
410. machine readable storage medium according to clause 389, the method further include described in response to detecting Potential collision sends an alert to second land vehicle.
411. machine readable storage medium according to clause 410, wherein the alarm includes that generation is described potential Position of the object of collision relative to second land vehicle.
412. machine readable storage medium according to clause 410, wherein the alarm includes that generation is described potential Speed of the object of collision relative to second land vehicle.
413. machine readable storage medium according to clause 410, the method further include generating to be used for described second The collision elimination of land vehicle instructs.
414. machine readable storage medium according to clause 410, the method further include:
Use the result potentially collided described in the collision detection model prediction;And
It is generated by using the result predicted and is instructed for the collision elimination of second land vehicle.
415. machine readable storage medium according to clause 389, the method further include described in response to detecting Potential collision sends an alert to third land vehicle.
416. machine readable storage medium according to clause 415, wherein it includes broadcasting the police to send the alarm Report.
417. machine readable storage medium according to clause 415, wherein the third land vehicle is related to described dive Collision.
418. machine readable storage medium according to clause 415, wherein the alarm includes that generation is described potential Position of the object of collision relative to the third land vehicle.
419. machine readable storage medium according to clause 415, wherein the alarm includes that generation is described potential Speed of the object of collision relative to the third land vehicle.
420. machine readable storage medium according to clause 415, the method further include:
Use the result potentially collided described in the collision detection model prediction;And
It is generated based on the result predicted and is instructed for the avoidance of the third land vehicle.
421. machine readable storage medium according to clause 420, the method further include based on described in being predicted As a result it generates and is instructed for the collision elimination of the third land vehicle.
422. machine readable storage medium according to clause 310, the method further include from second Vehicle Request auxiliary data.
423. machine readable storage medium according to clause 310, the method further include from second Vehicle Receive auxiliary data.
424. machine readable storage medium according to clause 422, the method further include using the auxiliary data To convert in the collision detection model to the reference system of second land vehicle.
425. machine readable storage medium according to clause 422, the method further include using the auxiliary data The sensing data obtained by using the sensing system of second land vehicle is converted into first land vehicle In reference system.
426. machine readable storage medium according to clause 422, the method further include using the auxiliary data Carry out transformed response and detects sense of the signal in first land vehicle in the sensing system transmitting by second land vehicle The sensing data that examining system receives.
427. machine readable storage medium according to clause 422, wherein the auxiliary data includes second land One kind in the acceleration of ground vehicle, speed, position and direction.
428. machine readable storage medium according to clause 422, wherein the auxiliary data includes and described second The related information of capacity of the sensing system of land vehicle.
429. machine readable storage medium according to clause 422, wherein the auxiliary data includes global positioning system System coordinate.
430. machine readable storage medium according to clause 422, wherein the auxiliary data includes the survey of speedometer Measure result.
431. machine readable storage medium according to clause 422, wherein the auxiliary data includes time synchronization letter Number.
432. machine readable storage medium according to clause 422, the method further include being connect from third land vehicle Receive auxiliary data.
433. machine readable storage medium according to clause 310, the method further include:
Collision detection model is generated on first land vehicle;And
At least part of the collision detection model is provided to another land vehicle.
434. machine readable storage medium according to clause 310, the method further include from another land vehicle Receive at least part of the second collision detection model.
435. machine readable storage medium according to clause 434, the method further include the combination collision detection Model and the second collision detection model.
436. machine readable storage medium according to clause 434, the method further include being collided using described second Detection model optimizes the collision detection model.
437. machine readable storage medium according to clause 310, the method further include receive access by using The request for the sensing data that the sensing system of the sensing system of second land vehicle and first land vehicle obtains.
438. machine readable storage medium according to clause 437 is asked wherein accessing the described of the sensing data It asks and offers including payment, the machine readable storage medium further includes meeting the payment judged in advance in response to the payment quotation Threshold value and the access to requested sensing data is provided.
439. machine readable storage medium according to clause 437 is asked wherein accessing the described of the sensing data It asks including bid, the machine readable storage medium further includes meeting the bid threshold value judged in advance in response to the bid and carrying For the access to requested sensing data.
440. machine readable storage medium according to clause 437 is asked wherein accessing the described of the sensing data Ask the quotation including providing the access to the collision detection data of designated vehicle.
441. machine readable storage medium according to clause 440, wherein the machine readable storage medium further includes:
In response to judging that the collision detection data of the specified vehicle can be used for generating the collision detection mould Type and the access to the requested sensing data is provided.
442. machine readable storage medium according to clause 310, wherein the machine readable storage medium further includes:
Request accesses the collision detection model generated by specified vehicle;And
The collision detection model is generated using the collision detection model of the specified vehicle.
443. machine readable storage medium according to clause 442, wherein it includes carrying to ask the collision detection model For paying the requested collision detection model is accessed to exchange for.
444. machine readable storage medium according to clause 442, wherein it includes pair to ask the collision detection model Access the requested collision detection model bid.
445. machine readable storage medium according to clause 442, wherein it includes pair to ask the collision detection model The quotation of access sensors data accesses the requested collision detection model to exchange for.
446. machine readable storage medium according to clause 310 further includes receiving to access the collision detection mould The request of type.
447. machine readable storage medium according to clause 446, wherein accessing the described of the collision detection model Request includes payment, and the method further includes meeting the payment threshold value judged in advance in response to the payment and providing and touched to described Hit the access of detection model.
448. machine readable storage medium according to clause 446, wherein accessing the described of the collision detection model Request bag includes valence, and the method further includes meeting the bid threshold value judged in advance in response to the bid and providing and touched to described Hit the access of detection model.
449. machine readable storage medium according to clause 446, wherein accessing the described of the collision detection model Request includes providing the quotation of the access to the collision detection data of designated vehicle.
450. machine readable storage medium according to clause 449, the method further include:
In response to judging that the collision detection data of the specified vehicle can be used for generating the collision detection mould Type and the access to the collision detection model is provided.
451. machine readable storage medium according to clause 310, the method further include establishing first land Communication link between vehicle and second land vehicle.
452. machine readable storage medium according to clause 451, wherein the communication link includes peer-to-peer network, spy If network, infrastructure network, wireless network, one kind in cellular data network and electrical-optical network.
453. machine readable storage medium according to clause 451, the method further include in response in communication range Inside detect that one in first land vehicle and second land vehicle is established communication link.
454. machine readable storage medium according to clause 451, the method further include described in response to detecting First land vehicle establishes communication link close to second land vehicle.
455. machine readable storage medium according to clause 454, wherein detect that first land vehicle is close Second land vehicle includes that broadcast communication finds that signal and detection communication find one kind in signal broadcast.
456. machine readable storage medium according to clause 451, wherein detect that first land vehicle is close Second land vehicle includes determining whether one place in first land vehicle and second land vehicle.
457. machine readable storage medium according to clause 456, the method further include registration first land One place in vehicle and second land vehicle.
458. machine readable storage medium according to clause 310, the method further include including via network transmission At least part of monitoring data of the collision detection model.
459. machine readable storage medium according to clause 458, the method further include sending the monitoring data To one in traffic control system, Network Accessible Service station, insurance company and public security organs.
460. machine readable storage medium according to clause 458, the method further include ensuring via the network The safety of monitoring data of transmission.
461. machine readable storage medium according to clause 460, wherein it is ensured that the safety of monitoring data includes giving The monitoring data signature.
462. machine readable storage medium according to clause 460, wherein it is ensured that the safety of monitoring data includes giving The monitoring data encryption.
463. machine readable storage medium according to clause 458, the method further include being applied to the monitoring data It is marked between added-time.
464. machine readable storage medium according to clause 458, the method further include including by place identifier In the monitoring data.
465. machine readable storage medium according to clause 310, the method further include that will be examined including the collision At least part of supervising data storage of model is surveyed on persistent storage medium.
466. machine readable storage medium according to clause 465, the method further include ensure to be stored it is described Safety of monitoring data.
467. machine readable storage medium according to clause 465, the method further include sending to be deposited on network The monitoring data of storage.
468. machine readable storage medium according to clause 310, wherein the machine readable storage medium is non-temporary When property.

Claims (38)

1. a kind of collision checking method comprising:
Request is generated to configure the sensing system of the second land vehicle with by using second land in the first land vehicle The one or more sensors of the sensing system of vehicle obtain sensing data in second land vehicle;
The one or more of the sensing system by using second land vehicle is used in first land vehicle Sensing data acquired in a sensor generates collision detection model;And
More base sensors are formed, which includes at least part of the sensing system of first land vehicle With at least part of the sensing system of second land vehicle;
Wherein, it includes configuring the sensing system of first land vehicle to be configured to emit to form more base sensors The sensing signal detected by the receiver of second land vehicle.
2. according to the method described in claim 1, wherein described more base sensors include one or more detection signal transmittings Device, one or more of detection signal projectors include the detection signal projector of first land vehicle.
3. according to the method described in claim 1, wherein described more base sensors include one or more receivers, described one A or multiple receivers include the receiver of the first land vehicle.
4. the method according to claim 1, wherein it includes configuration first land vehicle to form more base sensors Sensing system is to receive by the detection signal of the sensing system transmitting of second land vehicle.
5. according to the method described in claim 1, it further includes being sent out in response to the sensing system by first land vehicle The sensing signal penetrated is obtained in first land vehicle using the sensing system by using second land vehicle The sensing data taken generates the collision detection model.
6. according to the method described in claim 1, it includes turning to sensing signal in advance wherein to form more base sensors The region of judgement.
7. according to the method described in claim 1, it includes configuring institute by beam forming wherein to form more base sensors State the sensing system of the sensing system and second land vehicle of the first land vehicle.
8. according to the method described in claim 7, wherein beam forming includes guiding multistatic radar along the direction judged in advance Detection signal.
9. according to the method described in claim 7, wherein beam forming includes changing by the sensing system of first land vehicle The phase of the detection signal of a transmitting in the sensing system of system and second land vehicle.
10. according to the method described in claim 7, wherein beam forming includes changing by the sensing system of first land vehicle The amplitude of the detection signal of a transmitting in the sensing system of system and second land vehicle.
11. according to the method described in claim 1, wherein, it includes forming multistatic radar to form more base sensors, institute State at least part in the radar sensing system that multistatic radar includes first land vehicle and second Vehicle Radar sensing system at least part.
12. according to the method described in claim 1, it includes being formed to receive comprising multiple radars wherein to form more base sensors The multistatic radar of device, the multiple radar receiver include the radar receiver of first land vehicle.
13. according to the method described in claim 1, it includes being formed to send out comprising multiple radars wherein to form more base sensors The multistatic radar of emitter, the multiple radar transmitter include the transmitter receiver of the first land vehicle.
14. according to the method described in claim 1, wherein formed more base sensors include form bistatic radar, institute State the radar receiver of radar transmitter and second land vehicle that bistatic radar includes first land vehicle.
15. according to the method described in claim 1, wherein formed more base sensors include form bistatic radar, institute State the radar transmitter of radar receiver and second land vehicle that bistatic radar includes first land vehicle.
16. according to the method described in claim 1, wherein, it includes forming phased array to form more base sensors, described Phased array includes the sensing system of at least part and second land vehicle of the sensing system of first land vehicle At least part of system.
17. further including according to the method for claim 16, making the phased array towards the directional steering judged in advance.
18. according to the method described in claim 1, it further includes:
The auxiliary data in relation to second land vehicle is obtained in first land vehicle;And
More base sensors are formed using the acquired auxiliary data in first land vehicle.
19. according to the method for claim 18, wherein the acquired auxiliary data includes second land vehicle Acceleration, speed, one in position and direction.
20. further including according to the method for claim 18, asking auxiliary data from second land vehicle.
21. a kind of collision detecting system comprising:
The Coordination module of first land vehicle is configurable to generate request to configure the sensing system of the second land vehicle with logical It crosses and is obtained in second land vehicle using the one or more sensors of the sensing system of second land vehicle Sensing data, and more base sensors are formed, which includes the sensing system of first land vehicle At least part and second land vehicle sensing system at least part, wherein form the more bases sensing Device includes configuring the sensing system of first land vehicle to be configured to by the receiver of second land vehicle to emit Come the sensing signal detected;With
Processing module is configured to using the one or more of of the sensing system by using second land vehicle The sensing data that sensor obtains to generate collision detection model in first land vehicle.
22. collision detecting system according to claim 21 further includes being configured as transmitting the request to described The communication module of two land vehicles.
23. collision detecting system according to claim 21 further includes being configured to receive by using described second The communication module of sensing data acquired in the sensing system of land vehicle, wherein the processing module is configured to using logical Cross the sensing data obtained using the sensing system of second land vehicle and by using first land vehicle Sensing system obtain sensing data to generate the collision detection model.
24. collision detecting system according to claim 21, wherein the Coordination module is configured at least partly Second land vehicle is selected based at least one in item set forth below:It is the position of second land vehicle, described The direction of second land vehicle, the capacity sensor of second land vehicle, second land vehicle are relative to specified area The position in domain, second land vehicle are specified relative to the position of specified object, second land vehicle relative to described The direction of the direction in region and second land vehicle relative to the specified object.
25. collision detecting system according to claim 21, wherein the request identification region, and wherein described The sensing system of two land vehicles guides detection signal to the region identified in response to the request.
26. collision detecting system according to claim 25, wherein the detection signal is configured as by described first The receiver of the sensing system of land vehicle detects.
27. collision detecting system according to claim 21, wherein the request identifies object, and wherein described the The sensing system of two land vehicles guides detection signal to the object identified in response to the request.
28. collision detecting system according to claim 27, wherein the detection signal is configured as by described first The receiver of the sensing system of land vehicle detects.
29. collision detecting system according to claim 21, wherein the Coordination module be configured to respond to judgement with The related movable information of object is unsatisfactory for threshold value and generates the request.
30. collision detecting system according to claim 29, wherein the judgement is unsatisfactory for about the movable information of object Threshold value includes determining whether that the signal-to-noise ratio of sensing data related with the object is unsatisfactory for the threshold value.
31. collision detecting system according to claim 29, wherein described to judge that movable information related with object is discontented Sufficient threshold value includes determining whether that the direction of the sensing system of first land vehicle prevents to judge one or more fortune of the object Dynamic feature.
32. a kind of machine readable storage medium comprising be configured to the instruction for making collision detecting system execute method, this method Including:
Request is generated to configure the sensing system of the second land vehicle with by using second land in the first land vehicle The one or more sensors of the sensing system of vehicle obtain sensing data in second land vehicle;
The sensing acquired in one or more of sensors using the sensing system by using second land vehicle Device data generate collision detection model in the first land vehicle;And
More base sensors are formed, which includes at least part of the sensing system of first land vehicle With at least part of the sensing system of second land vehicle;
Wherein, it includes configuring the sensing system of first land vehicle to be configured to emit to form more base sensors The sensing signal detected by the receiver of second land vehicle.
33. machine readable storage medium according to claim 32, wherein more base sensors include one or more A detection signal projector, one or more of detection signal projectors include the detection signal hair of first land vehicle Emitter.
34. machine readable storage medium according to claim 32, the method further include:
The auxiliary data in relation to second land vehicle is obtained in first land vehicle;And
Using the acquired auxiliary data to form more base sensors in first land vehicle.
35. machine readable storage medium according to claim 32, the method further include in response to judgement object be The inside of the detected envelope line of the sensing system of second land vehicle and generate the request.
36. machine readable storage medium according to claim 32, wherein the request bag includes payment quotation.
37. machine readable storage medium according to claim 32, wherein the request bag, which includes, accesses the collision detection The quotation of model.
38. machine readable storage medium according to claim 32, wherein the request bag includes access by using described The quotation for the sensing data that the sensing system of first land vehicle obtains.
CN201380046869.3A 2012-07-09 2013-07-08 Coordinate the system and method for the sensor operations for collision detection Expired - Fee Related CN104620298B (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US13/544,770 2012-07-09
US13/544,770 US9165469B2 (en) 2012-07-09 2012-07-09 Systems and methods for coordinating sensor operation for collision detection
US13/544,757 2012-07-09
US13/544,799 2012-07-09
US13/544,757 US9558667B2 (en) 2012-07-09 2012-07-09 Systems and methods for cooperative collision detection
US13/544,799 US9000903B2 (en) 2012-07-09 2012-07-09 Systems and methods for vehicle monitoring
PCT/US2013/049579 WO2014011552A1 (en) 2012-07-09 2013-07-08 Systems and methods for coordinating sensor operation for collision detection

Publications (2)

Publication Number Publication Date
CN104620298A CN104620298A (en) 2015-05-13
CN104620298B true CN104620298B (en) 2018-09-18

Family

ID=49916493

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201380046869.3A Expired - Fee Related CN104620298B (en) 2012-07-09 2013-07-08 Coordinate the system and method for the sensor operations for collision detection

Country Status (3)

Country Link
EP (1) EP2870592A4 (en)
CN (1) CN104620298B (en)
WO (3) WO2014011552A1 (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9505412B2 (en) 2013-08-02 2016-11-29 Honda Motor Co., Ltd. System and method for detection and utilization of driver distraction level
US9786178B1 (en) 2013-08-02 2017-10-10 Honda Motor Co., Ltd. Vehicle pedestrian safety system and methods of use and manufacture thereof
JP6429368B2 (en) 2013-08-02 2018-11-28 本田技研工業株式会社 Inter-vehicle communication system and method
JP6391536B2 (en) * 2015-06-12 2018-09-19 日立建機株式会社 In-vehicle device, vehicle collision prevention method
TWI584238B (en) * 2015-12-16 2017-05-21 Optimization Method of Vehicle Coordinated Object Location and Vehicle Coordinate Location Device
CN105774800B (en) * 2016-03-28 2018-06-26 清华大学 A kind of impact-moderation method and device in hybrid vehicle queue between vehicle
CN108091154A (en) * 2016-11-23 2018-05-29 比亚迪股份有限公司 Information of vehicles treating method and apparatus
US10360797B2 (en) * 2017-01-27 2019-07-23 Qualcomm Incorporated Request-response-based sharing of sensor information
US11214143B2 (en) * 2017-05-02 2022-01-04 Motional Ad Llc Visually obstructed object detection for automated vehicle using V2V/V2I communications
US10796501B2 (en) * 2017-06-19 2020-10-06 Qualcomm Incorporated Interactive sharing of vehicle sensor information
US11300958B2 (en) * 2017-07-13 2022-04-12 Waymo Llc Sensor adjustment based on vehicle motion
CN107464436B (en) * 2017-08-02 2019-12-24 北京邮电大学 Information processing method and device based on vehicle clustering
FR3076045A1 (en) * 2017-12-22 2019-06-28 Orange METHOD FOR MONITORING AN ENVIRONMENT OF A FIRST ELEMENT POSITIONED AT THE LEVEL OF A CIRCULATION PATH, AND ASSOCIATED SYSTEM
JP7151234B2 (en) 2018-07-19 2022-10-12 株式会社デンソー Camera system and event recording system
US10611372B2 (en) * 2018-09-06 2020-04-07 Zebra Technologies Corporation Dual-mode data capture system for collision detection and object dimensioning
US11106209B2 (en) * 2019-02-11 2021-08-31 Toyota Jidosha Kabushiki Kaisha Anomaly mapping by vehicular micro clouds
DE102019207302A1 (en) * 2019-05-20 2020-11-26 Robert Bosch Gmbh Method for operating a sensor device of a vehicle
CN112153567A (en) * 2019-06-28 2020-12-29 大陆泰密克汽车系统(上海)有限公司 Method and vehicle for constructing real-time regional electronic map
TW202102392A (en) * 2019-07-02 2021-01-16 帷享科技股份有限公司 Driving safety enhancing system and method for making or enabling highly accurate judgment and providing advance early warning
JP7132895B2 (en) * 2019-08-27 2022-09-07 本田技研工業株式会社 Communication control device, communication control method, and program
WO2021087942A1 (en) * 2019-11-08 2021-05-14 Qualcomm Incorporated Distributed congestion control for sensor sharing
CN112977410A (en) * 2019-12-02 2021-06-18 奥迪股份公司 Vehicle braking system, auxiliary braking system, vehicle and corresponding method and medium
US11812371B2 (en) 2020-09-28 2023-11-07 Qualcomm Incorporated Adaptive node activation and configuration in cooperative sensing
CN116457694A (en) * 2020-11-13 2023-07-18 谷歌有限责任公司 User equipment collaborative radar sensing
WO2023133045A1 (en) * 2022-01-07 2023-07-13 Qualcomm Incorporated Establishing cooperative radar sensing sessions

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101299301A (en) * 2007-05-04 2008-11-05 通用汽车环球科技运作公司 Slow or stopped vehicle ahead advisor with digital map integration
CN101687509A (en) * 2007-03-08 2010-03-31 丰田自动车株式会社 Vicinity environment estimation device with blind region prediction, road detection and intervehicle communication
CN101751703A (en) * 2008-12-09 2010-06-23 财团法人资讯工业策进会 Vehicle running collision management system and method

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6720920B2 (en) * 1997-10-22 2004-04-13 Intelligent Technologies International Inc. Method and arrangement for communicating between vehicles
US6202023B1 (en) * 1996-08-22 2001-03-13 Go2 Systems, Inc. Internet based geographic location referencing system and method
US6950013B2 (en) * 1998-06-01 2005-09-27 Robert Jeffery Scaman Incident recording secure database
US6988276B2 (en) * 1999-12-14 2006-01-17 Koninklijke Philips Electronics N.V. In-house TV to TV channel peeking
US7236596B2 (en) * 2000-02-07 2007-06-26 Mikos, Ltd. Digital imaging system for evidentiary use
DE60107692T2 (en) * 2000-08-16 2005-12-15 Raytheon Company, Waltham SYSTEM FOR RECORDING NEARBY OBJECTS
WO2003001474A2 (en) * 2001-06-26 2003-01-03 Medius, Inc. Method and apparatus for detecting possible collisions and transferring information between vehicles
US8381982B2 (en) * 2005-12-03 2013-02-26 Sky-Trax, Inc. Method and apparatus for managing and controlling manned and automated utility vehicles
JP4752486B2 (en) * 2005-12-15 2011-08-17 株式会社日立製作所 Imaging device, video signal selection device, driving support device, automobile
JP4345832B2 (en) * 2007-03-12 2009-10-14 トヨタ自動車株式会社 Road condition detection system
US20080320036A1 (en) * 2007-06-22 2008-12-25 Winter Gentle E Automatic data collection
US7812758B2 (en) * 2007-11-27 2010-10-12 Northrop Grumman Space And Mission Systems Corporation Synthetic aperture radar (SAR) imaging system
US8400507B2 (en) * 2008-03-17 2013-03-19 International Business Machines Corporation Scene selection in a vehicle-to-vehicle network
US8468073B2 (en) * 2008-06-30 2013-06-18 The Invention Science Fund I, Llc Facilitating compensation arrangements providing for data tracking components
US8050880B2 (en) * 2008-10-28 2011-11-01 C & P Technologies, Inc. Generation of a constant envelope signal
US20100131300A1 (en) * 2008-11-26 2010-05-27 Fred Collopy Visible insurance
US20100164789A1 (en) * 2008-12-30 2010-07-01 Gm Global Technology Operations, Inc. Measurement Level Integration of GPS and Other Range and Bearing Measurement-Capable Sensors for Ubiquitous Positioning Capability
US7994902B2 (en) * 2009-02-25 2011-08-09 Southwest Research Institute Cooperative sensor-sharing vehicle traffic safety system
US20110106442A1 (en) * 2009-10-30 2011-05-05 Indian Institute Of Technology Bombay Collision avoidance system and method
US20110122026A1 (en) * 2009-11-24 2011-05-26 Delaquil Matthew P Scalable and/or reconfigurable beamformer systems
US8330645B2 (en) * 2010-08-31 2012-12-11 Raytheon Company Radar activation multiple access system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101687509A (en) * 2007-03-08 2010-03-31 丰田自动车株式会社 Vicinity environment estimation device with blind region prediction, road detection and intervehicle communication
CN101299301A (en) * 2007-05-04 2008-11-05 通用汽车环球科技运作公司 Slow or stopped vehicle ahead advisor with digital map integration
CN101751703A (en) * 2008-12-09 2010-06-23 财团法人资讯工业策进会 Vehicle running collision management system and method

Also Published As

Publication number Publication date
WO2014011556A1 (en) 2014-01-16
WO2014011545A1 (en) 2014-01-16
WO2014011552A1 (en) 2014-01-16
EP2870592A4 (en) 2016-05-04
EP2870592A1 (en) 2015-05-13
CN104620298A (en) 2015-05-13

Similar Documents

Publication Publication Date Title
CN104620298B (en) Coordinate the system and method for the sensor operations for collision detection
US9558667B2 (en) Systems and methods for cooperative collision detection
US9000903B2 (en) Systems and methods for vehicle monitoring
US9165469B2 (en) Systems and methods for coordinating sensor operation for collision detection
US10992755B1 (en) Smart vehicle
US9230442B2 (en) Systems and methods for adaptive vehicle sensing systems
US10928826B2 (en) Sensor fusion by operations-control vehicle for commanding and controlling autonomous vehicles
KR102325049B1 (en) Electronic device for transmitting communication signal associated with pedestrian safety and method for operating thereof
US20210108926A1 (en) Smart vehicle
US20190256088A1 (en) Methods and apparatus to generate vehicle warnings
CN108297880A (en) Divert one's attention driver notification system
US20150039218A1 (en) Systems and methods for adaptive vehicle sensing systems
US9269268B2 (en) Systems and methods for adaptive vehicle sensing systems
US20200398743A1 (en) Method and apparatus for learning how to notify pedestrians
US20230288927A1 (en) Vehicular management
US20210089048A1 (en) Smart vehicle
US20160304028A1 (en) Advanced warning and risk evasion system and method
US10836346B2 (en) Methods and systems for providing a protect occupants mode with an autonomous vehicle
JP2020501224A (en) Network and connected devices for emergency response and roadside work
US20140358324A1 (en) System and method for road side equipment of interest selection for active safety applications
JP2017527939A (en) Method and apparatus for monitoring traffic areas
CN108307295A (en) The method and apparatus for avoiding accident for vulnerable road user
JP2017535008A (en) Method and apparatus for forming a road traffic user mobility model
Khan et al. Autonomous vehicles: A study of implementation and security.
JP2022535454A (en) Classification of objects based on radio communication

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180918

Termination date: 20200708