WO2019044456A1 - Information processing device, vehicle, and roadside unit - Google Patents

Information processing device, vehicle, and roadside unit Download PDF

Info

Publication number
WO2019044456A1
WO2019044456A1 PCT/JP2018/030025 JP2018030025W WO2019044456A1 WO 2019044456 A1 WO2019044456 A1 WO 2019044456A1 JP 2018030025 W JP2018030025 W JP 2018030025W WO 2019044456 A1 WO2019044456 A1 WO 2019044456A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
vehicle
user
processing device
information processing
Prior art date
Application number
PCT/JP2018/030025
Other languages
French (fr)
Inventor
Kenta Takahira
Original Assignee
Kyocera Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kyocera Corporation filed Critical Kyocera Corporation
Publication of WO2019044456A1 publication Critical patent/WO2019044456A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/162Decentralised systems, e.g. inter-vehicle communication event-triggered
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/005Traffic control systems for road vehicles including pedestrian guidance indicator
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/027Services making use of location information using location based information parameters using movement velocity, acceleration information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/90Services for handling of emergency or hazardous situations, e.g. earthquake and tsunami warning systems [ETWS]

Definitions

  • the present technology relates to an information processing device.
  • a pedestrian terminal device for ensuring a pedestrian has been developed, in which pedestrian information is transmitted to an on-board terminal device in a case where a pedestrian enters a dangerous area set based on person attribute information of the pedestrian.
  • An information processing device, a vehicle, and a roadside unit are disclosed.
  • an information processing device acquires, from a mobile electronic device, information notifying that a user of the mobile electronic device may behave dangerously in a manner likely to induce a traffic accident
  • the information processing device checks appropriateness of the information. If the information is inappropriate information, the information processing device discards the information.
  • a vehicle includes the information processing device of the above-mentioned embodiment, and a vehicle controller. If the information processing device determines that the information is appropriate information, the information processing device instructs the vehicle controller to perform vehicle control to avoid contact with the user.
  • a roadside unit includes the information processing device of the above-mentioned embodiment, and a communication unit configured to communicate the information. If the information processing device determines that the information is appropriate information, the roadside unit transmits the information to a vehicle in a vicinity of the roadside unit and to another roadside unit in a vicinity the roadside unit via the communication unit.
  • FIG. 1 illustrates a diagram showing one example of an information processing system.
  • FIG. 2 illustrates a diagram showing one example of the information processing system.
  • FIG. 3 illustrates a perspective view showing one example of external appearance of an electronic device.
  • FIG. 4 illustrates a back view showing one example of external appearance of the electronic device.
  • FIG. 5 illustrates a block diagram showing one example of a configuration of the electronic device.
  • FIG. 6 illustrates a block diagram showing one example of a configuration of a roadside unit.
  • FIG. 7 illustrates a flowchart showing one example of operations of the electronic device.
  • FIG. 8 illustrates a flowchart showing one example of operations of the electronic device.
  • FIG. 9 illustrates a block diagram showing one example of a configuration of an information processing device installed in a vehicle.
  • FIG. 9 illustrates a block diagram showing one example of a configuration of an information processing device installed in a vehicle.
  • FIG. 10 illustrates a diagram schematically showing a state where a pedestrian is about to enter a vehicle.
  • FIG. 11 illustrates a flowchart showing one example of operations of the electronic device.
  • FIG. 12 illustrates a diagram schematically showing a state where a pedestrian is about to enter a vehicle.
  • FIG. 13 illustrates a flowchart showing one example of operations of a controller of a vehicle.
  • FIG. 1 and FIG. 2 each illustrate a diagram showing one example of an information processing system 1.
  • the information processing system 1 is, for example, a system used in intelligent transport systems (ITS).
  • ITS intelligent transport systems
  • the information processing system 1 is a safe driving support communication system of ITS.
  • the safe driving support communication system may be referred to as a safe driving support system, or as a safe driving support wireless system.
  • the information processing system 1 includes a plurality of types of information processing devices.
  • the plurality of types of information processing devices include a roadside unit 5, and an electronic device installed in a vehicle 6.
  • a roadside unit 5 disposed at an intersection 2 etc., a vehicle 6 such as an automobile that travels in a roadway 7, and an electronic device 10 of a user 9 being a pedestrian can wirelessly communicate with each other.
  • the roadside unit 5, the vehicle 6, and the electronic device 10 can exchange information with each other.
  • a plurality of vehicles 6 can wirelessly communicate with each another. With this, the plurality of vehicles 6 can exchange information with each another.
  • Communication between the roadside unit 5 and the vehicle 6, communication between the vehicles 6, communication between the roadside unit 5 and the electronic device 10 of a pedestrian, and communication between the electronic device 10 of a pedestrian and the vehicle 6 are respectively referred to as roadside-vehicle communication, inter-vehicle communication, roadside-pedestrian communication, and pedestrian-vehicle communication.
  • the electronic device 10, the roadside unit 5, and a server device 8 are connected to a network 900.
  • the network 900 includes, for example, a relay device, an internet, etc.
  • Each of the electronic device 10 and the roadside unit 5 can communicate with the server device 8 via the network 900.
  • the electronic device 10 is, for example, a mobile electronic device such as a smartphone, and may be referred to as a mobile electronic device.
  • the electronic device 10 can specify an operation state of its user 9.
  • the electronic device 10 can inform the roadside unit 5, the vehicle 6, etc. of information about an operation state of a specific user 9 or the like. Operation of the electronic device 10 will be described later in detail.
  • the roadside unit 5 can, for example, inform the vehicle 6 and the electronic device 10 of information about lighting of a traffic light 4, information about road traffic control, etc. Further, the roadside unit 5 can detect a nearby vehicle 6 and a nearby pedestrian. The roadside unit 5 disposed at the intersection 2 can, for example, can detect a pedestrian passing across a crosswalk 3. Further, the roadside unit 5 can inform the vehicle 6 and the electronic device 10 of the information about the detected vehicle 6 and pedestrian. Further, the roadside unit 5 can inform another vehicle 6 and another electronic device 10 of the information informed of by the vehicle 6 and the electronic device 10.
  • the vehicle 6 can inform another vehicle 6, the roadside unit 5, and the electronic device 10 of information about its own position, speed, turn signals, etc. Further, the vehicle 6 can support safe driving of its driver by issuing various notifications such as warning to the driver based on information informed of by another device.
  • the vehicle 6 can issue various notifications to the driver with the use of a speaker, a display, etc.
  • the vehicle 6 can, for example, issue various notifications to the driver with the use of a car navigation device installed in the vehicle 6. Further, the vehicle 6 can automatically control a steering operation, a braking operation, and an accelerating operation.
  • the server device 8 manages map information including road information, facility information, etc.
  • the server device 8 can transmit map information to the electronic device 10, the roadside unit 5, and the vehicle 6.
  • the electronic device 10 and the vehicle 6 can, for example, display a map or the like based on the received map information.
  • the roadside-vehicle communication, the inter-vehicle communication, the roadside-pedestrian communication, and the pedestrian-vehicle communication are performed, thereby supporting safe driving of the driver of the vehicle 6.
  • a vehicle of an automobile is illustrated as the vehicle 6.
  • the vehicle 6 may be a vehicle other than an automobile.
  • the vehicle 6 may be a vehicle of a bus or a vehicle of a streetcar.
  • FIG. 3 and FIG. 4 respectively illustrate a perspective view and a back view each showing one example of external appearance of the electronic device 10.
  • the electronic device 10 includes a plate-like device case 11 having a substantially rectangular shape in plan view.
  • the device case 11 forms the exterior of the electronic device 10.
  • a display region 12 for displaying various pieces of information, such as characters, symbols, and figures, is located on a front surface 11a of the device case 11.
  • a touch panel 130 to be described later is located on a back surface side of the display region 12.
  • the user 9 can input various pieces of information to the electronic device 10 by operating the display region 12 on the front surface of the electronic device 10 with a finger etc.
  • the user 9 can also input various pieces of information to the electronic device 10 by operating the display region 12 with a pointer other than a finger, that is, for example, a touch-panel pen such as a stylus pen.
  • a receiver hole 13 is located at an upper end portion of the front surface 11a of the device case 11.
  • a speaker hole 14 is located at a lower end portion of the front surface 11a.
  • a microphone hole 15 is located in a side surface 11c on a lower side of the device case 11.
  • a lens 181 of a first camera 180 to be described later is visually recognizable from an upper end portion of the front surface 11a of the device case 11.
  • a lens 191 of a second camera 190 to be described later is visually recognizable from an upper end portion of a back surface 11b of the device case 11.
  • the electronic device 10 includes an operation button group 220 made up of a plurality of operation buttons 22 (refer to FIG. 5).
  • Each of the plurality of operation buttons 22 is a hardware button.
  • each of the plurality of operation buttons 22 is a push button.
  • at least one operation button 22 included in the operation button group 220 may be a software button displayed in the display region 12.
  • the operation button group 220 includes operation buttons 22a, 22b, and 22c that are located at a lower end portion of the front surface 11a of the device case 11. Further, the operation button group 220 may include a power button and a volume button that are located on a surface of the device case 11.
  • the operation button 22a is, for example, a back button.
  • the back button is an operation button for switching the display of the display region 12 to a previous display.
  • the operation button 22b is, for example, a home button.
  • the home button is an operation button for displaying a home screen in the display region 12.
  • the operation button 22c is, for example, a history button.
  • the history button is an operation button for displaying in the display region 12 the history of applications executed in the electronic device 10. When the user 9 operates the operation button 22c, the history of applications executed in the electronic device 10 is displayed in the display region 12.
  • FIG. 5 illustrates a block diagram mainly showing one example of an electrical configuration of the electronic device 10.
  • the electronic device 10 includes a controller 100, a wireless communication unit 110, a display 120, an operation unit 210, and a satellite signal receiver 140.
  • the electronic device 10 further includes a receiver 150, a speaker 160, a microphone 170, a first camera 180, a second camera 190, an acceleration sensor 200, and a battery 230. These components of the electronic device 10 are accommodated inside the device case 11.
  • the controller 100 can integrally manage the operations of the electronic device 10 by controlling other components of the electronic device 10.
  • the controller 100 includes at least one processor for providing control and processing capability to perform various functions as described in further detail below.
  • the at least one processor may be implemented as a single integrated circuit (IC) or as multiple communicatively coupled IC’s and/or discrete circuits. It is appreciated that the at least one processor can be implemented according to various known technologies.
  • the processor includes one or more circuits or units configurable to perform one or more data computing procedures or processes by executing instructions stored in an associated memory, for example.
  • the processor may be implemented as firmware (e.g. discrete logic components) configured to perform one or more data computing procedures or processes.
  • the processor may include one or more processors, controllers, microprocessors, microcontrollers, application specific integrated circuits (ASICs), digital signal processors, programmable logic devices, field programmable gate arrays, or any combination of these devices or structures, or other known devices and structures, to perform the functions described herein.
  • ASICs application specific integrated circuits
  • digital signal processors programmable logic devices, field programmable gate arrays, or any combination of these devices or structures, or other known devices and structures, to perform the functions described herein.
  • the controller 100 includes a central processing unit (CPU) 101, a digital signal processor (DSP) 102, and a storage 103.
  • the storage 103 includes a non-transitory recording medium readable with the CPU 101 and the DSP 102, such as read only memory (ROM) and random access memory (RAM).
  • ROM read only memory
  • RAM random access memory
  • the ROM of the storage 103 is, for example, flash ROM (flash memory) being non-volatile memory.
  • the storage 103 can store a plurality of control programs 103a for controlling the electronic device 10, acceleration data 103b, movement determination data 103c, threshold value data 103d, setting data 103z, etc.
  • the CPU 101 and the DSP 102 execute the various control programs 103a in the storage 103, whereby various functions of the controller 100 are implemented.
  • the plurality of control programs 103a in the storage 103 include various applications (application programs).
  • the control programs 103a can provide a function of determining whether the behavior of a user of the subject device is likely to induce a traffic accident based on acceleration being a detection result of the acceleration sensor 200. Specifically, the control programs 103a can measure vibration and motion acting on the subject device based on the direction and magnitude of the acceleration being a detection result of the acceleration sensor 200. The control programs 103a can determine whether the user of the subject device is in a walking state by comparing a measurement result of the measured vibration and motion with the movement determination data 103c. The control programs 103a can start monitoring acceleration if determining that the user is in a walking state.
  • the control programs 103a can determine that the behavior of the user is likely to induce a traffic accident if the monitored acceleration exceeds a predetermined threshold value contained in the threshold value data 103d.
  • the control programs 103a can use, for example, average walking acceleration of the user as the predetermined threshold value. In this case, the control programs 103a can determine that the behavior of the user is likely to induce a traffic accident if the monitored acceleration exceeds the average walking acceleration of the user. Determining whether the monitored acceleration exceeds the average walking acceleration of the user with the use of a function provided by the control programs 103a, for example, a behavior that the user suddenly starts to run can be detected.
  • control programs 103a can also determine whether the user of the subject device is in a stopping state by comparing the measurement result of the vibration and motion acting on the subject device with the movement determination data 103c.
  • the control programs 103a can start monitoring acceleration if determining that the user is in a stopping state. Further, similarly to the case of determining that the user is in a walking state, the control programs 103a can determine that the behavior of the user is likely to induce a traffic accident if the monitored acceleration exceeds a predetermined threshold value contained in the threshold value data 103d.
  • control programs 103a determine that the behavior of the user of the subject device is likely to induce a traffic accident, the control programs 103a can provide a function to issue an alert notification to the user and a notification about the danger to another entity other than the user.
  • the acceleration data 103b contains a value of the acceleration obtained by the acceleration sensor 200.
  • the acceleration data 103b contains a direction and magnitude of the acceleration obtained by the acceleration sensor 200.
  • the acceleration data 103b may contain all of the measurement results measured by the acceleration sensor 200.
  • the movement determination data 103c contains, for example, information of determination conditions to be used at the time of determining a movement state of the user.
  • the information of the determination conditions may contain a direction and magnitude of the acceleration acting on the subject device, an acceleration pattern made up of time-series changes in a direction and magnitude of the acceleration, or a combined vector in which the acceleration in three axes of an X-axis, a Y-axis, and a Z-axis are combined.
  • the information of the determination conditions contains at least information for determining whether the user is in a walking state or in a stopping state that is obtained from the detection result of the acceleration sensor 200.
  • the threshold value data 103d contains information of a predetermined threshold value for determining whether the behavior of the user is likely to induce a traffic accident.
  • the threshold value data 103d contains, as information of the predetermined threshold value, for example, information of average walking acceleration of the user of the subject device that is measured in advance.
  • the setting data 103z contains information of various settings about operations of the electronic device 10.
  • the setting data 103z contains information about notification modes of an alert issued when the behavior of the user is determined to be likely to induce a traffic accident.
  • the notification modes may include a pattern using at least one of sound, image, light, vibration, etc.
  • the controller 100 may include a plurality of CPUs 101. Further, the controller 100 may omit the DSP 102, or may include a plurality of DSPs 102. Further, all of the functions of the controller 100 or a part of the functions of the controller 100 may be implemented by a hardware circuit that does not require software for implementing such functions.
  • the storage 103 may include a non-transitory recording medium readable with a computer other than ROM and RAM.
  • the storage 103 may include, for example, a small-sized hard disk drive, a solid state drive (SSD), and the like.
  • the wireless communication unit 110 includes an antenna 111.
  • the wireless communication unit 110 may also be referred to as a wireless communication circuit. With the use of the antenna 111, the wireless communication unit 110 can, for example, wirelessly communicate in a plurality of types of communication methods. The wireless communication of the wireless communication unit 110 is controlled by the controller 100.
  • the wireless communication unit 110 can wirelessly communicate with a base station of a mobile phone system.
  • the wireless communication unit 110 can communicate with an information processing device different from the electronic device 10, such as the server device 8, a mobile phone, and a web server, via the network 900 including the base station.
  • the electronic device 10 can perform data communication, a voice call, a video call, etc. with another mobile phone.
  • the wireless communication unit 110 can wirelessly communicate with the roadside unit 5 and the vehicle 6.
  • the wireless communication unit 110 can wirelessly communicate using a 700 MHz band that is allocated in ITS.
  • the wireless communication unit 110 can communicate in conformity to the wireless standard of IEEE 802.11p used in ITS.
  • the wireless communication unit 110 can wirelessly communicate using a wireless local area network (LAN), such as Wifi.
  • LAN wireless local area network
  • the wireless communication unit 110 can perform near-field wireless communication.
  • the wireless communication unit 110 can wirelessly communicate in conformity to Bluetooth (registered trademark).
  • the wireless communication unit 110 may be able to wirelessly communicate in conformity to at least one of ZigBee (registered trademark) and near field communication (NFC).
  • the wireless communication unit 110 subjects a signal received at the antenna 111 to various types of processing such as amplification processing, and outputs the processed received signal to the controller 100.
  • the controller 100 subjects the input received signal to various types of processing, and acquires information contained in the received signal. Further, the controller 100 outputs a transmission signal containing information to the wireless communication unit 110.
  • the wireless communication unit 110 subjects the input transmission signal to various types of processing such as amplification processing, and wirelessly transmits the processed transmission signal from the antenna 111.
  • the display 120 includes the display region 12 located on the front surface of the electronic device 10, and a display panel 121.
  • the display 120 can display various pieces of information in the display region 12.
  • the display panel 121 is, for example, a liquid crystal display panel or an organic EL panel.
  • the display panel 121 can display various pieces of information, such as characters, symbols, and figures under control of the controller 100.
  • the display panel 121 is opposed to the display region 12 inside the device case 11. Information displayed in the display panel 121 is displayed in the display region 12.
  • the operation unit 210 can accept various operations performed on the electronic device 10 by the user 9.
  • the operation unit 210 includes the touch panel 130 and the operation button group 220.
  • the touch panel 130 can detect operations performed on the display region 12 by a pointer such as a finger. With this, the touch panel 130 can accept operations performed on the display region 12 by the user 9.
  • the touch panel 130 is, for example, a projected capacitive touch panel.
  • the touch panel 130 is, for example, located on the back of the display region 12.
  • the touch panel 130 can input an electrical signal to the controller 100 in accordance with the operation. Based on the electrical signal (output signal) from the touch panel 130, the controller 100 can identify the detail of the operation performed on the display region 12. Then, the controller 100 can perform processing in accordance with the identified detail of the operation.
  • each operation button 22 of the operation button group 220 can output an operation signal to the controller 100, indicating that the operation button 22 has been operated. With this, regarding each operation button 22, the controller 100 can determine whether or not the operation button 22 has been operated. The controller 100 that has received the input operation signal controls other components, whereby a function allocated to the operated operation button 22 is implemented in the electronic device 10.
  • the satellite signal receiver 140 can receive a satellite signal transmitted from a positioning satellite. Then, based on the received satellite signal, the satellite signal receiver 140 can acquire positional information indicating the position of the electronic device 10.
  • the positional information acquired by the satellite signal receiver 140 contains, for example, latitude and longitude indicating the position of the electronic device 10.
  • the controller 100 can initiate operation of the satellite signal receiver 140, and can also stop the operation.
  • the satellite signal receiver 140 may simply be referred to as the “receiver 140.”
  • the receiver 140 is, for example, a GPS receiver, and can receive wireless signals from positioning satellites of the Global Positioning System (GPS). Based on the received wireless signals, the receiver 140 calculates the current position of the electronic device 10 by means of latitude and longitude, for example, and outputs the positional information containing the calculated latitude and longitude to the controller 100. It can also be said that the positional information of the electronic device 10 corresponds to positional information indicating the position of the user 9 possessing the electronic device 10.
  • GPS Global Positioning System
  • the receiver 140 may obtain positional information of the electronic device 10 based on signals from positioning satellites of a global navigation satellite system (GNSS) other than GPS.
  • GNSS global navigation satellite system
  • the receiver 140 may obtain positional information of the electronic device 10 based on signals from positioning satellites of the Global Navigation Satellite System (GLONASS), the Indian Regional Navigational Satellite System (IRNSS), COMPASS, Galileo, or the Quasi-Zenith Satellites System (QZSS).
  • GLONASS Global Navigation Satellite System
  • IRNSS Indian Regional Navigational Satellite System
  • COMPASS COMPASS
  • Galileo Galileo
  • QZSS Quasi-Zenith Satellites System
  • the microphone 170 can convert sound input from the outside of the electronic device 10 into an electrical sound signal, and can output the converted signal to the controller 100. Sound from the outside of the electronic device 10 is input to the inside of the electronic device 10 through the microphone hole 15, to be input into the microphone 170.
  • the speaker 160 is, for example, a dynamic speaker.
  • the speaker 160 can convert the electrical sound signal from the controller 100 into sound, and can output the converted sound.
  • the sound output from the speaker 160 is output to the outside through the speaker hole 14.
  • the user 9 can hear the sound output from the speaker hole 14 even at a place far from the electronic device 10.
  • the receiver 150 can output phone-call received sound.
  • the receiver 150 is, for example, a dynamic speaker.
  • the receiver 150 can convert the electrical sound signal from the controller 100 into sound, and can output the converted sound.
  • the sound output from the receiver 150 is output to the outside through the receiver hole 13.
  • the volume of the sound output through the receiver hole 13 is smaller than the volume of the sound output through the speaker hole 14.
  • the user 9 brings his/her ear closer to the receiver hole 13, whereby the user 9 can hear the sound output through the receiver hole 13.
  • a vibration element such as a piezoelectric vibration element, that vibrates a front surface portion of the device case 11 may be provided. In this case, the sound is conveyed to the user through the vibration in the front surface portion.
  • the first camera 180 includes the lens 181, an image sensor, etc.
  • the second camera 190 includes the lens 191, an image sensor, etc.
  • Each of the first camera 180 and the second camera 190 can capture an image of an object based on control of the controller 100, can generate a still image or a moving image showing the captured object, and can output the captured image to the controller 100.
  • the acceleration sensor 200 can detect acceleration of the electronic device 10.
  • the acceleration sensor 200 is, for example, a three-axis acceleration sensor.
  • the acceleration sensor 200 can detect acceleration of the electronic device 10 in an x-axis direction, a y-axis direction, and a z-axis direction.
  • the x-axis direction, the y-axis direction, and the z-axis direction are, for example, set in a long-side direction, a short-side direction, and a thickness direction of the electronic device 10, respectively.
  • the battery 230 can output power for the electronic device 10.
  • the battery 230 is, for example, a rechargeable battery.
  • the power output from the battery 230 is supplied to various components of the electronic device 10, such as the controller 100 and the wireless communication unit 110.
  • the electronic device 10 may include a sensor other than the acceleration sensor 200.
  • the electronic device 10 may include at least one of an air pressure sensor, a geomagnetic sensor, a temperature sensor, a proximity sensor, an illuminance sensor, and a gyro sensor.
  • FIG. 6 illustrates a block diagram showing one example of a configuration of the roadside unit 5.
  • the roadside unit 5 includes a controller 500, a wireless communication unit 510, a wired communication unit 520, and a camera 530 (imaging unit).
  • the controller 500 can integrally manage the operations of the roadside unit 5 by controlling other components of the roadside unit 5.
  • the controller 500 includes at least one processor for providing control and processing capability to perform various functions as described in further detail below.
  • the above description about the processor of the controller 100 of the electronic device 10 is applicable to the processor of the controller 500 as well.
  • the controller 500 includes a CPU 501, a DSP 502, and a storage 503.
  • the storage 503 includes a non-transitory recording medium readable with the CPU 501 and the DSP 502, such as ROM and RAM.
  • the ROM of the storage 503 is, for example, flash ROM being non-volatile memory.
  • the storage 503 stores a plurality of control programs 503a etc. for controlling the roadside unit 5.
  • the CPU 501 and the DSP 502 execute the various control programs 503a in the storage 503, whereby various functions of the controller 500 are implemented.
  • the storage 503 may include a non-transitory recording medium readable with a computer other than ROM and RAM. At least one control program 503a in the storage 503 may be stored in the storage 503 in advance. Further, at least one control program 503a in the storage 503 may be downloaded by the roadside unit 5 from another device to be stored in the storage 503. Further, all of the functions of the controller 500 or a part of the functions of the controller 500 may be implemented by a hardware circuit that does not require software for implementing such functions.
  • the wireless communication unit 510 includes an antenna 511. With the use of the antenna 511, the wireless communication unit 510 can wirelessly communicate with the vehicle 6 (specifically, an electronic device inside the vehicle 6) and the electronic device 10.
  • the wireless communication unit 510 can, for example, wirelessly communicate using a 700 MHz band that is allocated in ITS.
  • the wireless communication unit 510 can communicate in conformity to the wireless standard of IEEE 802.11p used in ITS.
  • the wireless communication unit 510 subjects a signal received at the antenna 511 to a various types of processing such as amplification processing, and outputs the processed received signal to the controller 500.
  • the controller 500 subjects the input received signal to various types of processing, and acquires information contained in the received signal. Further, the controller 500 outputs a transmission signal containing information to the wireless communication unit 510.
  • the wireless communication unit 510 subjects the input transmission signal to various types of processing such as amplification processing, and wirelessly transmits the processed transmission signal from the antenna 511.
  • the wireless communication unit 510 may be able to wirelessly communicate with the vehicle 6 and the electronic device 10 using a wireless local area network (LAN), such as Wifi.
  • LAN wireless local area network
  • the wired communication unit 520 is connected to the network 900 in a wired manner.
  • the wired communication unit 520 can communicate with a device connected to the network 900, such as the server device 8, via the network 900.
  • the wired communication unit 520 can input information received from the network 900 to the controller 500. Further, the wired communication unit 520 can output information received from the controller 500 to the network 900.
  • the camera 530 can capture an image of a place where the roadside unit 5 is installed.
  • the image generated by the camera 530 is input to the controller 500.
  • the camera 530 can capture a moving image and a still image.
  • the controller 500 can, for example, make the wireless communication unit 510 transmit the image received from the camera 530 to the electronic device 10 and the vehicle 6. Further, the controller 500 can analyze the image received from the camera 530 to detect a pedestrian (e.g. the user 9 of the electronic device 10) who is about to enter a vehicle, and the vehicle itself.
  • the controller 500 can transmit information to the vehicle 6 and the electronic device 10 via the wireless communication unit 510.
  • the roadside unit 5 may be able to wirelessly communicate with the network 900. Further, the roadside unit 5 may be able to perform near-field wireless communication. Further, the roadside unit 5 may be able to wirelessly communicate in conformity to Bluetooth.
  • FIG. 7 and FIG. 8 each illustrate a flowchart showing one example of a flow of a process executed by the electronic device 10.
  • the controller 100 executes the control programs 103a stored in the storage 103, whereby the process illustrated in FIG. 7 and FIG. 8 is implemented.
  • FIG. 7 illustrates a flowchart for illustrating one example of starting monitoring the acceleration acting on the electronic device 10 and determining whether the monitored acceleration exceeds average walking acceleration to thereby determine whether the behavior of the user is likely to induce a traffic accident when the user of the electronic device 10 is in a walking state.
  • the controller 100 acquires a detection result of the acceleration sensor 200 (Step S101).
  • Step S102 the controller 100 determines whether the user of the electronic device 10 is in a walking state.
  • Step S102 If the user is in a walking state as a result of the determination in Step S102 (if Yes), the controller 100 starts monitoring the acceleration being the detection result of the acceleration sensor 200 (Step S103). On the other hand, if the user is not in a walking state as a result of the determination in Step S102 (if No), the process proceeds to Step S106.
  • the controller 100 determines whether the monitored acceleration exceeds average walking acceleration of the user of the subject device (Step S104).
  • Step S104 If the monitored acceleration exceeds average walking acceleration being a result of the determination in Step S104 (if Yes), the controller 100 issues an alert notification to the user of the subject device and also issues a notification about danger information to another entity other than the user (Step S105). On the other hand, if the monitored acceleration does not exceed the average walking acceleration as a result of the determination in Step S104 (if No), the process proceeds to Step S106.
  • the “another entity” corresponds to at least one of an information processing device 600 installed in the vehicle 6 and the roadside unit 5.
  • the controller 100 transmits, to the information processing device 600 installed in the vehicle 6 and the roadside unit 5, danger information that there is a pedestrian who behaves in a manner likely to induce a traffic accident through pedestrian-vehicle communication and roadside-pedestrian communication via the wireless communication unit 110.
  • Step S106 determines whether to continue the process. If the process is to be continued as a result of the determination in Step S106 (if Yes), the process starting with Step S101 is repeated. On the other hand, if the process is not to be continued as a result of the determination in Step S106 (if No), the series of the process is ended.
  • FIG. 8 illustrates a flowchart for illustrating one example of starting monitoring the acceleration acting on the subject device and determining whether the monitored acceleration exceeds average walking acceleration to thereby determine whether the behavior of the user is likely to induce a traffic accident when the user of the electronic device 10 is in a stopping state.
  • the controller 100 acquires a detection result of the acceleration sensor 200 (Step S201).
  • Step S202 the controller 100 determines whether the user of the electronic device 10 is in a stopping state.
  • Step S202 If the user is in a stopping state as a result of the determination in Step S202 (if Yes), the controller 100 starts monitoring the acceleration as the detection result of the acceleration sensor 200 (Step S203). On the other hand, if the user is not in a stopping state as a result of the determination in Step S102 (if No), the process proceeds to Step S206.
  • the controller 100 determines whether the monitored acceleration exceeds average walking acceleration of the user of the subject device (Step S204).
  • Step S204 If the monitored acceleration exceeds average walking acceleration as a result of the determination in Step S204 (if Yes), the controller 100 issues an alert notification to the user of the subject device and also issues a notification about danger information to another entity other than the user (Step S205). On the other hand, if the monitored acceleration does not exceed the average walking acceleration as a result of the determination in Step S204 (if No), the process proceeds to Step S206.
  • Step S206 determines whether to continue the process. If the process is to be continued as a result of the determination in Step S206 (if Yes), the process starting with Step S201 is repeated. On the other hand, if the process is not to be continued as a result of the determination in Step S206 (if No), the series of the process is ended.
  • the electronic device 10 starts monitoring the acceleration of the subject device when the user is in a walking state or in a stopping state, and issues an alert notification to the user and a notification about danger information to another entity other than the user on condition that the monitored acceleration exceeds average walking acceleration of the user of the subject device. Therefore, danger information can be transmitted to the traveling vehicle 6 if the user behaves in a manner of running out into a roadway, for example.
  • the above-mentioned method of determining the behavior of the user with the use of the electronic device 10 is merely one example, and the behavior of the user may be determined in any method not limited to the above. In short, it is only necessary that, in a case where the user may behave in a manner of running out into a roadway, for example, such a case can be transmitted to the traveling vehicle 6 as danger information.
  • FIG. 9 illustrates a block diagram showing one example of a configuration of the information processing device 600.
  • the information processing device 600 includes a CPU 601, a DSP 602, and a storage 603, and is connected to a wireless communication unit 610, a vehicle controller 620, and a camera 630 (imaging unit).
  • the vehicle controller 620 is a unit, such as Advanced Driving Assistance Systems (ADAS)-Electronic Control Unit (ECU), that automatically controls a steering operation, a braking operation, and an accelerating operation of the vehicle 6.
  • ADAS Advanced Driving Assistance Systems
  • ECU Electronic Control Unit
  • the wireless communication unit 610 includes an antenna 611, and can wirelessly communicate with the roadside unit 5 and the electronic device 10 with the use of the antenna 611.
  • the wireless communication unit 610 can, for example, wirelessly communicate using a 700 MHz band that is allocated in ITS.
  • the wireless communication unit 610 can communicate in conformity to the wireless standard of IEEE 802.11p used in ITS.
  • the storage 603 includes a non-transitory recording medium readable with the CPU 601 and the DSP 602, such as ROM and RAM.
  • the ROM of the storage 603 is, for example, flash ROM being non-volatile memory.
  • the storage 603 stores a plurality of control programs 603a etc. for controlling the information processing device 600.
  • the CPU 601 and the DSP 602 execute the various control programs 603a in the storage 603, whereby various functions of the information processing device 600 are implemented.
  • the storage 603 may include a non-transitory recording medium readable with a computer other than ROM and RAM. At least one control program 603a in the storage 603 may be stored in the storage 603 in advance. Further, at least one control program 603a in the storage 603 may be downloaded by the information processing device 600 from another device to be stored in the storage 603. Further, all of the functions of the information processing device 600 or a part of the functions of the information processing device 600 may be implemented by a hardware circuit that does not require software for implementing such functions.
  • the camera 630 can capture an image in at least a front direction of the vehicle 6.
  • the image generated by the camera 630 is input to the information processing device 600.
  • the camera 630 can capture a moving image and a still image.
  • the information processing device 600 can analyze the image received from the camera 630 to detect a pedestrian (e.g. the user 9 of the electronic device 10) who is about to enter a vehicle in the front direction, and the vehicle in the front direction itself.
  • a pedestrian e.g. the user 9 of the electronic device 10.
  • the electronic device 10 can transmit danger information to the traveling vehicle 6.
  • the information processing device 600 that has received the danger information via the wireless communication unit 610 issues a danger aversion instruction to the vehicle controller 620 so that the vehicle controller 620 performs a steering operation, a braking operation, etc. that are necessary for avoiding contact with the user behaving dangerously.
  • the vehicle controller 620 that has received the danger aversion instruction performs a danger aversion operation such as sudden braking and sudden steering operations. Such operations, however, may raise a sense of discomfort in a passenger of the vehicle.
  • the information processing device 600 that has received the danger information notifies a driver about the reception of the danger information with voice etc. so as to prompt danger aversion, which causes the driver having received such a notification to perform sudden braking and sudden steering operations. This, however, may result in imposing a burden on the driver.
  • the problem is the case where the determination result of the behavior of the user obtained by the electronic device 10 is wrong.
  • the pedestrian may run and hurry, behaving at acceleration exceeding average walking acceleration.
  • the electronic device 10 may transmit danger information even in such a case, and the traveling vehicle 6 that has received the danger information results in performing the above-mentioned danger aversion operation.
  • FIG. 10 illustrates a diagram schematically showing a state where the pedestrian 9 possessing the electronic device 10 is about to enter a vehicle 60, and the vehicle 6 installed with the information processing device 600 approaches the vehicle 60.
  • the electronic device 10 of the pedestrian 9 determines that the behavior of the pedestrian 9 who is about to enter the vehicle 60 is a behavior likely to induce a traffic accident, and transmits danger information EI.
  • FIG. 10 illustrates one example in which a “caution running-out” message and a running-out picture image are transmitted to the vehicle 6.
  • the information processing device 600 of the vehicle 6 appropriateness of the determination result of the behavior of the user obtained by the electronic device 10 is determined based on an image in the front direction captured by the camera 630. If the determination result is wrong, the received danger information is discarded so as not to perform the danger aversion operation. Note that, as examples of discarding the received danger information, for example, the information processing device 600 may delete the danger information from the storage 603, or the information processing device 600 may not regard the received danger information as the conditions for performing the danger aversion operation irrespective of whether or not the information processing device 600 deletes the danger information from the storage 603.
  • the information processing device 600 determines whether to have received danger information via the wireless communication unit 610 (Step S1). If the information processing device 600 determines to have received danger information (if Yes), the process proceeds to Step S2, and if the information processing device 600 determines not to have received danger information (if No), the Step S1 is repeated.
  • the information processing device 600 analyzes the image captured by the camera 630 (Step S2). Then, based on the analyzed image, the presence or absence of a vehicle in the front direction and the presence or absence of a pedestrian on the same side are checked (Step S3). That is, if the image captured by the camera 630 is an image in which a vehicle is stopped in the traveling front direction of the subject vehicle and a standing or bending-down pedestrian on the same side is about to enter the vehicle, the operation is thought to result from exceeded average walking acceleration of the pedestrian 9 possessing the electronic device 10 to enter the vehicle.
  • a method of detecting a feature quantity of the image through learning with deep learning and then determining a vehicle and a pedestrian through a pattern recognition is conceivable.
  • a vehicle and a pedestrian may be determined through simple pattern matching.
  • Step S3 If the image analyzed in Step S3 is an image in which a person is about to enter a vehicle (if Yes), the process proceeds to Step S4 to discard the received danger information and end the series of the process.
  • Step S3 the process proceeds to Step S5 to perform, based on the received danger information, a process necessary for avoiding contact with the pedestrian 9 possessing the electronic device 10 that has transmitted the danger information. That is, the information processing device 600 issues the danger aversion instruction to the vehicle controller 620 so that the vehicle controller 620 performs a steering operation, a braking operation, etc. that are necessary for avoiding contact with the user behaving dangerously. Note that, if the subject vehicle is not adapted to automated driving, the information processing device 600 that has received the danger information performs a process of notifying a driver about the reception of the danger information with voice etc. so as to prompt danger aversion.
  • the received danger information is discarded on condition that the pedestrian 9 possessing the electronic device 10 that has transmitted the danger information is determined to be about to enter a vehicle based on the image analysis.
  • the subject vehicle is not required to perform the operation of avoiding danger based on the danger information, thereby eliminating a burden on the passenger.
  • an obstacle present in the front direction of the vehicle 6 may be detected based on detection data obtained by an obstacle sensor such as light detection and ranging (LIDAR) installed in the vehicle 6, and appropriateness of the danger information may be determined based on the presence or absence of a stopped vehicle as an obstacle and a person on the same side.
  • LIDAR light detection and ranging
  • the roadside unit 5 is, as described above with reference to FIG. 1, often installed at an intersection, but may be installed along a road other than an intersection. In such a case, the roadside unit 5 may have a similar function as that of the information processing device 600.
  • the roadside unit 5 includes the camera 530, and can therefore determine whether or not the pedestrian 9 possessing the electronic device 10 that has transmitted danger information is about to enter a vehicle by analyzing an image captured by the camera 530.
  • FIG. 12 illustrates a diagram schematically showing a state where the pedestrian 9 possessing the electronic device 10 is about to enter the vehicle 60, and the vehicle 6 approaches the vehicle 60.
  • the roadside unit 5 is installed along the roadway 7 between the vehicle 6 and the vehicle 60.
  • the electronic device 10 of the pedestrian 9 determines that the behavior of the pedestrian 9 who is about to enter the vehicle 60 is a behavior likely to induce a traffic accident, and transmits danger information EI.
  • FIG. 12 illustrates one example in which a “caution running-out” message and a running-out picture image are transmitted to the roadside unit 5.
  • the controller 500 of the roadside unit 5 appropriateness of the determination result of the behavior of the user obtained by the electronic device 10 is determined based on an image in the front direction captured by the camera 530. If the determination result is wrong, the received danger information is not relayed to the vehicle 6. Since the danger information is not relayed to the vehicle 6, the vehicle 6 is not required to perform the operation of avoiding danger based on the danger information.
  • the controller 500 that performs the process as above implements a similar function as that of the information processing device 600, and can therefore be referred to as an information processing device as well. Now, the operations of the controller 500 are described with reference to a flowchart illustrated in FIG. 13.
  • the controller 500 determines whether to have received danger information via the wireless communication unit 510 (Step S11). If the controller 500 determines to have received danger information (if Yes), the process proceeds to Step S12, and if the controller 500 determines not to have received danger information (if No), the Step S11 is repeated.
  • the controller 500 analyzes the image captured by the camera 530 (Step S12).
  • FIG. 12 illustrates one example in which the camera 530 captures an image in the direction of the vehicle 60.
  • the camera 530 is a camera for capturing an image of the vicinity of the place where the roadside unit 5 is installed, and may therefore be made up of a plurality of cameras so as to be capable of capturing an image in multi-directions, not only in one direction.
  • Step S13 the presence or absence of a vehicle in the vicinity of the roadside unit 5 and the presence or absence of a pedestrian on the same side are checked.
  • the camera 530 is made up of a plurality of cameras as described above, when each of the images captured by the cameras are analyzed to detect an image in which a pedestrian stands beside a stopped vehicle or bends down to enter a vehicle, the operation is thought to result from exceeded average walking acceleration of the pedestrian 9 possessing the electronic device 10 to enter the vehicle.
  • Step S13 If the image analyzed in Step S13 is an image in which a person is about to enter a vehicle (if Yes), the process proceeds to Step S14 to discard the received danger information and end the series of the process.
  • Step S13 if the image analyzed in Step S13 is not an image in which a person is about to enter a vehicle (if No), the process proceeds to Step S15 to transmit the received danger information to the vehicle 6 in the vicinity of the roadside unit 5.
  • the vehicle 6 that has received the danger information includes the information processing device 600 as illustrated in FIG. 9, appropriateness of the danger information is determined in the information processing device 600 in accordance with the flow illustrated in FIG. 11.
  • the image is determined to be not an image in which a person is about to enter a vehicle in Step S3, and a process necessary for avoiding contact with the pedestrian 9 possessing the electronic device 10 that has transmitted the danger information is performed in Step S5 based on the received danger information.
  • a process necessary for avoiding contact with the pedestrian 9 possessing the electronic device 10 that has transmitted the danger information is performed immediately after the reception of the danger information.
  • an obstacle present at a place where the roadside unit 5 is installed may be detected based on detection data obtained by an obstacle sensor such as LIDAR installed in the roadside unit 5, and appropriateness of the danger information may be determined based on the presence or absence of a stopped vehicle as an obstacle and a person on the same side.
  • an obstacle sensor such as LIDAR installed in the roadside unit 5
  • appropriateness of the danger information may be determined based on the presence or absence of a stopped vehicle as an obstacle and a person on the same side.
  • the roadside unit 5 transmits received danger information to the vehicle 6 in the vicinity of the roadside unit 5.
  • the destination for transmitting the danger information is not limited to a vehicle.
  • the danger information may be transmitted to another roadside unit, and may be then relayed. With this, even if the vehicle 6 is present at a place far from the roadside unit 5 that first received the danger information, the danger information can be transmitted to the vehicle 6.
  • such another roadside unit that has received danger information from the roadside unit 5 performs a process to relay the danger information without executing the flow illustrated in FIG. 13.
  • the flow illustrated in FIG. 13 may be configured in advance so as to be executed on condition that danger information is received from the electronic device 10.
  • the roadside unit 5 may transmit received danger information to an electronic device 10 of another pedestrian.
  • another pedestrian who has received the danger information via the electronic device 10 is alerted that there is a pedestrian nearby who behaves in a manner likely to induce a traffic accident, and can thus make an action to avoid the danger.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

An information processing device is disclosed. If the information processing device acquires, from a mobile electronic device, information notifying that a user of the mobile electronic device may behave dangerously in a manner likely to induce a traffic accident, the information processing device checks appropriateness of the information. If the information is inappropriate information, the information processing device discards the information.

Description

INFORMATION PROCESSING DEVICE, VEHICLE, AND ROADSIDE UNIT Cross-reference to Related Application
The present application claims priority to Japanese Patent Application No. 2017-163999 (filed on August 29, 2017). The content of which is incorporated by reference herein in its entirety.
The present technology relates to an information processing device.
For example, a pedestrian terminal device for ensuring a pedestrian has been developed, in which pedestrian information is transmitted to an on-board terminal device in a case where a pedestrian enters a dangerous area set based on person attribute information of the pedestrian.
Summary
An information processing device, a vehicle, and a roadside unit are disclosed. In one embodiment, if an information processing device acquires, from a mobile electronic device, information notifying that a user of the mobile electronic device may behave dangerously in a manner likely to induce a traffic accident, the information processing device checks appropriateness of the information. If the information is inappropriate information, the information processing device discards the information.
Further, in another embodiment, a vehicle includes the information processing device of the above-mentioned embodiment, and a vehicle controller. If the information processing device determines that the information is appropriate information, the information processing device instructs the vehicle controller to perform vehicle control to avoid contact with the user.
Further, in another embodiment, a roadside unit includes the information processing device of the above-mentioned embodiment, and a communication unit configured to communicate the information. If the information processing device determines that the information is appropriate information, the roadside unit transmits the information to a vehicle in a vicinity of the roadside unit and to another roadside unit in a vicinity the roadside unit via the communication unit.
FIG. 1 illustrates a diagram showing one example of an information processing system. FIG. 2 illustrates a diagram showing one example of the information processing system. FIG. 3 illustrates a perspective view showing one example of external appearance of an electronic device. FIG. 4 illustrates a back view showing one example of external appearance of the electronic device. FIG. 5 illustrates a block diagram showing one example of a configuration of the electronic device. FIG. 6 illustrates a block diagram showing one example of a configuration of a roadside unit. FIG. 7 illustrates a flowchart showing one example of operations of the electronic device. FIG. 8 illustrates a flowchart showing one example of operations of the electronic device. FIG. 9 illustrates a block diagram showing one example of a configuration of an information processing device installed in a vehicle. FIG. 10 illustrates a diagram schematically showing a state where a pedestrian is about to enter a vehicle. FIG. 11 illustrates a flowchart showing one example of operations of the electronic device. FIG. 12 illustrates a diagram schematically showing a state where a pedestrian is about to enter a vehicle. FIG. 13 illustrates a flowchart showing one example of operations of a controller of a vehicle.
FIG. 1 and FIG. 2 each illustrate a diagram showing one example of an information processing system 1. The information processing system 1 is, for example, a system used in intelligent transport systems (ITS). Specifically, the information processing system 1 is a safe driving support communication system of ITS. The safe driving support communication system may be referred to as a safe driving support system, or as a safe driving support wireless system.
The information processing system 1 includes a plurality of types of information processing devices. The plurality of types of information processing devices include a roadside unit 5, and an electronic device installed in a vehicle 6.
In the information processing system 1, as illustrated in FIG. 1, a roadside unit 5 disposed at an intersection 2 etc., a vehicle 6 such as an automobile that travels in a roadway 7, and an electronic device 10 of a user 9 being a pedestrian can wirelessly communicate with each other. With this, the roadside unit 5, the vehicle 6, and the electronic device 10 can exchange information with each other. Further, a plurality of vehicles 6 can wirelessly communicate with each another. With this, the plurality of vehicles 6 can exchange information with each another. Communication between the roadside unit 5 and the vehicle 6, communication between the vehicles 6, communication between the roadside unit 5 and the electronic device 10 of a pedestrian, and communication between the electronic device 10 of a pedestrian and the vehicle 6 are respectively referred to as roadside-vehicle communication, inter-vehicle communication, roadside-pedestrian communication, and pedestrian-vehicle communication.
Further, as illustrated in FIG. 2, in the information processing system 1, the electronic device 10, the roadside unit 5, and a server device 8 are connected to a network 900. The network 900 includes, for example, a relay device, an internet, etc. Each of the electronic device 10 and the roadside unit 5 can communicate with the server device 8 via the network 900.
The electronic device 10 is, for example, a mobile electronic device such as a smartphone, and may be referred to as a mobile electronic device. The electronic device 10 can specify an operation state of its user 9. The electronic device 10 can inform the roadside unit 5, the vehicle 6, etc. of information about an operation state of a specific user 9 or the like. Operation of the electronic device 10 will be described later in detail.
The roadside unit 5 can, for example, inform the vehicle 6 and the electronic device 10 of information about lighting of a traffic light 4, information about road traffic control, etc. Further, the roadside unit 5 can detect a nearby vehicle 6 and a nearby pedestrian. The roadside unit 5 disposed at the intersection 2 can, for example, can detect a pedestrian passing across a crosswalk 3. Further, the roadside unit 5 can inform the vehicle 6 and the electronic device 10 of the information about the detected vehicle 6 and pedestrian. Further, the roadside unit 5 can inform another vehicle 6 and another electronic device 10 of the information informed of by the vehicle 6 and the electronic device 10.
The vehicle 6 can inform another vehicle 6, the roadside unit 5, and the electronic device 10 of information about its own position, speed, turn signals, etc. Further, the vehicle 6 can support safe driving of its driver by issuing various notifications such as warning to the driver based on information informed of by another device. The vehicle 6 can issue various notifications to the driver with the use of a speaker, a display, etc. The vehicle 6 can, for example, issue various notifications to the driver with the use of a car navigation device installed in the vehicle 6. Further, the vehicle 6 can automatically control a steering operation, a braking operation, and an accelerating operation.
The server device 8 manages map information including road information, facility information, etc. The server device 8 can transmit map information to the electronic device 10, the roadside unit 5, and the vehicle 6. When receiving map information from the server device 8, the electronic device 10 and the vehicle 6 can, for example, display a map or the like based on the received map information.
In this manner, in the information processing system 1, the roadside-vehicle communication, the inter-vehicle communication, the roadside-pedestrian communication, and the pedestrian-vehicle communication are performed, thereby supporting safe driving of the driver of the vehicle 6.
Note that, in one example of FIG. 1, a vehicle of an automobile is illustrated as the vehicle 6. However, the vehicle 6 may be a vehicle other than an automobile. For example, the vehicle 6 may be a vehicle of a bus or a vehicle of a streetcar.
FIG. 3 and FIG. 4 respectively illustrate a perspective view and a back view each showing one example of external appearance of the electronic device 10. As illustrated in FIG. 3 and FIG. 4, the electronic device 10 includes a plate-like device case 11 having a substantially rectangular shape in plan view. The device case 11 forms the exterior of the electronic device 10.
A display region 12 for displaying various pieces of information, such as characters, symbols, and figures, is located on a front surface 11a of the device case 11. A touch panel 130 to be described later is located on a back surface side of the display region 12. With this, the user 9 can input various pieces of information to the electronic device 10 by operating the display region 12 on the front surface of the electronic device 10 with a finger etc. Note that, the user 9 can also input various pieces of information to the electronic device 10 by operating the display region 12 with a pointer other than a finger, that is, for example, a touch-panel pen such as a stylus pen.
A receiver hole 13 is located at an upper end portion of the front surface 11a of the device case 11. A speaker hole 14 is located at a lower end portion of the front surface 11a. A microphone hole 15 is located in a side surface 11c on a lower side of the device case 11.
A lens 181 of a first camera 180 to be described later is visually recognizable from an upper end portion of the front surface 11a of the device case 11. As illustrated in FIG. 4, a lens 191 of a second camera 190 to be described later is visually recognizable from an upper end portion of a back surface 11b of the device case 11.
The electronic device 10 includes an operation button group 220 made up of a plurality of operation buttons 22 (refer to FIG. 5). Each of the plurality of operation buttons 22 is a hardware button. Specifically, each of the plurality of operation buttons 22 is a push button. Note that, at least one operation button 22 included in the operation button group 220 may be a software button displayed in the display region 12.
The operation button group 220 includes operation buttons 22a, 22b, and 22c that are located at a lower end portion of the front surface 11a of the device case 11. Further, the operation button group 220 may include a power button and a volume button that are located on a surface of the device case 11.
The operation button 22a is, for example, a back button. The back button is an operation button for switching the display of the display region 12 to a previous display. When the user 9 operates the operation button 22a, the display of the display region 12 is switched to a previous display. The operation button 22b is, for example, a home button. The home button is an operation button for displaying a home screen in the display region 12. When the user 9 operates the operation button 22b, a home screen is displayed in the display region 12. The operation button 22c is, for example, a history button. The history button is an operation button for displaying in the display region 12 the history of applications executed in the electronic device 10. When the user 9 operates the operation button 22c, the history of applications executed in the electronic device 10 is displayed in the display region 12.
Next, one example of an electrical configuration of the electronic device is described. FIG. 5 illustrates a block diagram mainly showing one example of an electrical configuration of the electronic device 10. As illustrated in FIG. 5, the electronic device 10 includes a controller 100, a wireless communication unit 110, a display 120, an operation unit 210, and a satellite signal receiver 140. The electronic device 10 further includes a receiver 150, a speaker 160, a microphone 170, a first camera 180, a second camera 190, an acceleration sensor 200, and a battery 230. These components of the electronic device 10 are accommodated inside the device case 11.
The controller 100 can integrally manage the operations of the electronic device 10 by controlling other components of the electronic device 10. The controller 100 includes at least one processor for providing control and processing capability to perform various functions as described in further detail below.
According to various embodiments, the at least one processor may be implemented as a single integrated circuit (IC) or as multiple communicatively coupled IC’s and/or discrete circuits. It is appreciated that the at least one processor can be implemented according to various known technologies.
In one embodiment, the processor includes one or more circuits or units configurable to perform one or more data computing procedures or processes by executing instructions stored in an associated memory, for example. In other embodiments, the processor may be implemented as firmware (e.g. discrete logic components) configured to perform one or more data computing procedures or processes.
According to various embodiments, the processor may include one or more processors, controllers, microprocessors, microcontrollers, application specific integrated circuits (ASICs), digital signal processors, programmable logic devices, field programmable gate arrays, or any combination of these devices or structures, or other known devices and structures, to perform the functions described herein.
In this example, the controller 100 includes a central processing unit (CPU) 101, a digital signal processor (DSP) 102, and a storage 103. The storage 103 includes a non-transitory recording medium readable with the CPU 101 and the DSP 102, such as read only memory (ROM) and random access memory (RAM). The ROM of the storage 103 is, for example, flash ROM (flash memory) being non-volatile memory. The storage 103 can store a plurality of control programs 103a for controlling the electronic device 10, acceleration data 103b, movement determination data 103c, threshold value data 103d, setting data 103z, etc.
The CPU 101 and the DSP 102 execute the various control programs 103a in the storage 103, whereby various functions of the controller 100 are implemented.
The plurality of control programs 103a in the storage 103 include various applications (application programs).
The control programs 103a can provide a function of determining whether the behavior of a user of the subject device is likely to induce a traffic accident based on acceleration being a detection result of the acceleration sensor 200. Specifically, the control programs 103a can measure vibration and motion acting on the subject device based on the direction and magnitude of the acceleration being a detection result of the acceleration sensor 200. The control programs 103a can determine whether the user of the subject device is in a walking state by comparing a measurement result of the measured vibration and motion with the movement determination data 103c. The control programs 103a can start monitoring acceleration if determining that the user is in a walking state.
The control programs 103a can determine that the behavior of the user is likely to induce a traffic accident if the monitored acceleration exceeds a predetermined threshold value contained in the threshold value data 103d. The control programs 103a can use, for example, average walking acceleration of the user as the predetermined threshold value. In this case, the control programs 103a can determine that the behavior of the user is likely to induce a traffic accident if the monitored acceleration exceeds the average walking acceleration of the user. Determining whether the monitored acceleration exceeds the average walking acceleration of the user with the use of a function provided by the control programs 103a, for example, a behavior that the user suddenly starts to run can be detected.
Further, the control programs 103a can also determine whether the user of the subject device is in a stopping state by comparing the measurement result of the vibration and motion acting on the subject device with the movement determination data 103c. The control programs 103a can start monitoring acceleration if determining that the user is in a stopping state. Further, similarly to the case of determining that the user is in a walking state, the control programs 103a can determine that the behavior of the user is likely to induce a traffic accident if the monitored acceleration exceeds a predetermined threshold value contained in the threshold value data 103d.
If the control programs 103a determine that the behavior of the user of the subject device is likely to induce a traffic accident, the control programs 103a can provide a function to issue an alert notification to the user and a notification about the danger to another entity other than the user.
The acceleration data 103b contains a value of the acceleration obtained by the acceleration sensor 200. The acceleration data 103b contains a direction and magnitude of the acceleration obtained by the acceleration sensor 200. The acceleration data 103b may contain all of the measurement results measured by the acceleration sensor 200.
The movement determination data 103c contains, for example, information of determination conditions to be used at the time of determining a movement state of the user. The information of the determination conditions may contain a direction and magnitude of the acceleration acting on the subject device, an acceleration pattern made up of time-series changes in a direction and magnitude of the acceleration, or a combined vector in which the acceleration in three axes of an X-axis, a Y-axis, and a Z-axis are combined. The information of the determination conditions contains at least information for determining whether the user is in a walking state or in a stopping state that is obtained from the detection result of the acceleration sensor 200.
The threshold value data 103d contains information of a predetermined threshold value for determining whether the behavior of the user is likely to induce a traffic accident. The threshold value data 103d contains, as information of the predetermined threshold value, for example, information of average walking acceleration of the user of the subject device that is measured in advance.
The setting data 103z contains information of various settings about operations of the electronic device 10. In one embodiment, the setting data 103z contains information about notification modes of an alert issued when the behavior of the user is determined to be likely to induce a traffic accident. The notification modes may include a pattern using at least one of sound, image, light, vibration, etc.
The controller 100 may include a plurality of CPUs 101. Further, the controller 100 may omit the DSP 102, or may include a plurality of DSPs 102. Further, all of the functions of the controller 100 or a part of the functions of the controller 100 may be implemented by a hardware circuit that does not require software for implementing such functions.
The storage 103 may include a non-transitory recording medium readable with a computer other than ROM and RAM. The storage 103 may include, for example, a small-sized hard disk drive, a solid state drive (SSD), and the like.
The wireless communication unit 110 includes an antenna 111. The wireless communication unit 110 may also be referred to as a wireless communication circuit. With the use of the antenna 111, the wireless communication unit 110 can, for example, wirelessly communicate in a plurality of types of communication methods. The wireless communication of the wireless communication unit 110 is controlled by the controller 100.
The wireless communication unit 110 can wirelessly communicate with a base station of a mobile phone system. The wireless communication unit 110 can communicate with an information processing device different from the electronic device 10, such as the server device 8, a mobile phone, and a web server, via the network 900 including the base station. The electronic device 10 can perform data communication, a voice call, a video call, etc. with another mobile phone.
Further, the wireless communication unit 110 can wirelessly communicate with the roadside unit 5 and the vehicle 6. For example, the wireless communication unit 110 can wirelessly communicate using a 700 MHz band that is allocated in ITS. For example, the wireless communication unit 110 can communicate in conformity to the wireless standard of IEEE 802.11p used in ITS. Further, the wireless communication unit 110 can wirelessly communicate using a wireless local area network (LAN), such as Wifi. Further, the wireless communication unit 110 can perform near-field wireless communication. For example, the wireless communication unit 110 can wirelessly communicate in conformity to Bluetooth (registered trademark). The wireless communication unit 110 may be able to wirelessly communicate in conformity to at least one of ZigBee (registered trademark) and near field communication (NFC).
The wireless communication unit 110 subjects a signal received at the antenna 111 to various types of processing such as amplification processing, and outputs the processed received signal to the controller 100. The controller 100 subjects the input received signal to various types of processing, and acquires information contained in the received signal. Further, the controller 100 outputs a transmission signal containing information to the wireless communication unit 110. The wireless communication unit 110 subjects the input transmission signal to various types of processing such as amplification processing, and wirelessly transmits the processed transmission signal from the antenna 111.
The display 120 includes the display region 12 located on the front surface of the electronic device 10, and a display panel 121. The display 120 can display various pieces of information in the display region 12. The display panel 121 is, for example, a liquid crystal display panel or an organic EL panel. The display panel 121 can display various pieces of information, such as characters, symbols, and figures under control of the controller 100. The display panel 121 is opposed to the display region 12 inside the device case 11. Information displayed in the display panel 121 is displayed in the display region 12.
The operation unit 210 can accept various operations performed on the electronic device 10 by the user 9. The operation unit 210 includes the touch panel 130 and the operation button group 220.
The touch panel 130 can detect operations performed on the display region 12 by a pointer such as a finger. With this, the touch panel 130 can accept operations performed on the display region 12 by the user 9. The touch panel 130 is, for example, a projected capacitive touch panel. The touch panel 130 is, for example, located on the back of the display region 12. When the user 9 operates the display region 12 with a pointer such as a finger, the touch panel 130 can input an electrical signal to the controller 100 in accordance with the operation. Based on the electrical signal (output signal) from the touch panel 130, the controller 100 can identify the detail of the operation performed on the display region 12. Then, the controller 100 can perform processing in accordance with the identified detail of the operation.
When operated by the user 9, each operation button 22 of the operation button group 220 can output an operation signal to the controller 100, indicating that the operation button 22 has been operated. With this, regarding each operation button 22, the controller 100 can determine whether or not the operation button 22 has been operated. The controller 100 that has received the input operation signal controls other components, whereby a function allocated to the operated operation button 22 is implemented in the electronic device 10.
The satellite signal receiver 140 can receive a satellite signal transmitted from a positioning satellite. Then, based on the received satellite signal, the satellite signal receiver 140 can acquire positional information indicating the position of the electronic device 10. The positional information acquired by the satellite signal receiver 140 contains, for example, latitude and longitude indicating the position of the electronic device 10. The controller 100 can initiate operation of the satellite signal receiver 140, and can also stop the operation. Hereinafter, the satellite signal receiver 140 may simply be referred to as the “receiver 140.”
The receiver 140 is, for example, a GPS receiver, and can receive wireless signals from positioning satellites of the Global Positioning System (GPS). Based on the received wireless signals, the receiver 140 calculates the current position of the electronic device 10 by means of latitude and longitude, for example, and outputs the positional information containing the calculated latitude and longitude to the controller 100. It can also be said that the positional information of the electronic device 10 corresponds to positional information indicating the position of the user 9 possessing the electronic device 10.
Note that, the receiver 140 may obtain positional information of the electronic device 10 based on signals from positioning satellites of a global navigation satellite system (GNSS) other than GPS. For example, the receiver 140 may obtain positional information of the electronic device 10 based on signals from positioning satellites of the Global Navigation Satellite System (GLONASS), the Indian Regional Navigational Satellite System (IRNSS), COMPASS, Galileo, or the Quasi-Zenith Satellites System (QZSS).
The microphone 170 can convert sound input from the outside of the electronic device 10 into an electrical sound signal, and can output the converted signal to the controller 100. Sound from the outside of the electronic device 10 is input to the inside of the electronic device 10 through the microphone hole 15, to be input into the microphone 170.
The speaker 160 is, for example, a dynamic speaker. The speaker 160 can convert the electrical sound signal from the controller 100 into sound, and can output the converted sound. The sound output from the speaker 160 is output to the outside through the speaker hole 14. The user 9 can hear the sound output from the speaker hole 14 even at a place far from the electronic device 10.
The receiver 150 can output phone-call received sound. The receiver 150 is, for example, a dynamic speaker. The receiver 150 can convert the electrical sound signal from the controller 100 into sound, and can output the converted sound. The sound output from the receiver 150 is output to the outside through the receiver hole 13. The volume of the sound output through the receiver hole 13 is smaller than the volume of the sound output through the speaker hole 14. The user 9 brings his/her ear closer to the receiver hole 13, whereby the user 9 can hear the sound output through the receiver hole 13. Note that, in place of the receiver 150, a vibration element, such as a piezoelectric vibration element, that vibrates a front surface portion of the device case 11 may be provided. In this case, the sound is conveyed to the user through the vibration in the front surface portion.
The first camera 180 includes the lens 181, an image sensor, etc. The second camera 190 includes the lens 191, an image sensor, etc. Each of the first camera 180 and the second camera 190 can capture an image of an object based on control of the controller 100, can generate a still image or a moving image showing the captured object, and can output the captured image to the controller 100.
The acceleration sensor 200 can detect acceleration of the electronic device 10. The acceleration sensor 200 is, for example, a three-axis acceleration sensor. The acceleration sensor 200 can detect acceleration of the electronic device 10 in an x-axis direction, a y-axis direction, and a z-axis direction. The x-axis direction, the y-axis direction, and the z-axis direction are, for example, set in a long-side direction, a short-side direction, and a thickness direction of the electronic device 10, respectively.
The battery 230 can output power for the electronic device 10. The battery 230 is, for example, a rechargeable battery. The power output from the battery 230 is supplied to various components of the electronic device 10, such as the controller 100 and the wireless communication unit 110.
Further, the electronic device 10 may include a sensor other than the acceleration sensor 200. For example, the electronic device 10 may include at least one of an air pressure sensor, a geomagnetic sensor, a temperature sensor, a proximity sensor, an illuminance sensor, and a gyro sensor.
Next, one example of a configuration of the roadside unit is described. FIG. 6 illustrates a block diagram showing one example of a configuration of the roadside unit 5. As illustrated in FIG. 6, the roadside unit 5 includes a controller 500, a wireless communication unit 510, a wired communication unit 520, and a camera 530 (imaging unit).
The controller 500 can integrally manage the operations of the roadside unit 5 by controlling other components of the roadside unit 5. The controller 500 includes at least one processor for providing control and processing capability to perform various functions as described in further detail below. The above description about the processor of the controller 100 of the electronic device 10 is applicable to the processor of the controller 500 as well.
In this example, the controller 500 includes a CPU 501, a DSP 502, and a storage 503. The storage 503 includes a non-transitory recording medium readable with the CPU 501 and the DSP 502, such as ROM and RAM. The ROM of the storage 503 is, for example, flash ROM being non-volatile memory. The storage 503 stores a plurality of control programs 503a etc. for controlling the roadside unit 5. The CPU 501 and the DSP 502 execute the various control programs 503a in the storage 503, whereby various functions of the controller 500 are implemented.
Similarly to the storage 103 of the electronic device 10, the storage 503 may include a non-transitory recording medium readable with a computer other than ROM and RAM. At least one control program 503a in the storage 503 may be stored in the storage 503 in advance. Further, at least one control program 503a in the storage 503 may be downloaded by the roadside unit 5 from another device to be stored in the storage 503. Further, all of the functions of the controller 500 or a part of the functions of the controller 500 may be implemented by a hardware circuit that does not require software for implementing such functions.
The wireless communication unit 510 includes an antenna 511. With the use of the antenna 511, the wireless communication unit 510 can wirelessly communicate with the vehicle 6 (specifically, an electronic device inside the vehicle 6) and the electronic device 10. The wireless communication unit 510 can, for example, wirelessly communicate using a 700 MHz band that is allocated in ITS. The wireless communication unit 510 can communicate in conformity to the wireless standard of IEEE 802.11p used in ITS.
The wireless communication unit 510 subjects a signal received at the antenna 511 to a various types of processing such as amplification processing, and outputs the processed received signal to the controller 500. The controller 500 subjects the input received signal to various types of processing, and acquires information contained in the received signal. Further, the controller 500 outputs a transmission signal containing information to the wireless communication unit 510. The wireless communication unit 510 subjects the input transmission signal to various types of processing such as amplification processing, and wirelessly transmits the processed transmission signal from the antenna 511.
Note that, the wireless communication unit 510 may be able to wirelessly communicate with the vehicle 6 and the electronic device 10 using a wireless local area network (LAN), such as Wifi.
The wired communication unit 520 is connected to the network 900 in a wired manner. The wired communication unit 520 can communicate with a device connected to the network 900, such as the server device 8, via the network 900. The wired communication unit 520 can input information received from the network 900 to the controller 500. Further, the wired communication unit 520 can output information received from the controller 500 to the network 900.
The camera 530 can capture an image of a place where the roadside unit 5 is installed. The image generated by the camera 530 is input to the controller 500. The camera 530 can capture a moving image and a still image. The controller 500 can, for example, make the wireless communication unit 510 transmit the image received from the camera 530 to the electronic device 10 and the vehicle 6. Further, the controller 500 can analyze the image received from the camera 530 to detect a pedestrian (e.g. the user 9 of the electronic device 10) who is about to enter a vehicle, and the vehicle itself. The controller 500 can transmit information to the vehicle 6 and the electronic device 10 via the wireless communication unit 510.
Note that, the roadside unit 5 may be able to wirelessly communicate with the network 900. Further, the roadside unit 5 may be able to perform near-field wireless communication. Further, the roadside unit 5 may be able to wirelessly communicate in conformity to Bluetooth.
Next, a flow of a process executed by the electronic device 10 is described. FIG. 7 and FIG. 8 each illustrate a flowchart showing one example of a flow of a process executed by the electronic device 10. The controller 100 executes the control programs 103a stored in the storage 103, whereby the process illustrated in FIG. 7 and FIG. 8 is implemented.
FIG. 7 illustrates a flowchart for illustrating one example of starting monitoring the acceleration acting on the electronic device 10 and determining whether the monitored acceleration exceeds average walking acceleration to thereby determine whether the behavior of the user is likely to induce a traffic accident when the user of the electronic device 10 is in a walking state.
As illustrated in FIG. 7, the controller 100 acquires a detection result of the acceleration sensor 200 (Step S101).
Subsequently, based on the detection result of the acceleration sensor 200, the controller 100 determines whether the user of the electronic device 10 is in a walking state (Step S102).
If the user is in a walking state as a result of the determination in Step S102 (if Yes), the controller 100 starts monitoring the acceleration being the detection result of the acceleration sensor 200 (Step S103). On the other hand, if the user is not in a walking state as a result of the determination in Step S102 (if No), the process proceeds to Step S106.
Next, the controller 100 determines whether the monitored acceleration exceeds average walking acceleration of the user of the subject device (Step S104).
If the monitored acceleration exceeds average walking acceleration being a result of the determination in Step S104 (if Yes), the controller 100 issues an alert notification to the user of the subject device and also issues a notification about danger information to another entity other than the user (Step S105). On the other hand, if the monitored acceleration does not exceed the average walking acceleration as a result of the determination in Step S104 (if No), the process proceeds to Step S106.
Here, the “another entity” corresponds to at least one of an information processing device 600 installed in the vehicle 6 and the roadside unit 5. The controller 100 transmits, to the information processing device 600 installed in the vehicle 6 and the roadside unit 5, danger information that there is a pedestrian who behaves in a manner likely to induce a traffic accident through pedestrian-vehicle communication and roadside-pedestrian communication via the wireless communication unit 110.
Next, the controller 100 determines whether to continue the process (Step S106). If the process is to be continued as a result of the determination in Step S106 (if Yes), the process starting with Step S101 is repeated. On the other hand, if the process is not to be continued as a result of the determination in Step S106 (if No), the series of the process is ended.
FIG. 8 illustrates a flowchart for illustrating one example of starting monitoring the acceleration acting on the subject device and determining whether the monitored acceleration exceeds average walking acceleration to thereby determine whether the behavior of the user is likely to induce a traffic accident when the user of the electronic device 10 is in a stopping state.
As illustrated in FIG. 8, the controller 100 acquires a detection result of the acceleration sensor 200 (Step S201).
Subsequently, based on the detection result of the acceleration sensor 200, the controller 100 determines whether the user of the electronic device 10 is in a stopping state (Step S202).
If the user is in a stopping state as a result of the determination in Step S202 (if Yes), the controller 100 starts monitoring the acceleration as the detection result of the acceleration sensor 200 (Step S203). On the other hand, if the user is not in a stopping state as a result of the determination in Step S102 (if No), the process proceeds to Step S206.
Next, the controller 100 determines whether the monitored acceleration exceeds average walking acceleration of the user of the subject device (Step S204).
If the monitored acceleration exceeds average walking acceleration as a result of the determination in Step S204 (if Yes), the controller 100 issues an alert notification to the user of the subject device and also issues a notification about danger information to another entity other than the user (Step S205). On the other hand, if the monitored acceleration does not exceed the average walking acceleration as a result of the determination in Step S204 (if No), the process proceeds to Step S206.
Next, the controller 100 determines whether to continue the process (Step S206). If the process is to be continued as a result of the determination in Step S206 (if Yes), the process starting with Step S201 is repeated. On the other hand, if the process is not to be continued as a result of the determination in Step S206 (if No), the series of the process is ended.
As described above, the electronic device 10 starts monitoring the acceleration of the subject device when the user is in a walking state or in a stopping state, and issues an alert notification to the user and a notification about danger information to another entity other than the user on condition that the monitored acceleration exceeds average walking acceleration of the user of the subject device. Therefore, danger information can be transmitted to the traveling vehicle 6 if the user behaves in a manner of running out into a roadway, for example.
Note that, the above-mentioned method of determining the behavior of the user with the use of the electronic device 10 is merely one example, and the behavior of the user may be determined in any method not limited to the above. In short, it is only necessary that, in a case where the user may behave in a manner of running out into a roadway, for example, such a case can be transmitted to the traveling vehicle 6 as danger information.
Next, one example of a configuration of the information processing device 600 installed in the vehicle 6 is described. FIG. 9 illustrates a block diagram showing one example of a configuration of the information processing device 600. As illustrated in FIG. 9, the information processing device 600 includes a CPU 601, a DSP 602, and a storage 603, and is connected to a wireless communication unit 610, a vehicle controller 620, and a camera 630 (imaging unit).
The vehicle controller 620 is a unit, such as Advanced Driving Assistance Systems (ADAS)-Electronic Control Unit (ECU), that automatically controls a steering operation, a braking operation, and an accelerating operation of the vehicle 6.
The wireless communication unit 610 includes an antenna 611, and can wirelessly communicate with the roadside unit 5 and the electronic device 10 with the use of the antenna 611. The wireless communication unit 610 can, for example, wirelessly communicate using a 700 MHz band that is allocated in ITS. The wireless communication unit 610 can communicate in conformity to the wireless standard of IEEE 802.11p used in ITS.
The storage 603 includes a non-transitory recording medium readable with the CPU 601 and the DSP 602, such as ROM and RAM. The ROM of the storage 603 is, for example, flash ROM being non-volatile memory. The storage 603 stores a plurality of control programs 603a etc. for controlling the information processing device 600. The CPU 601 and the DSP 602 execute the various control programs 603a in the storage 603, whereby various functions of the information processing device 600 are implemented.
Similarly to the storage 103 of the electronic device 10, the storage 603 may include a non-transitory recording medium readable with a computer other than ROM and RAM. At least one control program 603a in the storage 603 may be stored in the storage 603 in advance. Further, at least one control program 603a in the storage 603 may be downloaded by the information processing device 600 from another device to be stored in the storage 603. Further, all of the functions of the information processing device 600 or a part of the functions of the information processing device 600 may be implemented by a hardware circuit that does not require software for implementing such functions.
The camera 630 can capture an image in at least a front direction of the vehicle 6. The image generated by the camera 630 is input to the information processing device 600. The camera 630 can capture a moving image and a still image.
The information processing device 600 can analyze the image received from the camera 630 to detect a pedestrian (e.g. the user 9 of the electronic device 10) who is about to enter a vehicle in the front direction, and the vehicle in the front direction itself.
As in the description above, the electronic device 10 can transmit danger information to the traveling vehicle 6. The information processing device 600 that has received the danger information via the wireless communication unit 610 issues a danger aversion instruction to the vehicle controller 620 so that the vehicle controller 620 performs a steering operation, a braking operation, etc. that are necessary for avoiding contact with the user behaving dangerously. The vehicle controller 620 that has received the danger aversion instruction performs a danger aversion operation such as sudden braking and sudden steering operations. Such operations, however, may raise a sense of discomfort in a passenger of the vehicle. Further, in a vehicle not adapted to automated driving, the information processing device 600 that has received the danger information notifies a driver about the reception of the danger information with voice etc. so as to prompt danger aversion, which causes the driver having received such a notification to perform sudden braking and sudden steering operations. This, however, may result in imposing a burden on the driver.
Here, the problem is the case where the determination result of the behavior of the user obtained by the electronic device 10 is wrong. For example, in a case where a pedestrian possessing the electronic device 10 is about to enter a bus stopping at a bus stop, the pedestrian may run and hurry, behaving at acceleration exceeding average walking acceleration. The electronic device 10 may transmit danger information even in such a case, and the traveling vehicle 6 that has received the danger information results in performing the above-mentioned danger aversion operation.
The information processing device 600 can reduced the number of times of such danger aversion operation. FIG. 10 illustrates a diagram schematically showing a state where the pedestrian 9 possessing the electronic device 10 is about to enter a vehicle 60, and the vehicle 6 installed with the information processing device 600 approaches the vehicle 60. The electronic device 10 of the pedestrian 9 determines that the behavior of the pedestrian 9 who is about to enter the vehicle 60 is a behavior likely to induce a traffic accident, and transmits danger information EI. FIG. 10 illustrates one example in which a “caution running-out” message and a running-out picture image are transmitted to the vehicle 6.
At this time, in the information processing device 600 of the vehicle 6, appropriateness of the determination result of the behavior of the user obtained by the electronic device 10 is determined based on an image in the front direction captured by the camera 630. If the determination result is wrong, the received danger information is discarded so as not to perform the danger aversion operation. Note that, as examples of discarding the received danger information, for example, the information processing device 600 may delete the danger information from the storage 603, or the information processing device 600 may not regard the received danger information as the conditions for performing the danger aversion operation irrespective of whether or not the information processing device 600 deletes the danger information from the storage 603.
Such operations of the information processing device 600 are described with reference to a flowchart illustrated in FIG. 11.
The information processing device 600 determines whether to have received danger information via the wireless communication unit 610 (Step S1). If the information processing device 600 determines to have received danger information (if Yes), the process proceeds to Step S2, and if the information processing device 600 determines not to have received danger information (if No), the Step S1 is repeated.
If the information processing device 600 receives danger information, the information processing device 600 analyzes the image captured by the camera 630 (Step S2). Then, based on the analyzed image, the presence or absence of a vehicle in the front direction and the presence or absence of a pedestrian on the same side are checked (Step S3). That is, if the image captured by the camera 630 is an image in which a vehicle is stopped in the traveling front direction of the subject vehicle and a standing or bending-down pedestrian on the same side is about to enter the vehicle, the operation is thought to result from exceeded average walking acceleration of the pedestrian 9 possessing the electronic device 10 to enter the vehicle.
As a method of analyzing the image, a method of detecting a feature quantity of the image through learning with deep learning and then determining a vehicle and a pedestrian through a pattern recognition is conceivable. Alternatively, a vehicle and a pedestrian may be determined through simple pattern matching.
If the image analyzed in Step S3 is an image in which a person is about to enter a vehicle (if Yes), the process proceeds to Step S4 to discard the received danger information and end the series of the process.
On the other hand, if the image analyzed in Step S3 is not an image in which a person is about to enter a vehicle (if No), the process proceeds to Step S5 to perform, based on the received danger information, a process necessary for avoiding contact with the pedestrian 9 possessing the electronic device 10 that has transmitted the danger information. That is, the information processing device 600 issues the danger aversion instruction to the vehicle controller 620 so that the vehicle controller 620 performs a steering operation, a braking operation, etc. that are necessary for avoiding contact with the user behaving dangerously. Note that, if the subject vehicle is not adapted to automated driving, the information processing device 600 that has received the danger information performs a process of notifying a driver about the reception of the danger information with voice etc. so as to prompt danger aversion.
In this manner, even when danger information is received, the received danger information is discarded on condition that the pedestrian 9 possessing the electronic device 10 that has transmitted the danger information is determined to be about to enter a vehicle based on the image analysis. Thus, the subject vehicle is not required to perform the operation of avoiding danger based on the danger information, thereby eliminating a burden on the passenger.
Note that, in the above, description is given of one example in which appropriateness of the danger information is determined based on the image analysis on the pedestrian 9. However, an obstacle present in the front direction of the vehicle 6 may be detected based on detection data obtained by an obstacle sensor such as light detection and ranging (LIDAR) installed in the vehicle 6, and appropriateness of the danger information may be determined based on the presence or absence of a stopped vehicle as an obstacle and a person on the same side.
In this embodiment described above, description is given of a configuration in which danger information is directly transmitted from the electronic device 10 to the vehicle 6 through pedestrian-vehicle communication between the electronic device 10 of the pedestrian 9 and the vehicle 6 installed with the information processing device 600. However, such a configuration is also conceivable that danger information that has been transmitted from the electronic device 10 is transmitted to the vehicle 6 via the roadside unit 5.
The roadside unit 5 is, as described above with reference to FIG. 1, often installed at an intersection, but may be installed along a road other than an intersection. In such a case, the roadside unit 5 may have a similar function as that of the information processing device 600.
That is, as described above with reference to FIG. 6, the roadside unit 5 includes the camera 530, and can therefore determine whether or not the pedestrian 9 possessing the electronic device 10 that has transmitted danger information is about to enter a vehicle by analyzing an image captured by the camera 530.
FIG. 12 illustrates a diagram schematically showing a state where the pedestrian 9 possessing the electronic device 10 is about to enter the vehicle 60, and the vehicle 6 approaches the vehicle 60. The roadside unit 5 is installed along the roadway 7 between the vehicle 6 and the vehicle 60. The electronic device 10 of the pedestrian 9 determines that the behavior of the pedestrian 9 who is about to enter the vehicle 60 is a behavior likely to induce a traffic accident, and transmits danger information EI. FIG. 12 illustrates one example in which a “caution running-out” message and a running-out picture image are transmitted to the roadside unit 5.
At this time, in the controller 500 of the roadside unit 5, appropriateness of the determination result of the behavior of the user obtained by the electronic device 10 is determined based on an image in the front direction captured by the camera 530. If the determination result is wrong, the received danger information is not relayed to the vehicle 6. Since the danger information is not relayed to the vehicle 6, the vehicle 6 is not required to perform the operation of avoiding danger based on the danger information.
The controller 500 that performs the process as above implements a similar function as that of the information processing device 600, and can therefore be referred to as an information processing device as well. Now, the operations of the controller 500 are described with reference to a flowchart illustrated in FIG. 13.
The controller 500 determines whether to have received danger information via the wireless communication unit 510 (Step S11). If the controller 500 determines to have received danger information (if Yes), the process proceeds to Step S12, and if the controller 500 determines not to have received danger information (if No), the Step S11 is repeated.
If the controller 500 receives danger information, the controller 500 analyzes the image captured by the camera 530 (Step S12). FIG. 12 illustrates one example in which the camera 530 captures an image in the direction of the vehicle 60. The camera 530, however, is a camera for capturing an image of the vicinity of the place where the roadside unit 5 is installed, and may therefore be made up of a plurality of cameras so as to be capable of capturing an image in multi-directions, not only in one direction.
Then, based on the analyzed image, the presence or absence of a vehicle in the vicinity of the roadside unit 5 and the presence or absence of a pedestrian on the same side are checked (Step S13). In such a case where the camera 530 is made up of a plurality of cameras as described above, when each of the images captured by the cameras are analyzed to detect an image in which a pedestrian stands beside a stopped vehicle or bends down to enter a vehicle, the operation is thought to result from exceeded average walking acceleration of the pedestrian 9 possessing the electronic device 10 to enter the vehicle.
If the image analyzed in Step S13 is an image in which a person is about to enter a vehicle (if Yes), the process proceeds to Step S14 to discard the received danger information and end the series of the process.
On the other hand, if the image analyzed in Step S13 is not an image in which a person is about to enter a vehicle (if No), the process proceeds to Step S15 to transmit the received danger information to the vehicle 6 in the vicinity of the roadside unit 5.
If the vehicle 6 that has received the danger information includes the information processing device 600 as illustrated in FIG. 9, appropriateness of the danger information is determined in the information processing device 600 in accordance with the flow illustrated in FIG. 11. In this case, the image is determined to be not an image in which a person is about to enter a vehicle in Step S3, and a process necessary for avoiding contact with the pedestrian 9 possessing the electronic device 10 that has transmitted the danger information is performed in Step S5 based on the received danger information. Further, if the vehicle 6 that has received the danger information does not include the information processing device 600 as illustrated in FIG. 9, a process necessary for avoiding contact with the pedestrian 9 possessing the electronic device 10 that has transmitted the danger information is performed immediately after the reception of the danger information.
Note that, in the above, description is given of one example in which appropriateness of the danger information is determined based on the image analysis on the pedestrian 9. However, an obstacle present at a place where the roadside unit 5 is installed may be detected based on detection data obtained by an obstacle sensor such as LIDAR installed in the roadside unit 5, and appropriateness of the danger information may be determined based on the presence or absence of a stopped vehicle as an obstacle and a person on the same side.
Note that, in the above, description is given of a case where the roadside unit 5 transmits received danger information to the vehicle 6 in the vicinity of the roadside unit 5. However, the destination for transmitting the danger information is not limited to a vehicle. The danger information may be transmitted to another roadside unit, and may be then relayed. With this, even if the vehicle 6 is present at a place far from the roadside unit 5 that first received the danger information, the danger information can be transmitted to the vehicle 6. In this case, such another roadside unit that has received danger information from the roadside unit 5 performs a process to relay the danger information without executing the flow illustrated in FIG. 13. The flow illustrated in FIG. 13 may be configured in advance so as to be executed on condition that danger information is received from the electronic device 10.
Further, the roadside unit 5 may transmit received danger information to an electronic device 10 of another pedestrian. In this case, such another pedestrian who has received the danger information via the electronic device 10 is alerted that there is a pedestrian nearby who behaves in a manner likely to induce a traffic accident, and can thus make an action to avoid the danger.
As in the above, while an information processing device, a roadside unit, and a vehicle have been described in detail, the above description is in all aspects illustrative and the present disclosure is not limited thereto. Further, it is understood that numerous modifications not illustrated herein can be devised without departing from the scope of the present disclosure.

Claims (9)

  1. An information processing device, wherein
    if the information processing device acquires, from a mobile electronic device, information notifying that a user of the mobile electronic device may behave in a dangerous manner likely to induce a traffic accident, the information processing device checks appropriateness of the information, and
    if the information comprises inappropriate information, the information processing device discards the information.
  2. The information processing device according to claim 1, wherein
    the information processing device receives an image of the user captured by an imaging unit, and checks appropriateness of the information based on the image of the user.
  3. The information processing device according to claim 2, wherein
    if the image of the user is an image in which the user is about to enter a vehicle or an image in which the user stands beside a vehicle, the information processing device determines that the information is inappropriate information.
  4. A vehicle comprising:
    the information processing device of claim 1; and
    a vehicle controller, wherein
    if the information processing device determines that the information is appropriate information, the information processing device instructs the vehicle controller to perform vehicle control to avoid contact with the user.
  5. The vehicle according to claim 4, further comprising
    an imaging unit configured to capture an image in at least a front direction of the vehicle, wherein
    the information processing device receives an image of the user captured by the imaging unit, and checks appropriateness of the information based on the image of the user.
  6. The vehicle according to claim 5, wherein
    if the image of the user is an image in which the user is about to enter a vehicle or an image in which the user stands beside a vehicle, the information processing device determines that the information is inappropriate information.
  7. A roadside unit comprising:
    the information processing device of claim 1; and
    a communication unit configured to communicate the information, wherein
    if the information processing device determines that the information is appropriate information, the roadside unit transmits the information to a vehicle in a vicinity of the roadside unit and to another roadside unit in a vicinity the roadside unit via the communication unit.
  8. The roadside unit according to claim 7, further comprising
    an imaging unit configured to capture an image in at least a front direction of the roadside unit, wherein
    the information processing device receives an image of the user captured by the imaging unit, and checks appropriateness of the information based on the image of the user.
  9. The roadside unit according to claim 8, wherein
    if the image of the user is an image in which the user is about to enter a vehicle or an image in which the user stands beside a vehicle, the information processing device determines that the information is inappropriate information.

PCT/JP2018/030025 2017-08-29 2018-08-10 Information processing device, vehicle, and roadside unit WO2019044456A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-163999 2017-08-29
JP2017163999A JP2019040551A (en) 2017-08-29 2017-08-29 Information processor, vehicle and roadside machine

Publications (1)

Publication Number Publication Date
WO2019044456A1 true WO2019044456A1 (en) 2019-03-07

Family

ID=63405296

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/030025 WO2019044456A1 (en) 2017-08-29 2018-08-10 Information processing device, vehicle, and roadside unit

Country Status (3)

Country Link
JP (1) JP2019040551A (en)
DE (1) DE102018119554A1 (en)
WO (1) WO2019044456A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190319793A1 (en) * 2019-06-28 2019-10-17 Eve M. Schooler Data offload and time synchronization for ubiquitous visual computing witness

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102303023B1 (en) * 2019-08-01 2021-09-16 정규홍 Taxi nomitoring system using smart band, gateway module and integrated control device connected with taxi service using smart band
JP7462076B2 (en) 2020-12-21 2024-04-04 京セラ株式会社 Wireless roadside unit, traffic communication system, and control method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130201036A1 (en) * 2010-07-12 2013-08-08 Continental Teves Ag & Co. Ohg Road safety communication system for increasing the road safety of pedestrians
US20150228195A1 (en) * 2014-02-07 2015-08-13 Here Global B.V. Method and apparatus for providing vehicle synchronization to facilitate a crossing
US20150251599A1 (en) * 2014-03-04 2015-09-10 Magna Electronics Inc. Vehicle alert system utilizing communication system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017163999A (en) 2017-06-29 2017-09-21 株式会社東洋新薬 Flavor improving method and flavor improving composition comprising n-acetylglucosamine

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130201036A1 (en) * 2010-07-12 2013-08-08 Continental Teves Ag & Co. Ohg Road safety communication system for increasing the road safety of pedestrians
US20150228195A1 (en) * 2014-02-07 2015-08-13 Here Global B.V. Method and apparatus for providing vehicle synchronization to facilitate a crossing
US20150251599A1 (en) * 2014-03-04 2015-09-10 Magna Electronics Inc. Vehicle alert system utilizing communication system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190319793A1 (en) * 2019-06-28 2019-10-17 Eve M. Schooler Data offload and time synchronization for ubiquitous visual computing witness
US11646886B2 (en) * 2019-06-28 2023-05-09 Intel Corporation Data offload and time synchronization for ubiquitous visual computing witness

Also Published As

Publication number Publication date
DE102018119554A1 (en) 2019-02-28
JP2019040551A (en) 2019-03-14

Similar Documents

Publication Publication Date Title
KR102325049B1 (en) Electronic device for transmitting communication signal associated with pedestrian safety and method for operating thereof
US10930147B2 (en) Electronic apparatus, roadside unit, and transport system
KR20190011488A (en) Electronic device for controlling communication circuit based on identification information received from external device and method for operating thereof
US11749104B2 (en) Electronic device and method for providing V2X service using same
WO2019044456A1 (en) Information processing device, vehicle, and roadside unit
WO2017110526A1 (en) Mobile terminal and vehicle
US20200100120A1 (en) Communication apparatus, communication device, vehicle, and method of transmitting
US10609510B2 (en) Mobile electronic apparatus, mobile electronic apparatus control method, a non-transitory computer readable recording medium, for providing warnings to a user of the apparatus based on the location of the electronic apparatus
JP2019028541A (en) Information processing device, management device, control program, method for operating information processing device, and data structure
KR20210056632A (en) Method for image processing based on message and electronic device implementing the same
US10645535B2 (en) Electronic apparatus, control device and computer-readable non-transitory recording medium for selectively transmitting information based on indoor/outdoor specification
JP7327963B2 (en) Processing device, photographing device and information processing device
JP2019028542A (en) Information processing device, management device, portable electronic apparatus, control program, method for operating information processing device, method for operating portable electronic apparatus, and data structure
KR20160065723A (en) Electronic apparatus, control method of electronic apparatus, computer program and computer readable recording medium
US20200098254A1 (en) Roadside unit
JP6703936B2 (en) Electronic device, vehicle, control device, control program, and operating method of electronic device
US10812950B2 (en) Electronic apparatus, control device, computer-readable non-transitory recording medium and operation method of electronic apparatus
JP2020135804A (en) Traffic light controller, first roadside machine, information processor and traffic light control method
US10841861B2 (en) Electronic apparatus, method for controlling electronic apparatus, and storage medium
JP2019028533A (en) Information processing device, management device, portable electronic apparatus, control program, method for operating information processing device, method for operating portable electronic apparatus, and data structure
JP7422344B2 (en) Notification control device, notification device, notification control method, and notification control program
JP2020087015A (en) Roadside device, server device, communication system, first roadside device, second roadside device, first electronic apparatus, second electronic apparatus, and vehicle
JP2018106352A (en) Electronic apparatus, vehicle, control device, control program, and method for operating electronic apparatus
JP2023153235A (en) Device and electronic appliance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18760065

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18760065

Country of ref document: EP

Kind code of ref document: A1