WO2008015375A1 - Assistance device for blind and partially sighted people - Google Patents
Assistance device for blind and partially sighted people Download PDFInfo
- Publication number
- WO2008015375A1 WO2008015375A1 PCT/GB2007/002649 GB2007002649W WO2008015375A1 WO 2008015375 A1 WO2008015375 A1 WO 2008015375A1 GB 2007002649 W GB2007002649 W GB 2007002649W WO 2008015375 A1 WO2008015375 A1 WO 2008015375A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- portable device
- user
- voice
- scanning
- Prior art date
Links
- 230000001771 impaired effect Effects 0.000 claims abstract description 19
- 238000013439 planning Methods 0.000 claims abstract description 3
- 238000004891 communication Methods 0.000 claims description 3
- 238000000034 method Methods 0.000 claims description 3
- 238000012544 monitoring process Methods 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 6
- 230000001953 sensory effect Effects 0.000 description 4
- 201000004569 Blindness Diseases 0.000 description 3
- 241000614201 Adenocaulon bicolor Species 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000002354 daily effect Effects 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 241000282472 Canis lupus familiaris Species 0.000 description 1
- ZRVUJXDFFKFLMG-UHFFFAOYSA-N Meloxicam Chemical compound OC=1C2=CC=CC=C2S(=O)(=O)N(C)C=1C(=O)NC1=NC=C(C)S1 ZRVUJXDFFKFLMG-UHFFFAOYSA-N 0.000 description 1
- ATJFFYVFTNAWJD-UHFFFAOYSA-N Tin Chemical compound [Sn] ATJFFYVFTNAWJD-UHFFFAOYSA-N 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 238000010411 cooking Methods 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 230000009474 immediate action Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000001404 mediated effect Effects 0.000 description 1
- 229940112801 mobic Drugs 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 239000002994 raw material Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
- A61H3/061—Walking aids for blind persons with electronic detecting or guiding means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/006—Teaching or communicating with blind persons using audible presentation of the information
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/003—Maps
- G09B29/006—Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes
- G09B29/008—Touring maps or guides to public transport networks
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
- A61H3/061—Walking aids for blind persons with electronic detecting or guiding means
- A61H2003/063—Walking aids for blind persons with electronic detecting or guiding means with tactile perception
Definitions
- the present invention relates to a voice activated device suitable for assisting blind and partially sighted people.
- GPS-TaIk Global Positioning System
- GPS-talks together with MoBIC and Visuaide Trekers are navigation systems that utilise GPS (Global Positioning Systems) and wireless communication networks for outdoor way-finding. Orientation and navigation are achieved by the use of sophisticated systems such as Talking Sign technology; the system consists of short audio signals sent by invisible infrared light beams from permanently installed transmitters, which are placed at sign locations or on key environmental features, to a hand-held receiver that decodes the signal and delivers the voice message through its speaker or headset.
- Talking Sign technology the system consists of short audio signals sent by invisible infrared light beams from permanently installed transmitters, which are placed at sign locations or on key environmental features, to a hand-held receiver that decodes the signal and delivers the voice message through its speaker or headset.
- RFID Radio Frequency Identification
- RFID tags embedded in plates at sushi bars to calculate the customer's bill.
- New applications are expected in baggage delivery and in inventory management, as well as medical supervision.
- RFID technology is also being studied to develop a system to help visually impaired people navigate.
- Echo locators offer users little or no information about an object, only its location. Access to transport and to unfamiliar destinations remains problematic, time consuming and challenging and individual preferences and personal objectives when travelling are frequently only realised with difficulty or, indeed, not at all.
- the objective of the present invention is to solve the above problems and to pull together diverse technologies, which would otherwise evolve independently.
- the invention provides a device comprising a laser scanning technology, mobile service facility, voice recognition facilities, sensory identification, memory, data acquisition and storage.
- a first specific object of the present invention is to use the device in "smart homes" by virtual travel communities, and emergency response agencies. Moreover, the device could be used in shopping and supermarket for automated check-out.
- a voice-activated, portable device to assist visually impaired people comprising: communicating means for transmitting and receiving data to and/or from a network; scanning means for scanning an object selected by a user; memory means for storing information regarding a scanned object; sensor and identification means for locating an object previously scanned; journey means for planning a route to a destination selected by the user and for identifying the correct transport for travel to a selected destination; wherein the sensor identification and journey means are able to communicate with the network through the communicating means.
- the invention is configured with a wireless system for assisting visually impaired people comprising: a number of portable devices according to the first aspect of the invention and connected to a sensor network; receiving means adapted to receive data from the portable devices and to transmit said data to a server which is adapted to store the data; transmitting means adapted to transmit data from the server to the portable devices; selecting means adapted to select the data transmitted by the transmitting means according to each portable device; wherein each portable device is adapted to receive and store the selected data, and to update its existing data.
- the present invention is configured as a method for assisting visually impaired people using a wireless system comprising a number of portable devices according to its primary mode and connected to a sensor network comprising: receiving data from the portable devices and transmitting said data to a server for storage; transmitting data from the server to the portable devices; selecting the data to be transmitted by the transmitting means according to each portable device.
- each portable device receives and stores the selected data, and updates its own existing data.
- FIG. 1 is a schematic view of a device according to the invention.
- Figure 1 is a schematic view showing the first configuration of the present invention.
- the device of the present invention comprises laser scanning means, voice recognition means, sensory identification means and storage means.
- the device is portable, suitable to be hand or pocket held, wireless and lightweight, weighing less than 250 grams. It can be upgraded on demand, in a similar way to the upgrading of a computer and can be configured for either home base services, i.e. indoor use only, using a form of "wall-mounted control board" or for indoor and outdoor use. If the latter, subscription to the following Talking@Gadget service will usually be preferred.
- the device is able to provide and support three main functionalities: talking@object, talking@travel and talking@care.
- buttons generally indicated in Figure 1. Textured or tactile buttons of various shapes will select specific functions and can be easily identified by touch by a visually impaired user. For example, a square button can be provided to select the talking@object function, a round button to select the talking@travel function and a triangular button to select the talking@care function.
- the user is provided with a supply of thin electronic tags that the device can read using the scanning facility.
- Each tag is attached to an object selected by the user if the object is not already provided with a tag or tag equivalent, which can be read by the scanning facility already inbuilt.
- the user can select the talking@object mode using the square button. Then, the user can uses the device in the talking@object mode, also referred to as "objectFinder" mode, to scan the object once and the user assigns a preferred name for the object via the device, which is recorded and associated with the tag identity.
- objectFinder also referred to as "objectFinder" mode
- the device in the talking@object mode can scan the object and use the scanned tag identity to inform the user what the object is.
- Further information about the tagged object can also be stored in the device or an associated storage device able to communicate with the device, such as a home computer so that, on scanning the tag associated with the object, the device will inform the user of this further information. For example, in addition to what the object is, its colour or other attributes can be described. This further information allows the device in the talking@object or objectFinder mode to be used as a "mirror". The device reads the tagged clothes and gives a description of the clothes, e.g. colour to the user.
- the talking@travel or "travelFinder” mode can be selected using the round button.
- the destination is fed into the device, preferably using a voice recognition system.
- the device then either plans the route or updates previous information and tells the individual how to get to the desired destination and how long it will take.
- the device will identify the correct bus, for example, and announce the stop at which the user has to alight.
- the talking@care or "careFinder” mode is selected using the triangular button, hi this mode, the device, when attached to a specific add-on medical sensor, will wirelessly collect and transport data, such as blood pressure, weight, and blood sugar levels to a home computer to be analysed. Alternatively, the device could either transmit the data to a doctor's surgery immediately or stores it temporarily for later transmission or periodic retrieval.
- Table 1 shown below sets out a range of desired functionalities for the device and the user's advantages of the different functionalities.
- the laser scanning means is based on a principle similar to bar code recognition and thus will be multi-purpose, allowing for identification of tagged objects of all kinds.
- Foodstuffs for example, will usually be bar coded, at source, i.e. by the manufacturer.
- the bar code would ideally identify the product, its cost and contents. It may also be possible to extend the bar code so that cooking instructions and recipes can also be included on the packet, tin or bag.
- some clothing manufacturers include "Braille information" on clothing tags. However, there are relatively few who provide this facility and indeed relatively few visually impaired people who use Braille as a communication method. Consequently, bar coding clothing to include cost, size, colour, material and washing instructions would be, without doubt, hugely helpful to the visually impaired person.
- Sensor identification means will allow the visually impaired person to locate specific objects.
- Each object when bought may have an identification bar code or alternatively could be "tagged" ideally, by the manufacturer and/or by the visually impaired person.
- the object information when scanned, would then be stored either in the device itself or sent to a home-based computer and/or a central sever. The latter could be provided by an Internet agency, phone company or an independent provider.
- the device would behave like a personal server and data distributor (data hub). Thus data storage and management would be possible both locally and remotely. If stored locally, i.e. in the device or on a home computer, vital information could still also be sent to the central server as a back-up facility.
- Another aspect of the present invention relates to a system for public transport comprising a service provider, a computer and detecting means.
- Bus and train companies could subscribe to a service provider and each mode of transport has inbuilt sensors capable of communicating with a computer into which is programmed route information, timetables and travel times, in much the same way as GPS and navigation systems currently operate. Even if bus numbers were altered mid route, this would be recognised by the computer and the visually impaired person would be informed accordingly.
- the system would also be capable of identifying the correct bus and where it was in relation to the individual and/or bus stop.
- a med-alert which is an added health and safety feature and can be tailored to each user.
- the user monitors various aspects of body function by attaching body sensors.
- the information is picked up by the device and then sent to a remote server, which is then accessed by the medical practice responsible for the individual's care.
- the practice would require, however, subscribing to such a service.
- significant changes in a visually impaired person's health status would be recorded. Immediate action by the health agencies involved would then be possible.
- the device will have a face recognition and fingerprint security facility requiring biometric verification of user identity to enable use of the device.
- a further extension of the invention is to use the device to support the Talking@Gadget service which tailors the device to the specific needs of each user, for example, personalising voice commands, unique privacy protection, and the creation of home based services as alternative mobility solutions are all possible.
- the service uses the voice commanded device to enable users to identify, find and manage tagged objects and object information indoors and/or in and out of doors.
- the device can be designed to access a wireless sensor network which will then send the information to either a home based server or a central (remote) service centre.
- the device will act like a personal service gateway once services are created, subscribed and activated.
- the service may have particular appeal and could ultimately be extended to other ad hoc mobile services, and provide a platform on which various services could be developed.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Business, Economics & Management (AREA)
- Epidemiology (AREA)
- Automation & Control Theory (AREA)
- Pain & Pain Management (AREA)
- Physical Education & Sports Medicine (AREA)
- Rehabilitation Therapy (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Mathematical Physics (AREA)
- Ecology (AREA)
- Biophysics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Traffic Control Systems (AREA)
Abstract
A wireless system to assist visually impaired people comprising voice-activated portable devices comprising communicating means for transmitting and receiving data to and/or from a network; scanning means for scanning an object selected by a user; memory means for storing information regarding a scanned object; sensor and identification means for locating an object previously scanned; journey means for planning a route to a destination selected by the user and for identifying the correct transport for travel to a selected destination, wherein the sensor identification and journey means are able to communicate with the wireless system through the communicating means.
Description
ASSISTANCE DEVICE FOR BLIND AND PARTIALLY SIGHTED PEOPLE
TECHNICAL FIELD
The present invention relates to a voice activated device suitable for assisting blind and partially sighted people.
BACKGROUND ART In physical space, visually impaired users still depend mainly on basic low technology aids such as canes, guide dogs or spoken directions to reach only known destinations along familiar learned routes. Tactile maps exist, but are hard to use and, as a result are often ineffective, and so are not widely used. In general, low technology aids are not intuitive enough for travel to new or unknown destinations and access to unfamiliar social environments such as a hospital or shopping mall is restricted.
Visually impaired people have great difficulties participating in necessary daily activities such as shopping, going to hospital, a bank or just visiting a friend.
More recently, there have been attempts to use technology to detect nearby objects. For example, some high-tech devices detect potential hazards at head level. Several more advanced navigation systems such as GPS-TaIk utilise Global Positioning System (GPS) and offer potential to develop outdoor way-finding solutions. However, to realise this potential, devices must be integrated with real time sensory information about users' dynamic environment. Such a device is the front-end of a complex technologies' integration, and must be affordable, wearable and intuitive.
In virtual space, access to the Web is very difficult - often impossible. Sensor based technology applications; such as smart homes and telecare offer users some limited alternatives. However, in order to fully address the needs of blind and partially sighted
people, Internet mediated services have to be dedicated to users' needs and living environments.
The global incidence of blindness has increased in the last twelve years but the incidence of blindness in the population over 50 years of age has increased even more. Therefore, there is the need, not only to prevent blindness and to find effective strategies to combat its progression, but also there is an urgent requirement to find appropriate, effective rehabilitative and assistance strategies in order to promote independent functioning in the growing visually impaired population.
In the United Kingdom, services to people with significant sight loss are well established and comprise, in the main, mobility and rehabilitation. However, these services, which tend to be fragmented and/or geographically dependent, are limited in their potential and are not intuitive in function.
People with sight loss experience significant difficulties in undertaking every day tasks and achieving daily goals. Various aids, ranging from tactile maps to touch pad orientation systems exist. However the inherent limitations of such systems prevent their widespread use by people with sight loss.
Considerable advances in both science and technology have, however, resulted in a variety of low and high tech mobility devices ranging from sonic canes, guides and pathfinders to sophisticated sensory substitutions such as echo-locators, and to more complex navigation, orientation and location aware systems.
For example, there are products from Sonar Canes, Sonic Guide and Sonic Pathfinders, which detect obstacles and hazards. VOICe, on the other hand translates video images from a regular PC camera into sounds so that users can "see with their ears". GPS-talks, together with MoBIC and Visuaide Trekers are navigation systems that utilise GPS (Global Positioning Systems) and wireless communication networks for outdoor way-finding. Orientation and navigation are achieved by the use of
sophisticated systems such as Talking Sign technology; the system consists of short audio signals sent by invisible infrared light beams from permanently installed transmitters, which are placed at sign locations or on key environmental features, to a hand-held receiver that decodes the signal and delivers the voice message through its speaker or headset. Other systems such as indoor location systems are all encompassing and are, in the main, sensor-based technologies providing location information, space identifiers, and position coordinates. Cricket, for example, piggybacks onto applications running on handheld computers, laptops, and sensors and is designed for use indoors and in urban areas where outdoor systems such as Global Positioning Systems (GPS) cannot be used.
Other relevant technologies to the present invention are RFID (Radio Frequency Identification) tags. The usage of RFID tags is increasing strongly and they are already being used for the following applications:
• cards for transport
• access control/corporate ID
• libraries/archives
• raw material inventory in industry
There are also some original applications such as RFID tags embedded in plates at sushi bars to calculate the customer's bill. New applications are expected in baggage delivery and in inventory management, as well as medical supervision. RFID technology is also being studied to develop a system to help visually impaired people navigate.
Card shaped products for personal usage, such as telephone cards or commuters passes, have traditionally accounted for the majority of RFID usage, but in future growth is especially expected in retail and logistics related applications. The international standardization for RFID at high-frequency band (ISO/IEC 14443
proximity application, ISO/IEC 15693 vicinity application) is expected to facilitate wider usage of RFID.
It is evident that there is an abundance of assistive technologies, however they exist in isolation and are, as a consequence, of limited use to the visually impaired person. Echo locators for example, offer users little or no information about an object, only its location. Access to transport and to unfamiliar destinations remains problematic, time consuming and challenging and individual preferences and personal objectives when travelling are frequently only realised with difficulty or, indeed, not at all.
As mentioned above, there are a number of relevant technologies that provide information for blind and partially sighted people. They are generally however, one service providers and thus are severely limited in use. A device, which suits a person who is partially sighted, can be completely useless for a totally blind person.
There is thus an urgent requirement to address the diverse and changing needs of the blind and partially sighted population by integrating existing technologies such as sensor and data networks, mobile and wearable smart devices and remote healthcare monitoring into one simple device i.e. a multi-modal integrated technology aid. Such technology is not available at the present time.
DISCLOSURE OF THE INVENTION
The objective of the present invention is to solve the above problems and to pull together diverse technologies, which would otherwise evolve independently.
In general terms the invention provides a device comprising a laser scanning technology, mobile service facility, voice recognition facilities, sensory identification, memory, data acquisition and storage.
A first specific object of the present invention is to use the device in "smart homes" by virtual travel communities, and emergency response agencies. Moreover, the device could be used in shopping and supermarket for automated check-out.
To attain the above objective in the invention's primary mode, it is configured as a voice-activated, portable device to assist visually impaired people comprising: communicating means for transmitting and receiving data to and/or from a network; scanning means for scanning an object selected by a user; memory means for storing information regarding a scanned object; sensor and identification means for locating an object previously scanned; journey means for planning a route to a destination selected by the user and for identifying the correct transport for travel to a selected destination; wherein the sensor identification and journey means are able to communicate with the network through the communicating means.
In its secondary mode, the invention is configured with a wireless system for assisting visually impaired people comprising: a number of portable devices according to the first aspect of the invention and connected to a sensor network; receiving means adapted to receive data from the portable devices and to transmit said data to a server which is adapted to store the data; transmitting means adapted to transmit data from the server to the portable devices; selecting means adapted to select the data transmitted by the transmitting means according to each portable device; wherein each portable device is adapted to receive and store the selected data, and to update its existing data.
In addressing a third aspect of its use, the present invention is configured as a method for assisting visually impaired people using a wireless system comprising a number of portable devices according to its primary mode and connected to a sensor network comprising: receiving data from the portable devices and transmitting said data to a server for storage; transmitting data from the server to the portable devices; selecting the data to be transmitted by the transmitting means according to each portable device. In this way, each portable device receives and stores the selected data, and updates its own existing data.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic view of a device according to the invention.
MODES FOR CARRYING OUT THE INVENTION
Below, preferred configurations of the present invention are explained by way of example with reference to the accompanying drawings.
Figure 1 is a schematic view showing the first configuration of the present invention.
The device of the present invention comprises laser scanning means, voice recognition means, sensory identification means and storage means.
The device is portable, suitable to be hand or pocket held, wireless and lightweight, weighing less than 250 grams. It can be upgraded on demand, in a similar way to the upgrading of a computer and can be configured for either home base services, i.e. indoor use only, using a form of "wall-mounted control board" or for indoor and outdoor use. If the latter, subscription to the following Talking@Gadget service will usually be preferred.
The device is able to provide and support three main functionalities: talking@object, talking@travel and talking@care.
The selection of the mode of operation of the device is by push buttons generally indicated in Figure 1. Textured or tactile buttons of various shapes will select specific functions and can be easily identified by touch by a visually impaired user.
For example, a square button can be provided to select the talking@object function, a round button to select the talking@travel function and a triangular button to select the talking@care function.
The user is provided with a supply of thin electronic tags that the device can read using the scanning facility. Each tag is attached to an object selected by the user if the object is not already provided with a tag or tag equivalent, which can be read by the scanning facility already inbuilt.
The user can select the talking@object mode using the square button. Then, the user can uses the device in the talking@object mode, also referred to as "objectFinder" mode, to scan the object once and the user assigns a preferred name for the object via the device, which is recorded and associated with the tag identity.
Subsequently, when the user wants to know the location of the tagged object, he or she speaks the assigned name so the device can recognize it using the scanning means and tell the user where the object is.
Similarly, if a user wishes to identify an object, the device in the talking@object mode can scan the object and use the scanned tag identity to inform the user what the object is.
Further information about the tagged object can also be stored in the device or an associated storage device able to communicate with the device, such as a home computer so that, on scanning the tag associated with the object, the device will inform the user of this further information. For example, in addition to what the object is, its colour or other attributes can be described.
This further information allows the device in the talking@object or objectFinder mode to be used as a "mirror". The device reads the tagged clothes and gives a description of the clothes, e.g. colour to the user.
The talking@travel or "travelFinder" mode can be selected using the round button. In this mode, before travelling, the destination is fed into the device, preferably using a voice recognition system. The device then either plans the route or updates previous information and tells the individual how to get to the desired destination and how long it will take. The device will identify the correct bus, for example, and announce the stop at which the user has to alight.
The talking@care or "careFinder" mode is selected using the triangular button, hi this mode, the device, when attached to a specific add-on medical sensor, will wirelessly collect and transport data, such as blood pressure, weight, and blood sugar levels to a home computer to be analysed. Alternatively, the device could either transmit the data to a doctor's surgery immediately or stores it temporarily for later transmission or periodic retrieval.
Table 1 shown below sets out a range of desired functionalities for the device and the user's advantages of the different functionalities.
Gadget Facility (examples) User Benefits
TABLE 1
The laser scanning means is based on a principle similar to bar code recognition and thus will be multi-purpose, allowing for identification of tagged objects of all kinds. Foodstuffs, for example, will usually be bar coded, at source, i.e. by the manufacturer. The bar code would ideally identify the product, its cost and contents. It may also be possible to extend the bar code so that cooking instructions and recipes can also be included on the packet, tin or bag. At the present time, some clothing manufacturers include "Braille information" on clothing tags. However, there are relatively few who provide this facility and indeed relatively few visually impaired people who use Braille as a communication method. Consequently, bar coding clothing to include cost, size, colour, material and washing instructions would be, without doubt, hugely helpful to the visually impaired person.
Sensor identification means will allow the visually impaired person to locate specific objects. Each object when bought may have an identification bar code or alternatively could be "tagged" ideally, by the manufacturer and/or by the visually impaired person. The object information, when scanned, would then be stored either in the device itself or sent to a home-based computer and/or a central sever. The latter could
be provided by an Internet agency, phone company or an independent provider. The device would behave like a personal server and data distributor (data hub). Thus data storage and management would be possible both locally and remotely. If stored locally, i.e. in the device or on a home computer, vital information could still also be sent to the central server as a back-up facility.
Another aspect of the present invention relates to a system for public transport comprising a service provider, a computer and detecting means. Bus and train companies could subscribe to a service provider and each mode of transport has inbuilt sensors capable of communicating with a computer into which is programmed route information, timetables and travel times, in much the same way as GPS and navigation systems currently operate. Even if bus numbers were altered mid route, this would be recognised by the computer and the visually impaired person would be informed accordingly. The system would also be capable of identifying the correct bus and where it was in relation to the individual and/or bus stop.
Another functional aspect of the invention is provided by a med-alert, which is an added health and safety feature and can be tailored to each user. The user monitors various aspects of body function by attaching body sensors. The information is picked up by the device and then sent to a remote server, which is then accessed by the medical practice responsible for the individual's care. The practice would require, however, subscribing to such a service. Thus significant changes in a visually impaired person's health status would be recorded. Immediate action by the health agencies involved would then be possible.
Preferably, the device will have a face recognition and fingerprint security facility requiring biometric verification of user identity to enable use of the device.
A further extension of the invention is to use the device to support the Talking@Gadget service which tailors the device to the specific needs of each user,
for example, personalising voice commands, unique privacy protection, and the creation of home based services as alternative mobility solutions are all possible.
The service uses the voice commanded device to enable users to identify, find and manage tagged objects and object information indoors and/or in and out of doors. The device, as previously indicated, can be designed to access a wireless sensor network which will then send the information to either a home based server or a central (remote) service centre. The device will act like a personal service gateway once services are created, subscribed and activated.
The service may have particular appeal and could ultimately be extended to other ad hoc mobile services, and provide a platform on which various services could be developed.
The embodiments described above are exemplary and not exhaustive, the skilled user will be able to envisage other alternatives within the scope of the invention, as set out in the claims.
Claims
1. A voice-activated portable device to assist visually impaired people comprising: communicating means for transmitting and receiving data to and/or from a network; scanning means for scanning an object selected by a user; memory means for storing information regarding a scanned object; sensor and identification means for locating an object previously scanned; and journey means for planning a route to a destination selected by the user and for identifying the correct transport for travel to a selected destination, wherein the sensor identification and journey means are able to communicate with the network through the communicating means.
2. The voice-activated portable device according to claim 1, further comprising monitoring means for monitoring medical information related to the user and storing said information in the memory means.
3. The voice activated portable device according to claim 1 or claim 2, further comprising textured and shaped tactile control buttons.
4. The voice activated portable device according to any of the preceding claims, further comprising biometric means adapted to identify the user and enable operation of the device.
5. The voice activated portable device according to claim 4, in which the biometric means is adapted to identify the user's face or fingerprints.
6. The voice activated portable device according to any of the preceding claims, wherein the communication to the network is wireless and the device is adapted to be used both indoors and outdoors.
7. The voice activated portable device according to any of the preceding claims, in which the scanning means is adapted to read electronic tags attached to an object selected by the user when scanning the object.
8. The voice activated portable device according to claim 7 and adapted to store a voice signal together with the identity of said object after scanning the object and to subsequently recognise said stored voice signal.
9. The voice activated portable device according to claim 8, adapted to respond to a voice signal recognised as said stored voice signal by informing the user of the location of said object by using the sensor and identification means.
10. The voice activated portable device according to any of the preceding claims and adapted to be upgradeable on demand by the user.
11. The voice activated portable device according to any of the preceding claims and having a weight of less than 250 grams.
12. A wireless system for assisting visually impaired people comprising: a plurality of portable devices as defined in any of the preceding claims connected to a sensor network; receiving means adapted to receive data from the portable devices and to transmit said data to a server which is adapted to store the data; transmitting means adapted to transmit data from the server to the portable devices; selecting means adapted to select the data transmitted by the transmitting means according to each portable device; wherein each portable device is adapted to receive and store the selected data, and to update its existing data.
13. A wireless system according to claim 12, wherein the transmitting means is adapted to transmit medical data to a service centre to be analysed.
14. A method for assisting visually impaired people using a wireless system comprising a plurality of portable devices as defined in any one of claims 1 to 13 connected to a sensor network; and comprising the steps of: receiving data from the portable devices and transmitting said data to a server which stores the data; transmitting data from the server to the portable devices; selecting the data to be transmitted by the transmitting means according to each portable device; whereby each portable device receives and stores the selected data, and updates its existing data.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0615559A GB2440583A (en) | 2006-08-04 | 2006-08-04 | A portable route planning and object identification device for the visually impaired |
GB0615559.2 | 2006-08-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2008015375A1 true WO2008015375A1 (en) | 2008-02-07 |
Family
ID=37027279
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2007/002649 WO2008015375A1 (en) | 2006-08-04 | 2007-07-13 | Assistance device for blind and partially sighted people |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB2440583A (en) |
WO (1) | WO2008015375A1 (en) |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102641198A (en) * | 2012-04-27 | 2012-08-22 | 浙江大学 | Blind person environment sensing method based on wireless networks and sound positioning |
CN103919663A (en) * | 2014-03-31 | 2014-07-16 | 浙江大学 | Method for blind persons to sense outdoor environment |
US9307073B2 (en) | 2013-12-31 | 2016-04-05 | Sorenson Communications, Inc. | Visual assistance systems and related methods |
US9578307B2 (en) | 2014-01-14 | 2017-02-21 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US9576460B2 (en) | 2015-01-21 | 2017-02-21 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable smart device for hazard detection and warning based on image and audio data |
US9586318B2 (en) | 2015-02-27 | 2017-03-07 | Toyota Motor Engineering & Manufacturing North America, Inc. | Modular robot with smart device |
US9629774B2 (en) | 2014-01-14 | 2017-04-25 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US9677901B2 (en) | 2015-03-10 | 2017-06-13 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for providing navigation instructions at optimal times |
US9811752B2 (en) | 2015-03-10 | 2017-11-07 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable smart device and method for redundant object identification |
US9829322B2 (en) | 2016-03-03 | 2017-11-28 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for directing a vision-impaired user to a vehicle |
US9898039B2 (en) | 2015-08-03 | 2018-02-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Modular smart necklace |
US9915545B2 (en) | 2014-01-14 | 2018-03-13 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US9922236B2 (en) | 2014-09-17 | 2018-03-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable eyeglasses for providing social and environmental awareness |
US9958275B2 (en) | 2016-05-31 | 2018-05-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for wearable smart device communications |
US9972216B2 (en) | 2015-03-20 | 2018-05-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for storing and playback of information for blind users |
US9996730B2 (en) | 2016-03-18 | 2018-06-12 | Toyota Motor Engineering & Manufacturing North America, Inc. | Vision-assist systems adapted for inter-device communication session |
US10012505B2 (en) | 2016-11-11 | 2018-07-03 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable system for providing walking directions |
US10024667B2 (en) | 2014-08-01 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable earpiece for providing social and environmental awareness |
US10024679B2 (en) | 2014-01-14 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US10024680B2 (en) | 2016-03-11 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Step based guidance system |
US10024678B2 (en) | 2014-09-17 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable clip for providing social and environmental awareness |
US10172760B2 (en) | 2017-01-19 | 2019-01-08 | Jennifer Hendrix | Responsive route guidance and identification system |
US10248856B2 (en) | 2014-01-14 | 2019-04-02 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US10360907B2 (en) | 2014-01-14 | 2019-07-23 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US10395555B2 (en) | 2015-03-30 | 2019-08-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for providing optimal braille output based on spoken and sign language |
US10432851B2 (en) | 2016-10-28 | 2019-10-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable computing device for detecting photography |
US10490102B2 (en) | 2015-02-10 | 2019-11-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for braille assistance |
US10521669B2 (en) | 2016-11-14 | 2019-12-31 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for providing guidance or feedback to a user |
US10561519B2 (en) | 2016-07-20 | 2020-02-18 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable computing device having a curved back to reduce pressure on vertebrae |
US10912281B2 (en) | 2016-02-24 | 2021-02-09 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for communicating with a guide animal |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020121986A1 (en) * | 2001-02-07 | 2002-09-05 | William Krukowski | Method and system for identifying an object and announcing a voice message |
GB2388194A (en) * | 2002-05-02 | 2003-11-05 | Nec Technologies | Remote medical monitor utilising a mobile telephone |
WO2005008914A1 (en) * | 2003-07-10 | 2005-01-27 | University Of Florida Research Foundation, Inc. | Mobile care-giving and intelligent assistance device |
EP1685794A1 (en) * | 2003-11-18 | 2006-08-02 | Sony Corporation | Input device, input method, and electronic device |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5470233A (en) * | 1994-03-17 | 1995-11-28 | Arkenstone, Inc. | System and method for tracking a pedestrian |
JP3398126B2 (en) * | 2000-06-05 | 2003-04-21 | 株式会社九北エレクトロニクス | Article identification device |
NL1016812C2 (en) * | 2000-12-06 | 2002-06-07 | Sjirk Van Der Zee | Route navigation method, especially for blind or visually impaired people, detects user position and sends information from central database server to user |
FR2839805A1 (en) * | 2002-05-17 | 2003-11-21 | Florence Daumas | Speech synthesising transport information unit includes microphone and voice recognition unit linked to algorithm providing itinerary information |
JP4190843B2 (en) * | 2002-09-25 | 2008-12-03 | Necフィールディング株式会社 | Personal navigation system |
US7199725B2 (en) * | 2003-11-06 | 2007-04-03 | International Business Machines Corporation | Radio frequency identification aiding the visually impaired with synchronous sound skins |
-
2006
- 2006-08-04 GB GB0615559A patent/GB2440583A/en not_active Withdrawn
-
2007
- 2007-07-13 WO PCT/GB2007/002649 patent/WO2008015375A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020121986A1 (en) * | 2001-02-07 | 2002-09-05 | William Krukowski | Method and system for identifying an object and announcing a voice message |
GB2388194A (en) * | 2002-05-02 | 2003-11-05 | Nec Technologies | Remote medical monitor utilising a mobile telephone |
WO2005008914A1 (en) * | 2003-07-10 | 2005-01-27 | University Of Florida Research Foundation, Inc. | Mobile care-giving and intelligent assistance device |
US20050060088A1 (en) * | 2003-07-10 | 2005-03-17 | University Of Florida Research Foundation, Inc. | Pedestrian navigation and spatial relation device |
EP1685794A1 (en) * | 2003-11-18 | 2006-08-02 | Sony Corporation | Input device, input method, and electronic device |
Non-Patent Citations (2)
Title |
---|
DELIS A ET AL: "Navigation and Multimodal Transportation with EasyTransport", IEEE INTELLIGENT SYSTEMS, vol. 20, no. 2, March 2005 (2005-03-01), pages 54 - 61, XP011129178 * |
THURTIG ET AL: "Modality Fusion in a Route Navigation System", PROCEEDINGS OF THE WORKSHOP ON EFFECTIVE MULTIMODAL DIALOGUE INTERFACES EMMDI, X, XX, 29 January 2006 (2006-01-29), pages 6pp, XP007903172 * |
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102641198A (en) * | 2012-04-27 | 2012-08-22 | 浙江大学 | Blind person environment sensing method based on wireless networks and sound positioning |
US9843678B2 (en) | 2013-12-31 | 2017-12-12 | Sorenson Ip Holdings, Llc | Visual assistance systems and related methods |
US9307073B2 (en) | 2013-12-31 | 2016-04-05 | Sorenson Communications, Inc. | Visual assistance systems and related methods |
US9915545B2 (en) | 2014-01-14 | 2018-03-13 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US10360907B2 (en) | 2014-01-14 | 2019-07-23 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US10024679B2 (en) | 2014-01-14 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US9629774B2 (en) | 2014-01-14 | 2017-04-25 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US10248856B2 (en) | 2014-01-14 | 2019-04-02 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US9578307B2 (en) | 2014-01-14 | 2017-02-21 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
CN103919663A (en) * | 2014-03-31 | 2014-07-16 | 浙江大学 | Method for blind persons to sense outdoor environment |
US10024667B2 (en) | 2014-08-01 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable earpiece for providing social and environmental awareness |
US9922236B2 (en) | 2014-09-17 | 2018-03-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable eyeglasses for providing social and environmental awareness |
US10024678B2 (en) | 2014-09-17 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable clip for providing social and environmental awareness |
US9576460B2 (en) | 2015-01-21 | 2017-02-21 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable smart device for hazard detection and warning based on image and audio data |
US10490102B2 (en) | 2015-02-10 | 2019-11-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for braille assistance |
US10391631B2 (en) | 2015-02-27 | 2019-08-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | Modular robot with smart device |
US9586318B2 (en) | 2015-02-27 | 2017-03-07 | Toyota Motor Engineering & Manufacturing North America, Inc. | Modular robot with smart device |
US9811752B2 (en) | 2015-03-10 | 2017-11-07 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable smart device and method for redundant object identification |
US9677901B2 (en) | 2015-03-10 | 2017-06-13 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for providing navigation instructions at optimal times |
US9972216B2 (en) | 2015-03-20 | 2018-05-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for storing and playback of information for blind users |
US10395555B2 (en) | 2015-03-30 | 2019-08-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for providing optimal braille output based on spoken and sign language |
US9898039B2 (en) | 2015-08-03 | 2018-02-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Modular smart necklace |
US10912281B2 (en) | 2016-02-24 | 2021-02-09 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for communicating with a guide animal |
US9829322B2 (en) | 2016-03-03 | 2017-11-28 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for directing a vision-impaired user to a vehicle |
US10024680B2 (en) | 2016-03-11 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Step based guidance system |
US9996730B2 (en) | 2016-03-18 | 2018-06-12 | Toyota Motor Engineering & Manufacturing North America, Inc. | Vision-assist systems adapted for inter-device communication session |
US9958275B2 (en) | 2016-05-31 | 2018-05-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for wearable smart device communications |
US10561519B2 (en) | 2016-07-20 | 2020-02-18 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable computing device having a curved back to reduce pressure on vertebrae |
US10432851B2 (en) | 2016-10-28 | 2019-10-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable computing device for detecting photography |
US10012505B2 (en) | 2016-11-11 | 2018-07-03 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable system for providing walking directions |
US10521669B2 (en) | 2016-11-14 | 2019-12-31 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for providing guidance or feedback to a user |
US10172760B2 (en) | 2017-01-19 | 2019-01-08 | Jennifer Hendrix | Responsive route guidance and identification system |
Also Published As
Publication number | Publication date |
---|---|
GB0615559D0 (en) | 2006-09-13 |
GB2440583A (en) | 2008-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2008015375A1 (en) | Assistance device for blind and partially sighted people | |
KR101835832B1 (en) | Calling system for using local area wireless communication | |
Ivanov | Indoor navigation system for visually impaired | |
US6977579B2 (en) | Radio frequency identification aiding the visually impaired | |
US9977938B2 (en) | Method and apparatus for accessing electronic data via a plurality of electronic tags | |
US7688211B2 (en) | Apparatus and method for enhancing face-to-face communication | |
US9626697B2 (en) | Method and apparatus for accessing electronic data via a plurality of electronic tags | |
WO2004032019A3 (en) | Universal communications, monitoring, tracking, and control system for a healthcare facility | |
US20060109083A1 (en) | Method and apparatus for accessing electronic data about at least one person of interest | |
US8094012B1 (en) | Active composite RFID tag for object localization and instruction | |
USRE41171E1 (en) | System for monitoring a person's location in a defined area | |
JP6160447B2 (en) | Terminal device and program | |
EP2587426A2 (en) | RFID tracker and locator | |
Murad et al. | RFAIDE—An RFID based navigation and object recognition assistant for visually impaired people | |
MX2015002352A (en) | Guiding users in an area. | |
US7375641B2 (en) | Centralized implementation of portal announcing method and system | |
EP2230614A1 (en) | Mobile browsing | |
US9972216B2 (en) | System and method for storing and playback of information for blind users | |
US20170269799A1 (en) | Method and apparatus for accessing electronic data via a plurality of electronic tags | |
KR100886077B1 (en) | The melody information furnish system using the mobile RFID toy and method thereof | |
Liu et al. | On smart-care services: Studies of visually impaired users in living contexts | |
US11687754B1 (en) | Automated location capture system | |
KR20110115205A (en) | Using wireless identification technology system and method for providing customized content | |
KR102462309B1 (en) | Short range wireless communication calling system for performing customer service using tagging informaion | |
JP2004334439A (en) | Corporeal thing information management system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07766226 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: RU |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 07766226 Country of ref document: EP Kind code of ref document: A1 |