NL1043806B1 - Inclusive personal wayfinding assistive method and system for electronic devices for all, in particular for visually impaired, dyslectics, the color blind, elderly and children. - Google Patents
Inclusive personal wayfinding assistive method and system for electronic devices for all, in particular for visually impaired, dyslectics, the color blind, elderly and children. Download PDFInfo
- Publication number
- NL1043806B1 NL1043806B1 NL1043806A NL1043806A NL1043806B1 NL 1043806 B1 NL1043806 B1 NL 1043806B1 NL 1043806 A NL1043806 A NL 1043806A NL 1043806 A NL1043806 A NL 1043806A NL 1043806 B1 NL1043806 B1 NL 1043806B1
- Authority
- NL
- Netherlands
- Prior art keywords
- user
- wayfinding
- personal
- navigation
- route
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
- A61H3/061—Walking aids for blind persons with electronic detecting or guiding means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5023—Interfaces to the user
- A61H2201/5043—Displays
- A61H2201/5046—Touch screens
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Epidemiology (AREA)
- Pain & Pain Management (AREA)
- Physical Education & Sports Medicine (AREA)
- Rehabilitation Therapy (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Navigation (AREA)
Abstract
The present invention is an inclusive personal wayfinding assistive methodology and system (The System) for electronic devices such as handheld or in-vehicle devices (The Device). The System is inclusively designed making independent wayfinding more intuitive and hereby accessible to a larger audience in general and in particular to visually impaired persons, dyslectics, color blind, elderly and children (The User). The concept transforms all current navigation guidance complexity into an intuitive, personal user experience that steps away from commonly used complex user interfaces that utilize a traditional digital map on top of which arrows and routes are displayed to in indicate the user's position, direction and route information in combination with navigation voice instructions to guide the user such as described in these following examples US2013325322A1l and US2019212151A1 2. The disclosed is different to these traditional digital map and arrow interfaces by using well distinguishable, high contrast circular visual objects instead of maps and arrows and the likes to make the interface accessible to a much larger audience. Moreover, the disclosed uses a unique combination of multi-sensory interfaces; visual, haptic, voice and sound in a game like dynamic interface to guide The User. The System offers flexible wayfinding assistance allowing The User to deviate from the selected route to avoid obstacles. The flexible wayfinding assistance wilt in this case 20 recognize the deviation and alter its navigation guidance automatically without the need for route-recalculations.
Description
i
INCLUSIVE PERSONAL WAYFINDING ASSISTIVE METHOD AND SYSTEM FOR ELECTRONIC DEVICES FOR ALL, IN PARTICULAR FOR VISUALLY IMPAIRED, DYSLECTICS, THE COLOR BLIND, ELDERLY AND CHILDREN. Embodiments 3 1. The present invention introduces an intuitive personalized game-like multi-sensory interface for personal wayfinding {The System) for electronic devices. The disclosed will make real personal assistive wayfinding for electronic personal and in-vehicle devices accessible to a larger and, more diverse audience. Like visually impaired persons (VIP), dyslectics, illiternies, color blind, elderly and children {The User) that require additional assistance to navigate their environment independently and to create location awareness, This is accomplished by transforming the complexity of currently used digital navigation systems that utilize a traditional digital map on top of which arrows and routes are displayed io guide the user as described in examples US2013325322A1! and US2019212151A1 2 info an intuitive, SVOryYOnE-can- 13 understand personal navigation tool that uses a new dynamic visual interface supported by haptic, sound and voice interfaces.
2. The Systems visual user interface provides personal wayfinding assistance to The User towards a desired destination associated with a facility by indrodocing large, easy distinguishable high contrast circular objects that objectifies in a dynamic manner The Users relative position and movement towards the desired direction The Users position relative to the desired route, destination and waypoints which represents a point on the desired route that requires The Users action, like a change of direction or interaction with an object such as a crossing or door. The System also allows for haptic, voice and sound guidance,
3. The System introduces a 2D/5D guidance switch simplifying the user experience by offering The User to choose between a 2D or 3D representation of the circular object representing The Users position relative to the desired route in the visual user interface described in 1.
4, The System offers fexible wayfinding when The User to deviates from the desired route to avoid (temporary) obstacles. The flexible wayfinding will in this case POUS2013325322 A SYSTEM AND METHOD FOR NAVIGATION WITH INERTIAL CHARACTERISTICS. YUS201921218 1A FACILITY NAVIGATION recognize The User's deviation and alter the navigation guidance by rearranging the circular objects to the new deviated situation.
5. The System offers location awareness in an easy to vnderstand intuitive way by its visual interface and by introducing the explore function that uses comparable multi sensory interface elements as described in 1 to indicate objects in The Users direct surroundings. More specifically the explore-function presents objects in the direction The Users device is pointing, which in this document will be referred to as The Heading.
6. The System offers presets to make the visual interface suitable for people with various visual disabilities.
7. The present invention introduces an intuitive personalized game-like multi-sensory interface for personal wayfinding {The System) for electronic devices, The disclosed will make real personal assistive wayfinding for electronic personal and in-vehicle devices accessible to a larger and, more diverse audience. Like visually impaired is persons (VIP), dyslectics, illiterates, color blind, elderly and children (The User) that require additional assistance to navigate their enviroment independently and to create location awareness. This is accomplished by transforming the complexity of currently used digital navigation systems that uiilize a traditional digital map on top of which arrows and routes are displayed to guide the user as described in examples US2013325322A13 and US2019212151A1 ° into an intuitive, evervone-can- understand personal navigation tool that uses a now dynamic visual interface supported by haptic, sound and voice interfaces.
8. The Systems visual user interface provides personal wayfinding assistance to The User towards a desired destination associated with a facility by introducing large, easy distinguishable high contrast circular objects that objectifies in a dynaniic manner The Users relative position and movement towards the desired direction The Users position relative to the desired route, destination and waypoints which represents a point on the desired route that requires The Users action, like a change of direction or interaction with an object such as a crossing or door. The System also allows for haptic, voice and sound guidance.
3 S7013325322A1 SYSTEM AND METHOD FOR NAVIGATION WITH INERTIAL CHARACTERISTICS, HUISZOIS2IZISIAT FACILITY NAVIGATION
9. The System introduces a 20/30 guidance switch simplifying the user experience by offering The User to choose between a 2D or 3D representation of the circular object representing The Users position relative to the desired route in the visual user interface described in 1. $ 10, The System offers flexible wayfinding when The User to deviates from the desired toute to avoid (temporary) obstacles, The flexible wayfinding will in this case recognize The User's deviation and alter the navigation guidance by rearranging the circular objects to the new deviated situation.
LL, The System offers location awareness in an easy to understand intuitive way by its visual interface and by introducing the explore function that uses comparable nmult- sensory interface elements as described m 1 to indicate objects in The Users direct surroundings.
More specifically the explore~-function presents objects in the direction The User's device is pointing, which in this document will be referred to as The Heading.
IS 12, The System offers presets to make the visual interface suitable for people with various visual disabilities,
Description Field {1] Embodiments described here represent a wayfinding guidance methodology and application that introduces a novel concept of navigation to users of an electronic device, inclusively designed, The System is in particular designed for individuals with a cognitive or visual impairment such as the visually impaired, dyslectics, elderly and children but also improve the user experience to the general public increasing the use of personal waylinding for personal and in- vehicle electronic devices.
Background 21 Individuals with a visual or cognitive disability such as visually tnpaired, dyslectics, color blind, illiterates, elderly and children have many difficulties navigating in a given environment.
For example, walking from a given location to a destination may be a difficult task for such individuals, particularly in city is and indoor environments.
In case of a visual disability walking canes and seeing-eye dogs are helpful for avoiding some obstacles bal may not address the broader issue of providing personal wayfinding assistance and location awareness.
For this the visually impaired have depended on walking sticks, audible traffic signals, and Braille signs to navigate.
Without Orientation and Mobility training, blind people cannot navigate new places on thelr own, Moreover, information needed by these individuals for navigating certain environments {e.g., streets, subways, train stations, museums, office buildings and the like) may not be readily available in any accessible form that is of use to them.
Further, orientation and navigation information may take on diverse forms 23 which is already complex to the general public and therefore in particular to the individuals described here.
The disclosed offers real personal assistive wayfinding and creates for electronic personal and in-vehicle devices accessible to a larger and more diverse audience. {31 Most currently used navigation systems utilize a user interface existing of a 38 traditional digital map on top of which arrows and routes are displayed to guide people as described in examples US2013325322A1 and US2019212151A1, These systems are complex for the general public to understand and a burden for users belonging io the following groups: visually impaired (VIP), dyslectics,
iterates, color blind, elderly and children (The User). These mdividuals require other assistive methods to empower them to independently create location awareness and navigate to a desired location.
4] Cooumonly used solutions on the market today trying to address these issues 5 focus on audio systems to guide visually impaired that are built on top of these digital map interfaces and offen nse additional technology like Bluetooth or RFID beacons such as described in US2016259027A1° {51 Other previous attempt to address these issues with infrared audible signs. This approach requires hand held devices that have to be pointed {0 the right direction to hear the sign. Additionally, infrared audible signs are expensive, have high installation maintenance cost and cannot be used by just any electronic device {smartphones or in-vehicle devices).
Definitinns used in this document IS [6] Navigation guidance information consists of information about the users position relative to The User's desired route, information about The User's next waypoint heading towards the next waypoint, The User's destination-heading and The Users deviation from the desired route which are all relative to The User's position.
{71 A heading or view angle in this document is equal to the direction in which the top represented by N { Figure 1 introduces the high contrast circular objects of The System.) when the electronic device is pointing in vertical position aka. portrait mods or € when the device is in horizontal position a.k.a. landscape mode,
[8] The Users start position is the position from which the user starts navigating, It is determined by sources such as GPS, triangulation methods and/or beacons or the like from outside the present invention.
[9] The User's desired route (Route) is a set of waypoints and points of inferest calculated from The User's start position and destination. The route is generated by calling a route generation Application Programming Interface (APD such as the one provided by Google Maps or other route provisioning method external to the present invention.
S US20162590274 1 AUDHO NAVIGATION SYSTEM POR THE VISUALLY IMPAIRED
[101 A Point of Interest (POI) is a location that someone may find useful or interesting. In our case a POI is a location which The User can navigate to or pass by on the desired route, POD are imported into the present invention using APT's from available external sources. [II] A waypoint is described a POI on The User's desired route where the route direction changes or an interaction with the environment is required by The User. Waypoints where an interaction is required is called an interaction point. Examples of interaction point are a door which has to be opened by the user, pushing an elevator button, going up or down stairs or escalator or other actions alike, Waypoints are imported into the present invention using APT's from available external sources. The Destination is the POI representing The User's destination, Brief description of drawings
[12] The invention will be further elucidated below on the basis of drawings. In the drawings, which ave not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document. {131 Figure | introduces the high contrast circular objects of The System.
[14] Figure 2 depicts how The System guides The User over a desired route. [IS] Figure 3 depicts the behavior of waypoint and destination etreles when navigating a desired route. {16] Figure 4 depicts reaching a waypoint or destination. [171 Figure 3 depicts the 2D/3D switch.
[18] Figure 6 depicts the flexible waytinding assistance
[19] Figue 7 depicts the explore function for location awareness
[20] Figure 8 depicts custom color options for various visual conditions.
Detailed description of the drawings [211 The visually impaired have depended on walking sticks, andible traffic signals, and Braille signs to navigate. Wilhou Orientation and Mobility training, blind people cannot navigate new places on their pwn. They ars confined to routes and places they are familiar with and they must be constantly alert fo sense cues and counting steps te recall routes memorized through training and practice.
{22} To address these non-intnitive user interface issues a visual user interface for slectronic devices is herein described that provide more context and awareness to all users in general and in particular to users belonging to the groups described in [21].
[23] The present invention relates generally to methods and systems for receiving navigation routing information from various soît- and hardware resources and objectifies the information into a novel navigational user experience that can be used both in- and outdoor environment for portable and in-vehicle electronic iS devices, such as, smartphones, smart wearable devices, laptops, personal dignal assistants, tablets, in-vehicle navigation devices and the like. More specifically, the present invention relates to a system and method for interpreting navigation instructions to a desired destination and creating location awareness into a novel intuitive user experience to users belonging to the groups described in [21].
isd] FIG. 1 introduces the high contrast objects of the visual navigation user imerface which objectifies in a dynamic manner the user's relative position and movement towards a destination along a desired route where the center of circle 3 represents the user's optimal position at the desired route, Ì represents the user's destination by a pulsating circle. 2 represents the users next waypoint by another circle, B represents the relative position of the user to the desired route, 4 represents the 2130) switch for P. § is a button which stops the wayfinding process when pressed. The right screen represents a detailed enlargement of the smartphone screen at the left. The dotted line & represents an imaginary radius on which 1 {the user's destination), is presented. Dotled line 7 represents an imaginary circle on which 2 (the users next waypoint), is depicted. When navigating the positions of 1 and 2 on their radii is determined by their relative position from the user's perspective as will be explained mn FIG. 2-5. 8 isa distance to destination indicator which displays the remaining distance to The Destination, described in [11], in meters as part of the total distance trom navigation start point to destination point, Whenever the User reaches a waypoint a voice message is triggered communicating the remaining distance io 3 the destination in meters. N and Q arc representing The Heading of the electronic device depending if they are positioned vertically {portrait} or horizontally (landscape) as introduced in [7]. {251 FIG. 2 illustrates the user experience of navigating the Route, When both The User’s initial position and the Route are determined from sources outside the id present invention, they are used to determine the starting positions of circles 1,2 and P of the of the visual user interface from FIG. 1. In the situation depicted here the user is right from the Route in case circle P is placed left from the central circle 3 indicating the User's optimal position from the Route, If the user will next move towards the Route, in this case to the left, circle P will start moving towards the center of 3, the optimal position. Meanwhile the final destination circle 1 will appear and disappear intermittently for some seconds to indicate the Users final destination. Meanwhile the User's next waypoint direction is constantly indicated by the position of circle 2. The behavior of 1 and 3 will be elucidated in more detail in FIG. 3, {261 The dynamic movement of the circles transforms complex (maps and arrow style) navigation info a pleasant casy for all to understand, game-like user experience while creating location awareness at the same time.
{271 FIG. 3 illustrates in detail the dynamic behavior of 1 {the user's destination) and 2 {the next waypoint} as described in [11] on their respective radii as described in 124] when The User navigates to a destination over the desired route. P indicates that the user currently is right from the desired route and should therefore move to the left. Circle § indicates the direction of the user's destination and 2 the direction of the users next waypoint in the real world. Tis positioned on imaginary line L1 that originates from the center of 3 {representing the user's current position) to the real destination Ta in the real world. Even so 2 is positioned on imaginary line 12 that originates from the center of 3 (representing the user's current position) to the position of next waypoint 2a. In this case when the user moves to the lel P will move to the center of 3 indicating to The User he is moving towards the desired route.
In case that 3a is in the center of 3 The User is at the optimal position relative to the desired route, Subsequently 2 will move clockwise on radius 7 {defined in FIG. 1) in such a way that it is always on imaginary line 12. When 2 arrives at the top position of circle 3 if means that the user is heading straight at the next waypoint.
Meanwhile the user is informed about the direction of the destination by 1. As 1 appears and disappears iniermitientiy for some seconds it does not distract the user while navigating to the next waypoint.
When P moves out of circle 3 The User is off the desired route too much, this will trigger off route sound and vibration to warn The User to move into the direction of B.
The functionality as described can also be used in case a user has to deviate from the desired route when faced with a (temporary) obstacle as tliustrated in FIG. 6. This dynamic behavior of the circles while The User is on the move creates a unigne game-like navigation user experience and stmultancously provides location awareness. 128] FIG. 4 illustrates what happens when The User reaches a waypoint, In that case P is positioned inside 3 will dynandeally increase in size after which it decreases in size while a waypoint arrival sound is triggered by the system, If the reached waypoint is an interaction point the arrival will trigger the associated voice instruction, Next 2 will rotate on radius 7 to the new position indicating the next waypoint to The User and P will take the new position relative to the route to the next waypoint. 1 will take on the new position on radius 6 indicating the destination, which is usually the same as the final waypoint. {291 FIG. 5 illustrates the 3D guidance switch to determine how P will be positioned in the graphical gser interface.
We consider 3a as the closest point on the desired route {indicated by LS} and The User's position represented by 3. In a, the 3D switch is deactivated, the distance between P and 3 is proportional tc the distance between The User 3 and 3a, Circle P moves in horizontal direction over imaginary line L3. In b, the 3D switch is activated in which case again the distance between P and 3 is proportional to the distance between The User 3 and 3a.
Circle P moves in horizontal direction over imaginary line LI, In addition, P is positioned at an angle B at circle 3 where 0 degrees is at 12 o'clock.
[301 FIG. 6 illustrates the behavior of the interface when The User has ia deviate from the desired route, Details are described in {27] {311 FIG. 7 mtroduces the explore function which is developed to create location awareness by identifying points of interests (POD) m The User's surroundings with The User as the central point of reference. POPs and their locations are imported into the present invention using API's from available external sources. We consider a circle BY around The User, represented by center point C witha radius of 50 meters. All POs that Be within the set range of in this case 50 meters are presented in RY at an angle a calculated from a from the center of © to N representing The Heading of the electronic device. An angle 8 of © degrees is therefore an indication that the POI is straight ahead of the electronic device, 90 degrees the POI will be to the right of the current heading ote. Whenever the User or vehicle turns and thos changing The Heading of the electronic device which will dynamically change the ¢ of the POs creating a rotating effect in the Visual User Interface of POPs orbiting around The User D. If a POI nikes contact with 1.6 text label D displaying the given name of the POI will appear. Text label D will also contain button ¥ when activated will start the navigation io this POI by starting the visual navigation user interface as introduced in 113] 321 FIG. 8 illustrates accessibility pre-set options for the visual interface tha 28 together will allow individuals with minimal evesight to make use of the visual interface: a, visualizes the Standard setting and b. the high contrast seling. State of the art {331 Most curently used navigation systems utilize a user interface existing of a 28 traditional digital map on top of which arrows and routes are displayed to guide people as described in examples US2013325322A1 and US2019212151AL {341 Commonly used solutions on the market today frying to address these issues focus on andi systems 10 guide visually impaired that are built on top of these digital map interfaces and often use additional technology lke Bluetooth or RFID beacons such as described in US2016259027418
[35] Other navigation related patents such as KR 2017/0104541A15 depicted in FIG. 9; US10132910B2 depicted in FIG. 10; US 2005/0099291A1 depicted in FIG. fUS2BIAZIS027AL AUDIO NAVIGATION SYSTEM FOR THE VISUALLY IMPAIRED tt il and US2015302774A1 depicted in FIG, 12 have similar ambitions navigating the impaired and others but all have a completely different approach as can be deducted from the key drawings depicted in the associated pictures (FIG. 9, FIG. 10, FIG, 11 and FIG. 12).
Claims (3)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NL1043806A NL1043806B1 (en) | 2020-10-05 | 2020-10-05 | Inclusive personal wayfinding assistive method and system for electronic devices for all, in particular for visually impaired, dyslectics, the color blind, elderly and children. |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NL1043806A NL1043806B1 (en) | 2020-10-05 | 2020-10-05 | Inclusive personal wayfinding assistive method and system for electronic devices for all, in particular for visually impaired, dyslectics, the color blind, elderly and children. |
Publications (1)
Publication Number | Publication Date |
---|---|
NL1043806B1 true NL1043806B1 (en) | 2022-06-03 |
Family
ID=74347663
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
NL1043806A NL1043806B1 (en) | 2020-10-05 | 2020-10-05 | Inclusive personal wayfinding assistive method and system for electronic devices for all, in particular for visually impaired, dyslectics, the color blind, elderly and children. |
Country Status (1)
Country | Link |
---|---|
NL (1) | NL1043806B1 (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050099291A1 (en) | 2003-11-12 | 2005-05-12 | Steven Landau | System for guiding visually impaired pedestrian using auditory cues |
EP1614997A1 (en) * | 2004-06-30 | 2006-01-11 | Navteq North America, LLC | Method of operating a navigation system using images |
US20060253226A1 (en) * | 2005-04-12 | 2006-11-09 | Ehud Mendelson | System and method of detecting and navigating to empty parking spaces |
EP2116811A2 (en) * | 2008-05-09 | 2009-11-11 | Eurocopter Deutschland GmbH | Flight guidance and navigation display for a helicopter |
WO2012035191A1 (en) * | 2010-09-13 | 2012-03-22 | Hyperin Inc. | Arrangement and related method for digital signage |
US20130325322A1 (en) | 2012-06-05 | 2013-12-05 | Christopher Blumenberg | System and method for navigation with inertial characteristics |
US20150302774A1 (en) | 2014-02-11 | 2015-10-22 | Sumit Dagar | Device Input System and Method for Visually Impaired Users |
US20160259027A1 (en) | 2015-03-06 | 2016-09-08 | Sensible Innovations, LLC | Audio navigation system for the visually impaired |
KR20170104541A (en) | 2015-01-12 | 2017-09-15 | 트레카스 테크놀로지스 엘티디 | Navigation devices and methods |
US20190212151A1 (en) | 2018-01-05 | 2019-07-11 | Lynette Parker | Facility navigation |
-
2020
- 2020-10-05 NL NL1043806A patent/NL1043806B1/en active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050099291A1 (en) | 2003-11-12 | 2005-05-12 | Steven Landau | System for guiding visually impaired pedestrian using auditory cues |
EP1614997A1 (en) * | 2004-06-30 | 2006-01-11 | Navteq North America, LLC | Method of operating a navigation system using images |
US20060253226A1 (en) * | 2005-04-12 | 2006-11-09 | Ehud Mendelson | System and method of detecting and navigating to empty parking spaces |
EP2116811A2 (en) * | 2008-05-09 | 2009-11-11 | Eurocopter Deutschland GmbH | Flight guidance and navigation display for a helicopter |
WO2012035191A1 (en) * | 2010-09-13 | 2012-03-22 | Hyperin Inc. | Arrangement and related method for digital signage |
US20130325322A1 (en) | 2012-06-05 | 2013-12-05 | Christopher Blumenberg | System and method for navigation with inertial characteristics |
US20150302774A1 (en) | 2014-02-11 | 2015-10-22 | Sumit Dagar | Device Input System and Method for Visually Impaired Users |
KR20170104541A (en) | 2015-01-12 | 2017-09-15 | 트레카스 테크놀로지스 엘티디 | Navigation devices and methods |
US20160259027A1 (en) | 2015-03-06 | 2016-09-08 | Sensible Innovations, LLC | Audio navigation system for the visually impaired |
US10132910B2 (en) | 2015-03-06 | 2018-11-20 | Sensible Innovations, LLC | Audio navigation system for the visually impaired |
US20190212151A1 (en) | 2018-01-05 | 2019-07-11 | Lynette Parker | Facility navigation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11576817B1 (en) | Selective information provision and indoor navigation assistance for the visually impaired | |
AU2020256377B2 (en) | Facilitating interaction between users and their environments using sounds | |
US10132910B2 (en) | Audio navigation system for the visually impaired | |
US10330490B2 (en) | Touch-based exploration of maps for screen reader users | |
Loomis et al. | Assisting wayfinding in visually impaired travelers | |
Giudice et al. | Blind navigation and the role of technology | |
Jain | Path-guided indoor navigation for the visually impaired using minimal building retrofitting | |
JP2000352521A (en) | System and method for providing navigation support for user, tactile-sense direction indicating device for providing the same via direction indicating cue, and method for providing the same via the same device | |
De Oliveira et al. | Indoor navigation with mobile augmented reality and beacon technology for wheelchair users | |
Kappers et al. | Hand-held haptic navigation devices for actual walking | |
NL1043806B1 (en) | Inclusive personal wayfinding assistive method and system for electronic devices for all, in particular for visually impaired, dyslectics, the color blind, elderly and children. | |
KR20090059834A (en) | Electronic apparatus with haptic module | |
Dias et al. | Future directions in indoor navigation technology for blind travelers | |
Ren et al. | Experiments with RouteNav, A Wayfinding App for Blind Travelers in a Transit Hub | |
Swaminathan et al. | From tactile to NavTile: opportunities and challenges with multi-modal feedback for guiding surfaces during non-visual navigation | |
Hub | Precise indoor and outdoor navigation for the blind and visually impaired using augmented maps and the TANIA system | |
Doush et al. | Non-visual navigation interface for completing tasks with a predefined order using mobile phone: a case study of pilgrimage | |
Pastrana-Brincones | Virtual Dog | |
Abraham et al. | An Accessible BLE Beacon-based Indoor Wayfinding System | |
Giudice | Wayfinding without vision: Learning real and virtual environments using dynamically-updated verbal descriptions | |
Clark et al. | Human Factors Suggestions for the Advanced Pedestrian Assistant: Usability and performance concerns of Advanced Accessible Pedestrian Signals for low vision users | |
Liu | Design of an adaptive wayfinding system for individuals with cognitive impairments | |
GB2625404A (en) | Navigation guidance devices | |
White et al. | Wayfinding White Paper | |
Hub et al. | Integration of Voice-Operated Route Planning and Route Guidance into a Portable Navigation and Object Recognition System for the Blind and Visually Impaired |