The white cane and beyond: Part 2

News
Article
Optometry Times JournalNovember digital edition 2024
Volume 16
Issue 10

Navigation and orientation technologies offer practical solution for people with vision loss.

Man using white cane up stairs Image credit: AdobeStock/Pixel-Shot

Image credit: AdobeStock/Pixel-Shot

In Part 1, Wolynski and Matos described accessible GPS and wearable navigation tech. Apps such as Lazarillo, BlindSquare, and HapticNav make navigation tech easily accessible by pairing the power of smartphone tech (location services, GPS, etc.) with navigation resources for people with vision loss, such as audible turn-by-turn directions and information on upcoming intersections and points of interest. External devices, such as the WeWALK Smart Cane, Ara, biped, and SideSight can further enhance navigation with cameras, sensors, and haptic actuators to alert users to obstacles. Some work with GPS apps previously mentioned, and other leverage artificial intelligence (AI) to communicate with the user. In Part 2, Wolynski and Matos discuss technology that helps people with vision loss navigate traffic and indoor directions, which may lack precise GPS.

To read the full Part 1 article, click here.

Pedestrian traffic signal detection

Knowing when it is safe to cross a busy street can pose a challenge for someone with vision loss. Making use of the iPhone’s camera and integrating its AI software, the Oko application by AYES aids with street crossing. With proper orientation and direction of the iPhone camera, a user will receive, in real time, the status of pedestrian traffic signals via sound, haptic vibration, or a visual on the screen notifying whether a signal reads “Don’t walk” or “Walk” or is about to change. Free to use and only available on the iPhone platform, the company notes that its use is not meant to replace orientation and mobility skills.1 However, Matos says it adds another layer of protection and confidence in crossing streets in a busy city. The company reports that GPS routes and mapping will soon be added.

Accessible signs for indoor or outdoor orientation

Figure 1a. NaviLens code at the M66 bus stop in New York, New York, gives bus information, including time of next bus arrival. (Image courtesy of Bryan Wolynski, OD, FAAO)

Figure 1b. NaviLens app recognizing a NaviLens code used for indoor orientation, giving information about the Lighthouse Guild Technology Center in New York, New York. (Image courtesy of Bryan Wolynski, OD, FAAO)

Another approach to accessible travel is to create accessible signs. Based on this concept, NaviLens has developed an accessible code. Different from a QR code, a NaviLens code is designed to be easily and instantly recognized by a smartphone’s camera from far distances and extreme angles. The square-shaped code has a white and black border surrounded by a grid of smaller colored squares (Figure 1), which can be used for outdoor or indoor signage. Available as a free application on iOS or Android, a user scans the environment for the code with their smartphone camera. Once recognized, auditory information preprogrammed into the code is instantly given. Although not meant to give step-by-step navigation, the information can help orient an individual to their surroundings, give POI, or direct someone toward the code with its magnet feature. NaviLens codes are currently used for indoor signage on building directories and museum exhibits and for outdoor orientation in public spaces. As part of a pilot project, the Metropolitan Transit Authority in New York, New York, has NaviLens codes displayed at select train stations, bus stops, and routes, giving entrance and other physical orientation information, including next transport arrivals.2 Feedback can also be spoken in other languages no matter what language the data were initially entered in, making this useful for everyone and very helpful for tourists. NaviLens codes are also being adopted for accessible product packaging.3

Indoor navigation and orientation

Since the Americans with Disabilities Act was passed in 1990, braille has been included on indoor signage to make it accessible. However, not many people with vision impairment read braille, and if they do, it is difficult for them to locate it on signage. Other technology solutions for those with vision loss have been researched with the focus of using sensor networks, such as ultrawideband, Bluetooth low-energy (BLE) beacons, radio-frequency identification (RFID), near-field communication (NFC) tags, computer vision/camera systems, or a combination of these options.4-6

The BLE beacon is a small, lightweight, energy-efficient short-range transmitter, which sends out a radio signal when in range of a receiver, usually a smartphone. Although location accuracy can be off by as much as 7.81 m,7 the information provided, as in other technologies mentioned previously, is best used for orientation information rather than turn-by-turn navigation. Although BLE beacons have been studied for indoor and outdoor accessible navigation,4 commercial BLE beacons have mainly been used as an indoor solution.

Figure 2a. Bluetooth low-energy beacon, part of the RightHear system, placed on wall left of elevator. (Image courtesy of Bryan Wolynski, OD, FAAO)

Figure 2a. Bluetooth low-energy beacon, part of the RightHear system, placed on wall left of elevator. (Image courtesy of Bryan Wolynski, OD, FAAO)

The company RightHear has a system using BLE beacons in their audible wayfinding system, which includes:

  1. Web portal to enter information
  2. BLE beacon(s) to transmit the data (Figure 2a)
  3. RightHear smartphone app (free to user on iOS and Android) to receive the information (Figure 2b)
Figure 2b. Screenshot of the RightHear app on iPhone giving current location and orientation information. (Image courtesy of Bryan Wolynski, OD, FAAO)

Figure 2b. Screenshot of the RightHear app on iPhone giving current location and orientation information. (Image courtesy of Bryan Wolynski, OD, FAAO)

With RightHear, there is no need to search for signage or use the smartphone’s camera. When the system is in use, information is transmitted to the user’s smartphone, speaking out content such as business hours, emergency information, location of restrooms, or floor directory signage. Information is also segmented into cardinal positions, and the application can inform the individual what is ahead in the direction they are facing. Another function is the ability for a user who is blind or visually impaired to virtually explore a RightHear location before travel, which can be found on their app. The RightHear app also supports an outdoor navigation option using GPS. Other companies using BLE beacons for indoor navigation include Lazarillo and BlindSquare.

In addition to BLE beacons, other radio-transmitting sensor technologies include RFID and NFC tags.4,8 Many of us are familiar with NFC tags that are used for contactless payment. The WayAround company is using this technology in a similar way. Through a free smartphone app, users can get helpful information transmitted to their smartphone by tapping on strategically placed WayAround NFC tags. Information can include office room numbers, location, and orienting content within a building. These tags can also be purchased for home use to digitally label products, clothing, and other items around the home.

Unlike NFC tags, which require close touch contact, RFID tags can be read by simply passing by an RFID reader, much like we do at continuing education events. Hearsee Mobility, a nonprofit company in Utah, has developed a white cane designed to receive RFID tag signals, allowing users to receive information about POI and location of offices or restrooms while navigating an indoor route.

Most technologies for indoor navigation systems give orientation, proximity location, POI, and audible signage information. As previously mentioned, although GPS can provide approximate turn-by-turn navigation outside, it does not work for indoor routes. GoodMaps addresses this gap by using LiDAR technology.

Setting up GoodMaps involves several steps:

  • An initial LiDAR scan of the indoor facility, which can take several hours depending on size of the area
  • Creation of a digital map
  • Tagging and labeling all POI and mapping connecting routes
Figure 3. Using the GoodMaps app to navigate to a destination. The app gives directional instructions aloud, on-screen, and with visual arrows on-screen.(Image courtesy of Bryan Wolynski, OD, FAAO)

Figure 3. Using the GoodMaps app to navigate to a destination. The app gives directional instructions aloud, on-screen, and with visual arrows on-screen.(Image courtesy of Bryan Wolynski, OD, FAAO)

Once operational, users choose a destination on the GoodMaps application, which is free to users on iOS or Android platforms (Figure 3). While using the app and holding the smartphone’s camera facing forward, the user receives audible and visual step-by-step guidance. As with any technology, there is a learning curve, so individuals should always rely on their orientation and mobility skills. Currently, GoodMaps is available in airports, train stations, and retail businesses worldwide. A list of locations can be found on the app.9

Another approach to indoor navigation comes from Microsoft’s Seeing AI application. This free app, available on iOS and Android, is used by many people with blindness or visual impairment for reading text, scene descriptions, recognizing barcodes, money, and other uses. The app has various features called channels. One of them, the World Channel, includes navigation using virtual beacons or breadcrumbs. Users can follow saved routes visually on the screen, going toward a virtual beacon (large transparent blue dot on the screen) or through spatial sounds, which require headphones. Here’s how it works:

1. Initial setup:Use the smartphone camera to scan the starting point until the app notifies you that 100% of the area has been scanned.

2. Save a route:Start walking toward your destination. The app will drop virtual beacons or breadcrumbs as you walk to save and name the route later.

3. Follow the route: To navigate your saved route, select the route in the World Channel. Follow the virtual beacons on the screen or use spatial sounds with headphones to guide you from one beacon to the next until you complete the route.

Sighted and virtual assistance

Companies such as BeMyEyes and Aira provide live sighted assistance to individuals with blindness or visual impairment. BeMyEyes is a free service and app on iOS and Android that connects sighted volunteers with users with visual impairment needing assistance. Volunteers see through the user’s smartphone camera to help with tasks such as shopping, finding things, and navigation, with calls lasting anywhere from 1 to 3 minutes.10 Recently, BeMyEyes has introduced a new feature called Be My AI, which uses ChatGPT to provide highly descriptive scene explanations allowing for orientation information and the ability to ask more questions about an individual’s surroundings.

Aira also connects users to sighted assistance; however, the service is subscription based and connects to a trained professional agent. Findings from a study on Aira’s service found that more than 10,000 calls in a 3-month period averaged 8 to 9 minutes and were used primarily for reading, navigation, and home management.11 Aira is also developing an AI virtual option, which is currently in beta testing.

Some other noteworthy options include Seeing AI, Envision AI, and Ray-Ban Meta smart glasses.They all offer AI scene description functionality. Envision AI provides this through their free smartphone application or on their smart glasses, which are available for purchase. Envision AI glasses also offer sighted assistance through their Call a Companion feature. Ray-Ban Meta smart glasses use WhatsApp, allowing recipients of a call to see through the glasses’ camera and communicate directly with the user through the glasses.

The future

Technology is constantly evolving, and the systems mentioned in this article are also improving. There is a growth of commercial options, especially for indoor navigation, integrated as an app on our smartphones and in the future with the promise of wearables. For these advancements to be effective, they need to be user-friendly and aesthetically acceptable in public and provide customizable options for feedback. Recently, OpenAI showcased a future feature where an AI assistant will seamlessly provide real-time assistance, as demonstrated in a video of a blind person touring London in the United Kingdom and hailing a taxi. The future of AI integration and capabilities is promising, but we should remain cautiously optimistic.

The marketing and promotion of assistive and mainstream technologies do not always correctly represent what the technology can do. More data-driven research is needed as technology advances. Although technologies such as GPS can get a user close to a destination, they can fall short on the last few feet of a journey. Consequently, individuals with blindness or visual impairment still need to rely on their orientation and mobility skills. What’s more, they need those skills to learn how to use and integrate technology into their lives. Optometrists should consider more orientation and mobility referrals, particularly for older patients with low vision who are more at risk for falling and becoming detached from their community due to their fears of travel.

Nevertheless, technology is revolutionizing all our lives. Fortunately, new solutions are emerging that can help patients with low vision and are being brought to and tested at Lighthouse Guild in New York, New York. Incorporating technology as part of low-vision care can significantly enhance the safety and functionality of our patients, enabling our patients to accomplish their daily activities and not only meet their goals but surpass them.

References:
1. Get help with Oko. AYES. Accessed July 6, 2024. https://www.ayes.ai/help
2. Expanding innovative accessibility solutions. Metropolitan Transportation Authority. Updated September 3, 2024. Accessed September 25, 2024. https://new.mta.info/accessibility/innovation
3. Discover NaviLens: the cutting edge technology for the visually impaired. NaviLens. Accessed July 7, 2024. https://www.navilens.com/en/#where-section
4. Plikynas D, Žvironas A, Budrionis A, Gudauskis M. Indoor navigation systems for visually impaired persons: mapping the features of existing technologies to user needs. Sensors (Basel). 2020;20(3):636. doi:10.3390/s20030636
5. Martinez-Sala AS, Losilla F, Sánchez-Aarnoutse JC, García-Haro J. Design, implementation and evaluation of an indoor navigation system for visually impaired people. Sensors (Basel). 2015;15(12):32168-32187. doi:10.3390/s151229912
6. Prandi C, Delnevo G, Salomoni P, Mirri S. On supporting university communities in indoor wayfinding: an inclusive design approach. Sensors (Basel). 2021;21(9):3134. doi:10.3390/s21093134
7. Fachri M, Khumaidi A. Positioning accuracy of commercial Bluetooth low energy beacon. IOP Conf Ser Mater Sci Eng. 2019;662(5):052018. doi:10.1088/1757-899X/662/5/052018
8. Ivanov R. Indoor navigation system for visually impaired. In: ACM International Conference Proceeding Series. Association for Computing Machinery; 2010:143-149. doi:10.1145/1839379.1839405
9. Enhance your venue today. GoodMaps. Accessed July 5, 2024. https://goodmaps.com/enhance-your-venue/
10. Avila M, Wolf K, Brock A, Henze N. Remote assistance for blind users in daily life: a survey about Be My Eyes. In: PETRA ’16: Proceedings of the 9th ACM International Conference on Pervasive Technologies Related to Assistive Environments. Association for Computing Machinery; 2016;85:1-2. doi:10.1145/2910674.2935839
11. Nguyen BJ, Chen WS, Chen AJ, et al. Large-scale assessment of needs in low vision individuals using the Aira assistive technology. Clin Ophthalmol. 2019;13:1853-1868. doi:10.2147/OPTH.S215658
Recent Videos
Eye care practitioners reported moderate to high satisfaction with lifitegrast's ability to improve signs of dry eye, according to Melissa Barnett, OD, FAAO, FSLS.
Neda Gioia, OD, CNS, FOWNS, details the positive feedback gained so far from other optometrists that have been prescribing the NutriTears supplement to their dry eye patients.
Damaris Raymondi, OD, FAAO, highlighted the importance of building patient-doctor trust to learn about these practices, which can include non-traditional treatments like chamomile or manuka honey eye drops.
Noreen Shaikh, OD, Magdalena Stec, OD, FAAO, and Brenda Bohnsack, MD, PhD, emphasize that collaboration and communication are key to proper diagnosis and treatment.
Cecilia Koetting, OD, FAAO, DipABO, cited data from a recent student that found that presbyopia treatment with 0.4% pilocarpine led to up to 86% of patients achieving 20/40 or better.
Kerry Giedd, OD, MS, FAAO, was 1 of 20 investigators around the country for a study evaluating the daily disposable contact lens.
According to A. Paul Chous, MA, OD, FAAO, optometrists have an important opportunity to educate patients in their chairs about diabetes.
David Geffen, OD, FAAO, gave a poster presentation titled "Revolutionizing Comfort: Unveiling the Potential of Perfluorohexyloctane Eyedrops for Contact Lens Wearers" at this year's Academy meeting.
Jessica Steen, OD, FAAO, Dipl-ABO, discussed ophthalmic considerations for patients undergoing treatment with antibody drug conjugates for gynecologic cancers at this year's conference.
A. Paul Chous, MA, OD, FAAO, details a presentation on this year's updates on diabetes given at this year's Academy meeting
© 2024 MJH Life Sciences

All rights reserved.