iPhone video link FaceTime as an orientation tool: Remote O&M for people with vision impairment

Publications

Share / Export Citation / Email / Print / Text size:

International Journal of Orientation & Mobility

Guide Dogs NSW/ACT

Subject: Health Care Sciences & Services , Medicine , Rehabilitation

GET ALERTS

ISSN: 1836-0254

DESCRIPTION

2
Reader(s)
3
Visit(s)
0
Comment(s)
0
Share(s)

SEARCH WITHIN CONTENT

FIND ARTICLE

Volume / Issue / page

Archive
Volume 8 (2016)
Volume 7 (2015)
Volume 6 (2013-2014)
Volume 5 (2012)
Volume 4 (2011)
Volume 3 (2010)
Volume 2 (2009)
Volume 1 (2008)
Related articles

VOLUME 7 , ISSUE 1 (November 2015) > List of articles

iPhone video link FaceTime as an orientation tool: Remote O&M for people with vision impairment

Nicole Holmes * / Kelly Prentice *

Citation Information : International Journal of Orientation & Mobility. VOLUME 7 , ISSUE 1 , Pages 60-67 , ISSN (Online) , DOI: 10.21307/ijom-2017-057, November 2015 © 2017.

License : (CC-BY-NC-ND-4.0)

Published Online: 16-April-2018

ARTICLE

ABSTRACT

Two case study participants investigated the effectiveness of the Application “FaceTime” as an O&M tool via the Apple iPhone. The participants included a traveller (blind) who is an experienced long cane and guide dog user; and a qualified O&M instructor. The traveller and instructor tested FaceTime in five varying scenarios including shop identification, product identification in a supermarket, identification of buses at a transport interchange, orientation while free walking in residential streets, and road intersection identification. It was found that the information provided remotely by the instructor enhanced the independence of the traveller since it could not be obtained via GPS or other means.

Graphical ABSTRACT

In recent years, people with vision impairment have been able to access and use, with their sighted peers, such mainstream technologies as Global Positioning Systems (GPS) applications (apps) and software. Interestingly, some educational institutions and service providers serving students with disabilities are successfully using video conferencing to reach those students who would otherwise not have access to their services (Dewald & Smyth, 2013-14; Royal Institute for Deaf and Blind Children, 2015).

The development of accessible smart-phone GPS apps for people with vision impairment means that GPS can, with varying degrees of inaccuracy, announce a location, nearest cross streets, points of interest, and give directions to a destination. However, more recent technology, for example, ‘FaceTime’ on the iPhone, have the capacity to provide increasingly accurate and reliable information that might be used by people with vision impairment. This paper explores the potential for using FaceTime on the iPhone to assist a person with vision impairment in a number of different day-to-day situations in which GPS might not be the most effective means of obtaining accurate and reliable information.

GPS has been an evolving technology in relation to the orientation and mobility (O&M) of a person who is blind or vision impaired for over 20 years. SenderoGroup was the first company to develop adaptive GPS equipment, beginning with a laptop computer carried in a backpack, followed by accessible GPS software for the BrailleNote notetaker, to the more recent GPS iPhone apps. While this technology has proved helpful to people with vision impairment, it has not yet addressed particular mobility issues, for example, identifying an object, place, or landmark in real-time and with refined accuracy. The CEO of Sendero Group (Sendero Group, 2015) referred to the “frustrating 50 feet”, a situation whereby GPS technology can orientate the user within approximately 50 feet (15.5 metres) of where they want to be. GPS can provide some independence to a traveller, however, it leaves the traveller relying on prior knowledge of the area or members of the public to provide assistance.

Several studies have investigated the use of video link to provide orientation to a traveller with vision impairment. For example, Garaj, Jirawimut, Ptasinski, Cecelja, and Balachandran (2003) trialled video link using two personal computers. One computer was inside a backpack with a camera on the traveller’s chest with a GPS receiver on the shoulder; and the other was a personal computer (PC) with onscreen display used by the O&M instructor. The instructor was located at a desk. This video link system allowed the remote guide (instructor) on the PC to see the traveller’s location on a map, and also provided a video link via cell towers. The trials revealed that the system increased the independence of the traveller with vision impairment on both a macro and micro level. That is, the person with vision impairment was provided by the instructor with a verbal preview of the route before travelling it, and was also prewarned about landmarks as the route was travelled. Their study did not progress past the trial, however, this particular video link system appeared to improve the navigation of the participant who was blind. That is, the remote guide (instructor) detected when the person moved off route and corrected him accordingly.

Similarly, Baranski, Polanczyk, and Strumillo (2010) used two terminals that were connected via cell towers. One terminal was a wearable and compact mobile device with a digital camera, GPS receiver, and headset worn by the traveller; and the other was a personal computer used by the guide (instructor). This research used both video link and transmission of GPS data to allow the instructor to see the location of the traveller on a map. The difference between the former and this study was that the instructor was able to navigate the traveller by controlling the GPS system while also warning him of hazardous obstacles.

A further study by Baranski, Polanczyk, and Strumillo (2014) used video camera glasses and provided haptic feedback to the traveller via two vibrating tactile bracelets. This procedure allowed the remote operator (instructor) to guide the traveller by activating one of the two vibrating bracelets rather than using audio. The study was conducted in an outdoor environment and was found to be effective for users who were unfamiliar with the environment.

The introduction of smartphones and apps led to a number of object recognition and crowd sourcing apps to assist people with vision impairment. For example, “VizWiz” introduced the idea of taking a photo of an object, pairing it with a question and then receiving a response via web workers (Bigham, Jayant, Miller, White, & Yeh, 2010). Similarly, “TapTapSee” developed by the Royal National Institute for the Blind, allows a person to take a photo of an object, compare the photo to a database of items, and then attempts to match the photo with the particular item in the database (Holton, 2015). Both these apps reduce the need by a person with vision impairment to organise face-to-face assistance, and instead allows access to assistance at any time of the day or night. Although these two apps do not permit video link, they have the potential for independent identification of features in the environment. The most recent iPhone app for people who are blind or vision impaired is “Be My Eyes.” This app allows sighted volunteers to connect with people who are blind anywhere in the world via live video link. This development means that the person who is blind can get assistance with identifying objects in their environment at any time through the iPhone or iPad (Holton, 2015).

“FaceTime” builds on the concept of identifying features in the environment. FaceTime is a free video calling capability between Apple iPhones and iPads. It is a live video link between iDevices that can be created between two people. This process only requires iPhones or iPads to work and can stream live video and audio footage between the two devices while only paying for a data connection. Unlike ‘Be My Eyes’, FaceTime allows the person who is vision impaired to connect to an assistant of their choice. Importantly, and in contrast to the studies discussed, FaceTime is accessible mainstream technology, and does not require adaptive equipment or software to run.

Method

The traveller who is blind is an experienced and independent traveller and a regular user of GPS information. Using her primary aids, first, a guide dog and then a long cane, she employed the FaceTime application in the following scenarios in known and unknown environments that she identified as frustrating situations:

  • Locating shop entrances

  • Reading shop signage

  • Reading bus numbers and their destinations as they drive past

  • Identifying various department store sections

  • Identifying obstacles on regular routes

  • Negotiating barriers on footpaths

  • Identifying complex road crossings, roundabouts, islands, angles, lights

  • Identifying street signs

These situations were divided into five scenarios and specific tasks (Table 1).

Table 1.

Five scenarios in which the FaceTime application was used.

10.21307_ijom-2017-057-tbl1.jpg

Equipment

The iPhone 4s employed Optus data service for both the traveller and instructor. A Lanyon that loops around the neck and places the camera at chest level was used by the traveller (Figure 1). This enabled the traveller to be hands-free. The instructor used the application FaceTime while sitting at her workstation in her office (Figure 2).

Figure 1.

Lanyon around the traveller’s neck; iPhone screen inward so the instructor’s face is not visible to the public.

10.21307_ijom-2017-057-f001.jpg
Figure 2.

The instructor using the iPhone screen to view the traveller’s environment; and talking to the traveller via the headphones.

10.21307_ijom-2017-057-f002.jpg

Camera Positioning

Prior to the commencement of the trial the instructor and traveller reviewed effective ways to communicate the position of the camera so that it could detect the traveller’s environment. For example, they agreed to use the base of the iPhone as a reference point to tilt the camera down, bringing it closer to the body, or up moving it away. Once the angle of the phone was set this way, they used left and right for scanning. Movement directions were then given using the ‘clock face method’ and angles, for example, “move left to 10 o’clock”; “move 90 degrees to the right.”

Further, during the scenario’s the traveller tested method of scanning items by first, moving the iPhone in her hand in front of various items (hand scanning) and second, by moving her body with the camera resting on her chest connected to the Lanyon loop around her neck (Figure 1).

Results

In general, FaceTime video link with an O&M instructor appeared to enhance the traveller’s independence (Table 2). When the traveller was walking the instructor was able to provide useful information about the type of shops being approached e.g., hair salon and newsagency. However, the instructor also experienced some motion sickness although this decreased when the traveller held the phone rather than let it hang around her neck. When the traveller was stationary the level of detail provided by the instructor increased as it was easier for her to see items (e.g., jewellery in a glass cabinet) and the environment (e.g., street sign) clearly.

Table 2.

The effectiveness of FaceTime across the five scenarios.

10.21307_ijom-2017-057-tbl2.jpg

Discussion

FaceTime video link with an O&M instructor appeared to enhance the independence of the traveller. Such tasks as identifying shop signs, supermarket products, obstacles, hazards on the footpath, providing details about buses, and intersection layout would have previously been done face-to-face with either an O&M instructor or another person with vision.

Having a video link with the instructor meant that the traveller could request assistance only when she believed it was required, and feel confident that she was receiving the correct technical orientation information.

The main difference between the present study and previous studies was in the equipment used. While previous studies used video links, these were not as portable and easy to access as FaceTime for the end user.

No major differences were apparent between use of a cane and a guide dog. The instructor provided more detail about obstacles ahead when the traveller walked with the cane, whereas this was not required as much when the guide dog did its job correctly.

Providing information was easiest when the traveller was stationary as the instructor received a clear view of what was in front of the camera and was not rushed to decipher the traveller’s location. Stationary orientation was used when the traveller required a high level of detail for example, identifying products, shop signage, or buses at a bus stand. This result is in contrast to previous studies, in which the remote guide followed the traveller along a particular route and was able to detect whether or not she had gone off route. What previous studies did not do, and what makes the traveller with vision impairment more independent using FaceTime, is using technology to do the haphazard tasks for example, identify products and bus numbers. Having the confidence to do such tasks without needing to ask a member of the public means that the traveller can complete the tasks independently and efficiently.

Although the study by Baranski, Polanczyk, and Strumillo (2010) used wearable camera technology on the traveller, it did not increase the independence of the traveller as the remote guide had complete control of alerting the traveller to obstacles in the environment, as well as controlling the GPS system used to guide the traveller. However, in the present study the traveller was free to use GPS apps on the iPhone, but this information was not controlled by the O&M instructor. Instead, the traveller had autonomy since she was able to travel in the environments of her choice and could initiate questions to the O&M instructor to seek further information about features of the environment.

Verbal communication in real-time was important in the present study since it allowed the O&M instructor to identify the traveller’s location, and also permitted the traveller to gain the information she needed at any time. This result is in contrast to the system where haptic feedback is given in order to guide the traveller along a route.

Instructor familiarity with the environment was not analysed but would be an important element in further research. In addition, to assist in the reduction of motion sickness experienced by the instructor, perhaps the camera needs to be fixed in some way though able to be hand-held when required so that the traveller can use it for example, to identify objects on shelves. It is recommended that training be provided to the traveller to explain the way the camera works and to position the camera correctly prior to the commencement of travel.

Conclusion

Results of the current study indicate that FaceTime orientation with the iPhone might be a useful tool to increase the independence of a person with vision impairment. FaceTime orientation appears to enhance the traveller’s journey by giving an additional level of information to that provided by any GPS system. FaceTime provides information in real-time and allows important two-way communication and questioning between the O&M instructor and traveller. The O&M instructor is a trusted professional in the area of orientation of people who are blind or vision impaired, and thus the information provided via FaceTime might be considered high standard compared to that provided by a member of the public. Advantages of using FaceTime include: (i) cost and resource efficiency in that instructors are not required to provide face-to-face training; (ii) clients accessing an increasing number of environments (iii) a cost efficient strategy to expand an organisations mobility services (iv) increase in autonomy for a traveller.

References


  1. Baranski, P., Polanczyk, M., & Strumillo, P. (2010). A remote guidance system for the blind. e-Health Networking Applications and Services, 386-390.
  2. Bigham, J.P., Jayant, C., Miller, A., White, B., & Yeh, T. (2010). VizWiz: LocateIt - enabling blind people to locate objects in their environment. Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference, 65-72.
    [CROSSREF]
  3. Dewald, H. P., & Smyth, C. A. (2013-14). Feasibility of O&M services for young children with vision impairment using teleintervention. International Journal of Orientation & Mobility, 6(1), 83-92.
  4. Garaj, V., Jirawimut, R., Ptasinski, P., Cecelja, F., & Balachandran, W. (2003). A system for remote sighted guidance of visually impaired pedestrians. British Journal of Visual Impairment, 21(2), 55-63.
    [CROSSREF]
  5. Holton, B. (2015). A review of the Be My Eyes Remote Sighted Helper App for Apple iOS. Access World. Retrieved from http://www.afb.org/afbpress/pub.asp?DocID=aw160202&utm_content=bufferbb105&utm_medium=social&utm_source=facebook.com&utm_campaign=buffer
    [URL]
  6. Royal Institute for Deaf and Blind Children. (2015). Services across Australia. Retrieved from http://www.ridbc.org.au/services-across-australia
    [URL]
  7. Scheggi, S., Talarico, A., & Prattichizzo, D. (2014). A remote guidance system for blind and visually impaired people via vibrotactile haptic feedback,” Control and Automation (MED), 20-23.
  8. Sendero Group. (2015). Sendero Group: Accessible location and navigation. Retrieved from http://www.senderogroup.com/
    [URL]
XML PDF Share

FIGURES & TABLES

Figure 1.

Lanyon around the traveller’s neck; iPhone screen inward so the instructor’s face is not visible to the public.

Full Size   |   Slide (.pptx)

Figure 2.

The instructor using the iPhone screen to view the traveller’s environment; and talking to the traveller via the headphones.

Full Size   |   Slide (.pptx)

Table 1.

Five scenarios in which the FaceTime application was used.

Full Size   |   Slide (.pptx)

Table 2.

The effectiveness of FaceTime across the five scenarios.

Full Size   |   Slide (.pptx)

REFERENCES

  1. Baranski, P., Polanczyk, M., & Strumillo, P. (2010). A remote guidance system for the blind. e-Health Networking Applications and Services, 386-390.
  2. Bigham, J.P., Jayant, C., Miller, A., White, B., & Yeh, T. (2010). VizWiz: LocateIt - enabling blind people to locate objects in their environment. Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference, 65-72.
    [CROSSREF]
  3. Dewald, H. P., & Smyth, C. A. (2013-14). Feasibility of O&M services for young children with vision impairment using teleintervention. International Journal of Orientation & Mobility, 6(1), 83-92.
  4. Garaj, V., Jirawimut, R., Ptasinski, P., Cecelja, F., & Balachandran, W. (2003). A system for remote sighted guidance of visually impaired pedestrians. British Journal of Visual Impairment, 21(2), 55-63.
    [CROSSREF]
  5. Holton, B. (2015). A review of the Be My Eyes Remote Sighted Helper App for Apple iOS. Access World. Retrieved from http://www.afb.org/afbpress/pub.asp?DocID=aw160202&utm_content=bufferbb105&utm_medium=social&utm_source=facebook.com&utm_campaign=buffer
    [URL]
  6. Royal Institute for Deaf and Blind Children. (2015). Services across Australia. Retrieved from http://www.ridbc.org.au/services-across-australia
    [URL]
  7. Scheggi, S., Talarico, A., & Prattichizzo, D. (2014). A remote guidance system for blind and visually impaired people via vibrotactile haptic feedback,” Control and Automation (MED), 20-23.
  8. Sendero Group. (2015). Sendero Group: Accessible location and navigation. Retrieved from http://www.senderogroup.com/
    [URL]

EXTRA FILES

COMMENTS