twitterfacebookgoogleyoutuberss
Full Issues
Download the latest full issue here! →

Visual Augmentation for the Blind

  • Posted On: 2nd June 2014

Screen shot 2014-06-02 at 11.02.21

 

By Patrick Degenaar

Human beings have six dominant senses – vision, hearing, touch, taste, smell, and vestibular balance. Of these, vision is perhaps our most important. The loss of vision can be devastating, but not entirely disabling, especially in the young. The brain is sufficiently adaptable to re-route other functions, such that visual loss coincides with significantly improved hearing, touch and associated memory. Therefore, the visually impaired can augment their reality via sensory substitution. In rare cases, individuals (blind from birth) are even able to develop an auditory visual system through echolocation – making small clicking noises and detecting their 3-D environment by timing how long it takes for the echo to return.

Techniques to enhance sensory substitution, through for example walking sticks, have been around for some time. Now in the 21st century, blind computer users regularly use a screenreader program called JAWS, which converts the screen information to sound. Such users can navigate using auditory information streams which are unintelligible to the normally sighted. Day to day mobility can now be enhanced via an echolocating walking stick – the ultracane, which gives vibrating feedback about objects three to five meters away. There are even attempts to use a camera and processing engine to convert the visual world into auditory or gustational signals. And yet, the overwhelming desire of the majority of the visually impaired is to return to at least some visual function rather than sensory substitution.

Electrical stimulation of the visual system has been attempted for almost a century. The first experiments on live humans date to 1929, but subsequent progress has been slow. The most impressive results to date have been by a German company, Retina AG, whose low resolution implant has allowed individuals to be able to read large, high contrast characters.

Perhaps the most exciting work in the field is with the new optogenetic method. Instead of implanting electrodes, it is now possible to genetically re-engineer the remaining cells of the eye to be light sensitive. Specially adapted Virtual Reality headsets can then transmit high intensity pulses of light to return vision. This is the basis of the European OptoNeuro project. What is particularly exciting is that resolution is no longer limited by the curvature of the eye and degradation of implanted electrodes. Additionally, without the need for expensive surgery the cost of such systems can be reduced significantly from hundreds of thousands to thousands.

We must, however, accept that in the early phases, perfect vision will not be returned. As such, the prosthesis needs to interpret the world and send only the most useful information. By cartoonizing the scene, less imp o r t a n t textures are suppres sed to give better contrast to the more important features (e.g., body shapes). A particularly intriguing question is whether the users of such visual systems will want to limit themselves entirely to the visual range. In Song of the Machine, my team at Newcastle in conjunction with Superflux conceptually explore the possibilities of the future – where users can view the world in the infrared and ultraviolet as well as the visible. This can be combined with assistive technologies such as navigation and face recognition, so that the total package – mixing augmented visual return with supplemented sensory substitution – will allow the blind to lead normal lives.

Full human trials of such optogenetic retinal prosthesis are a few years away pending safety trials and will in the first instance only help those suffering from the Retinitis Pigmentosa disease. Trials for those blinded by other conditions will be further down the line. Assistive technologies are however already here. The touchscreen smartphones are now becoming surprisingly popular amongst the visually impaired because previous keypads are being replaced with vibrating and auditory feedback. As such, GPS based navigational apps are already changing lives.

Patrick Degenaar, Ph.D.                                                                                                 School of Electrical and Electronic Engineering                                                   Newcastle University United Kingdom                       patrick.degenaar@newcastle.ac.uk

Brenda Wiederhold About Brenda Wiederhold
President of Virtual Reality Medical Institute (VRMI) in Brussels, Belgium. Executive VP Virtual Reality Medical Center (VRMC), based in San Diego and Los Angeles, California. CEO of Interactive Media Institute a 501c3 non-profit Clinical Instructor in Department of Psychiatry at UCSD Founder of CyberPsychology, CyberTherapy, & Social Networking Conference Visiting Professor at Catholic University Milan.

Written by Brenda Wiederhold

President of Virtual Reality Medical Institute (VRMI) in Brussels, Belgium. Executive VP Virtual Reality Medical Center (VRMC), based in San Diego and Los Angeles, California. CEO of Interactive Media Institute a 501c3 non-profit Clinical Instructor in Department of Psychiatry at UCSD Founder of CyberPsychology, CyberTherapy, & Social Networking Conference Visiting Professor at Catholic University Milan.