ClinicalTrials.Veeva

Menu

Environmental Localization Mapping and Guidance for Visual Prosthesis Users (SLAM)

Johns Hopkins University logo

Johns Hopkins University

Status

Completed

Conditions

Visual Prosthesis
Retinitis Pigmentosa
Visual Impairment

Treatments

Device: Navigation system mode: Argus Vision
Device: Distance test vision mode: Low Resolution / High Field-of-View
Device: Navigation system mode: Depth Vision
Device: Distance test vision mode: Low Resolution / Low Field-of-View
Device: Distance test vision mode: High Resolution / High Field-of-View
Device: Distance test vision mode: High Resolution / Low Field-of-View
Device: Navigation system mode: Depth Vision with Haptic / Audio
Device: Navigation system mode: High Field-of-View Depth Vision
Device: Navigation system mode: Haptic / Audio

Study type

Interventional

Funder types

Other
NIH

Identifiers

NCT04359108
1R01EY029741-01A1 (U.S. NIH Grant/Contract)
IRB00228932

Details and patient eligibility

About

This study is driven by the hypothesis that independent navigation by blind users of visual prosthetic devices can be greatly aided by use of an autonomous navigational aid that provides information about the environment and guidance for navigation through multimodal sensory cues. For this study, the investigators developed a navigation system that uses on-board sensing to map the user's environment and compute navigable paths to desired destinations in real-time. Information regarding obstacles and directional guidance is communicated to the user via a combination of sensory modalities including limited vision (through the user's visual prosthesis), haptic, and audio cues. This study evaluates how effectively this navigational aid improves prosthetic vision users' ability to perform navigational tasks. The participants for this study include both retinal prosthesis users of the Argus II Retinal Prosthesis System (Argus II) and normally sighted individuals who use a virtual reality headset to simulate the limited vision of the Argus II system.

Full description

About 1.3 million Americans aged 40 and older are legally blind, a majority because of diseases with onset later in life, such as glaucoma and age-related macular degeneration. Second Sight Medical Products (SSMP) has developed the world's first FDA approved retinal implant, Argus II, intended to restore some functional vision for people suffering from retinitis pigmentosa (RP).

In this era of smart devices, generic navigation technology, such as GPS mapping apps for smartphones, can provide directions to help guide a blind user from point A to point B. However, these navigational aids do little to enable blind users to form an egocentric understanding of the surroundings, are not suited to navigation indoors, and do nothing to assist in avoiding obstacles to mobility. The Argus II, on the other hand, provides blind users with a limited visual representation of the users surroundings that improves users' ability to orient themselves and traverse obstacles, yet lacks features for high-level navigation and semantic interpretation of the surroundings. The proposed study aims to address these limitations of the Argus II through a synergy of state-of-the-art simultaneous localization and mapping (SLAM) and scene recognition technologies.

This study is driven by the hypothesis that independent navigation by blind users of visual prosthetic devices can be greatly aided by use of an autonomous navigational aid that provides information about the environment and guidance for navigation through multimodal sensory cues. The investigators developed a navigation system that uses on-board sensing and SLAM-based algorithms to continuously construct a map of the user's environment and locate the user within that map in real-time. On-board path planning algorithms compute optimal navigation routes to reach desired destinations based on the constructed map. The system then communicates obstacle locations and navigational cues to the user while navigating via a combination of sensory modalities. The participants for this study include blind Argus II users, who use their retinal implant for vision, and normally sighted individuals, who use a virtual reality headset to simulate the limited vision of a retinal prosthesis.

The sensory modalities used by the navigational aid to communicate information back to the user include:

  • Limited vision is provided via the user's visual prosthesis, with the Argus II retinal implant supporting an image size of 10 x 6 pixels spanning approximately 18 x 11 degrees field-of-view. Images sent to the visual implant are derived from video frames provided by forward-facing cameras integrated within headgear worn by the user. Three forms of vision feedback are evaluated in this study including: 1) the standard vision output of the Argus II (which uses texture-based processing including difference-of-Gaussian and contrast enhancement filters), 2) an enhanced depth-based vision mode that uses the depth sensing capabilities of the navigational aid to highlight above-ground obstacles, and 3) a high field of view depth-based vision mode that doubles the pixel size and field of view of the visual feedback in each dimension. The depth-vision modes display only above-ground obstacles with increasing brightness relative to decreasing distance from the user; the ground and obstacles beyond a threshold distance are not displayed in order to declutter the visual scene. The high field-of-view depth-vision mode is only utilized by normally sighted participants, as this mode exceeds the vision capabilities of the Argus II implant.
  • Haptic cues indicate the direction in which the user should advance in order to follow the path computed by the navigational aid to reach a target destination. The haptic cues are generated by vibrators situated at five positions located left-to-right along the users forehead, which are built into user-worn headgear. The five vibration points indicate for the user to turn in directions: "far left", "slight left", "straight ahead", "slight right", and "far right".
  • Audio cues provide an audible alert when the user approaches an obstacle within 1.5 feet and provide verbal updates on the remaining distance to reach the destination along the path computed by the navigational aid.

This study compares participants' performance in completing navigation tasks using five different modes and combinations of the foregoing sensory modalities as follows: 1) Argus vision, 2) depth vision, 3) depth vision with haptic and audio, 4) haptic and audio (without vision), and 5) high field-of-view depth vision.

The navigation tasks performed by the participants using these modalities include navigating through a dense obstacle field and navigating between rooms within an indoor facility that requires successful traversal of non-trivial paths.

In addition, a third experiment evaluates the effect of resolution and field-of-view of the retinal implant upon participants' ability to visually discern relative distances to different obstacles based on optical flow patterns induced by the participant's motion when approaching obstacles situated at different distances ahead of the user. For this experiment, the following four vision settings are evaluated: 1) low resolution / low field-of-view, 2) low resolution / high field-of-view, 3) high resolution / low field-of-view, and 4) high resolution / high field-of-view. The "low" settings correspond to the values of the Argus II system, whereas the "high" settings corresponding to a doubling of the "low" values along each dimension. For Argus user participants, only the low resolution / low field-of-view setting is evaluated since the Argus II retinal implant is incapable of supporting the higher vision settings.

Enrollment

26 patients

Sex

All

Ages

18+ years old

Volunteers

Accepts Healthy Volunteers

Inclusion and exclusion criteria

Criteria for inclusion of normally sighted individuals:

  • Subject speaks English;
  • Subjects must be an adult (at least 18 years of age);
  • Subject has the cognitive and communication ability to participate in the study (i.e., follow spoken directions, perform tests, and give feedback);
  • Subject is willing to conduct psychophysics testing up to 4-6 hours per day of testing on 3-5 consecutive days;
  • Subject has visual acuity of 20/40 or better (corrected);
  • Subject is capable of understanding participant information materials and giving written informed consent.
  • Subject is able to walk unassisted

Criteria for inclusion of Argus II users:

The inclusion criteria for the study are the following:

  • Subject is at least 25 years of age;
  • Subject has been implanted with the Argus II system;
  • Subject's eye has healed from surgery and the surgeon has cleared the subject for programming;
  • Subject has the cognitive and communication ability to participate in the study (i.e., follow spoken directions, perform tests, and give feedback);
  • Subject is willing to conduct psychophysics testing up to 4-6 hours per day of testing on 3-5 consecutive days;
  • Subject is capable of understanding patient information materials and giving written informed consent;
  • Subject is able to walk unassisted.

Exclusion criteria for all subjects is the following:

  • Subject is unwilling or unable to travel to testing facility for at least 3 days of testing within a one-week timeframe;
  • Subject does not speak English;
  • Subject has language or hearing impairment.

Trial design

Primary purpose

Basic Science

Allocation

Non-Randomized

Interventional model

Single Group Assignment

Masking

None (Open label)

26 participants in 2 patient groups

Evaluation of normally sighted participants using a simulated visual prosthesis
Experimental group
Description:
All normally sighted participants will be assigned to this arm of the study. This study group will perform all experiments using a virtual reality headset (Oculus Go) to simulate the limited vision of a retinal prosthesis system. This study group will include all interventions, including those that both match and exceed the visual performance of the Argus II system.
Treatment:
Device: Navigation system mode: High Field-of-View Depth Vision
Device: Navigation system mode: Haptic / Audio
Device: Navigation system mode: Depth Vision with Haptic / Audio
Device: Distance test vision mode: High Resolution / Low Field-of-View
Device: Distance test vision mode: High Resolution / High Field-of-View
Device: Distance test vision mode: Low Resolution / Low Field-of-View
Device: Distance test vision mode: Low Resolution / High Field-of-View
Device: Navigation system mode: Depth Vision
Device: Navigation system mode: Argus Vision
Evaluation of blind participants using the Argus II retinal prosthesis
Experimental group
Description:
All blind participants who have been implanted with the Argus II Retinal Prosthesis System will be assigned to this arm of the study. This study group will be limited to performing a subset of the interventions including only those that do not exceed the visual performance of the Argus II system.
Treatment:
Device: Navigation system mode: Haptic / Audio
Device: Navigation system mode: Depth Vision with Haptic / Audio
Device: Distance test vision mode: Low Resolution / Low Field-of-View
Device: Navigation system mode: Depth Vision
Device: Navigation system mode: Argus Vision

Trial documents
1

Trial contacts and locations

2

Loading...

Central trial contact

Seth Billings, Ph.D.; Francesco Tenore, Ph.D.

Data sourced from clinicaltrials.gov

Clinical trials

Find clinical trialsTrials by location
© Copyright 2026 Veeva Systems