A Mobile Robotic Arm for People with Severe Disabilities: Trial Development of a Vision-Based User Interface

Open Access

Abstract: A vision-based user interface of a mobile robotic arm for people with severe disabilities is described in this paper. The robotic arm’s main body can be contained in a laptop computer’s briefcase without removing any parts. The user interface consists of a single web camera to obtain user’s the eye movements, computer running a detection program of the center of the iris and pupil from the captured images, and display unit to indicate feedback information to the user. Control boxes to detect user’s eye movements were also made and superimposed on the processed images. It is clear from the experimental results that an able-bodied subject can operate the proposed user interface without any difficulty, and that using it promises to support to select one of foods on a table.

Keywords: Robotic arm; self-feeding system; people with severe disabilities; eye movement; image processing

Fei Gao, Hiroki Higa, Hideyuki Uehara and Takashi Soken

The Author field can not be Empty

University of the Ryukyus, Japan; National Institute of Technology, Japan

The Institution field can't be Empty

Volume 1 Issue 1

Volume and Issue can't be empty

25 - 30

The Page Numbers field can't be Empty

25-12-2015

Publication Date field can't be Empty

Description

Abstract: A vision-based user interface of a mobile robotic arm for people with severe disabilities is described in this paper. The robotic arm’s main body can be contained in a laptop computer’s briefcase without removing any parts. The user interface consists of a single web camera to obtain user’s the eye movements, computer running a detection program of the center of the iris and pupil from the captured images, and display unit to indicate feedback information to the user. Control boxes to detect user’s eye movements were also made and superimposed on the processed images. It is clear from the experimental results that an able-bodied subject can operate the proposed user interface without any difficulty, and that using it promises to support to select one of foods on a table.

Keywords: Robotic arm; self-feeding system; people with severe disabilities; eye movement; image processing