Prototype Electric Wheelchair Controlled By Eye Only Computer Science Essay

Recently, 2 % of US citizen are paralyzed which caused by shot, spinal cord hurt, and multiple induration. It becomes dark female horse for every Americans that they will free their mobility map furthermore lose their life. Electric wheelchair controlled by eye-only is being developed to supply independent mobility for paralytic individuals and return their life.MethodsA paradigm of wheelchair controlled by eye-only has been developed utilizing camera which mounted on user ‘s glass, micro-controller, and Yamaha JW II. The paradigm allows user move frontward, right, and left by expression at these way during 2 seconds.

Several image processing analysis are used to analysis user regard, and so utilize it for commanding wheelchair.ConsequencesThe paradigm has been tested utilizing five normal users with existent fortunes. Vibration, truth, and light alterations besides have been evaluated. For all rating, our paradigm solved jobs of assorted users, quiver, and light alterations.DecisionThe paradigm demonstrates the feasibleness and dependability of supplying computing machine input for paralytic user to command wheelchair.

We Will Write a Custom Essay about Prototype Electric Wheelchair Controlled By Eye Only Computer Science Essay
For You For Only $13.90/page!

order now

I. Background

IntroductionThe development of wheelchair for paralytic user is comparative new, and represents smart option for individual who are unable usage motion organ to command wheelchair as their mobility device. The journeys of wheelchair development start with conventional wheelchair which rely on user ‘s manus public presentation to give force to wheel.

Beside force, user ‘s manus besides controls motion of wheelchair. Recently, although conventional wheelchair is besides available, electric wheelchair reduced small map of user ‘s manus by adding electric motor with battery to replace of manus force. Joystick, hand-based accountant, is used to command motion of wheelchair.Recently, there are 1 in 50 people of Americans populating with palsy [ 11 ] [ 12 ] .

It is by and large caused by shot ( 29 % ) , spinal cord hurt ( 23 % ) , and multiple induration ( 17 % ) . Diseases or accident, which broke nervous system, can do individual loses their ability to travel the musculus voluntary. Because of the musculus voluntary is chief actuator to enable our organic structures to travel, palsy may do individual can non travel their motion organ such as face, arm, leg, and others.Type of palsy can be local, planetary, or follow specific form.

Most palsy are changeless, nevertheless there are other signifiers such as periodic palsy ( largely caused by familial disease ) and sleep palsy ( occurs when encephalon awake from REM ( Rapid Eye Movement ) but organic structure can non be moved during several 2nd or minute ) which is caused by another factors.Person, who have palsy job, can non utilize typical electric wheelchair type. It is caused by user can non utilize their manus ( or other motion variety meats ) to run joystick accountant. In this instance, merely oculus is organ that can demo user desires.

Sing the grounds above, we develop electric wheelchair that can be controlled utilizing eye-only.Related PlantsWheelchair free-hand based as assistive mobility device can be loosely categorized into following classs,Bio-Signal based system [ 1 ] [ 2 ] , Electrooculograph, Electroencephalograph, Electromyograph, and other bio-Signal instruments are used to get bio-Signal from user and used it to command wheelchair. Ref.

[ 1 ] proposed electric wheelchair controlled utilizing Electrooculography ( EOG ) . EOG analyze oculus motions by lodging electrode onto environing oculus. EOG obtain two sort of signals: horizontal and perpendicular that represent oculus musculus activities. Each oculus motion has their ain signal form. Eye motion of right, left, up, and down can be distinguished by analyze EOG signal. The consequences are used to command wheelchair. Ref.

[ 2 ] proposed wheelchair controlled by Muscular and Brain signals. EMG and EEG are used together to analyse user desire, and used it to command wheelchair. Electrodes stick onto caput, end product signals are analyzed and change over into wheelchair bid.

Voice based system. Ref. [ 3 ] proposed wheelchair guided by voice bids. The prototype consist of Speech acknowledgment, motor control, user interface, and cardinal processor faculties. Speech acknowledgment is used to acknowledge voice bid. User have to enter the unwritten bid associated with every map. After first recording, user can get down with normal runing manner.

For illustration, when user say “ Forward ” , wheelchair will travel frontward. Likewise, when user say “ Stop ” , wheelchair will halt.Vision based system [ 4 ] [ 5 ] [ 6 ] , use camera to get user image and analyse user desire.

Ref. [ 4 ] proposed wheelchair controlled by caput gesture. Viola-Jones face sensing is used to acknowledge face profile. Head gesture such as up, down, left, right, and centre are expected to give bid of velocity up, speed down, bend left, turn right, or maintain the velocity.

Ref. [ 5 ] proposed wheelchair controlled utilizing gaze way and oculus eye blink. The gaze way is expressed by horizontal angle of regard, and it is derived from the triangle signifier formed by the halfway place of eyes and nose. The gaze way and oculus eye blink are used to supply the way and timing bid. The way bid related to the motion way of electric wheelchair and the timing bid related to the clip status when the wheelchair should travel. Ref. [ 6 ] proposed wheelchair with two cameras, indoor camera for proctor wheelchair motion and other camera is mounted on wheelchair for obstruction sensing.

Ref. [ 7 ] proposed wheelchair controlled by regard. Stereo CCD cameras is used to gauge user regard and caput airs. Besides, scope finder is used to acknowledge environing environment.System ( 1 ) requires direct touch with user and electrodes should be stick onto user organic structure. These system are expensive and are non convenience to used. System ( 2 ) easy and simple to developed, but speech perturbation, when wheelchair used in existent environment, should be consider. Aforementioned ground, we proposed wheelchair utilizing vision system.

The aim of our proposed system to develop wheelchair, which particular designed for paralytic user, with overcome assorted users, quiver, allows user motion, allows light alterations jobs of old system.We design our wheelchair particular used for paralytic users. Paralytic users can non user their manus, pes, organic structure gesture, caput, and others. Largely, although they can non travel the motion organ, oculus still can be used to demo their desires. The easy manner to demo desire is by wink.

By commanding clip continuance of winking, they can pass on and demo information to others. Even they can utilize wink, commanding wink during long period can makes oculus become tired. Because of this ground, we decide to utilize user regard as beginning information.Our proposed system fundamentally same with Ref. [ 7 ] . Both of us use gaze to capture user information.

In Ref. [ 7 ] usage stereo CCD camera to analyse regard, but our system merely use individual camera which mounted on user glass. By this manner, our wheelchair perform when used in existent fortunes. Furthermore, our system can works with inexpensive netbook Personal computer and it makes our system more marketable when reviewed in economic side.Our proposed system consists of individual infrared camera, netbook, micro-controller, and, modified wheelchair. Our camera mounted on user glass in order to let user motion. Infrared LED will set light when light of environment is alterations.

Besides, this camera place will let user motion because camera ever follows caput motion. Furthermore, this manner eliminates quiver because user organic structure will cut down daze or quiver which comes from bottom side. After user ‘s image acquired by camera, image processing analysis methods estimates user regard from this image. Viola-Jones oculus sensing, adaptative threshold, and Kalman filter are used to gauge the regard. Single supersonic detector, which used to avoid hit, puts on forepart of wheelchair.

In order to command wheelchair, unseeable layout is used. Turn left, right, and travel frontward will selected by user by looking at the key during 2 seconds. Invisible keys mean that user know the keys place without any existent grade. Our wheelchair did non used halt key for safety ground.

When user changes the gaze way, wheelchair will automatically halt. Besides when system fails analyze user regard, wheelchair will halt. By implement this system, wheelchair will travel safely when used by paralytic user.

Furthermore, this wheelchair will return their life because they get back their mobility.This paper is organized as follows: subdivision 2 describes our proposed system which involve hardware constellation, gaze appraisal method, oculus theoretical account, flow of choice key, and micro-controller circuit, subdivision 3 describes our experimental consequence which involve hardiness against assorted users, noise, and light changing, proving of quiver, and measuring of wheelchair public presentation, and subdivision 4 describes the decision.

II. Proposed System

The extreme importance of our proposed wheelchair is that the paradigm should be warrant that works when used for all users and besides works with existent fortunes such as quiver, light alterations, possibility of user motion, and accountant should hold perfect truth.

Furthermore, system should be guaranteed able to works safely. In order to recognize perfect wheelchair for paralytic users, infrared camera is utilised and mounted on user glass. This manner has benefit such as allows user motion, cut down quiver, and infrared LED will automatically set light and makes end product of image is stable. This manner besides should be follows with perfect image processing analysis in order to analyse user regard.

Our gaze appraisal method employ student cognition when estimation user regard. Pupil cognition such as size, colour, form, consecutive location, and gesture are used. After pupil location is found, simple theoretical account covert from pupil location into user regard. Micro-controller circuit connect and adjust communicating between netbook and wheelchair machine. This circuit converts RS 232 consecutive informations communicating into wheelchair bid.

When user looking at the key during 2 2nd, netbook will direct bid to wheelchair and makes it travel into the selected way. When user alterations their gaze way, wheelchair will automatically halt. No halt key is used for safety ground. Furthermore, in order to avoid hit with obstruction, supersonic scope finder detect obstruction in front side. When obstruction is detected, wheelchair will automatically halt and user merely can turn left or right. Besides, the backward key is non used for safety ground because it is unsafe when wheelchair move rearward while user can non look at rearward. Fig.

1 shows block diagram of our proposed system.Fig.1. Block diagram of our proposed method, micro-controller circuit alterations function of original accountant of wheelchair Yamaha JW II.

II.1 Hardware Configuration

Our proposed system utilizes Infrared Camera NetCowBow DC NCR-131 to get user image.

This camera has 7 LED which automatically adjust light and obtain stable image even light of environment alterations. The utilizations of IR camera will work out job of light alterations. We put the camera on user glass. Distance between camera and oculus is 15.5 centimeter. This value is come from test and mistake which consider that the camera can execute get oculus and besides the arrangement of camera will non upset user ‘s position itself. The place of camera is in forepart of oculus but small up in order to avoid user ‘s position perturbation. Furthermore, the arrangement of camera will give excess consequence that will cut down quiver.

It is of course that route will do wheelchair vibrate. When this state of affairs go on, user organic structure will cut down the quiver and it makes quiver will non act upon the camera. The place of camera shows in Fig.2. We used Netbook Asus Eee Personal computer 1002 HA, which is based on Intel Atom N270 CPU ( 1.6 GHz ) , 1GB Memory, 160 GB SATA HDD, and has little screen show 10 inch as chief processing device.

In order to change over USB to serial, we use Keyspan USA 19Qi USB to Serial convertor. Micro-controller AT 89S51 is used to alter the original accountant. Our package is developed under C++ linguistic communication of Visual Studio 2005 and OpenCv, image processing library, which can be downloaded as free at their web site. Besides, in order to observe obstruction, PING supersonic scope finder is used. This scope finder is able to observe obstruction around 3cm until 3m. Fig.2 shows figure of our paradigm hardware.Fig.

2 Hardware of proposed wheelchairTable.1 Features of supersonic scope finder [ 8 ]Supply Voltage5 V ( DC )Supply Current30 ma ( Typ ) , 35 ma ( Max )Scope3cm to 3mInput TriggerPositive TTL pulsation, 2I?S min, 5I?S ( Typ )Echo PulsePositive TTL pulsation, 115 I?S to 18.5 multiple sclerosissEcho Hold-off750 I?S from autumn of Trigger pulsationBurst Frequency40 kilohertz for 200 I?SDelay before following measuring200 I?SDimension22 millimeter H x 46 millimeter W x 16 millimeter D

II.2 Gaze Estimation

In order to gauge user ‘s regard, several image processing methods are used. Method such as Viola-Jones oculus sensing, deformable templet, adaptative threshold, and Kalman filter are used to gauge user ‘s regard.

Flow of gaze appraisal is shown in Fig.3.Fig.3 Flow of gaze appraisalThe appraisal start with detect oculus location. By utilizing oculus location, estimated oculus country is locked. Because the camera is mounted on user glass, one time oculus location is known, following oculus location will has same place. It is average that oculus sensing measure merely runs one time in the beginning.

Following procedure is sensing of student location. Pupil location is detected by utilizing pupil cognition. Pupil cognition such as colour, size, form, consecutive location, and gesture are used. By utilizing pupil location, oculus theoretical account convert into gaze way and obtain user regard. The item account of each procedure is described bellow,Normally, oculus is detected utilizing deformable templet. We captured oculus image and use Gaussian drum sander onto this image. Eye image and deformable templet is matched and is found the oculus location. The advantage of this method is fast and can be used for big sum of user than original templet matching.

Because of this method is non ever success detect oculus location, Viola-Jones oculus sensing [ 9 ] is used as backup. This will take over the oculus sensing when deformable templet fails detect oculus location. XML file is required when we used viola-Jones oculus sensing of OpenCv Image treating library map. This file can be created by roll uping object ( positive sample ) and non-object ( negative sample ) images.

This map can be used utilizing this undermentioned codification:CvSeq* objects = cvHaarDetectObjects ( small_img, cascade, storage, 1.1, 2, 0, cvSize ( 30, 30 ) ) ;The utilizations of this both methods have advantages that the processing clip will be faster and is robust against different fortunes. After oculus location is found, this location will be used to lock oculus image. It is average that for following procedure, oculus sensing measure will be skipped.Following is pupil detection measure. We estimate pupil location utilizing pupil cognition. In order to pull out pupil cognition, we use adaptative threshold method to divide student and other oculus constituents.

We set our threshold value T is 0.27 % bellow average I? of oculus image I.( 1 )( 2 )End product from adaptative threshold is black pels which represent student on image. In order to extinguish noise, we use average filter. The end product of adaptative threshold has a batch of varies. We make three classs of adaptative threshold end product: ( 1 ) instance 1 ( When black pels clearly represent students without any noise ) , ( 2 ) instance 2 ( When noise appear on image, furthermore size and form of this noise same with student ) , and ( 3 ) instance 3 ( When no any belongingss of student can be used to happen pupil location. All instances of adaptative threshold end product are shown in Fig.

4, Fig.5, and Fig.6.Fig.

4 Case 1, this figure shows that end product can be distinguished by it is shape and size.Fig. 5 Case 2, this figure shows that end product has noise which has about same size and form with studentFig. 6 Case 3, this figure shows no any pupil belongingss can be used because there are no black pels on image end product.After categorization of adaptative threshold end product, we estimate pupil location by tally three measure procedure based on student cognition. In instance 1, it is easy that pupil location is estimated by form and size. Even noise appear on image, we still can separate the true student by see it form and size.

In instance 2, the status is more specific than instance 1. In this instance, noise appears with about same size and form. This status may go on when adaptative threshold fails separate other oculus constituents. Eye constituent such as eyelid and oculus corner may appears with about same size and form with student.

In order to work out this instance, we estimate students based their consecutive locations. Every location of student is recorded as their history. When method confuses where true student is, we trust that the true student must be ever most closest with old location.( 3 )The sensible student location P ( T ) is ever in environing old location P ( t-1 ) with country C.

The last instance is instance 3, it happen when student moves to stop of waies except move to up. It is happen because size of student becomes little and disappears. In order to work out this instance, we estimate pupil location based on their gesture. We adopt Kalman Filter [ 10 ] to gauge pupil location.

II.3 Eye Model

In order to change over student location into regard, simple oculus theoretical account is used. We assume that motion of oculus similar as sphere with radius R.

Event the existent motion is non absolutely precisely with sphere, this difference merely give less consequence to our method. Pupil is assumed that the location is in forepart of orb. When student moves, the motion will follows the domain ‘s orbit.

We can pattern the student motions as shown in Fig.7.( a ) In ten way( B ) In y wayFig.7. Eye theoretical account, oculus is modeled as sphere with radius R.

If the distance between normal angle and current student location is r, relation between I?x, I?y, R, and R can be calculated as follow,( 4 )( 5 )( 6 )( 7 )The concluding consequence of gaze appraisal procedure is ( I?x, I?y ) . Although this end product besides can be used to other intents, our system merely require three end products: left, right, and down, which calculated from angle of regard. In order to change over from user regard to wheelchair bid, we use threshold user angle.

When user looking at left or right exceed threshold angle, so left or right is selected. We use same threshold for left and right waies. Particular for down way, we use different threshold because user position should non be influence.

II.4 Micro-controller Circuit

After user regard is estimated and bid is already translated, netbook direct this bid to wheelchair.

In order to set communicating between netbook and wheelchair, we modified original accountant of wheelchair with new accountant. As is shown in Fig. 1, our new accountant consists of micro-controller, buffer, and digital to analog circuit. By utilizing consecutive communicating, micro-controller will pass on with netbook. After bid is send to micro-controller, micro-controller will change over into digital I/O and so is converted to analog by utilizing Digital to Analog Converter. The parallel end product will give bid to wheelchair through their Analog to Digital Converter.


5 Controlling of Wheelchair

In order to command EWC, we design three keys unseeable layout, travel frontward, turn right, and bend left. Stop key is non required for safety ground. No screen show is required. Users understand the location of coveted key so that it can be selected with users eye-only.

Instance, when user looks at the right key within 2 2nd, EWC will travel to right until user change the regard. While user has non alter his/her regard, EWC will go on the moving. When user changes the gaze way, EWC will automatically halt. This method is more safely than use halt ‘s key. If we use stop key, user will necessitate longer clip to hit the halt keys and do EWC non safely. Besides because of safety ground, we did n’t utilize backward key. It is excessively danger traveling rearward while user can non cognize the state of affairs in the buttocks. Beside the above function, our EWC besides stop when user looking at free country.

The flow of commanding wheelchair is shown in fig.8.StopHOLD ON/OFFStopBend LEFTTURN RIGHTGO FORWARDFig.

8 Controlling of Wheelchair, this figure show that when every bid alterations, it is ever through stop measure foremost. It means that every bid changing, wheelchair ever halt foremost.

III. Experimental Result

In order to mensurate the public presentation of our paradigm, several experiments have done in our research lab for each method and besides incorporate system. The experiments consist of proving of pupil sensing public presentation, proving of commanding of gaze appraisal truth, proving of influence of light alterations, proving of influence of quiver, and proving of incorporate system public presentation.

Detail experiment is described bellow,

III.1 Pupil sensing public presentation

In order to prove public presentation of our student sensing, we involve five different users who have different race and nationality ( Indonesian, Nipponese, Srilanka, and Vietnamese ) . The utilizations of many samples will turn out that our method works when used for all types of users, the oculus motion informations was collected from each user while was doing several oculus motion.Eye images of three Indonesian have been collected as shown in Fig.9. This figure shows that even images was taken from same state, each individual has different race and oculus form.

Fig.9 Collected images of three Indonesian, the top individual has slanted oculus and two of undersides have width oculus and clear student.Fig.10 Collected images of Srilanka with his tegument colour is black and thick palpebraFig.11 Collected images of Nipponese with his tegument colour is bright and oculus is slant.

Fig.12 Collected images of Vietnamese.Number of images of Indonesian whose slated oculus are 882 samples and other two Indonesian whose width oculus and clear student are 552 samples and 668 samples. The collected information from Srilanka is shown in Fig. 10 with figure of images are 828 samples, his tegument colour is black and eyelid is thick. Collected informations from Nipponese is shown in Fig. 11 with figure of images are 665 samples, his tegument colour is bright and oculus is slant. The last information is collected from Vietnamese as shown in Fig.

12.This experiment evaluates pupil sensing truth and discrepancy against different user by numbering the success sample and the fail 1. After counted truth of pupil sensing, our method is compared with adaptative threshold method and templet matching method. The comparing of adaptative threshold method uses combination between adaptative threshold itself and connected labeling method. Furthermore, other comparing uses pupil templet as mention and matched with the images. The hardiness of our pupil sensing method against different users is shown in Table 1.

Table.2 Robustness of our pupil sensing method against different users, this tabular array shows that our method is robust against different user and has high success rate.

User Types


Adaptive Threshold ( % )

Template Matching ( % )

Our Method ( % )














The consequence informations show that our pupil sensing method has high success rate and robust against different utilizations with discrepancy value is 16.27.Following, we measured public presentation of pupil sensing due to influence of light alterations. The experiment was done by give adjustable visible radiation beginning to system and recorded the debasement of success rate. We measured light status by utilizing Multifunctional environmental sensor LM-8000.

First, zero light is given to system ( dark status ) . Even though no light is given to system, our IR camera will automatically set the light and do it let used with out any light. IR camera with seven IR LED and visible radiation detector will set light and makes consequence image is ever stable. Unfortunately, when strong light hit the camera, it causes the pupil sensing is non running good. The consequence of light influence experiment is shown in Fig.

13.Fig.13. Influence of Illumination altering. This figure shows that our pupil sensing method works without any light. The strong visible radiation caused the method does n’t work. This status may go on when Sun light hit straight to camera.


2 Influence of Vibration

The aim of this experiment is to turn out that camera mounted on user glass gives extraordinary advantages that able to cut down and moreover about extinguish the quiver. This experiment was done by recorded the quiver by utilizing daze recording equipment G-MEN DR 10. By this experiment, we want to compare the arrangement of the camera between our system with other systems [ 5 ] [ 7 ] . In Ref.

[ 5 ] and Ref. [ 7 ] , the camera is placed on wheelchair as shown in Fig. 14 point 2. Our system put the camera mounted on user glass as shown in Fig. 14 point 1.

Fig.14. Placement of camera, other system put camera on point 2, but we put out camera mounted on user glass ( indicate 1 )In order to prove public presentation of each camera arrangement, two daze recording equipments are placed on point 1 and point 2. After two daze recording equipments are turned ON, wheelchair is used to go through the step and recorded the quiver of point 1 and point 2. The quiver informations on point 2 is shown in Fig. 15 and quiver informations on point 1 is shown in Fig.

16. The comparing of quiver between point 1 and point 2 is shown in Fig. 17.Fig.15 Vibration on point 2, this figure shows that quiver on point 2 is high.

Fig.16 Vibration on point 1, this figure shows that quiver on point 1 is littleFig.17 Vibration Reduction, this figure shows that by puting camera on point 1 will makes high quiver decrease.The decrease of quiver can be happen because of user organic structure is elastic and makes it similar with spring.

In order to explicate the decrease quiver, we modeled arrangement of camera as shown in Fig.18.Fig.

18. Vibration Model, this theoretical account shows that point 1 has more spring to cut down quiver than point 2.If mass of wheelchair is m2 with stiffness of spring is k2 and mass of user is m1 with stiffness of spring is k1, we get the undermentioned equation:Fs = k1x1 + m1g + k2x2 + m2g,( 8 )where g is gravitation. Equation 8 shows that there are two sort of stiffness of spring that will absorb the quiver are k1 and k2. If we measure vibration decrease on point 2, merely k1 is involved. Otherwise, If we measure vibration decrease on point 1, both of stiffness of springs are used. That is why the arrangement of camera on point 1 will more robust against quiver.


3 Testing of Integrated System

The aim of this experiment is to analyze the whole portion of our wheelchair when it is siting by users. All maps such as go frontward, bend left, turn right, and halt are used. Start from start line, user is siting the wheelchair controlled by eye-only until finish line. The ingestion clip is recorded. The route map that is used for making this experiment is shown in Fig. 19.

Fig.19. Road Map, users did all map in order to go through the route. User rides the wheelchair, travel frontward, bend left, and turn right from start until finish line.This experiment affect five users, include adept user ( of all time rides before ) and novice users ( ne’er sit before ) .

Before the experiment, our users did the exercising foremost. We explain and teach them how to sit this wheelchair. Because of this wheelchair besides same with other vehicle such motor rhythm or auto, they need to pattern before siting. Around 10 minute is needed by user for exercising.

They turn left, turn right, and travel frontward freely as they want. After they feel can command the wheelchair, the experiment is began. User begins the siting from start line and stop watch bend ON.

User go frontward by looking at down. Wheelchair moves frontward every bit long as user ‘s oculus did n’t alterations. Unfortunately, user ever blinks their oculus when oculus become tired. Because of this wink, wheelchair will automatically halt. Although this will makes wheelchair moves easy, we choose this function because of safety ground. After wheelchair halt because of user ‘s wink, user can travel frontward once more by looking at down.

After go frontward about 3.5 metres, user should halt and turn left. User bend left by looking at left. After wheelchair bend left, user goes frontward around 1.9 metre and halt. User turns right by looking at right and travel frontward about 3 metres.

After user base on balls the finish line, we recorded the clip as shown in table.3.Table.3. Recorded clip when user rode the wheelchair from start line until finish line, this tabular array shows that user can easy utilize our wheelchair even they ne’er ride it.


Time ( 2nd )

Type of user

180expert270expert385novice499novice595noviceBesides, we compared the ingestion clip between eye-based and hand-based system utilizing same route as Fig.

19. When user controlled wheelchair by manus, the demand clip is 23 2nd. If we compared with our consequence, it is really huge than ours. It is about four times faster than our system. However, we can state that eye-based accountant can be alternate to replace hand-based accountant in specific status.After user rode our wheelchair, we made short interview with them about how they feel about this wheelchair.

Almost them said that it is easy to command even ne’er drive before. Besides, they can utilize their oculus freely when HOLD manner selected. It is caused by when user looking at upward, system become freezing. In this manner, system will disregard all oculus motion. This manner will give advantage because when user ‘s oculus becomes tired, user can rest their oculus by select this manner. Besides, when user want to look at about freely, this manner will helpful.



A paradigm of electric wheelchair controlled by eye-only for paralyzed user has been successfully realized. The utilizations of IR camera mounted on user ‘s glass given large impact such as allows user ‘s motion, maintain light status, and extinguish quiver. Furthermore, our student sensing based on students knowledge perform detect pupil about absolutely whenever it is used to different users. Besides, our wheelchair commanding method makes user easy controlled the wheelchair.

Not merely makes wheelchair easy to be controlled, combination between supersonic obstruction sensing and this method warrant that it is safe to be ridden.