Joystick

Similarly when the joystick is deflected beyond a certain threshold (e.g., 140–180), the motors begin to move in repetitive 1000-step increments at a constant rate regardless of a change in the magnitude of joystick deflection.

From: Methods in Neurosciences, 1992

Genuine Imitation?

CECILIA M. HEYES, in Social Learning in Animals, 1996

Local or Stimulus Enhancement

Observation of the demonstrator pushing the joystick may have drawn the observer's attention to the joystick and thereby resulted in the observers approaching and pushing the joystick sooner than they would have done if they had not seen the demonstrators. However, this kind of local or stimulus enhancement process is not sufficient to explain why the observers pushed the joystick in the same direction as the demonstrators.

Byrne and Tomasello (1995) have suggested that the rats may have learned during observation that the joystick should be moved toward a particular part of the chamber, for example, toward location L1 in Fig. 4, and that in the perpendicular test condition (Heyes et al., 1992) the rats thought they were moving it to L1 when they were in fact pushing it the other way, toward L2. Although it was presented as a local enhancement explanation of the bidirectional control effect, this hypothesis suggests that observational conditioning (see below) is responsible for the effect. Regardless of the label we assign to it, the hypothesis is implausible. As Byrne and Tomasello (in press) pointed out themselves, rats generally have an “excellent sense of space,” and, even if they were confused, the hypothesis does not explain why any error that occurred should have been so systematic (Heyes, 1995).

Read full chapter
URL: https://www.sciencedirect.com/science/article/pii/B9780122739651500182

Computers and Computations in the Neurosciences

Warren G. Tourtellotte, in Methods in Neurosciences, 1992

Joystick Programming

Two functions have been written to demonstrate reading the joystick position (Fig. 4; Function JOYSTICK) and the status of its buttons (Fig. 4; Function BUTTON). The JOYSTICK function has been written in machine language to increase execution speed. It requires one parameter (Axis) which specifies the axis of the joystick position to calculate. An output instruction to I/O port 303h starts the quad timer (Fig. 2; NE558). The bit corresponding to the requested axis (Fig. 3; I/O port 303h bit 0 for X axis and bit 1 for Y axis) is read continuously while incrementing a counter until a reset of the timer is detected. The value of the counter when the time-out is detected represents a numerical index of the absolute joystick position for the requested axis.

The joystick button status can be read easily by using the BUTTON function. A simple read instruction from I/O port 303h bit 6 (button 1) and bit 7 (button 2) indicates whether either button is currently depressed.

Read full chapter
URL: https://www.sciencedirect.com/science/article/pii/B9780121852696500072

A Powered Wheelchair Controlled by EMG Signals from Neck Muscles

Hiroaki Seki, ... Mitsuyoshi Maekawa, in Human Friendly Mechatronics, 2001

1 INTRODUCTION

Powered wheelchairs are used for disabled people who have disabilities to walk and drive wheels manually. They usually use joysticks to control wheelchairs and give desired direction and speed by them. However, there are many disabled who cannot handle them or hold them tilting stably. They operate special joysticks by tongue or chin, or use some voice commands [1-3]. In such cases, it is very difficult to control a wheelchair smoothly and its operation has large burden. Therefore, we propose to utilize electromyogram (EMG) signals from the operator’s neck or face muscles as one of wheelchair interfaces for disabled people who cannot use normal joysticks. Since motions of operator’s head or face are estimated from his/her EMG signals, the user can control a wheelchair by intended motions of his/her head or face. This EMG interface has following characteristics [4, 5].

It is relatively easy to obtain EMG signals comparing with other bioelectric signals.

Though head or face motions can be caught by accelerometer, gyroscope, and so on, these sensors measure not only essential motion but also wheelchair vibration. On the other hand, EMG signals are generated only when a user works his/her muscles.

This system needs the function to discriminate whether these signals (motions) involve user’s intention to operate a wheelchair or not.

EMG interface has possibilities to be applied to more serious physical handicap.

The problems are how to estimate motions from EMG signals and how to control a wheelchair by their motions. A powered wheelchair controlled by EMG interface has been developed and tested in this paper.

Read full chapter
URL: https://www.sciencedirect.com/science/article/pii/B9780444506498500164

Using Eye Movements as an Experimental Probe of Brain Function

Qi N. Cui, ... Gary D. Paige, in Progress in Brain Research, 2008

Experimental paradigm and response measures

Subjects manually localized stationary auditory targets using a red laser-LED pointer mounted on a 2-axis cylindrical joystick, aligning its beam with the perceived sound locations. For each target presentation, subjects registered response endpoint with a key press, and the target and pointer positions were recorded. Subjects were instructed to localize quickly but accurately. Auditory localization was studied before, during, and after 4 h of adaptation to either base-right (R) or base-left (L) prisms (20 prism-diopters) that induced an 11° shift L or R, respectively. For each session, a normal baseline of sound localization was first established without prisms or other devices. Subjects then donned the prisms for 4 h, during which time they engaged in normal active behaviour in and around the University of Rochester Medical Center (noise level ≤ 90 dB SPL), and returned to the laboratory for repeat testing after 1 and 4 h. Testing occurred without prisms or other devices (always removed or restored while subjects were on the bite-bar with eyes closed). In separate sessions, subjects were exposed to: (1) natural binaural hearing; (2) diotic hearing (portable Lavalier microphone-amplifier presented the same monaural signal to both ears through earphones); and (3) sound-attenuation (44±5 dB SPL) using earmuffs and earplugs (near-deafness). Twenty-six randomly distributed target locations were tested (repeated measures design; Fig. 1a). To complete all three acoustic conditions under both prism conditions (L and R), six sessions on different days at least 2 days apart were required per subject.

Fig. 1. (a) Auditory target distribution. (b) Mean change in Az localization accuracy (shift magnitude) for both diotic and binaural hearing conditions following 1 and 4 h of prism adaptation (R and L combined), binned in 10° intervals of target Az. (c) Average shift magnitude across subjects, pooled for all target locations (error bars are SDs).

To effectively separate eye position effects from those of cross-sensory interaction, it is important to control for the availability of auditory localization cues. Control sessions examined the efficacy of the lavalier microphone–amplifier (diotic condition) and the earplugs–earmuffs (near-deafness condition) in eliminating effective auditory spatial cues. Linear regression was performed to quantify spatial gain (slope) of response relative to target positions across horizontal space. As would be expected following the elimination of useable cues, spatial gains in Az fell to near zero for both diotic (0.05±0.07) and near-deafness (0.03±0.21) conditions, in contrast to a normal binaural baseline of 1.19 (±0.12, or a 19% overshoot of target positions).

Read full chapter
URL: https://www.sciencedirect.com/science/article/pii/S0079612308006377

Social Cognitive Neuroscience, Cognitive Neuroscience, Clinical Brain Mapping

D.J. Brooks, in Brain Mapping, 2015

Motor Activation in Dystonia

Cerebral activation studies in idiopathic torsion dystonia have suggested an imbalance between sensorimotor and premotor cortex functions. If dystonia patients perform paced joystick movements with their right hands in freely selected directions, they show significantly increased levels of contralateral putamen, rostral supplementary motor area, lateral premotor cortex, and dorsolateral prefrontal area activation (Ceballos-Baumann et al., 1995). In contrast, activation of contralateral sensorimotor cortex and caudal SMA is impaired; these are the motor cortical areas that send direct pyramidal tract projections to the spinal cord. Attenuation of sensorimotor cortex and caudal SMA activation in generalized and focal dystonia during vibrotactile stimulation have also been reported (Tempel & Perlmutter, 1990, 1993). Abnormal overactivation of premotor and parietal areas and the cerebellum has been demonstrated during sequence learning in nonmanifesting DYT1 carriers despite their impaired performance on the task (Carbon et al., 2008).

Read full chapter
URL: https://www.sciencedirect.com/science/article/pii/B9780123970251000865

The System Perspective on Human Factors in Aviation

Thomas B. Sheridan, in Human Factors in Aviation (Second Edition), 2010

The Human Transfer Function

During the 1950s the U.S. Air Force sponsored a major effort to determine the transfer function (differential equation relating the output, i.e., joystick position) of the human operator (the pilot) to the error input (discrepancy in pitch, roll, or yaw relative to a given reference setting on the attitude indicator). The reason for this was that the dynamic equations of the aircraft itself were known, but because the pilot was part of the control loop, to predict system stability the equation for pilot must be put into the overall system analysis. This led to a number of real-human-in-the-loop simulation experiments in target tracking with various unpredictable (“noise”) disturbances and controlled element (aircraft) dynamics. McRuer and Jex (1967) were able to generalize this work with a precedent-breaking idea, namely, that whatever the controlled element dynamics (within a reasonable range) the human operator adjusts his own transfer function, so that the resulting forward loop dynamics, the combination (Fc Fp), approximates a single integrator plus a time delay (the human reaction time of 0.2 seconds, which is inherent in the human neuromuscular system). This has the effect of making the closed loop system (goal variable G to planned output O1) behave as a “good servo”—stable and with minimal overshoot response to sudden transient inputs G (or disturbances D). Historically, this is one of the rare achievements in human factors where an equation predicts human behavior all the way from sensory input to motor output. (There are many mathematical models of visual and hearing sensitivity, muscle strength, memory, etc., but they are only components of behavior.)

Read full chapter
URL: https://www.sciencedirect.com/science/article/pii/B978012374518700002X

Cerebral Lateralization and Cognition: Evolutionary and Developmental Investigations of Behavioral Biases

William D. Hopkins, in Progress in Brain Research, 2018

3.5 Joystick Manipulation

Beginning in the late 1980s, automated test systems were developed for assessing cognitive functions and motor skills in different primate species. One such automated system uses a joystick that controls the movements of cursor projected on a computer screen (Hopkins, 1991; Rumbaugh et al., 1989; Washburn and Rumbaugh, 1992). Isomorphic movements of the cursor are controlled by the subjects manipulating the joystick and most develop a consistent hand preference. Though the number of subjects tested on these systems has been relatively small, there are some reports of data on hand use. Hopkins et al. (1989) reported that two rhesus monkeys and three chimpanzees all preferred their right hand for manipulation of the joystick and, moreover, we found that all five subjects performed a psychomotor task better with their right compared to left hand. In a follow-up study, Hopkins et al. (1992) examined the hand preference and acquisition of two psychomotor tasks (SIDE and CHASE) involving the manipulation of a joystick in a sample of 35 rhesus monkeys (all males except one female). There were two phases to the study. In Phase I, hand preference in joystick use was quantified in all 35 monkeys and 21 were found to prefer the right hand and 14 the left hand for manipulating the joystick on each trial. In Phase II, the acquisition in learning the two psychomotor tasks was compared in 18 monkeys that were trained using the exact same training criteria. Hopkins et al. then compared the acquisition data between those that preferred to use their right or left hand. Hopkins et al. found that right-handed individuals reached the asymptotic training criteria performance on both the SIDE and CHASE task in significantly fewer trials than left-handed individuals. Using a slightly different method, Andrews and Rosenblum (2001) similarly measured the number of trials needed to reach criterion on the joystick manipulation task in eight bonnet macaques living in social groups. Microchips were placed in the left and right wrists of the monkeys, and when they went to manipulate the joystick with one hand or the other, a computer would record the hand used and their performance on that specific trial. Across increasing levels of visuomotor difficulty, a right-hand preference for joystick manipulation emerged in six monkeys and a left-hand bias in two monkeys. When the joystick manipulation hand preference data from the macaques in the Hopkins et al. (1992) and Andrews and Rosenblum (2001) study are combined, a significantly higher proportion of right- compared to left-handed monkeys is found (z = 2.25, P < 0.05).

Fagot and Vauclair (1993) trained eight baboons on a similar type of joystick testing apparatus, though their system used a digital rather than analog interface between the movements of the joystick and the cursor on the screen. Fagot and Vauclair (1993) initially trained four monkeys on the device with their left hand and four with their right hand. After reaching the training criteria with the initial hand, transfer of learning was tested for the opposite hand in each group. For the initial training, the number of trials needed to reach criterion did not differ significantly between the baboons initially trained on the right (mean = 1014.5, SD = 261.5) compared to left (mean = 1099.5, SD = 298.5) hands. When testing for transfer of learning, baboons tested on their right hand (after being trained on the left) needed fewer trials to reach criterion (mean = 310.2, SD = 84) compared to individuals tested on left hand (mean = 633.2, SD = 173.4) (after initially being trained with the right), though the differences were not significant. Fagot and Vauclair (1993) also tested for differences in latency and found no significant differences but did find that the baboons used a shorter path between the cursor and target in the computer screen when performing with their left (mean = 431.5, SD = 40.1) compared to right hand (mean = 623.3, SD = 107.4).

Read full chapter
URL: https://www.sciencedirect.com/science/article/pii/S0079612318300578

Functional Neurologic Disorders

Q. Deeley, in Handbook of Clinical Neurology, 2016

Nonepileptic seizures, involuntary movement, and loss of awareness

While suggested convulsions cannot be safely or informatively produced in an fMRI scanner, it is possible to model nonepileptic seizures by suggesting involuntary movements with and without loss of awareness. Suggested simple involuntary actions (joystick movement) were associated with altered functional connectivity between motor-planning brain regions (supplementary motor area, SMA) and regions involved in movement execution (e.g., premotor areas, M1, S1) (Deeley et al., 2013b). Reduced awareness of hand movement was associated with decreased activity in brain areas involved in bodily awareness (BA 7) and sensation (insula), suggesting a mechanism for the loss or narrowing of awareness reported in about half of patients with nonepileptic seizures (Brown et al., 2011), as well as other forms of dissociation.

Read full chapter
URL: https://www.sciencedirect.com/science/article/pii/B9780128017722000096

Guidelines for Parents

SUSAN N. SCHRIBER ORLOFF OTR/L, in Learning Re-enabled (Second Edition), 2004

“Why is my child a wiz on the computer but can't do work in class? Why not just give him a laptop to work with instead of paper and pencil?”

Computers are useful tools after the child has attained adequate in-hand manipulation skills. These skills can only be acquired by using his own hands, and putting his forefinger on a mouse or moving a joystick or an adapted keyboard cannot replace the agility learned through old-fashioned play: jacks, finger painting, hand looms, lacing crafts, etc. Computers reinforce straight-ahead, focal vision-play and not peripheral vision (surrounding side stimuli). A child who does not develop this at a young age may have difficulty picking up on visual cues in the classroom (i.e., blackboard work, following sequences, and visual tracking).

Read full chapter
URL: https://www.sciencedirect.com/science/article/pii/B9780323027724500111

Computers and Creativity

M.R. Sarsani, in Encyclopedia of Creativity (Second Edition), 2011

The Computer – Its Functions

A computer is a device that computes, especially a programmable electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information. A computer can do a variety of tasks easily. It works under a set of instructions, automatically accepts supplied data, processes and analyses the data and produces the information. It does any complex problems speedily and perfectly if proper instructions or programs and data are introduced. Such unimaginable functions are possible because of its exceptional characteristics such as high speed, large memory and data storage capacity, accuracy, reliability, endurance, versatility, automation, and diligence.

Computer technology includes four basic functions: input, storage, control, and output.

1.

Input. Input entails entering information or data into the computer, for example, keyboard, mouse, scanner, joystick, touch screen, barcode readers, optical/magnetic character recognizers (OCRs), etc.,

2.

Storage. Once information is inputted, it is stored for eventual use on hard disks, floppy disks, compact disks (CDs), flash drives, etc.

3.

Control. Control of stored information, as well as new input, is achieved through programs written in one of several possible computer languages that are translated by the computer's controller to the computer's assembly languages.

4.

Output. The output or retrieval process transfers the processed information or data from the computer to the researcher, using one of a number of devices to communicate the results. The output may be displayed on a monitor, printed on paper, or recorded on disks.

In 1998, UNESCO identified 15 special properties of computers, which enhance student learning processes on a number of orthogonal dimensions. Dwight Egbert et al. believed that using computer vision research results could provide a high level of motivation to students. It is also an excellent learning tool for teaching students to integrate and use their acquired knowledge.

Read full chapter
URL: https://www.sciencedirect.com/science/article/pii/B9780123750389000418