Perception and Society: The Autism Emulation Project
Keywords: Autism, Autism Spectrum Disorders (ASD), mediated reality devices, sensory experience emulation.
Overview of Research
Autism is an umbrella term for a complex spectrum of neurological conditions that may impact, among many other neural processes, sensory information processing by the brain. These atypical sensory information processing patterns usually translate, in varying degrees, into atypical perception schemes in people with autism. The advantages of these altered perception schemes may include, for instance, a heightened sensitivity to visual and auditory stimuli which may result in extraordinary artistic ability. However, social isolation, repetitive and insistent behaviour, as well as verbal and non-verbal communication difficulties may also result when there is sensory overload. Understanding the perception schemes of people with autism may help us create tools, places and therapies that take advantage of the benefits of the different perception schemes characteristic of autism, while reducing their less advantageous consequences.
We performed an exhaustive literature review, considered previously reported case studies and first-hand accounts, and completed a number of semi-structured interviews in order to inform the design of a device that may be used to recreate a subset of the autism sensory experience for neuro-typical (i.e. non-autistic) individuals. We focused our design on the emulation of particular auditory and visual perception schemes. As a result, a prototype emulator was built by the novel combination of available audio/visual technologies and real-time signal and image processing algorithms.
The autism emulation device is a mediated-reality interface between the environment and the user that alters his/her sensory visual and auditory experience according to information extracted from personal accounts by people with autism.
Figure 1: Autism emulation device hardware (click to enlarge)
The hardware can be divided into equipment worn by the user and equipment remote from the user where video and signal processing occurs (Figure 1). Some of the equipment worn by the user intakes video and audio of what the user would normally see and hear without the device. In particular, we use a colour board camera and two sub-mini omnidirectional lavalier microphones. The microphones are plugged into phantom power and high-impedance unbalanced to low-impedance balanced transformers. The signals are sent via a 2.4 GHz transmitter worn by the user to its complementary receiver at the remote station. At the remote station, the analog video signal is acquired digitally into a computer. After processing, the video is extracted through a DVI-D port and converted back into an analog video signal before it is sent to another 2.4 GHz transmitter (set at a different channel) on the remote side. In turn, after pre-amplification, the audio signals from the microphone pair are acquired on the computer using an external soundcard. Processed audio signals are sent from the computer, through the same soundcard, to the remote-side transmitter for wireless transmission back to the user.
The video signal is displayed by an eyewear that is buried within lightweight light-occluding foam (Figure 2). The audio signal is displayed to the user through noise-blocking canal phones.
Figure 2: Eyewear of the autism emulation device
Table 1 and Table 2 describe the visual and auditory transformation algorithms implemented, respectively. A computer-based software program and graphical user interface allows for adjustments of the parameters governing these algorithms.
|Autism-related condition||Description||Implemented algorithm|
|Reduced Gestalt (Figure 3.B)||Gestalt is the ability to self-organize components of a scene into a cohesive whole.||Frames are divided into a grid and the cells are randomly distributed|
|Hypersensitivity to bright highlights (Figure 3.C)||Hypersensitivity to brightness is frequently described in individuals with autism. They can also be drawn into brightly coloured patterns.||Areas on a frame that pass a threshold of brightness are made brighter while the remaining areas are made duller. This effectively highlights bright areas on each frame.|
|Visual shutdown (Figure 3.D)||In response to overwhelming visual stimuli, the amount of visual information being processed can drop significantly as a coping mechanism.||A gaussian pattern is used to decrease the visual field in response to the amount of information that must be processed. This amount of information is estimated by quantifying the change between consecutive frames.|
Figure 3: Sample visual transformations implemented with the autism emulation device. Figure 3.A is the original frame. Table 1 includes details on the implementations. (click to enlarge)
|Autism-related condition||Description||Implemented algorithm|
|Heightened frequency sensitivity||This condition is characterized by hypersensitivity to certain frequencies.||Frequencies in the sensitive range are bandpass filtered and amplified.|
|Spatial disorientation||Reduced ability to judge spatial orientation, which often affects motor coordination.||Sense of direction is altered by combining signals from the left and right microphones in different proportions for one earphone, and in opposite proportions for the other earphone.|
|Voice distortion||Reduced ability to distinguish between voice and background noise.||The frequency range where voice usually resides is bandpass filtered and equalized to reduce intelligibility.|
|Audio shutdown||In response to overwhelming auditory stimuli, the amount of auditory information being processed can drop significantly as a coping mechanism.||The volume of the audio signal is decreased and a lowpass filter is applied to produce a muffled effect. A sudden increase in the amplitude (i.e. volume) of the audio signal triggers this transformation.|
- Dawson, G., Watling, R.. Interventions to Facilitate Auditory, Visual, and Motor Integration in Autism: A Review of the Evidence. Journal of Autism and Developmental Disorders. 30(5), 2000.
- Iarocci, G., McDonald, J.. Sensory Integration and Perceptual Experience of Persons with Autism. Journal of Autism and Developmental Disorders. 36(1), 2006.
- Rogers, S.J., Ozonoff, S.. Annotation: What do we know about sensory dysfunction in autism? A critical review of the empirical evidence. Journal of Child Psychology and Psychiatry. 46(12), 2005.
- Bagdashina, O., Lawson, W.. Sensory Perceptual Issues in Autism: Different Sensory Experiences - Different Perceptual Worlds. Jessica Kingsley, 2003.
- Poster: Chung J., Silva J. and Chau T., “A Mediated-Reality Device Based on the Autistic Sensory Experience,” Engineering Science Research Day, University of Toronto, ON., Canada, August 21, 2006. Winner of the Best Poster Award.
- Invited Lecture: Silva J., “The Autism Emulation Project and the New Social Studies of Childhood and Disability,” Child Life Program, McMaster Children’s Hospital, Hamilton ON., Canada, March 10, 2006.
Health Care Technology and Place (HCTP - U of T)
Jorge Silva, Ph.D. candidate (IBBME/HCTP, University of Toronto)
Jennifer Chung, (Engineering Science, University of Toronto)
Ceilidh Eaton Russell, CCLS, CLSt. Dip., (McMaster University)
Alex Mihailidis, Ph.D. P.Eng. (University of Toronto)
Tom Chau, Ph.D. P.Eng. (CRC in Paediatric Rehabilitation Engineering)
Pascal Lehoux, (CRC on Innovation in Health)
Hilde Zitselberger, (Nursing, University of Toronto)