Intelligent Assistive Technology and Systems Lab - click to go to homepage
IATSL develops assistive technology that is adaptive, flexible, and intelligent, enabling users to participate fully in their daily lives. Learn more about our research

Visit us:

Room 438

500 University Ave.

Toronto, Canada

P 416.946.8573

F 416.946.8570

 

Send us mail:

160 - 500 University Ave.

Toronto, ON, M5G 1V7

Canada

 

email us!

 

Follow IATSL on Twitter

Projects

Perception and Society: The Autism Emulation Project

Keywords: Autism, Autism Spectrum Disorders (ASD), mediated reality devices, sensory experience emulation.

In collaboration with: Bloorview Kids Rehab, PRISM Lab, the Geneva Centre for Autism, and Autism Society Ontario.


Overview of Research

Autism is an umbrella term for a complex spectrum of neurological conditions that may impact, among many other neural processes, sensory information processing by the brain. These atypical sensory information processing patterns usually translate, in varying degrees, into atypical perception schemes in people with autism. The advantages of these altered perception schemes may include, for instance, a heightened sensitivity to visual and auditory stimuli which may result in extraordinary artistic ability. However, social isolation, repetitive and insistent behaviour, as well as verbal and non-verbal communication difficulties may also result when there is sensory overload. Understanding the perception schemes of people with autism may help us create tools, places and therapies that take advantage of the benefits of the different perception schemes characteristic of autism, while reducing their less advantageous consequences.

We performed an exhaustive literature review, considered previously reported case studies and first-hand accounts, and completed a number of semi-structured interviews in order to inform the design of a device that may be used to recreate a subset of the autism sensory experience for neuro-typical (i.e. non-autistic) individuals. We focused our design on the emulation of particular auditory and visual perception schemes. As a result, a prototype emulator was built by the novel combination of available audio/visual technologies and real-time signal and image processing algorithms.

The autism emulation device is a mediated-reality interface between the environment and the user that alters his/her sensory visual and auditory experience according to information extracted from personal accounts by people with autism.

Hardware

Figure of hardware flow - click to enlarge

Figure 1: Autism emulation device hardware (click to enlarge)

The hardware can be divided into equipment worn by the user and equipment remote from the user where video and signal processing occurs (Figure 1). Some of the equipment worn by the user intakes video and audio of what the user would normally see and hear without the device. In particular, we use a colour board camera and two sub-mini omnidirectional lavalier microphones. The microphones are plugged into phantom power and high-impedance unbalanced to low-impedance balanced transformers. The signals are sent via a 2.4 GHz transmitter worn by the user to its complementary receiver at the remote station. At the remote station, the analog video signal is acquired digitally into a computer. After processing, the video is extracted through a DVI-D port and converted back into an analog video signal before it is sent to another 2.4 GHz transmitter (set at a different channel) on the remote side. In turn, after pre-amplification, the audio signals from the microphone pair are acquired on the computer using an external soundcard. Processed audio signals are sent from the computer, through the same soundcard, to the remote-side transmitter for wireless transmission back to the user.

The video signal is displayed by an eyewear that is buried within lightweight light-occluding foam (Figure 2). The audio signal is displayed to the user through noise-blocking canal phones.

Photo of woman wearing autism project hardware

Figure 2: Eyewear of the autism emulation device

Software

Table 1 and Table 2 describe the visual and auditory transformation algorithms implemented, respectively. A computer-based software program and graphical user interface allows for adjustments of the parameters governing these algorithms.

Table 1: Visual Transformation Algorithms

Autism-related condition Description Implemented algorithm
Reduced Gestalt (Figure 3.B) Gestalt is the ability to self-organize components of a scene into a cohesive whole. Frames are divided into a grid and the cells are randomly distributed
Hypersensitivity to bright highlights (Figure 3.C) Hypersensitivity to brightness is frequently described in individuals with autism. They can also be drawn into brightly coloured patterns. Areas on a frame that pass a threshold of brightness are made brighter while the remaining areas are made duller. This effectively highlights bright areas on each frame.
Visual shutdown (Figure 3.D) In response to overwhelming visual stimuli, the amount of visual information being processed can drop significantly as a coping mechanism. A gaussian pattern is used to decrease the visual field in response to the amount of information that must be processed. This amount of information is estimated by quantifying the change between consecutive frames.

 

Examples of different visualisations produced in the autism project - click to enlarge

Figure 3: Sample visual transformations implemented with the autism emulation device. Figure 3.A is the original frame. Table 1 includes details on the implementations. (click to enlarge)

Table 2: Auditory Transformation Algorithms

Autism-related condition Description Implemented algorithm
Heightened frequency sensitivity This condition is characterized by hypersensitivity to certain frequencies. Frequencies in the sensitive range are bandpass filtered and amplified.
Spatial disorientation Reduced ability to judge spatial orientation, which often affects motor coordination. Sense of direction is altered by combining signals from the left and right microphones in different proportions for one earphone, and in opposite proportions for the other earphone.
Voice distortion Reduced ability to distinguish between voice and background noise. The frequency range where voice usually resides is bandpass filtered and equalized to reduce intelligibility.
Audio shutdown In response to overwhelming auditory stimuli, the amount of auditory information being processed can drop significantly as a coping mechanism. The volume of the audio signal is decreased and a lowpass filter is applied to produce a muffled effect. A sudden increase in the amplitude (i.e. volume) of the audio signal triggers this transformation.

 

Read more about the Autism Emulation Project


References

  1. Dawson, G., Watling, R.. Interventions to Facilitate Auditory, Visual, and Motor Integration in Autism: A Review of the Evidence. Journal of Autism and Developmental Disorders. 30(5), 2000.
  2. Iarocci, G., McDonald, J.. Sensory Integration and Perceptual Experience of Persons with Autism. Journal of Autism and Developmental Disorders. 36(1), 2006.
  3. Rogers, S.J., Ozonoff, S.. Annotation: What do we know about sensory dysfunction in autism? A critical review of the empirical evidence. Journal of Child Psychology and Psychiatry. 46(12), 2005.
  4. Bagdashina, O., Lawson, W.. Sensory Perceptual Issues in Autism: Different Sensory Experiences - Different Perceptual Worlds. Jessica Kingsley, 2003.

Select Publications

  1. Poster: Chung J., Silva J. and Chau T., “A Mediated-Reality Device Based on the Autistic Sensory Experience,” Engineering Science Research Day, University of Toronto, ON., Canada, August 21, 2006. Winner of the Best Poster Award.
  2. Invited Lecture: Silva J., “The Autism Emulation Project and the New Social Studies of Childhood and Disability,” Child Life Program, McMaster Children’s Hospital, Hamilton ON., Canada, March 10, 2006.

Funding Source

Health Care Technology and Place (HCTP - U of T)


Research Team

Jorge Silva, Ph.D. candidate (IBBME/HCTP, University of Toronto)

Jennifer Chung, (Engineering Science, University of Toronto)

Ceilidh Eaton Russell, CCLS, CLSt. Dip., (McMaster University)

Alex Mihailidis, Ph.D. P.Eng. (University of Toronto)

Tom Chau, Ph.D. P.Eng. (CRC in Paediatric Rehabilitation Engineering)

Pascal Lehoux, (CRC on Innovation in Health)

Hilde Zitselberger, (Nursing, University of Toronto)