Skip to content →

Novel approaches to icon-based AAC presented by Karl Wiegand

One can easily argue that few are as keenly interested in the well-being of a person with a disability as is a parent. Expanding from that core of support one can also include siblings, guardians, educators, social workers and health care professionals. One can further include advocates, friends, spouses and co-workers, all of whom are concerned about quality of life. That covers just about everyone, and just about everyone should be in attendance at Karl Wiegand’s presentation at this year’s Conference on Disability, hosted by CSUN.

Mr. Wiegand is presenting some astonishing work in the field of augmentative and alternative communication (AAC). His presentation, entitled “Novel Approaches to Icon-Based AAC,” will explore two different methodologies for message construction and input. These two approaches can elevate the quality of communication for a person who has locked in syndrome. “Locked in syndrome” is an umbrella term that describes people who may have paralysis to the degree that the individual is unable to move any major body parts, except for above the neck. Even a person who may be in a full body cast is an example of someone who may have near complete lack of motor function, albeit temporarily.

The choices in alternative and augmentative communication devices now commonly involve the use of mouth sticks, switches or eye gaze input devices that can be cumbersome and fatiguing for the user. The current systems were designed based on an assumption that the user can press a button, make repetitious movements, or is able to maintain movement or body position for extended periods, so as to select letters or short words or phrases from choices on a menu. Using letter-based systems can be time consuming, because a letter-based system is more generative than the icon-based system that some users prefer in face-to-face or real time communication situations.

The challenge for Wiegand and his colleagues was to answer the questions: How can you redesign a screen such that you can display a large number of icons, but not all at once, which can be cognitively burdensome? How can icon-based systems be redesigned for faster and more efficient communication, as well as to accommodate users with upper limb motor impairments?

Together with his advisor and colleagues at Northeastern University, Wiegand is working on initial designs of two new approaches to icon-based
AAC: one using continuous motion and one using a brain-computer interface (BCI). The continuous motion system, called Symbol Path, consists of 120 screen icons of semantically salient words. “Continuous motion” means that a user can touch a word to begin a sentence, and without breaking contact from the screen, swipe or drag from icon to icon, ultimately completing a sentence.

His second approach makes use of a practice borrowed from the field of psychology. It is a system that shows icons to a user that represents a word or small phrase, in a serial fashion. It’s called Rapid Serial Visual Presentation. It allows for more efficient sentence construction, rather than presenting the user with a screen full of icons that must be made small in order to offer the user a full compliment of choices, which may be overwhelming.

This method of presenting information in rapid-fire fashion has been used before. If it sounds familiar, you may have once used this same technique if you’ve ever tried to tackle “speed reading.”

“My goal is to build a star trek computer.” Wiegand declares. He went on to explain. “A computer like the one in the program Star Trek, that can understand anybody, and will do it’s best to fill a person’s desires or needs.”

Karl was gracious enough to patiently explain what essential elements of communication would be required in order to make a “Star Trek computer” possible. First, a computer would have to be capable of parsing, which senses for context and speech recognition. Another element would include learning contexts, whereby a computer would understand how people interact with systems and expected responses from users. Finally, artificial intelligence would have to be achieved, enabling problem-solving with incomplete information, and natural language processing.

Until the point at which Mr. Wiegand has utterly changed our lives, and I do not doubt for a moment that he will, Wiegand says he’d like to work on Siri. To achieve his ultimate ends, Karl has worked in a number of other fields that have led him to this research. “I like AAC.” Wiegand continues. “It is a very focused area that is actually a vertex for four or five other fields.”

At CSUN, Karl will demonstrate the SymbolPath system, a prototype version of which is currently available for free on the Android app store (search for “SymbolPath”), show the BCI system, explain how both systems work, and talk about future directions for both. Wiegand hopes to have a system in place at his CSUN session so that attendees who interact with AAC users, friends or loved ones of AAC users, or AAC users themselves, can help create a corpus — a data set that shows what certain users want in certain times or settings or situations.

“We have revised both approaches based on initial testing and user feedback, and we are currently conducting several iterations of user-assisted design and revision before proceeding to full user testing.” Wiegand notes.

Attendees can help build this database by contributing realistic text, utterances, or phrases that AAC users like to say. If you attend the session, or find Karl throughout the week, you can contribute to the database or ask questions. In exchange, Karl will give you a copy of Symbol Path.

Karl will be presenting on Friday, March 1st at 3:10 pm in the Ford AB room, third floor.
Here is the link to the session page:
bit.ly/15yOOND

More about Karl Wiegand:

Karl Wiegand is a Ph.D. student in computer science at
Northeastern University in Boston, Massachusetts. He works in the
Communication Analysis and Design Laboratory (CadLab) under the
advisement of Dr. Rupal Patel. Since joining the CadLab in 2009, Karl
has been working on alternative methods of communication for users
with neurological
impairments and severely limited mobility. His research includes
aspects of interface design, artificial intelligence, and language
theory.

Here are more ways to contact Karl, and help with his corpus gathering project:

Karl Wiegand’s homepage: www.ccs.neu.edu/home/wiegand/
Karl’s lab: www.cadlab.neu.edu/
Link to Karl on LinkedIn: www.linkedin.com/in/karlwiegand/

Finally, if you know or love an AAC user, you can help get the ball rolling on data-gathering here:

www.cadlab.neu.edu/corpus/

Don’t forget to use hashtag #CSUN13 when tweeting about the event. See you in San Diego!

LL

Published in Accessible experts AT articles Cool Tools Data Mining

2 Comments

  1. […] strongly encourage you to read Laura Legendary's excellent write up on Karl's work at the Accessible Insights Blog. Laura does a great job of presenting AAC as well as defining some of the other terms often used in […]

Comments are closed.