text-only page produced automatically by LIFT Text
Transcoder Skip all navigation and go to page contentSkip top navigation and go to directorate navigationSkip top navigation and go to page navigation
National Science Foundation
design element
News From the Field
For the News Media
Special Reports
Research Overviews
NSF-Wide Investments
Speeches & Lectures
NSF Current Newsletter
Multimedia Gallery
News Archive
Press Releases
Media Advisories
News Tips
Press Statements
Speech Archives
Frontiers Archives

Adaptive Technologies Encourage Independent Learning

November/December 1998

James Lynds runs a small defense engineering company in Farmington, Utah. Following the birth of his granddaughter Darci, who has physical disabilities, he was inspired to develop products that could help the disabled use computers.

First, with his own funds, he designed an external box called "Darci Too," which could be used in place of a keyboard and mouse configuration. Then, with the help of an NSF grant, Lynds expanded this technology into the "Darci card," a PC card that enables the user to activate the computer with either an on-screen keyboard or Morse Code. The card allows a variety of methods for controlling the computer, including a "sip and puff" switch that activates commands through breath control; a head-mounted device that uses a light sensor; and a joystick that can be activated by a single extremity such as a foot. The Darci card has several advantages over the external box: It is smaller (the size of a playing card), it can perform more functions and it costs half as much.

Darci cards can be inserted into small, hand-held computers equipped with Microsof's Windows CE. These small portable computers enable wheelchair-bound children or adults to take their adaptive technology with them wherever they go. "It is not uncommon for persons with disabilities to control these small computers far faster than non-disabled persons who are less able to use the smaller keys and are less skilled in their use," says Lynds.

According to Larry Scadden, senior director for Programs for Persons with Disabilities at NSF, "The Darci cards provide a level of flexibility for computer use that did not exist before." Lynds' company, Westest Engineering Corporation, has been selling the Darci cards since 1996, primarily to vocational rehabilitation agencies.


Several new advances in adaptive technology are being developed to enable persons with disabilities to be more independent and to become lifelong learners.

One NSF-sponsored project underway at Arizona State University in Tempe is making real progress in this area. For example, chemical engineering students with limited sight can now hold the results of their experiments in their hands. First, images produced by Scanning Probe Microscopes (SPM) are geometrically modeled on a computer screen. Then the images are converted into three-dimensional plastic models using a Layered Manufacturing (Rapid Prototyping) machine. Using these tactile devices, visually impaired students not only can touch, feel and observe the data models, but also can use a wire grid to measure different components.

Although these models were designed to enable students with limited vision to be more independent learners in scientific and mathematical courses, the possible adaptations are broad. "The way science is taught in our schools needs to be restructured to make it more accessible to every student, with or without disabilities," says Scadden.The principal investigator, Anshuman Razdan, director of the Partnership for Research in Stereo Modeling (PRISM) Center, believes that this project can be used in a wide variety of settings. "About 99 percent of us are learning disabled when it comes to understanding three-dimensional concepts," he says. For example, students with and without sight limitations can use this technology to better understand calculus formulas that can be modeled into three-dimensional shapes.

Local university students with limited sight are now testing these models at Arizona State. Next steps include efforts to vary the surface texture on the models to better represent the organic images produced by the microscope. Then, Braille text could be added to provide a legend describing the various parts of the model based on the specific textures used. It is expected that, because these models are made out of plastic, they could be used by faculty and students for many years.


The electronic information age poses challenges to people with vision impairments who prefer to use the 170-year-old Braille system for reading materials available on the Internet and on CD-ROMs. Although this material is currently accessible inexpensively to individuals using computer speech technology, until now there has not been a low-cost way of presenting this material in Braille.

Dan Hinton, senior design engineer at S.A.I.C. in Arlington, VA, is working on two products originally invented by John Becker of Tactilics, also in Arlington, VA, to improve Tactile Display Units (TDUs) used by blind people. Currently, paperless Braille machines can only display one line of 20 characters at a time. Tactilics is developing display screens--six lines of 40 characters--that can be refreshed, or redrawn. Thus far, a multi-line screen has been developed as well as a single-line TDU that displays Braille characters at up to 240 characters per minute.

These NSF-sponsored advances are being tested to see how adaptable they are to everyday concerns. "Developing an affordable, refreshable display unit has been the highest priority for the blind for many years," says Scadden. The intent is that affordable, reliable units will open up many job opportunities for blind employees, he adds.


Kathleen McCoy of the University of Delaware has spent several years developing software to help deaf students improve their written English skills. Specifically, McCoy is teaching English as a second language to high school and college-age students already competent in American Sign Language (ASL). "Students very well-versed in ASL are better able to learn written English with the help of software that improves grammar and syntax errors commonly made by deaf students," says McCoy.

After students enter text into the computer, the program analyzes errors and labels mistakes in a color code. For example, if students see the passage "horses stops" highlighted in yellow, they know that a subject-verb agreement mistake has been made. The program explains the nature of the mistake to the students and suggests how they can address the error.

As part of this project, McCoy has analyzed many examples of material written by deaf students to understand the most common patterns of mistakes. The goal of this NSF-sponsored project is to help deaf people be more successful both in school and in the workplace as they are increasingly asked to produce reports in standard written English.


A "talking head" is another NSF-funded project that addresses the needs of deaf students. This computerized talking head, known as "Baldi," moves his jaw, lips, tongue and cheek muscles to mimic the way people speak. Elementary-age deaf students at the Tucker-Maxon Oral School in Portland, Oregon, are currently using Baldi to help them learn to articulate words and sentences better so they can eventually be mainstreamed into traditional public schools.

Ronald Cole, director of the Oregon Graduate Institute of Science and Technology's Center for Spoken Language Understanding, is a principal investigator of the study. He is working with the developers of the talking head, Dominic Massaro and Michael Cohen of the University of California, Santa Cruz, to develop software to make Baldi interactive.

For example, a teacher enters a series of words for Baldi to pronounce and a facial animation program instructs Baldi to make the appropriate movements and sounds. In addition, Baldi can ask the students questions and create games based on the classroom curriculum. In the past, computerized speech was available but students could not interact easily with the machine, so they couldn't engage in real "conversations." Massaro says, "the primary goal of the project is to determine if the eye can instruct language learning as well as the ear can." Cole and Massaro are working to replace the synthesized voice with real speech and to expand the variety of languages available.


As we move into the next century, researchers stress that adaptive technology is crucial in allowing the disabled to acquire and maintain skills that help them adapt to changes in their personal and professional lives.

"Technology can now address serious problems faced by people with disabilities, who in the past were passed over for opportunities in work and daily life," says Gary Strong, deputy division director in NSF's Division of Information and Intelligent Systems. "This technology is also a good example of ways we can put tools in the hands of people with disabilities to help them learn and create new knowledge."

See Sidebar: Computer Interface Helps Deaf-Blind Community

Return to November/December 1998 Frontiers home page   Other Contents of This Issue
Visit Other Frontiers Issues page   Other Frontiers Issues
Visit Other NSF Publications page   Other NSF Publications
Visit Office of Legislative and Public Affairs page   Office of Legislative and Public Affairs


Email this pagePrint this page
Back to Top of page