[sugar] Re: An agenda for accessibility?

Henrik Nilsen Omma henrik
Wed Nov 15 11:10:28 EST 2006


Benjamin Hawkes-Lewis <bhawkeslewis <at> googlemail.com> writes:

> 
> Preamble
> --------
> 
> I'm a web accessibility obsessive and a member of the Ubuntu
> Accessibility team. I came looking to see what efforts OLPC was making
> towards accessibility

...

Hi Benjamin and others,

I'm also on the Ubuntu accessibility team. Here are my two cents:

The framework used by Gnome, AT-SPI is slow even on modern systems and will
probably be way too heavy for OLPC at this point. I'm sure the AT-SPI code can
be tightened a great deal with more testing and profiling so it may well be that
a second or third generation OLPC will run it just fine.

In the meantime we should look at lighter alternatives. There are several useful
packages that should run well on OLPC out of the box:

 * onBoard - an on-screen keyboard written in python using cairo. It does not
use AT-SPI. It does have a gconf dependency but that can be removed.

 * eSpeak - a lightweight speech synthesiser with support for a growning number
of languages (making new voices for easpeak is relatively easy).

 * brltty - providing output to braille displays

 * The speakup kernel patch for reading terminal information to speech - this
may be missing the target group a bit though.

Regarding general accessibility areas:

Screen reading
------------------------

Normally you would have a screen reader that collects content from other
applications and sends it to the speech synthesiser for output. This can be the
text in a document but it may also be UI information like buttons and windows.

To avoid using the whole gnome AT stack we may need to make some of the
applications accessible within themselves. Abiword for example should be able to
send text information directly to eSpeak. This is a much more limited approach
than what is currently available on Gnome, where basically any GTK+ application
can be probed by a screen reader without the application needing special
modification.

I think the application-side approach is viable on OLPC for two reasons: There
is a limited number of applications on the platform and the interface of these
is simple. The first means that even if we have to hack accessibility directly
into each application, there is a finite number so it should be doable. The
simple interface also reduces the number of extra lines of output code needed in
each. 

The same two factors also make it easier for kids to use even if the
implementation is not very complete to start with. Visually impaired people
generally navigate computers with keyboard short cuts and commands. With a
simpler interface there will be fewer of these to remember. In a standard gnome
application you can navigate the menu structure using the screen reader to
discover what options are available as you go along. On OLPC we might need to
just provide keyboard short cuts for each function, and just document them well
so they can be learned.

Let's imagine a simple OLPC application that only has 10 functions you can
perform. If each of those can be activated by a hotkey then it should be fairly
quick to learn how to use. A screen reader has its own keybinding you would need
to learn anyway. Learning all the hotkeys for 10 simple applications on the OLPC
should not be too bad if there is some sensible overlap -- e.g. Ctrl-S will
always be save.

So in the case of a word processor, it could be hacked to send the text content
to espeak. It could be set to read continuously or one line at a time (as you
move the cursor down). Ctrl+S would open the save dialog which would also be
modified to read out it's content. 

FireVox is an example of an application with built-in accessibility.


Speech recognition
------------------------------

We would all love more work in this area, but ATM there are no good free
alternatives for voice dictation and all recognition engines are very
computationally intensive.


Braille
-----------

Given the screen reading discussion above it should be fairly easy to add
braille output via brltty once a suitable text stream is available.
Unfortunately braille displays are very expensive (simply by being special
equipment). We would of course love to see a simple $100 USB braille display :)


OLPC is destined to do great things for the education of economically
disadvantaged children all over the world. It's good to also see a focus on
accessibility in OLPC now. Kids with disabilities often find learning difficult
and may get lasting disadvantages from early on. As an example, young deaf
children do not get the same natural exposure to language as other kids and
therefore don't develop certain key skills related to that. As a result they
find it very difficult to learn to read and write later in their education.
Without being able to communicate verbally combined with poor reading and
writing skills they can become doubly disadvantaged for life. It would be great
if the OLPC project could contribute to reducing those gaps rather than widening
them!


Henrik




More information about the Sugar-devel mailing list