Kirk Wood writes: > In short, the only > way to program with known accessibility is to limit oneself to the M$ > Foundation Library. But this negates the whole point of C++. A developer > is supposed to be able to create a new control. Should it then rest on him > to modify the OS so that speech works? I thought the whole point of MSAA was that applications could define their own controls and use the MSAA API to make them accessible by providing information about them to accessibility aids like screen readers. MS has implemented MSAA in Internet Explorer, and to some degree in Office as well. Before MSAA was introduced, screen readers could provide access to applicatiosn that used the standard Windows controls, but not necessarily to applications that had their own custom controls. I thought MSAA was designed to solve that problem. But anyway, this is a Linux mailing list, so I'd like to present a few ideas about how access to GUI's under Linux could be achieved. I think the main problem in achieving access to Linux GUI's is that there are many GUI toolkits, each having its own set of widgets controls. I'm aware that there's a project called GSpeech which is trying to add speech to the GTK toolkit, which is used by the GNOME desktop. It's a GTK module that's automatically loaded into any application which uses GTK. It has its own commands for reading things on the screen, and it currently supports the Festival software synthesizer. I'm glad the author of GSpeech is putting the time and effort into this area, but I don't think he's using the right approach. GSpeech only provides access to GTK and GNOME, and though GHONE may be your primary desktop, you're bound to run an application sometime that doesn't use GTK as its toolkit. I suppose that every toolkit and window manager could be speech-enabled in a similar way, but then you'd have to use different screen review commands for each toolkit, and keep track of which one is used by each application you run. I think a better approach is to have a screen reader which runs outside of all your applications, and then modify or extend the GUI toolkits so that they expose information about the GUI's to the screen reader in a consistent way. And this screen reader should be able to at least read the text in any application window, regardless of what GUI toolkit is being used, so that you have some level of access to all X applications. -- Matt Campbell <mattcamp at crosswinds.net> Web site: http://www.crosswinds.net/~mattcamp/ ICQ #: 33005941