The interaction would best take place at the level of the access software so that the braille and speech devices can be configured to perform complementary functions, in ways that would require exploration and experimentation. For example, the braille display could be set to display text and cursor position while editing, at which time the speech would remain silent, whereas the speech would be activated if an error or other informative message arose (this is just an example).