Adding APIs for an accessibility service to intercept key events.
Now that we have gestures which are detected by the system and interpreted by an accessibility service, there is an inconsistent behavior between using the gestures and the keyboard. Some devices have both. Therefore, an accessibility service should be able to interpret keys in addition to gestures to provide consistent user experience. Now an accessibility service can expose shortcuts for each gestural action. This change adds APIs for an accessibility service to observe and intercept at will key events before they are dispatched to the rest of the system. The service can return true or false from its onKeyEvent to either consume the event or to let it be delivered to the rest of the system. However, the service will *not* be able to inject key events or modify the observed ones. Previous ideas of allowing the service to say it "tracks" the event so the latter is not delivered to the system until a subsequent event is either "handled" or "not handled" will not work. If the service tracks a key but no other key is pressed essentially this key is not delivered to the app and at potentially much later point this stashed event will be delivered in maybe a completely different context.The correct way of implementing shortcuts is a combination of modifier keys plus some other key/key sequence. Key events already contain information about which modifier keys are down as well as the service can track them as well. bug:8088812 Change-Id: I81ba9a7de9f19ca6662661f27fdc852323e38c00
Loading
Please register or sign in to comment