I am writing an accessibility service for Android which aims at providing different alternatives for people with physical disabilities to control a device, for instance, using switches, scanning, head tracking, and others.
Currently, to perform the actual actions on the application's interface we use the accessibility API, basically the AccessibilityNodeInfo.performAction() method. This works fine most of the time, but we found some important restrictions:
A solution would be to use a different API to perform the actions, and it seems that android.app.UiAutomation would fit the purpose. According to the documentation "It also allows injecting of arbitrary raw input events simulating user interaction with keyboards and touch devices" which is what I am looking for; although I understand that UiAutomation is intended for testing purposes (and, perhaps, not ready for production quality code) and that, perhaps, might not behave the same on different devices. I also understand that such an API might be a security hole if any application could use it. But it seems reasonable to allow an accessibility service to use UiAutomation given that AccessibilityNodeInfo.performAction() provides similar "powers".
So, I tried the following inside my accessibility service:
Instrumentation i = new Instrumentation();
UiAutomation automation = i.getUiAutomation();
But getUiAutomation() always returns null.
Is there any option to use UiAutomation (or similar API) inside an accessibility service?
BTW: rooting the device is not an option to us so we cannot inject events through the screen driver
I answer my own question.
Since Nougat (API 24), the way to programmatically execute actions on other applications is by using an accessibility service and the AccessibilityService#dispatchGesture method.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With