Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using UiAutomation from an accessibility service

I am writing an accessibility service for Android which aims at providing different alternatives for people with physical disabilities to control a device, for instance, using switches, scanning, head tracking, and others.

Currently, to perform the actual actions on the application's interface we use the accessibility API, basically the AccessibilityNodeInfo.performAction() method. This works fine most of the time, but we found some important restrictions:

  • Most keyboards (IME) just do not work. We only had success with the Google keyboard on Lollipop (API 22) and had to use AccessibilityService.getWindows(). For lower API versions we had to develop a special keyboard (undoubtedly not the optimal solution).
  • Most games are not accessible. DOT. They do not export a AccessibilityNodeInfo tree.
  • Web navigation is not practical (no scrolling, among other issues).

A solution would be to use a different API to perform the actions, and it seems that android.app.UiAutomation would fit the purpose. According to the documentation "It also allows injecting of arbitrary raw input events simulating user interaction with keyboards and touch devices" which is what I am looking for; although I understand that UiAutomation is intended for testing purposes (and, perhaps, not ready for production quality code) and that, perhaps, might not behave the same on different devices. I also understand that such an API might be a security hole if any application could use it. But it seems reasonable to allow an accessibility service to use UiAutomation given that AccessibilityNodeInfo.performAction() provides similar "powers".

So, I tried the following inside my accessibility service:

 Instrumentation i = new Instrumentation();
 UiAutomation automation = i.getUiAutomation();

But getUiAutomation() always returns null.

Is there any option to use UiAutomation (or similar API) inside an accessibility service?

BTW: rooting the device is not an option to us so we cannot inject events through the screen driver

like image 896
Cesar Mauri Avatar asked Nov 09 '22 11:11

Cesar Mauri


1 Answers

I answer my own question.

Since Nougat (API 24), the way to programmatically execute actions on other applications is by using an accessibility service and the AccessibilityService#dispatchGesture method.

like image 72
Cesar Mauri Avatar answered Nov 15 '22 11:11

Cesar Mauri