I have a game without built-in controller support, with touch-based controls. I would like to use a gamepad with it and get some hands-on programming experience in Android.
I want to figure out if I can have my app listen for controller events, and forward them to my games.
I learned how to process controller events. Then I found out I cannot forward them due to security - programmatically sending touch events is now allowed for good reasons.
According to this blog (its SSL cert expired), there are three ways to inject touch events:
Other methods I have found include building your own platform, so you can set your application as a system app and sign it with the platform certificate, to provide it with the required permissions.
Or wrapping another application within your own, but using Instrumentation to send touch events there still require root.
I have also read about the Accessibility API, but it looks like it is considered a misuse of the API and maybe impossible in the future.
I would rather not root my device, as many games have issues if they detect root.
I was about to give up, but then I found apps like these:
Despite Android security, options 2 seem to be able to do it regardless. How is it doing it?
EDIT: I downloaded Octopus' APK and took a look inside. It includes a file named libinject.so... so that's where it is doing its magic. I have a new direction to research now, but if anyone has links to resources related to this I would really appreciate it.
Setup a touch listener In order to make your OpenGL ES application respond to touch events, you must implement the onTouchEvent() method in your GLSurfaceView class. The example implementation below shows how to listen for MotionEvent. ACTION_MOVE events and translate them to an angle of rotation for a shape.
Touch Event PropagationThe DOWN touch event is passed to "View C" onTouchEvent and the boolean result of TRUE or FALSE determines if the action is captured. Because "View C" returns true and is handling the gesture, the event is not passed to "ViewGroup B"'s nor "ViewGroup A"'s onTouchEvent methods.
HOW TO HANDLE CONTROLLER ACTION [ALSO API 16+]
At the system level, Android reports input event codes from game controllers as Android key codes and axis values. In your game, you can receive these codes and values and convert them to specific in-game actions.
When players physically connect or wirelessly pair a game controller to their Android-powered devices, the system auto-detects the controller as an input device and starts reporting its input events. Your game can receive these input events by implementing the following callback methods in your active Activity
or focused View
(you should implement the callbacks for either the Activity
or View
, but not both).
More in detail you can find in this documentation.
SUPPORT CONTROLLERS ACROSS ANDROID VERSIONS [API 12+]
If you are supporting game controllers in your game, it's your responsibility to make sure that your game responds to controllers consistently across devices running on different versions of Android. This lets your game reach a wider audience, and your players can enjoy a seamless gameplay experience with their controllers even when they switch or upgrade their Android devices.
More in detail you can find in this documentation.
SINGLE TOUCH EVENTS
For single touch events that you want to controll you can use AccessibilityService combined with dispatchGesture. But I don't think that is what you are really looking for. You don't want to handle the Touch
itself, but the parameters that a controller is using for. That's why I added the 2 upper paragraphs, to achieve what you you want to have.
REASON WHY I ADDED THIS ANSWER
I know it's kinda late but till yet there is now acceptable answer out there. And in terms of reaching higher API level, it would be useful to know how to handle these stuff. Cheers :)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With