I am currently looking into the sample sceneform app and I notice that when I run it, I can translate the model when I drag it with one finger and rotate it with two fingers.
What I want to do is the translation to be done when two fingers are on screen and moving, and the rotation with only one finger moving left/right.
Since the documentation for ARCore is currently not finished it is hard to figure it out on my own with the decompiled ARCore code.
Thanks!
You would have to provide an own TransformationGestureDetector for the TransformationSystem of the ArFragment. But that does not seem to be easily possible at the moment. So you would have to skip the ArFragment and use ArSceneView directly. That one behaves like a default Android View so you can use an onTouchListener and use a GestureDetector to detect the gestures. But in this case you have to do rotation and translation of your objects on your own.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With