Imagine a layout with 4 buttons
_______________________________ | | | | A | B | |______________|________________| | | | | C | D | |______________|________________|
I'd like to detect the fling gesture over the whole layout but when the fling starts over a button is no detected.
I'm using:
public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); gesturedetector= new GestureDetector(this, this); findViewById(R.id.touchContainer).setOnTouchListener(new OnTouchListener() { @Override public boolean onTouch(View v, MotionEvent event) { Log.e("","TouchEvent"); return gesturedetector.onTouchEvent(event); } }); }
It when there is no clickable items but fails if the fling start over a clickable item.
How can I solve that? Offering a bounty of 50 point for a complete working answer
One way I have achieved this is to override the following method:
public boolean onInterceptTouchEvent(MotionEvent event){ super.onInterceptTouchEvent(event); ...
You can override this method in your layout container (e.g. ViewGroup, or whatever you're holding the buttons with) and continue to return false
from it in order to 'intercept' touch events that are being consumed by child View
s (i.e. your buttons). Within that overridden method you can then call your gesture detector object with the MotionEvent
s. This method also 'sees' events that target the ViewGroup
itself as well, which means - if I remember correctly - you would only need to call your gesture detector from within that method, and in doing so the gesture detector will 'see' all events, no matter whether they'er over the buttons or not. So if you drag your finger starting over a button and then ending at some point on the layout background, the gesture detector should see the entire swipe. You would not need to feed the gesture detector with the events from the layout's own onTouchEvent()
because it'll have already seen them.
A second way:
I just looked at my project where I used this, and realised that I switched to a different way of doing it. What I actually did was I designed all of my child View
s such that the parent Activity
(or the containing ViewGroup
) could register the same gesture detector object with all of those child View
s (each of my special View
s have a method called registerGestureDetector()
). Then, in the overridden 'onTouchEvent()' in my child View
s, I pass the MotionEvent
s to the gesture detector that has been registered with that View
. In other words, the parent ViewGroup
layout and all the child View
s simply share the same gesture detector.
I realise that this may sound like a bit of hassle and not necessary considering it could be done using onInterceptTouchEvent()
, but my application deals with some pretty complicated rules regarding how my View
s need to respond to touch events and gestures, and it allowed me to apply some additional logic that I needed specific for my application. However, both of these methods I've used achieve the same basic objective here: to channel the MotionEvent
s that targetted various View
s to the same gesture detector object.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With