I have an xib with a single UIView containing a bunch of images and labels. If I drag a Tap Gesture Recognizer onto the parent view, it works well, but I need to add code to determine the position of the tap so I can respond properly to the label tapped.
I thought it would be a lot easier if I could instead drag Gesture Recognizers to each label and/or image and wire IBActions to each where I respond respond appropriately. Unfortunately I cannot get this to work. I can only get the delegate methods to execute from the parent view.
What I did was drag a Tap Gesture Recognizer to a label, then wired the view controller as the TGR's delegate, and wired an IBAction method to handle the tap. In addition to using the IBAction method I tried without succes assigning the target and action method in the gestureRecognizerShouldBegin:gestureRecognizer delegate method.
Shouldn't this work, or am I just wishfully thinking?
Thanks for any help.
John
On UIImages and UILabels, userInteractionEnabled
is set to NO by default. You can use a gesture recognizer with both of them, but first you have to reset this property.
In Interface Builder or Storyboard, on the label or the image, check the box for userInteractionEnabled.
Or, in code, just do as follows (Objective-C):
myLabel.userInteractionEnabled = YES;
myImage.userInteractionEnabled = YES;
(Swift):
myLabel.userInteractionEnabled = true
myImage.userInteractionEnabled = true
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With