I have a custom UIControl
that contains a few other controls. In between those controls there is empty space and background of my UIControl
needs to be transparent.
I need to catch all touch events that happen on my custom UIControl
even if they occur just between other controls (over transparent areas). I cannot use gesture recognizers I need more control then they provide. Instead I would like to register touches handling functions like this:
myControl.addTarget(self, action: "handleTouchDown:event:", forControlEvents: UIControlEvents.TouchDown)
With this aproach I receive touches that happened with over non transparent areas of myControl
but not those that happen ower transparent background.
I tried overriding hitTest::point:withEvent
in my custom control not to check for alpha value. But the hitTest::point:withEvent
is not even called when touch happens over transparent area of control. I replaced my control's layer
by custom CALayer
and have overriden hitTest
on that too with no result (hitTest
on the layer seems not to be called at all).
To provide a perfect answer (and win the bounty) all you need to do is:
UIControl
(for example UIButton
).UIControl
(text from UIButton
) and make its background transparent (either set to clear color or set alpha channel to 0).addTarget::action:forControlEvents:
method to register for UIControlEvents.TouchDown
events on the control. In handler method print something to console.addTarget::action:forControlEvents:
. No hacking solutions are prefered. I know that setting background alpha channel on the control to 0.01
will make it work all the sudden but that is kind of hack I do not want. Describe here what you did.Following your EDIT section:
https://github.com/soxjke/TransparentControl
1) If i set the background colour to +[UIColor clearColor]
the touches work wonderful. So you have no need to do smth more, go ahead with clear color. (top button)
2) If i set alpha = 0, touches are not handled. OK (middle button)
3) To handle this touches there's simple solution (bottom button), subclass UIButton
(actually you can go with anything in hierarchy up to UIView
). Override the hitTest:withEvent:
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
return CGRectContainsPoint(self.bounds, point) ? self : nil;
}
PROFIT
4) If you need to go deeper, use
touchesBegan:withEvent:
touchesMoved:withEvent:
touchesEnded:withEvent:
touchesCancelled:withEvent:
on your UIResponder
subclass like Rob Glassey proposed in his answer
P.S. I'll end with off topic. Don't know what your task actually is, but telling that you can't use recognizers because you need "more control" on all touch events discovers that you don't know the possibilities of gesture recognisers. So, judging from my experience i'd say you're rather inventing bicycle then doing good solution.
P.P.S. If proposed by me and another guys here methods don't work for you - check your control's userInteractionEnabled
Appendix (view controller code to test:):
#import "ViewController.h"
#import "TransparentControl.h"
@interface ViewController ()
@property (weak, nonatomic) IBOutlet UIButton *buttonClearColor;
@property (weak, nonatomic) IBOutlet UIButton *buttonAlpha0;
@property (weak, nonatomic) IBOutlet TransparentControl *customButtonAlpha0;
@end
@implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
[self.buttonClearColor addTarget:self action:@selector(touchUpInside:) forControlEvents:UIControlEventTouchUpInside];
[self.buttonClearColor addTarget:self action:@selector(touchUpOutside:) forControlEvents:UIControlEventTouchUpOutside];
[self.buttonClearColor addTarget:self action:@selector(touchDown:) forControlEvents:UIControlEventTouchDown];
[self.buttonAlpha0 addTarget:self action:@selector(touchUpInside:) forControlEvents:UIControlEventTouchUpInside];
[self.buttonAlpha0 addTarget:self action:@selector(touchUpOutside:) forControlEvents:UIControlEventTouchUpOutside];
[self.buttonAlpha0 addTarget:self action:@selector(touchDown:) forControlEvents:UIControlEventTouchDown];
[self.customButtonAlpha0 addTarget:self action:@selector(touchUpInside:) forControlEvents:UIControlEventTouchUpInside];
[self.customButtonAlpha0 addTarget:self action:@selector(touchUpOutside:) forControlEvents:UIControlEventTouchUpOutside];
[self.customButtonAlpha0 addTarget:self action:@selector(touchDown:) forControlEvents:UIControlEventTouchDown];
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
}
- (void)touchUpInside:(id)sender
{
NSLog(@"%s", __PRETTY_FUNCTION__);
}
- (void)touchDown:(id)sender
{
NSLog(@"%s", __PRETTY_FUNCTION__);
}
- (void)touchUpOutside:(id)sender
{
NSLog(@"%s", __PRETTY_FUNCTION__);
}
@end
The documentation for hitTest
mentions totally transparent things being ignored, so it is possible that overriding hitTest
itself is not enough to get around this, as whatever calls hitTest
isn't calling it when the object is transparent.
Instead I'd suggest you try falling down to the lower level UIResponder
touch methods if you need to get access to the raw touch events no matter what (these are inherited by UIView
and UIControl
so are available to you).
They are:
touchesBegan:withEvent:
touchesMoved:withEvent:
touchesEnded:withEvent:
touchesCancelled:withEvent:
The first parameter is a NSSet
of touches, the second a UIEvent
like what you've been referring to in your other methods...
With this you don't need to add target-action, but are overriding these methods on your custom control. These are lower level (and old-school in the extreme) but should give you total control over the touch events.
I subclassed UIControl
with an empty drawRect
method and it worked.
According to the docs, opaque
is ignored by UIButton
and some other controls, so that can't be used as the control point for this technique. Curious however is that the default background color for a view is transparent (nil).
By subclassing UIControl
and setting opaque = NO
, you can create a drawRect
method which doesn't fully fill the frame and allows for "transparent" regions without setting alpha = 0
allowing for hitTest:withEvent:
to still pick up events. Since the element is a UIView
, you should be able to add views and then implement your own drawRect
which calls all the subviews' equivalent functions while not drawing the regions which are supposed to be transparent.
My basic ViewController elements, the ImageView
was to ensure it worked.
@implementation MyViewController
- (void)viewDidLoad {
[ super viewDidLoad ];
transparentControl = [ [ TransparentControl alloc ] initWithFrame:CGRectMake( 0, 0, 400, 400 ) ];
[ transparentControl addTarget:self action:@selector(printText) forControlEvents:UIControlEventTouchUpInside];
// Create an image view below the button for proof the control is transparent
UIImageView * imageView = [ [ UIImageView alloc ] initWithImage:[ UIImage imageNamed:@"BGImage.jpg" ] ];
imageView.frame = self.view.frame;
[ self.view addSubview:imageView ];
[ self.view addSubview:transparentControl ];
}
-( void )printText {
NSLog( @"Hello, this is a transparent button." );
}
@end
And my transparent control.
@implementation TransparentControl
- ( instancetype )initWithFrame:( CGRect )frame {
if( self = [ super initWithFrame:frame ] ) {
self.opaque = NO;
self.userInteractionEnabled = YES;
}
return self;
}
- ( void )drawRect:(CGRect)rect {
}
@end
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With