I'm creating a Windows Forms application that has a couple of clickable panels that require the touchscreen equivalent of the mouse down and up event.
When I'm testing with my keyboard and mouse the event are fired correctly and the application reacts as expected. However when testing on a touchscreen it is not. The only event that does work correctly is the click event but my application requires a mouse down event to continuously update a value.
Has anyone encountered a problem like this and found a solution?
Just doing a little reading, I think you need to override the WndProc and look for WM_TOUCH events.
Have a look at the Windows 7 Multitouch .NET Interop Sample Library which has examples on handling touch and gestures in winforms.
You have to override the WndProc, capture the messages and launch your MouseDown and MouseUp events manually
public const int WM_POINTERDOWN = 0x246;
public const int WM_POINTERUP = 0x247;
[PermissionSet(SecurityAction.Demand, Name = "FullTrust")]
protected override void WndProc(ref Message m)
{
base.WndProc(m);
switch (m.Msg)
{
case WM_POINTERDOWN:
{
MouseEventArgs args = new MouseEventArgs(MouseButtons.Left, 1, this.MousePosition.X, this.MousePosition.Y, 0);
MouseDown(this, args);
break;
}
case WM_POINTERUP:
{
MouseEventArgs args = new MouseEventArgs(MouseButtons.Left, 1, this.MousePosition.X, this.MousePosition.Y, 0);
MouseUp(this, args);
break;
}
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With