Problem with touchscreen events on Windows 7 touchscreen...

I'm making a kiosk application that unfortunately needs to run on
Windows 7, using a touchscreen that uses the Windows 7 touch drivers.
Everything works well except that my app's button events are triggered
when I remove my finger from the touchscreen, instead of when I first
touch it. That means, for example, that the hit states on the on-
screen keyboard I made using wxBitmapButtons aren't triggered until I
*release* the key, which is disorienting.

Googling around, I see there's an event in Windows called "WM_Touch".
Is wxPython able to bind to that event?

Or does anyone by chance know how to adjust the Windows 7 toushcreen
settings so it acts like a normal touchscreen, i.e. mouse emulation?
I've tried disabling all flicks and gestures and whatnot, basically
stripped down to nothing but "use touchscreen as an input device", but
still this problem persists.

Thanks for any help.

wrybread wrote:

Or does anyone by chance know how to adjust the Windows 7 toushcreen
settings so it acts like a normal touchscreen, i.e. mouse emulation?

That's exactly what it's doing. Try doing a slow-click with a mouse,
and I believe you'll see that your button event does not actually fire
until the button is released. That, for example, is the only way to
separate the start of a drag-and-drop operation from a button click.

···

--
Tim Roberts, timr@probo.com
Providenza & Boekelheide, Inc.

That's exactly what it's doing. Try doing a slow-click with a mouse,
and I believe you'll see that your button event does not actually fire
until the button is released. That, for example, is the only way to
separate the start of a drag-and-drop operation from a button click.

I know, but on other touchscreens, not using the Windows 7 driver, the
event fires on first press. That's how, for example, when you use a
check-in kiosk at an airplane, the keyboard buttons can show you a
pressed state. With the Windows 7 driver, that appears to not be the
case.

They do, I'm told, fire a "WM_Touch" event when the screen is first
touched, but I'm not sure if wxPython can sense that event.

You need to hook into the window's WndProc chain to see the native messages. There are a couple examples here: HookingTheWndProc - wxPyWiki

···

On 10/26/11 8:05 PM, wrybread wrote:

That's exactly what it's doing. Try doing a slow-click with a mouse,
and I believe you'll see that your button event does not actually fire
until the button is released. That, for example, is the only way to
separate the start of a drag-and-drop operation from a button click.

I know, but on other touchscreens, not using the Windows 7 driver, the
event fires on first press. That's how, for example, when you use a
check-in kiosk at an airplane, the keyboard buttons can show you a
pressed state. With the Windows 7 driver, that appears to not be the
case.

They do, I'm told, fire a "WM_Touch" event when the screen is first
touched, but I'm not sure if wxPython can sense that event.

--
Robin Dunn
Software Craftsman