Gestures over user interface

Put your problem here if it does not fit any of the other categories.

Gestures over user interface

Postby spook » Wed Apr 20, 2011 10:59 am


I was wondering, how to create application supporting gestures. Suppose, that an activity contains an area with buttons, which can be switched to two other such areas. Switching can be performed with a swipe gesture - like switching desktops on main screen.

Let's look at the mater from following perspective. When user touches button, program doesn't yet know, what user wish to do. If user lifts the finger from screen, it shall be interpreted as a button click. On the other hand, if user moves the finger after touching, the program shall start the gesture recognition process.

I've put the button on GestureOverlayView and confirmed, that I was able to implement gesture events when swiping on the screen. However, the button was also responding to touch events and it looked like both views were handling them. I wish to protect myself against such behavior: when user moves finger on the screen, the button shall no longer react to touch events.

How to develop such protection mechanism?

Best regards -- Spook.
Once Poster
Once Poster
Posts: 1
Joined: Wed Apr 20, 2011 9:15 am


Return to Other Coding-Problems

Who is online

Users browsing this forum: Alexa [Bot] and 6 guests