Why does MotionEvent.getEdgeFlags() always return 0?

Put your problem here if it does not fit any of the other categories.

Why does MotionEvent.getEdgeFlags() always return 0?

Postby pnguin34 » Sun Nov 15, 2009 1:16 am

I was expecting the getEdgeFlags() method of MotionEvent to tell me
when a MotionEvent had reached the edge of something, but the value
returned from getEdgeFlags() is *always* zero. Is this the expected
behavior? The documentation says that the flags indicate when a touch
has reached the edge of the display. I've tried this on a real device
and in the emulator, and the location coordinates never quite reach
the edge of the display, and getEdgeFlags() always returns 0. By
"never quite reach" I mean that if dragging a finger off the left edge
of the display, the smallest X I got was 2. Reaching the edge of the
view doesn't seem to change the value returned either.

I suppose I could set the flags myself using setEdgeFlags() using
calculations with known dimensions of the object whose edges I care
about. Is that how it's supposed to be used? Is this broken for now?

Thanks for your help.

- dave
pnguin34
Freshman
Freshman
 
Posts: 2
Joined: Thu Oct 01, 2009 2:19 am

Top

Return to Other Coding-Problems

Who is online

Users browsing this forum: Yahoo [Bot] and 22 guests