I am looking for ideas to improve OGL performance, maybe by rendering into a texture.
My app shows a infrequently changing image with about 360,000 vertices, overlayed onto some other OGL geometry. The "image" takes way to long to render (600ms on a G1). This makes panning (translating) or zooming (changing scale) unacceptably slow.
The key is that the data from which I render the image changes only on user command, and rarely - it is like a landscape in a game - it doesn't change often. It is okay to take a second or two to originally generate that image, but not okay to take that long to redraw it during a continuous (following a moving touch pan.
One idea is to render it (using OGL) into a texture, and then, in onDraw, just draw the underlying geometry and the two triangles holding the texture.
However, I don't see how to do this except to render to the screen, scrape the bits off the screen, rearrange the result into a texture, and display the texture. This is far from desirable.
Is there a way to render into a texture? Googling says no, but maybe something changed with 2.2.
Is there a better way to go about this?
Thanks in advance!