I found PlusMinus' PizzaTimer example very helpful for setting up my Thread and messaging system. I hope this forum will help as thoroughly with my current YUV decoding problem:
I want to display live camera data through a custom filter.
I am trying to decode the byte data buffer returned to Android's Camera.PreviewCallback onCameraFrame() callback function. The data is in YCbCr_422_SP format. I have verified that the first (width*height) bytes are simple Y luminance values that can be displayed (via Bitmap and ImageView) to make a viable gray-scale image. The total number of bytes are (width * height * 3 / 2).
The remaining 1/2 image bytes are clearly used to store U, V (Cb, Cr) data. Therefore, there are 1/4 image bytes for each U, V component (i.e. each U, V component is used for 4 pixels of the image). This looks like 411 or 420 data, not 422, but we have bigger fish to fry.
I cannot determine if the U V data is aligned adjacently, in alternating rows, or in squares as described in this Wikipedia graphical description: http://en.wikipedia.org/wiki/Image:Yuv420.svg
. Once I finally determine the structure of the U, V data, I have several equations to convert from YUV to RGB and I have tried many ways of combining the UV data with the luminance data of the first 2/3 of the buffer to no avail. So far I can only display mono-chrome.
If you or anyone on this list can decode the Android YCbCr_422_SP data, please post the solution as soon as you can. Your efforts and generosity will be greatly appreciated.
- Thank you, David Manpearl