I've write an application which streaming image from camera to other side over socket.
I make a simple test with the Take picture callback, and send data array to client like this:
public void onPictureTaken(byte[] data, Camera camera) {
PrintWriter outPrint = new PrintWriter(new BufferedWriter(
new OutputStreamWriter(mOutputStream)), true);
BufferedReader input = new BufferedReader(new InputStreamReader(
mInputStream));
new HHH().execute(data);
try {
// Send file size
outPrint.println(String.valueOf(data.length));
// Read command back
String command = input.readLine();
// Send byte[]
mOutputStream.write(data);
System.out.println("Sent image...");
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
It's work ok, and the size of the data array about 10,000 => 20,000
Then I've try with the onPreviewFrame, but the data array is too big, it's about 400,000 and it's will be too slow when send over socket!
Can I resize that data? Or is there any another method like onPreviewFrame to take picture and streaming? Or can I make multi capture camera and send that image to client?
Thank you.
EDIT
I found a solution to decode the data[] to use as normal. Hope this help someone.
if (camera.getParameters().getPreviewFormat() == ImageFormat.NV21) {// NV21
// Convert to JPG
Size previewSize = camera.getParameters()
.getPreviewSize();
YuvImage yuvimage = new YuvImage(data,
ImageFormat.NV21, previewSize.width,
previewSize.height, null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0, 0,
previewSize.width, previewSize.height), 50,
baos);
byte[] jdata = baos.toByteArray();
//Use the jdata as normal.
}