Labels: Android, Android emulator, Camera, Live Preview
I've been
looking into getting a live camera preview working in the Android emulator.
Currently the Android emulator just gives a black and white chess board
animation. After having a look around I found the web site of Tom Gibara who
has done some great work to get a live preview working in the emulator. The
link to his work can be found here. The basics are that you run the WebcamBroadcaster as a standard
java app on your PC. If there are any video devices attached to you PC, it will
pick them up and broadcast the frames captured over a socket connection. You
then run a SocketCamera class as part of an app in the android emulator, and as
long as you have the correct ip address and port it should display the captured
images in the emulator. On looking into Tom's code I saw that it seemed to be
written for an older version of the Android API so I thought I'd have a go at
updating it. As a starting point I'm going to use theCameraPreview sample code available on the android developers website. My aim
was to take this code and with as little changes as possible make it so it
could be used to give a live camera preview in the emulator.
So the first thing I did was to create a new class
called SocketCamera, this is based on Tom's version of the SocketCamera, but
unlike Tom's version I am trying to implement a subset of the new camera class
android.hardware.Camera and not the older class android.hardware.CameraDevice.
Please keep in mind that I've implemented just a subset of the Camera class
API. The code was put together fairly quickly and is a bit rough round the edges.
Anyhow, here's my new SocketCamera class:
<br> package com.example.socketcamera;<br> <br> import java.io.IOException;<br> import java.io.InputStream;<br> import java.net.InetSocketAddress;<br> import java.net.Socket;<br> <br> import android.graphics.Bitmap;<br> import android.graphics.BitmapFactory;<br> import android.graphics.Canvas;<br> import android.graphics.Paint;<br> import android.graphics.Rect;<br> import android.hardware.Camera;<br> import android.hardware.Camera.Size;<br> import android.util.Log;<br> import android.view.SurfaceHolder;<br> <br> public class SocketCamera {<br> <br> private static final String LOG_TAG = "SocketCamera:";<br> private static final int SOCKET_TIMEOUT = 1000;<br> <br> static private SocketCamera socketCamera;<br> private CameraCapture capture;<br> private Camera parametersCamera;<br> private SurfaceHolder surfaceHolder;<br> <br> //Set the IP address of your pc here!!<br> private final String address = "192.168.1.12";<br> private final int port = 9889;<br> <br> private final boolean preserveAspectRatio = true;<br> private final Paint paint = new Paint();<br> <br> <br> private int width = 240;<br> private int height = 320;<br> private Rect bounds = new Rect(0, 0, width, height);<br> <br> private SocketCamera() {<br> <span> //Just used so that we can pass Camera.Paramters in getters and setters<br> parametersCamera = Camera.open();<br> }<br> <br> static public SocketCamera open()<br> {<br> if (socketCamera == null) {<br> socketCamera = new SocketCamera();<br> }<br> <br> Log.i(LOG_TAG, "Creating Socket Camera");<br> return socketCamera;<br> }<br> <br> public void startPreview() {<br> capture = new CameraCapture();<br> capture.setCapturing(true);<br> capture.start(); <br> Log.i(LOG_TAG, "Starting Socket Camera");<br> <br> }<br> <br> public void stopPreview(){<br> capture.setCapturing(false);<br> Log.i(LOG_TAG, "Stopping Socket Camera");<br> }<br> <br> public void setPreviewDisplay(SurfaceHolder surfaceHolder) throws IOException {<br> this.surfaceHolder = surfaceHolder;<br> }<br> <br> public void setParameters(Camera.Parameters parameters) {<br> //Bit of a hack so the interface looks like that of<br> Log.i(LOG_TAG, "Setting Socket Camera parameters");<br> parametersCamera.setParameters(parameters);<br> Size size = parameters.getPreviewSize();<br> bounds = new Rect(0, 0, size.width, size.height);<br> }<br> public Camera.Parameters getParameters() { <br> Log.i(LOG_TAG, "Getting Socket Camera parameters");<br> return parametersCamera.getParameters(); <br> } <br> <br> public void release() {<br> Log.i(LOG_TAG, "Releasing Socket Camera parameters");<br> //TODO need to implement this function<br> } <br> <br> <br> private class CameraCapture extends Thread {<br> <br> private boolean capturing = false;<br> <br> public boolean isCapturing() {<br> return capturing;<br> }<br> <br> public void setCapturing(boolean capturing) {<br> this.capturing = capturing;<br> }<br> <br> @Override<br> public void run() {<br> while (capturing) {<br> Canvas c = null;<br> try {<br> c = surfaceHolder.lockCanvas(null);<br> synchronized (surfaceHolder) {<br> Socket socket = null;<br> try {<br> socket = new Socket();<br> socket.bind(null);<br> socket.setSoTimeout(SOCKET_TIMEOUT);<br> socket.connect(new InetSocketAddress(address, port), SOCKET_TIMEOUT);<br> <br> //obtain the bitmap<br> InputStream in = socket.getInputStream();<br> Bitmap bitmap = BitmapFactory.decodeStream(in);<br> <br> //render it to canvas, scaling if necessary<br> if (<br> bounds.right == bitmap.getWidth() &&<br> bounds.bottom == bitmap.getHeight()) {<br> c.drawBitmap(bitmap, 0, 0, null);<br> } else {<br> Rect dest;<br> if (preserveAspectRatio) {<br> dest = new Rect(bounds);<br> dest.bottom = bitmap.getHeight() * bounds.right / bitmap.getWidth();<br> dest.offset(0, (bounds.bottom - dest.bottom)/2);<br> } else {<br> dest = bounds;<br> }<br> if (c != null)<br> { <br> c.drawBitmap(bitmap, null, dest, paint);<br> }<br> }<br> <br> } catch (RuntimeException e) {<br> e.printStackTrace();<br> <br> } catch (IOException e) {<br> e.printStackTrace();<br> } finally {<br> try {<br> socket.close();<br> } catch (IOException e) {<br> /* ignore */<br> }<br> }<br> }<br> } catch (Exception e) {<br> e.printStackTrace();<br> } finally {<br> <br> // do this in a finally so that if an exception is thrown<br> // during the above, we don't leave the Surface in an<br> // inconsistent state<br> if (c != null) {<br> surfaceHolder.unlockCanvasAndPost(c);<br> }<br> }<br> }<br> Log.i(LOG_TAG, "Socket Camera capture stopped");<br> }<br> }<br> <br> }</span>
Make sure that you
change the ip address to that of your PC.
Now we just need to make a few small modifications
to the original CameraPreview. In this class look for the Preview class that
extends the SurfaceView. Now we just need to comments out three lines and
replace them with our own:
<br> class Preview extends SurfaceView implements SurfaceHolder.Callback {<br> <span> SurfaceHolder mHolder;<br> //Camera mCamera;<br> SocketCamera mCamera;<br> Preview(Context context) {<br> super(context);<br> <br> // Install a SurfaceHolder.Callback so we get notified when the<br> // underlying surface is created and destroyed.<br> mHolder = getHolder();<br> mHolder.addCallback(this);<br> //mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);<br> mHolder.setType(SurfaceHolder.SURFACE_TYPE_NORMAL);<br> }<br> public void surfaceCreated(SurfaceHolder holder) {<br> // The Surface has been created, acquire the camera and tell it where<br> // to draw.<br> //mCamera = Camera.open();<br> mCamera = SocketCamera.open();<br> try {<br> mCamera.setPreviewDisplay(holder);<br> } catch (IOException exception) {<br> mCamera.release();<br> mCamera = null;<br> // TODO: add more exception handling logic here<br> }<br> }</span>
Here i've change three
lines:
1. Camera mCamera is replaced with SocketCamera
mCamera
2.
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); is replaced with mHolder.setType(SurfaceHolder.SURFACE_TYPE_NORMAL);
3. mCamera = Camera.open(); is replaced with
mCamera = SocketCamera.open();.
So that's it.Now just make sure WebcamBroadcaster
is running and start up the CameraPreview app in the Android emulator, you should
now be seeing live previews in the emulator. Here's a short video of my
emulator with the live preview: (Yes i know, it's me waving a book around)
Note: if the WebcamBroadcaster is not picking up your devices you
most probably have a classpath issue. Make sure that you classpath points to
the jmf.jar that is in the same folder as the jmf.properties file. If JMstudio
works ok, its very likely that you have a classpath issue.
Oh, one last thing. I also updated the
WebCamBroadcaster so that it can be used with YUV format cameras, so here's the
code for that as well:
view sourceprint?
001.package
com.webcambroadcaster;
002.import
java.awt.Dimension;
003.import
java.awt.image.BufferedImage;
004.import
java.io.BufferedOutputStream;
005.import
java.io.DataOutputStream;
006.import
java.io.IOException;
007.import
java.io.OutputStream;
008.import
java.net.ServerSocket;
009.import
java.net.Socket;
010.import
java.util.Vector;
011.import
javax.imageio.ImageIO;
012.import
javax.media.Buffer;
013.import
javax.media.CannotRealizeException;
014.import
javax.media.CaptureDeviceInfo;
015.import
javax.media.CaptureDeviceManager;
016.import
javax.media.Format;
017.import
javax.media.Manager;
018.import
javax.media.MediaLocator;
019.import
javax.media.NoDataSourceException;
020.import
javax.media.NoPlayerException;
021.import
javax.media.Player;
022.import
javax.media.control.FrameGrabbingControl;
023.import
javax.media.format.RGBFormat;
024.import
javax.media.format.VideoFormat;
025.import
javax.media.format.YUVFormat;
026.import
javax.media.protocol.CaptureDevice;
027.import
javax.media.protocol.DataSource;
028.import
javax.media.util.BufferToImage;
029./**
030.
* A disposable
class that uses JMF to serve a still sequence captured from a
031. * webcam over a socket connection. It doesn't use
TCP, it just blindly
032.
* captures a
still, JPEG compresses it, and pumps it out over any incoming
033. * socket connection.
034.
*
035. * @author Tom Gibara
036.
*
037. */
038.public
class
WebcamBroadcaster {
039. public
static
boolean
RAW
= false;
040.
041.
042. private
static
Player createPlayer(int
width, int
height) {
043. try
{
044.
Vectornull
);
045. for
(CaptureDeviceInfo
info : devices) {
046.
DataSource source;
047. Format[] formats =
info.getFormats();
048. for
(Format format : formats) {
049. if
((format
instanceof
RGBFormat)) {
050.
RGBFormat rgb = (RGBFormat) format;
051. Dimension size =
rgb.getSize();
052. if
(size.width != width || size.height !=
height) continue
;
053. if
(rgb.getPixelStride()
!= 3) continue;
054. if
(rgb.getBitsPerPixel() != 24) continue
;
055. if
(
rgb.getLineStride() != width*3
)
continue;
056.
MediaLocator locator = info.getLocator();
057. source =
Manager.createDataSource(locator);
058.
source.connect();
059. System.out.println("RGB
Format Found");
060.
((CaptureDevice)source).getFormatControls()[0].setFormat(rgb);
061. } else
if
((format
instanceof
YUVFormat)) {
062.
YUVFormat yuv = (YUVFormat) format;
063. Dimension size =
yuv.getSize();
064. if
(size.width != width || size.height !=
height) continue
;
065. MediaLocator
locator = info.getLocator();
066.
source = Manager.createDataSource(locator);
067. source.connect();
068.
System.out.println("YUV Format Found");
069. ((CaptureDevice)source).getFormatControls()[0].setFormat(yuv);
070.
} else
{
071. continue;
072.
}
073. return
Manager.createRealizedPlayer(source);
074.
}
075. }
076.
} catch
(IOException e) {
077. System.out.println(e.toString());
078.
e.printStackTrace();
079. } catch
(NoPlayerException
e) {
080.
System.out.println(e.toString());
081. e.printStackTrace();
082.
} catch
(CannotRealizeException e) {
083. System.out.println(e.toString());
084.
e.printStackTrace();
085. } catch
(NoDataSourceException
e) {
086.
System.out.println(e.toString());
087. e.printStackTrace();
088.
}
089. return
null;
090.
}
091. public
static
void
main(String[]
args) {
092. int
[] values = new
int
[args.length];
093. for
(int
i
= 0; i < values.length; i++) {
094.
values[i] =
Integer.parseInt(args[i]);
095. }
096.
097. WebcamBroadcaster wb;
098. if
(values.length == 0) {
099. wb = new
WebcamBroadcaster();
100.
} else
if
(values.length == 1) {
101. wb = new
WebcamBroadcaster(values[0]);
102.
} else
if
(values.length == 2) {
103. wb = new
WebcamBroadcaster(values[0],
values[1]);
104.
} else
{
105. wb = new
WebcamBroadcaster(values[0],
values[1], values[2]);
106.
}
107.
108.
wb.start();
109. }
110.
111. public
static
final
int
DEFAULT_PORT
= 9889;
112. public
static
final
int
DEFAULT_WIDTH = 320;
113. public
static
final
int
DEFAULT_HEIGHT
= 240;
114.
115. private
final
Object
lock = new
Object();
116.
117. private
final
int
width;
118. private
final
int
height;
119. private
final
int
port;
120.
121. private
boolean
running;
122.
123. private
Player
player;
124. private
FrameGrabbingControl control;
125. private
boolean
stopping;
126. private
Worker worker;
127.
128. public
WebcamBroadcaster(int
width, int
height, int
port) {
129. this.width = width;
130. this
.height =
height;
131. this.port = port;
132.
}
133. public
WebcamBroadcaster(int
width,
int
height) {
134. this
(width, height,
DEFAULT_PORT);
135. }
136. public
WebcamBroadcaster(int
port) {
137. this(DEFAULT_WIDTH, DEFAULT_HEIGHT, port);
138.
}
139. public
WebcamBroadcaster()
{
140. this
(DEFAULT_WIDTH,
DEFAULT_HEIGHT, DEFAULT_PORT);
141. }
142.
143. public
void
start()
{
144. synchronized
(lock) {
145. if
(running)
return;
146.
player =
createPlayer(width, height);
147. if
(player
== null) {
148.
System.err.println("Unable to find a suitable player");
149. return;
150.
}
151. System.out.println("Starting the
player");
152.
player.start();
153. control = (FrameGrabbingControl)
player.getControl("javax.media.control.FrameGrabbingControl");
154.
worker = new
Worker();
155. worker.start();
156.
System.out.println("Grabbing
frames");
157. running = true;
158.
}
159. }
160. public
void
stop() throws
InterruptedException {
161. synchronized
(lock)
{
162. if
(!running) return
;
163. if
(player
!= null) {
164.
control = null
;
165. player.stop();
166.
player = null
;
167. }
168.
stopping = true
;
169. running = false;
170.
worker = null
;
171. }
172. try
{
173. worker.join();
174.
} finally
{
175. stopping = false;
176.
}
177. }
178. private
class
Worker extends
Thread {
179.
180. private
final
int
[] data = new
int
[width*height];
181.
182. @Override
183. public
void
run()
{
184.
ServerSocket ss;
185. try
{
186.
ss = new
ServerSocket(port);
187.
188.
} catch
(IOException e) {
189. e.printStackTrace();
190. return
;
191. }
192.
193. while(true) {
194.
FrameGrabbingControl c;
195. synchronized
(lock)
{
196. if
(stopping) break
;
197. c = control;
198.
}
199. Socket socket = null;
200. try
{
201. socket = ss.accept();
202.
203. Buffer buffer =
c.grabFrame();
204.
BufferToImage btoi = new
BufferToImage((VideoFormat)buffer.getFormat());
205. BufferedImage image =
(BufferedImage) btoi.createImage(buffer);
206.
207. if
(image
!= null) {
208.
OutputStream out = socket.getOutputStream();
209. if
(RAW)
{
210.
image.getWritableTile(0, 0).getDataElements(0, 0, width, height, data);
211. image.releaseWritableTile(0,
0);
212.
DataOutputStream dout = new
DataOutputStream(new
BufferedOutputStream(out));
213. for
(int
i
= 0; i < data.length; i++) {
214.
dout.writeInt(data[i]);
215. }
216.
dout.close();
217. } else
{
218.
ImageIO.write(image, "JPEG", out);
219. }
220.
}
221.
222.
socket.close();
223. socket = null;
224.
} catch
(IOException e) {
225. e.printStackTrace();
226.
} finally
{
227. if
(socket
!= null)
228. try
{
229. socket.close();
230.
} catch
(IOException e) {
231. /* ignore */
232.
}
233. }
234.
235. }
236.
237. try
{
238.
ss.close();
239. } catch
(IOException
e) {
240.
/* ignore */
241. }
242.
}
243. }
244.
245.}
·
Thanks a lot, it works for me and it's really useful.
I have also written the methods takePicture for SocketCamera. They are on my
blog, but it's in italian, so if any english user is interested i paste methods
here. (maybe not the best code but it works)
// Prova per takePicture
public final void takePicture(Camera.ShutterCallback shutter,
Camera.PictureCallback raw, Camera.PictureCallback jpeg) {
takePicture(shutter, raw, null, jpeg);
}
public final void takePicture(Camera.ShutterCallback shutter,
Camera.PictureCallback raw, Camera.PictureCallback postview,
Camera.PictureCallback jpeg) {
stopPreview();
try {
Socket socket = null;
try {
socket = new Socket();
socket.bind(null);
socket.setSoTimeout(SOCKET_TIMEOUT);
socket.connect(new InetSocketAddress(address, port), SOCKET_TIMEOUT);
if (shutter != null) shutter.onShutter();
// obtain the bitmap
InputStream in = socket.getInputStream();
Bitmap bitmap = BitmapFactory.decodeStream(in);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, baos); //bm is the bitmap
object
byte[] b = baos.toByteArray();
// Chiama la callback
if (raw != null) raw.onPictureTaken(b, null);
if (postview != null) postview.onPictureTaken(b, null);
if (jpeg != null) jpeg.onPictureTaken(b, null);
} catch (RuntimeException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
socket.close();
} catch (IOException e)
/* ignore */
}
}
} catch (Exception e) {
e.printStackTrace();
} finally {
}
}
@Alejandro: i don't have your problems running this code, but :
1- If i remember right 127.0.0.1 it's the loopback address of the emulator and
if u want to connect to your Computer u must use the explicit ip addres of
computer or 10.0.2.2 (see networking under android emulator on android sdk
pages)
2- IllegalAccesException point me to permissions of your Code, check the
AndroidManifest if u have added permission for both CAMERA and INTERNET.
Elsewhere i can't help, sorry.
Copyright © 2011 - All Rights Reserved - Softron.in
Template by Softron Technology