Getting video stream from Android’s display

Getting video stream from Android’s display


This is something that has been tried to be achieved in various ways. What other people usually do is to take screenshots at regular intervals and stitch them together to make a video out of it. What I’m doing here is pretty different an much better than that approach.
And so I also present a way to capture video frames from Android’s default display and do further processing as one pleases.
I’ll broadly use two main APIs viz. MediaCodec (added in API level 16) and DisplayManager (added in API level 19). So this will limit our app to a minimum API level of 19 which is Kitkat. And further if you want to mirror the output of secure windows as well, then you’ll have to push your apk to /system/priv-app/ which will require having root access on your phone.
My logic is :
  • Create a video encoder.
  • Get an input `Surface` from the encoder using `createInputSurface()` method of the encoder object.
  • Pass this surface to DisplayManager so that the display manager will route it’s output to this surface.
  • Use the dequeOutputBuffer() method which will return you the H.264 encoded frames of your video.
  • And if you want to get raw video frames you can further pass these AVC encoded frames to a video decoder and get the raw frames. However as of now I’m not covering that in this blog post.
Let’s start with the code :
First we need to create an encoder, configure it and get an input surface out of it.


MediaFormat mMediaFormat = MediaFormat.createVideoFormat("video/avc", 1280, 720);
//For a Nexus 5, the max bit-rate is 4 Mbps
mMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 3000000);
//For a Nexus 5, the range of frame-rate is from 15 to 30
mMediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
mMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, 
    MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
mMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10);
Log.i(TAG, "Starting encoder");
encoder = MediaCodec.createByCodecName(CodecUtils.selectCodec(CodecUtils.MIME_TYPE).getName());
encoder.configure(mMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
Surface surface = encoder.createInputSurface();

encoder.start();
We will then pass the above created surface to createVirtualDisplay() method.
DisplayManager mDisplayManager = (DisplayManager) mContext.getSystemService(Context.DISPLAY_SERVICE);
mDisplayManager.createVirtualDisplay("OpenCV Virtual Display", 960, 1280, 150, surface,
                DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC | DisplayManager.VIRTUAL_DISPLAY_FLAG_SECURE);
The DisplayManager will keep drawing the contents of the Android screen on our virtual display which in turn will feed the contents to video encoder through the surface.

final int TIMEOUT_USEC = 10000;
ByteBuffer[] encoderOutputBuffers = encoder.getOutputBuffers();
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
while (!encoderDone) {
    int encoderStatus = encoder.dequeueOutputBuffer(info, TIMEOUT_USEC);
    if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
        // no output available yet
        if (VERBOSE) Log.d(TAG, "no output from encoder available");
    } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
        // not expected for an encoder
        encoderOutputBuffers = encoder.getOutputBuffers();
        if (VERBOSE) Log.d(TAG, "encoder output buffers changed");
    } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
        // not expected for an encoder
        MediaFormat newFormat = encoder.getOutputFormat();
        if (VERBOSE) Log.d(TAG, "encoder output format changed: " + newFormat);
    } else {
        ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
        if (encodedData == null) {
            //something's wrong with the encoder
            break;
        }
        // It's usually necessary to adjust the ByteBuffer values to match BufferInfo.
        encodedData.position(info.offset);
        encodedData.limit(info.offset + info.size);
        
        encoder.releaseOutputBuffer(encoderStatus, false);
    }
}
The variable `encodedData` contains the AVC encoded frame and will be updated in every loop.
Now it’s upto you what you want to do with this. You can write it to a file. You can stream it over the network (Though that will require a lot of other effort too).
SHARE

Arpit

Thankyou for reading this post please share it with your firends
    Blogger Comment
    Facebook Comment

2 comments:

  1. DisplayManager is added in API level 17 or 19?
    http://developer.android.com/reference/android/hardware/display/DisplayManager.html

    Thanks a lot for help, icc

    ReplyDelete
  2. Just found the method of createVirtualDisplay is added in API level 19:
    http://developer.android.com/reference/android/hardware/display/DisplayManager.html#createVirtualDisplay(java.lang.String, int, int, int, android.view.Surface, int, android.hardware.display.VirtualDisplay.Callback, android.os.Handler)

    I try to do exactly the same as this article does. But only I only have Android 4.3 BSP environment. Is there any other way to do this on 4.3?
    Thanks a lot, icc

    ReplyDelete