Not able to play android h264 encoded video on vlc

2

I am using MediaCodec to encode android camera stream into h264 format and then using gstreamer to create a RTSP stream via gst-rtsp-server. I am able to play the stream using gstreamer client pipeline. But not able to play using vlc. Can anyone help me to play the stream on vlc?

This is the MediaCodec code,

final int TIMEOUT_USEC = 10000;     // no timeout -- check for buffers, bail if none
    try {
        ByteBuffer[] encoderOutputBuffers = encoder.getOutputBuffers();
        bufferInfo = new MediaCodec.BufferInfo();

            while (true) {
                int encoderStatus = encoder.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
                if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                    // no output available yet
                    break;
                } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                    // not expected for an encoder
                    encoderOutputBuffers = encoder.getOutputBuffers();
                } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                    // Should happen before receiving buffers, and should only happen once.
                    // The MediaFormat contains the csd-0 and csd-1 keys, which we'll need
                    // for MediaMuxer.  It's unclear what else MediaMuxer might want, so
                    // rather than extract the codec-specific data and reconstruct a new
                    // MediaFormat later, we just grab it here and keep it around.

                    // The PPS and PPS shoud be there

                    if (muxerStarted) {
                        throw new RuntimeException("format changed twice");
                    }

                    muxerStarted = true;
                } else if (encoderStatus < 0) {
                    Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " + encoderStatus);
                    // let's ignore it
                } else {
                    ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
                    if (encodedData == null) {
                        throw new RuntimeException("encoderOutputBuffer " + encoderStatus +
                                " was null");
                    }

                    if (bufferInfo.size != 0) {

                        encodedData.position(bufferInfo.offset);
                        encodedData.limit(bufferInfo.offset + bufferInfo.size);


                        byte[] outData = new byte[bufferInfo.size];
                        encodedData.get(outData);

                        if (spsPpsInfo == null) {
                            ByteBuffer spsPpsBuffer = ByteBuffer.wrap(outData);
                            if (spsPpsBuffer.getInt() == 0x00000001) {
                                spsPpsInfo = new byte[outData.length];
                                System.arraycopy(outData, 0, spsPpsInfo, 0, outData.length);
                            } else
                                return;
                        } else {
                            outputStream.write(outData);
                        }
                    }

                    encoder.releaseOutputBuffer(encoderStatus, false);

                    if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                        Log.w(TAG, "reached end of stream unexpectedly");
                        break;      // out of while
                    }
                }
            }

            byte[] ret = outputStream.toByteArray();


            if (ret.length > 5 && ret[4] == 0x65) {
                Log.d(TAG, "----> Setting SPS PPS info");
                byte[] concBuffer = new byte[spsPpsInfo.length + ret.length];
                System.arraycopy(spsPpsInfo, 0, concBuffer, 0, spsPpsInfo.length);
                System.arraycopy(ret, 0, concBuffer, spsPpsInfo.length, ret.length);

                outputStream.reset();
                outputStream.write(concBuffer);
                //outputStream.write(ret, spsPpsInfo.length, (spsPpsInfo.length + ret.length-1));
            }
        } catch (Throwable t) {
            t.printStackTrace();
        }

        byte ret[] = outputStream.toByteArray();
        outputStream.reset();

        // Log.d(TAG, "Got buffer with size " + ret.length + " and needData " + needData);

        if (needData == 1 && ret.length != 0) {
            if (streamMode == Native.STREAM_MODE_RTP) {
                // Log.d(TAG, "Sending buffer to RTP pipeline with size " + ret.length);

                Native.nativeRTPAddStream(ret, ret.length, bufferInfo.presentationTimeUs, native_custom);
            } else if (streamMode == Native.STREAM_MODE_RTSP) {
                // Log.d(TAG, "Sending buffer to RTSP pipeline with size " + ret.length);

                Native.nativeRTSPAddStream(ret, ret.length, bufferInfo.presentationTimeUs, native_custom);
            }
        }

This is the gstreamer pipeline,

appsrc name=camsrc ! h264parse ! rtph264pay name=pay0 pt=96

at the end of the above pipeline, gst-rtsp-server is attached.

android
video-streaming
gstreamer
vlc
asked on Stack Overflow May 5, 2018 by Kaustav

1 Answer

1

The problem is with the buffer timestamp (bufferInfo.presentationTimeUs). The timestamp sent by the Android camera is wrong. You will need to retimestamp each buffer before sending it to the GStreamer sink.

One way to do this is by accessing each buffer using the identity element. Another way is to correct the timestamp in Android itself, by manually calculating the timestamp for each buffer.

The retimestamping formula in GStreamer looks something like this

GstClockTime time = gst_util_uint64_scale_int (1, GST_SECOND, absCurFps);
GST_BUFFER_PTS (buffer) = GST_BUFFER_DTS (buffer) += time;
GST_BUFFER_DURATION (buffer) = 1 * GST_SECOND / absCurFps;

In effect it assumes each buffer is one frame and sets the presentation timestamp to base_time + 1 / (current_fps)

answered on Stack Overflow May 14, 2018 by Crearo Rotar

User contributions licensed under CC BY-SA 3.0