I have two cameras, listed below, that I am trying to use in a Media Foundation topology. Here is a summary of my topology:
Webcam --> MJPG Decoder --> Custom MFT --> H264 Encoder --> MP4 File Sink
The problem with this is that the generated MP4 file has incorrect duration and time scale tags, both for the MP4 container and the H264 stream. I can easily correct this with a tool like MP4Box or YAMB, but my eventual goal is to stream the video.
One potential cause I have identified is that the samples generated by the camera sources do not start at time 0. According to bullet #2 in http://msdn.microsoft.com/en-us/library/windows/desktop/ms700134(v=vs.85).aspx#live_sources, timestamps of a live source should start at 0.
Along this line, I've tried the following to "correct" the sample timestamps:
IMFSample::SetSampleTime
.MEMediaSample
and MEStreamTick
events.In both of these cases, the media session throws an error 0xC00D4A44
(MF_E_SINK_NO_SAMPLES_PROCESSED
), and the resulting MP4 file ends abruptly after the "mdat" atom declaration.
Cameras
Systems used (both have same issue):
Questions:
Try to reset for every sample flag MFSampleExtension_Discontinuity
pSample->SetUINT32( MFSampleExtension_Discontinuity, FALSE );
User contributions licensed under CC BY-SA 3.0