I want to encode video using the "Intel? Quick Sync Video H.264 Encoder MFT".
If I create IMFSample from system buffers, it works well. Just like following:
IMFMediaBuffer *pBuffer = NULL;
MFCreateMemoryBuffer(cbSize, &pBuffer);
BYTE *pData = NULL;
pBuffer->Lock(&pData, NULL, NULL);
memcpy(pData, bufferIhaveinYYYYUV format, buffer size);
pBuffer->Unlock();
IMFSample *pSample = NULL;
MFCreateSample(&pSample);
pSample->AddBuffer(pBuffer);
Now I'm investigating whether I can feed it ID3D11Texture2D surfaces as input (DXGI_FORMAT_NV12, 1280x720) in order to improve performance. I tried to pass IMFSample instances created with MFCreateVideoSampleFromSurface or MFCreateDXGISurfaceBuffer to IMFTransform::ProcessInput and made multiple experiments (trying different texture creation flags), but the best result was that all input samples were accepted, but no output samples produced. In case it matters, I never actually tried uploading data to the textures, assuming this would not make a difference from textures filled with garbage pixel data.
Am I doing something wrong?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…