Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
230 views
in Technique[技术] by (71.8m points)

directshow - Can't make IAMStreamConfig.SetFormat() to work with LifeCam Studio

I'm brand new to DirectShow and am working on adding a video stream to my application. I've looked into many solutions out there (TouchLess, DirectShow.net, etc.) and ended up going with this small project on Code Project There isn't much to it, which is why I selected it; I wanted a small code base to work with, as I need to get this feature implemented quickly.

After a solid day of reading, experimenting and debugging I finally have everything working nicely. There is a delay which is a bummer but I can worry about that later. The issue I have at this point is that the camera is capable of 1280X720 and I want to use this resolution. However it seems determined to capture at 640x480. As I dug deeper and deeper and deeper learning how to set the resolution, I finally thought I had it figured out. I also found code on that Code Project page in the comments that I used as a base.

After 6 hours of trying, I cannot get this camera to change its resolution. I'm not receiving any errors and the HRESULT returned from SetFormat() is 0, yet the camera will not change resolution.

There is too much code to paste everything, but I would like to include the section that builds up the graph as I imagine that's where the problem is.

Here is the code that sets up the graph

void CameraMethods::StartCamera(int camIndex, interior_ptr<int> width, 
    interior_ptr<int> height)
{
    if (g_pGraphBuilder != NULL)
        throw gcnew ArgumentException("Graph Builder was null");

    IMoniker *pMoniker = GetMoniker(camIndex);
    pMoniker->AddRef();

    HRESULT hr = S_OK;

    // Build all the necessary interfaces to start the capture
    if (SUCCEEDED(hr))
    {
        hr = CoCreateInstance(CLSID_FilterGraph,
            NULL,
            CLSCTX_INPROC,
            IID_IGraphBuilder,
            (LPVOID*)&g_pGraphBuilder);
    }

    if (SUCCEEDED(hr))
        hr = g_pGraphBuilder->QueryInterface(IID_IMediaControl, (LPVOID*)&g_pMediaControl);

    if (SUCCEEDED(hr))
    {
        hr = CoCreateInstance(CLSID_CaptureGraphBuilder2,
            NULL,
            CLSCTX_INPROC,
            IID_ICaptureGraphBuilder2,
            (LPVOID*)&g_pCaptureGraphBuilder);
    }

    // Setup the filter graph
    if (SUCCEEDED(hr))
        hr = g_pCaptureGraphBuilder->SetFiltergraph(g_pGraphBuilder);

    // Build the camera from the moniker
    if (SUCCEEDED(hr))
        hr = pMoniker->BindToObject(NULL, NULL, IID_IBaseFilter, (LPVOID*)&g_pIBaseFilterCam);

    // Add the camera to the filter graph
    if (SUCCEEDED(hr))
        hr = g_pGraphBuilder->AddFilter(g_pIBaseFilterCam, L"WebCam");

    // Create a SampleGrabber
    if (SUCCEEDED(hr))
        hr = CoCreateInstance(CLSID_SampleGrabber, NULL, CLSCTX_INPROC_SERVER, IID_IBaseFilter, 
            (void**)&g_pIBaseFilterSampleGrabber);

    // Configure the Sample Grabber
    if (SUCCEEDED(hr))
        hr = ConfigureSampleGrabber(g_pIBaseFilterSampleGrabber);

    //  Set the resolution - I have NO idea where this should be executed
    SetCaptureFormat(camIndex, *width, *height);

    // Add Sample Grabber to the filter graph
    if (SUCCEEDED(hr))
        hr = g_pGraphBuilder->AddFilter(g_pIBaseFilterSampleGrabber, L"SampleGrabber");

    // Create the NullRender
    if (SUCCEEDED(hr))
        hr = CoCreateInstance(CLSID_NullRenderer, NULL, CLSCTX_INPROC_SERVER, IID_IBaseFilter, 
            (void**)&g_pIBaseFilterNullRenderer);

    // Add the Null Render to the filter graph
    if (SUCCEEDED(hr))
        hr = g_pGraphBuilder->AddFilter(g_pIBaseFilterNullRenderer, L"NullRenderer");

    // Configure the render stream
    if (SUCCEEDED(hr))
        hr = g_pCaptureGraphBuilder->RenderStream(&PIN_CATEGORY_CAPTURE, &MEDIATYPE_Video, 
            g_pIBaseFilterCam, g_pIBaseFilterSampleGrabber, g_pIBaseFilterNullRenderer);

    // Grab the capture width and height
    if (SUCCEEDED(hr))
    {
        ISampleGrabber* pGrabber = NULL;
        hr = g_pIBaseFilterSampleGrabber->QueryInterface(IID_ISampleGrabber, (LPVOID*)&pGrabber);
        if (SUCCEEDED(hr))
        {
            AM_MEDIA_TYPE mt;
            hr = pGrabber->GetConnectedMediaType(&mt);
            if (SUCCEEDED(hr))
            {
                VIDEOINFOHEADER *pVih;
                if ((mt.formattype == FORMAT_VideoInfo) &&
                    (mt.cbFormat >= sizeof(VIDEOINFOHEADER)) &&
                    (mt.pbFormat != NULL) )
                {
                    pVih = (VIDEOINFOHEADER*)mt.pbFormat;
                    *width = pVih->bmiHeader.biWidth;
                    *height = pVih->bmiHeader.biHeight;
                }
                else
                {
                    hr = E_FAIL;  // Wrong format
                }

                // FreeMediaType(mt); (from MSDN)
                if (mt.cbFormat != 0)
                {
                    CoTaskMemFree((PVOID)mt.pbFormat);
                    mt.cbFormat = 0;
                    mt.pbFormat = NULL;
                }
                if (mt.pUnk != NULL)
                {
                    // Unecessary because pUnk should not be used, but safest.
                    mt.pUnk->Release();
                    mt.pUnk = NULL;
                }
            }
        }

        if (pGrabber != NULL)
        {
            pGrabber->Release();
            pGrabber = NULL;
        }
    }

    // Start the capture
    if (SUCCEEDED(hr))
        hr = g_pMediaControl->Run();

    // If init fails then ensure that you cleanup
    if (FAILED(hr))
        StopCamera();
    else
        hr = S_OK;  // Make sure we return S_OK for success

    // Cleanup
    if (pMoniker != NULL)
    {
        pMoniker->Release();
        pMoniker = NULL;
    }

    if (SUCCEEDED(hr))
        this->activeCameraIndex = camIndex;
    else
        throw gcnew COMException("Error Starting Camera", hr);
}

[UPDATE] Added the ConfigureSampleGrabber() method below

HRESULT CameraMethods::ConfigureSampleGrabber(IBaseFilter *pIBaseFilter)
{
    HRESULT hr = S_OK;
    ISampleGrabber *pGrabber = NULL;

    hr = pIBaseFilter->QueryInterface(IID_ISampleGrabber, (void**)&pGrabber);
    if (SUCCEEDED(hr))
    {
        AM_MEDIA_TYPE mt;
        ZeroMemory(&mt, sizeof(AM_MEDIA_TYPE));
        mt.majortype = MEDIATYPE_Video;
        mt.subtype = MEDIASUBTYPE_RGB24;
        mt.formattype = FORMAT_VideoInfo;
        hr = pGrabber->SetMediaType(&mt);
    }

    if (SUCCEEDED(hr))
        hr = pGrabber->SetCallback(new SampleGrabberCB(), 1);

    if (pGrabber != NULL)
    {
        pGrabber->Release();
        pGrabber = NULL;
    }

    return hr;
}

That is pretty much the exact code from the CodeProject source code. I then added this method to set the resolution:

void CameraMethods::SetCaptureFormat(int camIndex, int width, int height)
{
    HRESULT hr = S_OK;
    IMoniker* pMoniker = GetMoniker(camIndex);

    IBaseFilter* pCap;
    hr=pMoniker->BindToObject(0,0,IID_IBaseFilter,(void **)&pCap);

    if(!SUCCEEDED(hr))
        return;

    IAMStreamConfig *pConfig = NULL;

    if(g_pCaptureGraphBuilder == NULL)  // no CaptureGraphBuilder initialised
        return;

    hr = g_pCaptureGraphBuilder->FindInterface(
        &PIN_CATEGORY_CAPTURE, // Preview pin.
        0,    // Any media type.
        pCap, // Pointer to the capture filter.
        IID_IAMStreamConfig, (void**)&pConfig);

    if(!SUCCEEDED(hr))
        return;

    int iCount = 0, iSize = 0;
    hr = pConfig->GetNumberOfCapabilities(&iCount, &iSize);

    // Check the size to make sure we pass in the correct structure.
    if (iSize == sizeof(VIDEO_STREAM_CONFIG_CAPS)) {

        // Use the video capabilities structure.
        for (int iFormat = 0; iFormat < iCount; iFormat++)
        {
            VIDEO_STREAM_CONFIG_CAPS scc;
            AM_MEDIA_TYPE *pmt;
            /* Note:  Use of the VIDEO_STREAM_CONFIG_CAPS structure to configure a video device is 
            deprecated. Although the caller must allocate the buffer, it should ignore the 
            contents after the method returns. The capture device will return its supported 
            formats through the pmt parameter. */
            hr = pConfig->GetStreamCaps(iFormat, &pmt, (BYTE*)&scc);
            if (SUCCEEDED(hr))
            {
                /* Examine the format, and possibly use it. */
                if (pmt->formattype == FORMAT_VideoInfo) {
                    // Check the buffer size.
                    if (pmt->cbFormat >= sizeof(VIDEOINFOHEADER))
                    {
                        VIDEOINFOHEADER *pVih =  reinterpret_cast<VIDEOINFOHEADER*>(pmt->pbFormat);
                        BITMAPINFOHEADER *bmiHeader = &pVih->bmiHeader;

                        /* Access VIDEOINFOHEADER members through pVih. */
                        if( bmiHeader->biWidth == width && bmiHeader->biHeight == height && 
                            bmiHeader->biBitCount == 24)
                        {
                            hr = pConfig->SetFormat(pmt);
                        }
                    }
                }

                // Delete the media type when you are done.
                DeleteMediaType(pmt);
            }
        }
    }
}

I've stepped through the code and verified that the call to SetFormat() is executed and return a valid HRESULT. No changes to the captured frames though.

With no error messages it's difficult to know where to start. I'm hoping there are some DirectShow experts here that will see the problem, I'd even be happy with a good ol' fashion condescending "Well yeah, how do you expect the camera to change frame size once the buffer is allocated on the filter stack and the widget is initialized to the foobar! Pft... lol" ;)

Teach me, oh DirectShow/COM god!

[UPDATE #2] (FYI, it's odd that we can't just add a new message to this system and need to edit the original like this)

Per Roman's suggestion I have used GraphStudio to look under the hood of my graph. I will admit that I still don't understand what exactly I'm looking at. I found a "text report" function and thought it would be helpful to post that report here in case it shows some valuable information.

--------------------------------------------------
  Filters
--------------------------------------------------
  1. Smart Tee
  2. MJPEG Decompressor
  3. SampleGrabber
  4. NullRenderer
  5. WebCam

--------------------------------------------------
  Connections
--------------------------------------------------
  1. [Smart Tee]/(Capture) -> [MJPEG Decompressor]/(XForm In)
      Major:   MEDIATYPE_Video
      Subtype: MEDIASUBTYPE_MJPG
          bFixedSizeSamples:    TRUE
          bTemporalCompression: FALSE
          lSampleSize:          921600
          cbFormat:             88
      Format:  FORMAT_VideoInfo
      VIDEOINFOHEADER:
          rcSource:             (0,0,0,0)
          rcTarget:             (0,0,0,0)
          dwBitRate:            221184000
          dwBitErrorRate:       0
          AvgTimePerFrame:      333333
      BITMAPINFOHEADER:
          biSize:               40
       

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You put it into right place - after it's already in the graph by AddFilter, but yet before its output pin is connected. If you have successful HRESULT, then you could be expecting changed format, but there might be exceptions, such as for example this media type was not accepted by downstream filter and they started negotiating from the start.

You're not showing your ConfigureSampleGrabber here, so it might be the case that this media type you want is not accepted by sample grabber making filter graph to try alternate media types and/or intermediate filters (such as decoders).

There are a few things you can actually do.

  1. For troubleshooting you might want to:

    1. add the filter graph to ROT, or instead just install DirectShow Spy to have the same done for you automatically
    2. add a MessageBox in your code right after your SetCaptureFormat
    3. while the message box is still on the screen, use GraphEdit (GraphStudio) to inspect your filter graph and to see what media types it enumerates on its output pin; typically the first media type will be the one used for real connection, so your successfull HRESULT in SetFormat basically assumes the media type is now on top of this list
  2. To force the media type, you might want to use IFilterGraph::ConnectDirect with the configured pin, it's immediate neighbor downstream pin and media type of your interest.

Hope this helps.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...