i
i
i
i
i
i
i
i
400 15. Using Multimedia in VR
mentation. Because it is nicely self-contained, the facility can be added to any
Windows application with little difficulty. We wont go into great detail, since
it takes us away from our main theme, but Listings 15.4 and 15.5 present a
C++ class and its methods to implement drag and drop. If you put this code
into a separate file, you can include it with all your programs and make use of
it by adding the code from Listing 15.6 in the appropriate places.
STDMETHODIMP CDropTarget::QueryInterface(REFIID riid, void
**
ppv){
if(ppv == NULL)return E_POINTER;
if(riid == IID_IDropTarget || riid == IID_IUnknown) {
AddRef();
*
ppv=this;
return S_OK;
}
*
ppv = NULL;
return E_NOINTERFACE;
}
STDMETHODIMP_(ULONG) CDropTarget::AddRef() {
return InterlockedIncrement(&m_cRef);
}
STDMETHODIMP_(ULONG) CDropTarget::Release(){
LONG lRef = InterlockedDecrement(&m_cRef);
if (lRef == 0)delete this;
return lRef;
}
STDMETHODIMP CDropTarget::DragEnter(
IDataObject
*
pDataObject,
//Pointer to the interface of the source data
// object
DWORD grfKeyState, //Current state of keyboard modifier keys
POINTL pt, //Current cursor coordinates
DWORD
*
pdwEffect //Pointer to the effect of the drag-and-drop
// operation
){ return S_OK; };
STDMETHODIMP CDropTarget::DragLeave(void){ return S_OK;}
STDMETHODIMP CDropTarget::DragOver(
DWORD grfKeyState, //Current state of keyboard modifier keys
POINTL pt, //Current cursor coordinates
DWORD
*
pdwEffect //Pointer to the effect of the drag-and-drop
// operation
){ return S_OK;}
Listing 15.5. Methods of the drag-and-drop class.
i
i
i
i
i
i
i
i
15.4. Video Sources 401
STDMETHODIMP CDropTarget::Drop(
IDataObject
*
pDataObject,
//Pointer to the interface for the source data
DWORD grfKeyState, //Current state of keyboard modifier keys
POINTL pt, //Current cursor coordinates
DWORD
*
pdwEffect //Pointer to the effect of the drag-and-drop
// operation
){
FORMATETC ff;
ff.cfFormat=CF_HDROP;
ff.ptd=NULL;
ff.dwAspect=DVASPECT_CONTENT;
ff.lindex= -1;
ff.tymed=TYMED_HGLOBAL;
STGMEDIUM pSM;
pDataObject->GetData(&ff,&pSM);
HDROP hDrop = (HDROP)(pSM.hGlobal);
TCHAR name[512];
int i=DragQueryFile(hDrop,0xFFFFFFFF,name,512);
DragQueryFile(hDrop,0,name,512);
// now that we’ve got the name as a text string use it ///////////
OpenDraggedClip(name); // this is the application defined function
//////////////////////////////////////////////////////////////////
ReleaseStgMedium(&pSM);
return S_OK;
}
Listing 15.5. (continued).
15.4 Video Sources
The other major use we will make of DirectShow concerns data acquisition.
Essentially, this means grabbing video or audio. In this section, we will set
up some example FilterGraphs to acquire data from live video sources. These
could be digital camcorders (DV) connected via the FireWire interface or
USB cameras ranging in complexity from a simple webcam using the USB 1
interface to a pair of miniature cameras on a head-mounted display device
multiplexed through a USB 2 hub. It could even be a digitized signal from an
analog video recorder or S-video output from a DVD player.
To DirectShow, all these devices appear basically the same. There are
subtle differences, but provided the camera/digitizer has a WDM (windows
driver model) device driver, DirectShow has a source filter which any appli-
cation program can load into its FilterGraph and use to generate a stream of
video samples. We shall see in this section that using a live video sour c e in a
i
i
i
i
i
i
i
i
402 15. Using Multimedia in VR
OleInitialize(NULL); // Drag and drop requires OLE not jsut COM so
// replace CoInitialize(NULL) with this.
SetUpDragAndDrop(hWnd) // make the application D&D compatible
Close DragAndDrop(); // destroy the D&D object
OleUninitialize(); //
void OpenDraggedClip(TCHAR
*
filename){ // this is called with file is dropped
if (g_psCurrent != Init)CloseClip();
OpenClip(T2A(filename)); // convert text string to basic "char" format
}
Listing 15.6. Modifications required in an application so that it can handle drag
and drop. When a file is dropped into the application window, the function
OpenDraggedClip() is called. Note also the change to OleInitialize() in-
stead of
CoInitialize().
DirectShow FilterGraph is as easy as using a file-based source. As a result, our
code listings will be relatively short and only complicated by the need to pro-
vide code to enable the application programs to select the desired input source
when more than one possible device is connected to the host computer.
There are, of course, some differences between the types of video signal
sources that we must be aware of as we plan our application design. We note
three of significance:
1. Data rate. DV camcorders will typically provide 30 samples (or video
frames) per second (fps). A USB 1 webcam may only be able to deliver
15 fps.
2. Sample size or frame resolution. DV frame sizes in pixels are typically
720×576 for PAL regions and 640×480 in NTSC regions. A webcam
resolution might be 320 × 240.
3. Pixel data format. In most applications, we would like the source to de-
liver a 3-byte-per-pix el RGB24 data stream, but some video source de-
vices provide a YUV2 format. DirectShow’s intelligent connect mech-
anism inserts appropriate conversion filters so that if we want to use
RGB24 (or RGB32 for that matter), a YUV2-to-RGB conversion filter
will be loaded and connected into the graph.
A special DirectShow filter, called the grabber filter, is often put
into the filter chain by an application program. This not only
i
i
i
i
i
i
i
i
15.4. Video Sources 403
lets an application program peek into the data stream and grab
samples, but it is also used to tell the upstream source filters that
they must deliver samples in a particular format.
As w e build the code for a couple of examples using live video source fil-
ters, you will see that the differences from the previous movie player program
are quite small. The longest pieces of code associated with video sources relate
to finding and selecting the source we want. In the context of F ilterGraphs
using video sources, the
ICaptureGraphBuilder2 interface comes into its
own by helping to put in place any necessary intermediate filters using intel-
ligent connect. Without the help of the
ICaptureGraphBuilder2 interface,
our program would have to search the source and sink filters for suitable in-
put and output pins and link them. Some of the project codes on the CD
and many examples from the DXSDK use this strategy, and this is one of the
main reasons why two DirectShow programs that do the same thing can look
so different.
15.4.1 Viewing Video from a DV or USB Camera
For the most part (the WinMain entry, message handler function etc.), the
code for this short program is very similar to the movie player code, so we
will not list it here. Instead, we go straight to look at the differences in the
structure of the FilterGraph given in Listing 15.7. A couple of functions merit
a comment:
1.
g pCapture->RenderStream().ThisICaptureGraphBuilder2
method carries out a multitude of duties. Using its first argument,
we select either a preview output or a rendered output. This gives an
application program the possibility of previewing the video while cap-
turing to a file at the same time. The second argument tells the method
that we need video samples. The last three arguments—
pSrcFilter,
NULL, NULL—put the graph together. They specify:
(a) The source of the video data stream.
(b) An optional intermediate filter between source and destination; a
compression filter for example.
(c) An output filter. If this is
NULL, the built-in renderer will be used.
If we are recording the video data into a file using a file writing filter (say
called
pDestFilter), the last three arguments would be (pSrcFilter,
i
i
i
i
i
i
i
i
404 15. Using Multimedia in VR
ICaptureGraphBuilder2
*
g_pCapture=NULL;// declare global pointer
// this is main function to build/run the capture graph
HRESULT CaptureVideo(){
HRESULT hr;
// this will point to the Video source filter
IBaseFilter
*
pSrcFilter=NULL;
hr = GetInterfaces(); // Get DirectShow interfaces
// Tell the capture graph builter to do its work on the graph.
hr = g_pCapture->SetFiltergraph(g_pGraph);
// Use the system device enumerator and class enumerator to find
// a video capture/preview device, such as a desktop USB video camera.
hr = FindCaptureDevice(&pSrcFilter);// see next listing
// Add Capture filter to our graph.
hr = g_pGraph->AddFilter(pSrcFilter, L"Video Capture");
// Render the preview pin on the video capture filter
// Use this instead of g_pGraph->RenderFile
hr = g_pCapture->RenderStream(&PIN_CATEGORY_PREVIEW,&MEDIATYPE_Video,
pSrcFilter, NULL, NULL);
// Now that the filter has been added to the graph and we have
// rendered its stream, we can release this reference to the filter.
pSrcFilter->Release();
hr = SetupVideoWindow(); // Setup video window
hr = g_pMC->Run(); // Start previewing video data
return S_OK;
}
// very similar to movie player - (Do check for errors in hr!!!
HRESULT GetInterfaces(void){
HRESULT hr;
// Create the filter graph
hr = CoCreateInstance (CLSID_FilterGraph, NULL, CLSCTX_INPROC,
IID_IGraphBuilder, (void
**
) &g_pGraph);
// Create the capture graph builder
hr = CoCreateInstance (CLSID_CaptureGraphBuilder2 , NULL, CLSCTX_INPROC,
IID_ICaptureGraphBuilder2, (void
**
) &g_pCapture);
// Obtain interfaces for media control and Video Window
hr = g_pGraph->QueryInterface(IID_IMediaControl,(LPVOID
*
) &g_pMC);
hr = g_pGraph->QueryInterface(IID_IVideoWindow, (LPVOID
*
) &g_pVW);
hr = g_pGraph->QueryInterface(IID_IMediaEvent, (LPVOID
*
) &g_pME);
// Set the window handle used to process graph events
hr = g_pME->SetNotifyWindow((OAHWND)ghApp, WM_GRAPHNOTIFY, 0);
return hr;
}
Listing 15.7. Build FilterGraph thatpreviews live video from a DV/USB camera. Note
the call to
FindCaptureDevice(), which finds the video source filter. Its code is
given in Listing 15.8.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.146.199