i
i
i
i
i
i
i
i
12
Tools, Libraries
and Templates
for VR
This part of the book provides practical solutions and small projects for a
wide variety of VR requirements such as image and video processing, play-
ing movies, interacting with external devices, stereoscopy and real-time 3D
graphics.
The ideas, algorithms and theoretical concepts that we have been looking
at in Part I are interesting, but to put them into practice, one must turn them
into computer programs. It is only in the memor y of the computer that the
virtual world exists. It is through VR programs and their control over the
computers hardware that the virtual (internal) world is built, destroyed and
made visible/touchable to us in the real world. This part of the book will
attempt to give you useful and adaptable code covering most of the essential
interfaces that allow us to experience the virtual world, which only exists as
numbers in computer files and memory buffers.
To prepare the computer programs, it is necessary to use a wide variety of
software tools. No one would think of trying to use a computer these days if
it didnt have an operating system running on it to provide basic functions.
In this chapter, we will start by looking a t the development tools needed to
write the VR application programs. We will also discuss the strategy used in
the examples: its goal is to make the code as simple and adaptable as possible.
And we will look at the special problems and constraints posed by VR ap-
plications, especially how hardware and software must collaborate to deliver
the interfaces and speed of execution we require. VR applications, more of-
289
i
i
i
i
i
i
i
i
290 12. Tools, Libraries and Templates for VR
ten than not, are required to operate in a real-time environment. There is a
definite symbiotic relationship between hardware platforms and application
software. The ultimate expression of this relationship is in the custom inte-
grated circuits of the real-time image and video processors or the embedded
software applications running in customizable integrated circuits (ICs).
Today, the hardware/software dichotomy has become even more blurred
with the introduction of the programmable graphics processing unit (the
GPU), so that almost every PC sold today has the potential to deliver per-
formance capable of driving a magnificently realistic VR environment. From
the graphics point of view, there is little that cant be done on a domestic PC.
It is only the physical interaction, such as the sense of touch and possibly the
sense of depth perception, that are still too expensive to simulate exactly in
consumer-level hardware.
In the remainder of this chapter, we will look at the application frame-
works within which our sample VR programs will execute, but first we begin
by discussing the hardware requirements for getting the best out of the exam-
ple programs.
12.1 The Hardware Environment
In order to build and execute all the sample applications described in the next
few chapters, you will need to have a basic processor. Because Windows is
the dominant PC desktop environment, all our applications will be targeted
for that platform. A few of the examples can be compiled on other platforms;
when this is possible, we will indicate how. The most important hardware
component for creating a VR system is the graphics adapter. These are dis-
cussed in Section 12.1.1.
For the video applications, you will also need a couple of sources. This
can be as low-tech as hobbyists USB (universal serial bus) webcams, digi-
tal camcorder with a FireWire (or IEEE 1394) interface or a TV tuner card.
Unfortunately, if you try to use two digital camcorders connected via the
IEEE 1394 bus then your computer is unlikely to recognize them both. Ap-
plications making use of DirectShow
1
(aswewillbedoinginChapter15)
will only pick up one device. This is due to problems with the connection
1
DirectShow, DirectInput, Direct3D and DirectDraw are all part of Microsoft’s compre-
hensive API library for developing real-time interactive graphical and video applications called
Dire ctX. Details of these components will be introduced and discussed in the next few chap-
ters, but if you are unfamiliar with them, an overview of their scope can be found at [5].
i
i
i
i
i
i
i
i
12.1. The Hardware Environment 291
management procedure (CMP) and bandwidth issues. A full discussion of
this and other problems is given in [8]. The problem is not an easy one to
resolve, and for the examples we will create that need two video sources, we
will assume two USB-type webcams are available.
Where the applications have stereoscopic output, not only is the choice
of graphics cards important, but the choice of whether to use active or pas-
sive glasses can affect the way your application has to be written. However,
the good news is that if you program your 3D stereoscopic effects using the
OpenGL
2
graphics API (application programmer interface), the internal op-
eration of OpenGL offers a device-independent programming interface.
12.1.1 Graphics Adapters
Different display adapter vendors often provide functions that are highly op-
timized for some specific tasks, for example hardware occlusion and rendering
with a high dynamic range. While this has advantages, it does not make appli-
cations very portable. Nevertheless, where a graphics card offers acceleration,
say to assist with P hong shading calculations, it should be used. Another in-
teresting feature of graphics cards that could be of great utility in a practical
VR environment is support for multiple desktops or desktops that can span
two or more monitors. This has already been discussed in Chapter 4, but it
may have a bearing on the way in which the display software is written, since
two outputs might be configured to represent one viewport or two indepen-
dently addressable viewports. One thing the graphics card really should be
able to support is programmability. This is discussed next.
12.1.2 GPUs
The graphics display adapter is no longer a passive interface between the pro-
cessor and the viewer—it is a processor in its own right! Being a processor,
it can be programmed, and this opens up a fantastic world of opportuni-
ties, allowing us to make the virtual world more realistic, more detailed and
much more responsive. We will discuss this in Chapter 14. It’s difficult to
quantify the outstanding performance the GPU gives you, but, for example,
NVIDIAs GeForce 6800 Ultra chipset can deliver a sustained 150 Gflops,
which is about 2,000 times more than the fastest Pentium processor. It is no
2
OpenGL is a standard API for real-time 3D rendering. It provides a consistent interface
to programs across all operating systems and takes advantage of any hardware acceleration
available.
i
i
i
i
i
i
i
i
292 12. Tools, Libraries and Templates for VR
wonder that there is much interest in using these processors for non-graphical
applications, particularly in the field of mathematics and linear algebra. There
are some interesting chapters on this subject in the GPU Gems book series,
volumes 1 and 2 [3, 9].
The architectural design of all GPUs conforms to a model in which there
are effectively two types of subprocessor, the vertex and fragment proces-
sors. The phenomenal processing speed is usually accomplished by having
multiple copies of each of these types of subprocessor: 8 or 16 are possi-
ble. Programming the vertex processor and fragment processor is discussed in
Chapter 14.
12.1.3 Human Interface Devices (HIDs)
For VR application programs, interaction with the user is vital. In addition to
the display hardware, we need devices to acquire input from the user. Devices
that make this possible come under the broad heading of human interface de-
vices (HIDs). For most normal computer applications, the keyboard and/or
mouse is usually sufficient. Some drawing applications can benefit from a sty-
lus tablet, and of course computer games link up with a variety of interesting
devices: joysticks, steering consoles etc. For gaming, many joysticks are not
simply passive devices; they can kick back. In other words, they provide force
feedback or haptic responses. In VR, these devices may be significant, and in
application programs we should be able to use them. From a programmers
perspective, these devices can be tricky to access, especially if they have very
different forms of electrical connection to the computer. Fortunately, most
commercially available joysticks and game consoles have an accompanying
operating system driver that makes the device behave like a standard compo-
nent and which acts as an interface between application program and device.
With a system driver installed as part of Windows, a program can access a
broad range of devices in the same way and without the need to implement
system driver code itself. This is easily done using the DirectInput API and is
discussed in detail in Chapter 17.
With even a rudimentary knowledge of electronic circuits, it is not too
difficult to design and build a device interface to either the PC’s serial port
or the new standard for serial computer communication: USB. Section 17.5
discusses how one can use a programmable interrupt controller (PIC) to in-
terface almost any electrical signal to a PC via the USB.
i
i
i
i
i
i
i
i
12.2. Software Tools 293
12.2 Software Tools
At the very least, we need a high-level language compiler. Many now come
with a friendly front end that makes it easier to manage large projects with
the minimum of fuss. (The days of the traditional makefile are receding fast.)
Unfortunately, this is only half the story. F or most applications in VR, the
majority of the code will involve using one of the system API libraries. In
the case of PCs running Windows, this is the platform software developers
kit (SDK). For real-time 3D graphics work, a specialist library is required.
Graphics APIs are introduced and discussed in Chapter 13, where we will
see that one of the most useful libraries, OpenGL, is wonderfully platform-
independent, and it can often be quite a trivial job to port an application be-
tween Windows, Linux, Mac and SGI boxes. We shall also see in Chapter 17
that libraries of what is termed middleware can be invaluable in building VR
applications.
12.2.1 Why C and C++
Almost without exception, VR and 3D graphics application programs are de-
veloped in either the C and/or C++ languages. There are many reasons for
this: speed of program execution, vast library of legacy code, quality of devel-
oper tools—all of the operating systems on the target platforms were created
using C and C++. So we decided to follow the majority and write our code
in C and C++. We do this primarily because all the appropriate real-time 3D
graphics and I/O libraries are used most comfortably from C or C++. Since
OpenGL has the longest history as a 3D graphics library, it is based on the
functional approach to programming, and therefore it meshes very easily with
applications written in C. It does, of course, work just as well with application
programs written in C++. The more recently specified API for Windows ap-
plication programs requiring easy and fast access to graphics cards, multime-
dia and other I/O devices, known as DirectX, uses a programming interface
that conforms to the component object model (COM). COM is intended to
be usable from any language, but in reality most of the examples and applica-
tion programs in the VR arena use C++. Sadly, programming with COM can
be a nightmare to do correctly, and so Microsoft’s Visual Studio development
tools include a library of advanced C++ template classes to help. Called the
ATL, it is an enormous help to application developers, who sometimes get
interface reference counting object creation and destruction wrong. We talk
more about this topic in Section 12.4.2.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.222.3.255