A Glossary of AV/IT Terms

2K/4K (digital cinema)—Shorthand for 2K video (most often resolutions of 2,048 × 1,536 or 2,048 × 1,556 pixels) and 4K video (most often resolutions of 4,096 × 2,160 or 4,096 × 1,714) usually at frame rates of 24p or multiples.

24p—Video format at 24 progressive (not interlaced) frames per second. It is based on the traditional film rate of 24 FPS.

3:2 pull down—The process of converting 24 frame per second film to 60 (59.94 actually) fields per second video by repeating one film frame as three video fields and then the next film frame as two fields. The actual order is 2, 3, so 2:3 is more accurate but the principle is the same.

4:2:2 (4:4:4, 4:1:1, 4:2:0)—Common designations for pixel sampling relationships in a digital component video format. The first term relates to the luma (Y’) sampling rate, and the other two relate to the chroma (Cr and Cb) sampling rates. See Chapter 11 to decipher the hidden codes.

480i—Shorthand for a 480 active line, SD-interlaced scanning standard usually with 720 horizontal pixels at various frame rates. The progressive version is 480p. Total lines are 525. NTSC is based on this scanning structure.

50i/60i—Shorthand for 50 or 60 interlaced fields per second video. Closely related is the shorthand 50p/60p, referring to the progressive frames per second version.

576i—Shorthand for a 576 active line, SD-interlaced scanning standard usually with 720 horizontal pixels at various frame rates. Total lines are 625. PAL is based on this scanning structure.

601—Formally CCIR 601. This is the original name of a standard published by the CCIR (now ITU-R) for converting analog video signals to digital form. The new name of the standard is ITU-R BT.601. It includes methods for converting 525/625 line analog video to digital.

720p—Shorthand for a 720 active line, HD-progressive scanning standard with 1,280 horizontal pixels at various frame rates. Total lines are 750.

1080i—Shorthand for a 1,080 active line, HD-interlaced scanning standard with 1,920 horizontal pixels at various frame rates. The progressive version is 1,080p. Total lines are 1,125.

AAF—Advanced Authoring Format. A format for annotating the composition of an A/V edit project. The AAF Association (rebranded as AMWA) is responsible for its development.

Active picture area—The production area of the raster scan. For 525/NTSC systems, there are 480 active lines and 576 for 625/PAL systems. The safe picture is a slightly reduced area that is likely viewable on most TVs.

AES—Audio Engineering Society. This group sets standards and recommendations for audio technology.

AES/EBU audio—The joint-effort standard for packaging digital audio up to 24 bits/sample onto a serial link. There is also a mapping for compressed audio.

Aliasing (Image)—Visual distortions resulting from digitally sampling an image at a rate too low to capture all significant spatial or temporal detail. A result of temporal video aliasing can be seen as a car’s wheels appearing to spin backward even though the car is moving forward.

Alpha—The measure of a pixel’s opacity. A pixel with the maximum alpha value is solid, one with a value of zero is transparent, and one with an intermediate value is translucent.

AMWA—Advanced Media Workflow Association. An industry organization dedicated to creating technology for open specifications relating to file-based workflows (including AAF, MXF, and other formats), Service Oriented Architectures, and application specifications.

Anamorphic—An image system that optically compresses the picture horizontally (usually) during capture and then restores it to normal proportions for viewing. Typical application is to capture a 16 × 9 image but compress it to 4 × 3 for storage and transport.

ASI—Asynchronous Serial Interface. This is a 270 Mbps serial link used most often to carry MPEG Transport Streams. MPEG data can be a single program of A/V or many multiplexed programs. Most MPEG streams have data rates less than 100 Mbps. DVB defines this spec.

Aspect ratio (AR)—The ratio of a display’s horizontal versus vertical size expressed as H:V or H × V; 4 × 3 and 16 × 9 are common. ARC is shorthand for Aspect Ratio Converter, a device that converts from one AR (for example, 16 × 9) to another (for example, 4 × 3).

ATA—Advanced technology attachment. A parallel link for connecting disc drives and other devices inside a PC or other product. Initially developed for low-end device attachments. See Chapter 3B.

ATM—Asynchronous Transfer Mode. A WAN service based on layer 2 switching. ATM cells carry upper layer payloads such as TCP/IP.

ATSC—Advanced Television Systems Committee. This group defined the digital terrestrial broadcast standard in use for North America and elsewhere. It is based on MPEG encoding and supports SD and HD resolutions.

Automation—The process of controlling A/V system operations with a handsoff scheduler. Facility routers, servers, VTRs, compositors, mixers, codecs, processors, and more are controlled by automation logic. The automation operations are often controlled by a time-based schedule.

A/V (or AV)—Audio/visual or audio/video. A generic term for describing audio, video, graphics, animations, and associated technology.

AVC—Advanced video codec. This term describes the compression format also referred to as MPEG4 part 10 and separately as H.264. AVC-Intra is a constrained form used as a portable camera capture format.

AVI—Audio video interleaved. An A/V wrapper or container format for multiplexing A/V essence. It is not a compression format.

AVIT or AV/IT—Shorthand for a A/V + IT hybrid system and related concepts.

AXF—Archive eXchange Format. A SMPTE-specified, data-block tape layout to enable cartridge exchange between archive systems.

BWAV—The EBU’s broadcast audio WAV format.

BXF—Broadcast eXchange Format. BXF (SMPTE 2021) was developed to standardize methodologies for messaging and exchanging metadata between traffic, automation, content management, and workflow software systems.

CCIE—Cisco certified Internet work expert.

CFS—Clustered file system. A file system shared in common by more than one client or server. A CFS provides users with a shared or common view of all stored files, usually from one large pool. A CFS is a networkable service either configured as standalone or distributed among the nodes. In the latter case, a CFS is sometimes called a distributed file system (DFS).

Chroma—A value that conveys a color signal independent from the luma component.

CIF—Common intermediate format. This is a spatial resolution image format of 352 (H) 288 (V) pixels, 4:2:0. See also QCIF. CIF also has a secondary meaning: common image format. This second use is defined by MPEG as either 720 × 480 or 720 × 576. Beware of this acronym. See also SIF.

CIFS—Common Internet File System. A Microsoft-developed remote file-sharing protocol. NAS servers often support this. See also NFS.

CIM—Common Information Model. This is a model for describing managed information.

CIMOM—CIM Object Manager. This is a software component for accessing data elements in the CIM schema.

Closed captioning (CC)—Textual captioning on a TV screen for the hearing impaired. CC data are carried on unseen line 21 using NTSC. Standard EIA-608 defines CC data structures for analog transmission, and EIA-708 defines CC for digital transmission systems. Teletext subtitling is a similar system used in PAL countries.

Cloudware—Software that runs from the Web rather than a local desktop or campus server. Google, Microsoft, and others offer networked applications that replace common desktop versions.

Color burst—A burst of 8–10 cycles of subcarrier inserted into a composite video signal after the H_Sync pulse. It is used to synchronize the receiver’s video color decoder circuitry.

Colorimetry—The science of defining and measuring color and color appearance.

Component video—A method of signal representation that maintains the original image elements separately rather than combined (encoded) as a single, composite signal. Video signal sets such as R’G’B’ and Y’CrCb are component video signals.

Composite video—A standardized way to combine luma, chroma, and timing information into one signal. The NTSC and PAL standards define the methods to create a composite video signal. See Chapter 11.

Content—In the context of A/V media, content = essence + metadata.

CORBA—Common Object Request Broker Architecture.

CoS—Class of service. An indicator of performance or feature set associated with a flow of information. A CoS may be set to prioritize an A/V flow.

COTS—Commercial off-the-shelf. Products that can be purchased and integrated with little or no customization, thus facilitating customer infrastructure expansion and reducing costs. They are generic and not designed specifically to meet A/V requirements for the most part.

CRUD—Create, read, update, delete. CRUD describes the basic four functions needed to implement RESTful Web services as an example.

CVBS—Composite video burst and sync. Shorthand to describe a composite video signal.

D1, D2, … D16—Various SMPTE-standardized videotape formats with D4, 8, and 13 not defined.

DAS—Direct attached storage. Storage that is local and dedicated to a device. Chapter 3B covers this in detail. Sometimes called direct access storage device (DASD).

DCML—Data Center Markup Language. A model that describes a data center environment.

DES—Data Encryption Standard. An international standard for encryption and decryption. The same key is used for both.

DHCP—Dynamic Host Configuration Protocol. A method to automatically assign an IP address to a newly attached network client. A DHCP server doles out IP addresses from a pool.

DI—Digital Intermediate. A process step in the digital cinema production chain, whereas the image is manipulated in digital form (as compared to pure optical) before being recorded to film or other media for display.

Diffserv—Differentiated services. A method defined by the IETF to segregate user data flows per class of service and associated QoS.

DNS—Domain Name Server. This network service translates between a named address (as in www.ebay.com) and its IP address (66.135.208.89). The DNS may select an IP address from a pool, thereby performing a type of load balancing.

DPX—Digital Picture Exchange. A file container format for digital cinema images. DPX is defined as SMPTE standard 268M.

DRM—Digital rights management. The processes and techniques to secure and control the use of digital content by users.

DTMB—Digital Terrestrial Multimedia Broadcast. This is the digital TV broadcast standard for China.

DV—Digital Video. This is a video compression format and tape format. It is in common use for news gathering, consumer cameras, and editing. The nominal video rate is 25 Mbps, but 50 and 100 are also standardized. See also HDV.

DVB—Digital Video Broadcasting. This is a family of standards for digital transmission over terrestrial, cable, and satellite. DVB standards are implemented by 55+ countries. It supports SD and HD resolutions.

EBU—European Broadcasting Union. A television broadcast users’ group dedicated to education, setting policy, and recommendations for its members. Based in Geneva.

EDL—Edit decision list. A text file for annotating the composition of an edit project. See also AAF.

Embedded audio—The process of carrying audio and video on the same link; usually SDI as defined by SMPTE 259M or 292M.

eSATA—External Serial Advanced Technology Attachment. This is an external interface (connector type) for the SATA link. It competes with FireWire 800 and USB 2.0 to provide fast data transfer speeds for external storage devices.

Essence—Basic, low-level A/V data structures such as uncompressed audio or video, MPEG, DV, or WAV data. It is distinguished from “content” that normally has metadata and other non-A/V elements associated with it. A MXF file packages essence elements.

FCoE—Fibre Channel over Ethernet. FCoE is a proposed mapping of Fibre Channel frames over full-duplex IEEE 802.3 networks. Fibre Channel is able to leverage 10 gigabit Ethernet networks while preserving its higher level protocols.

FCP—Fibre Channel Protocol. This is a mapping protocol for carrying the SCSI command set over the Fibre Channel link.

FEC—Forward error correction. A method to correct for faulty transmitted data by applying error correction at the receiver. FEC needs overhead bits and can only correct for a maximum number of bad bits per sent payload.

Fibre Channel (FC)—A serial link for moving data to/from a storage element or system. It may be optical or even copper based despite the name. Fibre (British spelling) is used instead of fiber to distinguish it from other optical fiber links. One-, 2, 4, and 8 Gbps links are defined.

Field—With interlaced video, two fields are used to create a full frame. The first complete field (odd lines of the frame) is followed by the second field (even lines of the frame) in time. In practice, the lines are numbered consecutively across fields.

FPS—Frames per second.

Frame—Essentially, one complete picture. An interlaced frame is composed of two fields (two complete interlaced scans of the monitor screen). A frame consists of 525 interlaced horizontal lines in NSTC and 625 in PAL. A progressive frame is a sequential scan of lines without any interleaving.

Frame accurate—Actions on a video signal at a desired frame position.

Gamma—The exponent value applied to a linear-light signal to obtain an R’, G’, or B’ signal. For example, R’ = R0.45 is a gamma-corrected red-valued video signal. The 0.45 is the gamma value. The apostrophe indicates a gamma-corrected variable. See Chapter 11.

Gen Lock—See video reference.

GOLF—Group of linked files. A directory of associated files that are part of the same program material.

GOP—Group of pictures. In MPEG, a GOP is a collection of sequential pictures (frames) bound by temporal associations. A short GOP is one I frame. The long GOP format is normally a 12- to 15-length IBP sequence for standard definition formats. See also IBP.

H.264—A video compression format also defined as MPEG4 part 10. It offers superior compression compared to the older MPEG2 methods.

HA—High availability. The ability of a device/system to withstand hardware or software failures. HA is achieved by using forms of element duplication. See Chapter 5.

HANC—Horizontal ANCillary data field. Both 292M and 259M SDI links contain ancillary data space in the horizontal and vertical dimensions. HANC is included in the portion of each scanning line outside the active picture area and may be used to carry embedded audio. The vertical ANCillary data space (VANC) corresponds to the analog vertical blanking interval. It encompasses much bigger chunks of data space than HANC. Metadata may be embedded in the VANC data space.

HBA—Host bus adaptor. An interface card that plugs into a computer’s bus and provides network connectivity.

HD—High-definition video resolution. See Chapter 11.

HDD—Hard disc drive or hard disk drive.

HDV—High-definition video. This is an A/V tape and compression format using MPEG at 25 Mbps for 1080i and ~19 Mbps for 720p. The tape cartridge is the same as used for standard definition DV.

Horizontal sync—The portion of the video signal that triggers the receiver to start the next left-to-right raster scan.

HSM—Hierarchical storage management. The process of automatically moving/storing data to the lowest-cost devices commensurate with upper layer application needs.

HTTP—Hypertext Transfer Protocol. A protocol used to carry data between a client and a server over the Web. HTTPS is an encrypted version of HTTP. See also SSL.

IBP—Intraframe, Bidirectionally predicted, Predicted. This is MPEG shorthand for three different compressed video frame types. The I picture is a standalone compressed picture frame. The B picture is predicted from one or two of its I or P neighbors. The P picture is predicted from the previous I or P picture.

IETF—Internet Engineering Task Force. The body responsible for many of the Internet’s standards.

IFS—Installable file system. This client or server software component redirects local file system calls to another internal or external file system (a CFS).

IKE—Internet Key Exchange. IKE establishes a shared security policy and authenticates keys for services that require keys such as IPSec.

ILM—Information life cycle management.

InfiniBand—A switched-fabric I/O technology that ties together servers, storage devices, and network devices.

Interlace scan—A display image that is composed of time and spatially offset interleaved image fields. Two fields create a frame. Compare to progressive scan. See Chapter 11.

Interoperability—The capability to communicate and transfer data among various functional units without format or connectivity problems.

IP—Internet Protocol. The Internet Protocol, defined by RFC 791, is the network layer for the TCP/IP protocol suite. It is a connectionless, best-effort packetswitching protocol. This is most often referred to as IPV4. See Chapter 6.

IPSec—IP Security. A security protocol that provides for confidentiality and authentication of individual IP packets.

IPV6—A version upgrade of IPV4, including improved address space, quality of service, and data security.

iSCSI—SCSI commands over an IP-based link. It finds application in SAN environments and is a replacement technology for Fibre Channel. See Chapter 3B.

ISDB—Integrated Services Digital Broadcasting. The digital television broadcast standard for Japan.

iSNS—Internet Storage Naming Service. The iSNS protocol is designed to facilitate the automated discovery, management, and configuration of iSCSI and Fibre Channel devices on a TCP/IP network.

Isochronous—Signals that carry their own timing information imbedded as part of the signal. A SMPTE 259M SDI signal is an isochronous signal.

IT—Information technology. Related to technologies for the creation, storage, processing, consumption, and management of digital information.

ITIL—Information Technology Infrastructure Library. This is a framework of best practices that promote quality services in the IT sector. ITIL addresses the organizational and skill requirements for an organization to manage its IT operations.

Java EE 5—Java Enterprise Edition version 5. A Java-based, runtime platform for developing, deploying, and managing multitier, server-centric applications on an enterprise-wide scale. Java Standard Edition 6 (Java SE 6) is a reduced form of the Java EE model.

JBOD—Just a bunch of disks. This informal term refers to a hard disk array that isn’t configured according to RAID principles.

JITFT—Just in Time File Transfer. A concept of file exchange where the delivered file arrives at its destination “just in time” by some comfortable margin to be used for editing, playout, format conversion, or some other operation.

JPEG—Joint Photographic Experts Group. A digital image file format standard using image compression.

JPEG2000—A wavelet-based image compression standard used for both photos and moving images. It is the basis of the digital cinema specification as defined by the Digital Cinema Initiatives (DCI) group. It is often abbreviated as J2K.

Key—See video key.

Key performance indicators (KPIs)—Quantifiable measurements that reflect the critical success factors of an organization. Business dashboards (sales volume, inventory, units/hour, etc.) show typical KPIs.

LAN—Local area network. A data network covering a limited area. See Chapter 6.

Long GOP—See GOP.

LTC—Linear (longitudinal) time code. The SMPTE 12M time code standard historically recorded onto the audio track of a VTR or audio recorder. See also VITC.

Luma—A video signal related to the monochrome or lightness component of a scene. Often tagged as Y’.

LUN—Logical unit number. A LUN addresses a fixed amount of storage from a pool. The SCSI protocol uses LUNs to address portions of total storage.

MAM—Media asset management. These are the technologies used to index, catalog, search, browse, retrieve, manage, and archive specific media content objects. Also, more generally referred to as digital asset management when there are no time-based data types but mainly text and graphics.

MAN—Metropolitan area network. See Chapter 6.

Media Dispatch Protocol (MDP)—A transaction-oriented protocol for establishing the contract for a file transfer between two entities. It is standardized as SMPTE 2032.

Metadata—Literally defined as structured data about data. Metadata are descriptive information about an object or resource. Dark metadata are a value-set undefined to the current application but may be useful to another application.

MIB—Management Information Base. See .

MOS Protocol—Media Object Server Protocol. A protocol for managing the rundown list and associated operations per story for a news broadcast. See www.mosprotocol.com.

MPEG—Motion Picture Experts Group. This is an ISO/IEC standards body responsible for developing A/V compression formats (MPEG1, 2, 4) and other A/V-related standards (MPEG7, 21). MPEG2 and MPEG4 part 10 (H.264) are used as distribution formats for digital cable, satellite, over-the-air, and other applications.

MPLS—Multiprotocol Label Switching. See Chapter 6.

MSCE—Microsoft certified systems engineer.

MSO—Multiple system operator. A cable industry term describing a company that operates more than one cable TV system.

MXF—Material eXchange Format. A file wrapper or container format for A/V professional use. MXF encapsulates audio + video + metadata elements in a timealigned manner. It also supports streaming. See Chapter 7 for more information.

NAS—Network attached storage. Typically, a data server on a network that provides file storage. See Chapter 3B.

NAT—Network address translation. This method maps a local area network private IP address to/from a globally unique IP address. The method conserves the precious, global IP address space.

.NET—Microsoft’s programming framework for creating applications and services using combinations of servers and clients of all types. It relies on XML and Web services to implement solutions.

NFS—Network file system. A standardized protocol for networked file sharing. NAS file servers often support this. See also CIFS.

NLE—Nonlinear editor. The computer-assisted editing of A/V materials without the need to assemble them in a linear sequence. The visual equivalent of word processing. Tape-based editing is considered linear editing.

NRCS—News room computer system. A set of software applications for managing the editorial aspects of a news story.

NRT—Non-real-time. See also RT.

NTP—Network Time Protocol. A method for synchronizing remote clocks to a master clock over a packet-switched network with delay and jitter. RFC 1305 specifies the methods. A related protocol is IEEE 1588, the Precision Time Protocol.

NSPOF—No single point of failure. A system that tolerates a single component failure using fast bypass techniques. Performance should always be specified under a single failure mode.

NTSC—National Television Systems Committee. It describes the SD system of color analog TV used mainly in North America, Japan, and parts of South America. NTSC uses 525 lines per frame and 29.97 frames (59.94 fields) per second.

OASIS—Organization for the Advancement of Structured Information Standards. The group is a consortium that drives the development, convergence, and adoption of open standards for the global information society.

OOP—Object-oriented programming.

PAL—Phase alternate line. The name of the SD analog color television system used mainly in Europe, China, Malaysia, Australia, New Zealand, the Middle East, and parts of Africa. It uses 25 frames per second and 625 lines per frame.

Plesiochronous—Plesiochronous is derived from the Greek plesio, meaning near, and chronos, time. Plesiochronous systems run in a state in which different parts of the system are almost, but not quite perfectly, synchronized.

POTS—Plain old telephone service.

Progressive scan—An image that is scanned sequentially from top to bottom to create a single frame. Compare to interlace scan.

PSTN—Public Switched Telephone Network.

QCIF—Quarter common intermediate format. This is a spatial resolution image format of 176 (H) × 144 (V) pixels, 4:2:0. See also CIF.

QoS—Quality of service. A guarantee of predictable metrics for the data rate, latency, jitter, and loss for a network connection. It can also apply to other services with corresponding QoS metrics for that service. For example, typical QoS metrics of a storage system are transaction latency, R/W access rate, and availability.

RAID—Redundant array of independent (or inexpensive) discs. A method to improve the reliability of an array of discs. See Chapter 5.

RDMA—Remote direct memory access. A method to move block data between two memory systems where one is local and one remote.

RESTful Services—A model for Web services based on HTTP, CRUD functions, and URIs to implement the service calls. REST is derived from REpresentational State Transfer.

RFC—Request for comment. A specification developed by the IETF. The document series, begun in 1969, describes the Internet suite of protocols. Not all RFCs describe Internet standards, but many do. Other bodies also contribute to the standards pool.

RFP—Request for proposal.

RGB—Red, green, and blue primary linear-light components. The exact color interpretation depends on the colorimetry scheme used.

R’G’B’—Red, green, and blue primary non-linear light components. The prime symbol denotes a gamma-corrected value. See also gamma.

Router—A device that buffers and forwards data packets across an internetwork toward their destinations. Routing occurs at layer 3 (the network layer, e.g., IP) of the protocol stack.

RPC—Remote procedure call. A protocol for connecting to and running individual processes on remote computers across a network. The client/server model may use RPC-style message passing.

RT—Real time. An activity that occurs in “A/V real time” such as live-streamed video/audio or A/V device control.

RTP—Real-Time Protocol. The IETF standard RFC 1889 for streaming A/V usually across IP/UDP networks. Most Web-based A/V streaming uses RTP. Professional IP-based video carriage often relies on RTP. Most A/V data types (MPEG, MP3 audio, others) have a mapping for carriage using RTP.

Samba—An open source implementation of Microsoft’s CIFS protocol for file and printer sharing. For example, a Linux computer using Samba appears as a Windows-networked file system.

SAN—Storage area network. This is a technology for sharing a pool of storage with many independent clients and servers. See Chapter 3B for more details.

SAS—Serial-attached SCSI. This a serial version of the venerable parallel SCSI link.

SATA—Serial ATA. The serialized form of the common ATA interface. SATA and SAS connectivity have converged; see Chapter 3B. The 3.0 Gbps speed has been widely referred to as SATA II or SATA2. This is a misnomer; there is only SATA.

SCSI—Small Computer System Interface. The standard parallel interface for disc drives and other devices for high-end use. The SCSI command layer is used in Fibre Channel and other serial links; see Chapter 3A.

SD—Standard definition video resolution. See Chapter 11.

SDI—Serial digital interface. A serial coaxial link used to move A/V digital data from point to point in a professional video system. The nominal line rate is 270 Mbps for SD video. It is defined by SMPTE 259M for SD and 292M for HD.

SDTI—Serial Digital Transport Interface (SMPTE 305M). This link uses SMPTE 259M (SDI) as an agnostic payload carrier. There are several defined data mappings onto the SDTI link with compressed payloads (MPEG, DV, VC-3) being the most common.

SECAM—A French acronym describing an analog color television system. It is closely related to PAL in line structure and rates.

SI—System’s integrator.

SIF—Source input format. A 4:2:0, 352 × 288 (25 FPS) or 352 × 240 (29.97 FPS) image.

SLA—Service Level Agreement. A service contract between a customer and a LAN/WAN/MAN service provider that specifies the working QoS and reliability a customer should expect.

SMB—Server message block. The foundation protocol for Microsoft Windows file-server access, also described as the Common Internet File System (CIFS). See also Samba.

SMEF—Standard Media Exchange Framework. A BBC-developed XML schema for describing MAM metadata.

SMI—Storage Management Initiative. A project of the Storage Networking Industry Association (SNIA) to develop and standardize storage management methods.

SMP—Symmetric multiprocessing. See Appendix C.

SMPTE—Society of Motion Picture and Television Engineers. A professional engineering society tasked with developing educational forums and technical standards for motion pictures and television.

SNMP—Simple Network Management Protocol. See .

SOA—Service-oriented architecture. An architecture of distributed, loosely coupled services available over a network. Consumers (clients) access networked (using middleware) service providers (servers) to perform some well-defined task. SOA principles enable business agility and business process visibility.

SOAP—Simple Object Access Protocol. SOAP is a lightweight protocol for the exchange of information in a decentralized, distributed environment. It is XML based and consists of three parts: an envelope that defines the message and how to process it, a set of encoding rules for expressing the application-related data types, and a convention for representing remote calls and responses. SOAP is fundamental in W3C’s Web services specs.

SONET—Synchronous Optical NETwork. See Appendix F for more information.

SPOF—Single point of failure. Compare to NSPOF.

SQL—Structured Query Language. A standard language for querying and modifying relational databases.

SSD—Solid State Disc. A SSD mimics a HDD but with non-volatile Flash memory replacing rotating platters. See for a review of this technology.

SSL—Secure Sockets Layer. This is a method of encrypting networked data using a public key. HTTPS uses SSL.

S_Video—A base-band analog video format in which the chroma and luma signals are carried separately to improve fidelity.

TCO—Total cost of ownership. This metric combines all the costs of a device from capital outlay to ongoing operational costs.

TCP—Transmission Control Protocol. This protocol provides payload multiplexing and end-to-end reliability for data transfers across a network. TCP packets are carried by IP packets. This is a layer 4, connection-based protocol. See Chapter 6.

Timecode or time code—A number of the form HH:MM:SS:FF (hours, minutes, seconds, frames) that defines the frame sequence in a video file/stream or film. For 29.97 frames per second systems, FF spans from 00 to 29, whereas for 25 frames per second systems, FF spans from 00 to 24. An example is 11:49:59:24, which is immediately followed by 11:50:00:00 one frame later. See Chapter 11.

TLAN—Transparent LAN. This is a MAN that is Ethernet based end to end.

TOE—TCP Offload Engine. A hardware accelerator that offloads TCP/IP stack processing from the main device CPU.

Traffic system—A software application for managing the precise scheduling of programming, commercials, and live events throughout a TV station broadcast day. The output of a traffic system is the on-air, second-by-second schedule.

TS—Transport Stream. This is a MPEG systems layer spec for carrying compressed audio, video, and user data. Some non-MPEG compression formats also have mappings into the TS wrapper.

UDDI—Universal Description, Discovery, and Integration protocol. UDDI is a specification for maintaining standardized directories of information about Web services, their capabilities, location, and requirements.

UDP—User Datagram Protocol. This protocol provides payload multiplexing and error detection (not correction) for end-to-end data transfers over IP. This is a layer 4, connectionless-based protocol. Compare to TCP. See Chapter 6.

UHDV—Ultra High Definition Video. An image system with a 7,680 × 4,320 raster format that is being developed by the Japan Broadcasting Corp. (NHK). It has 16 times the resolution of HDTV at 1080i. It is expected to be in operation circa 2016.

UMID—Unique Material Identifier. A SMPTE standard, the UMID is an identifier for picture, audio, and data essence that is globally unique.

URI—Uniform Resource Identifier. In computing, a URI is a character string that names a resource. The most common URI is a URL—http://www.google.com, for example. RESTful Web services rely on URIs to identify each resource that enables CRUD-style manipulation.

VANC—Vertical ANCillary data field. See also HANC.

VBI—Vertical blanking interval. For NTSC, all the horizontal lines from 7 to 21 (field one) and from 270 to 284 (field two). These lines may carry nonvisual information such as time code, teletext, test signals, and closed caption text. For PAL, VBI spans lines 7–21 and 319–333.

VC-1—Video coding 1. A shorthand descriptor of the SMPTE 421M standard. It is based on Microsoft’s WM9 video codec.

VC-2—Video coding 2. A shorthand descriptor of the tentative SMPTE 2042 standard. VC-2 defines a wavelet-based, intra frame video decoder for production applications. It provides coding at multiple resolutions including CIF, SDTV, and HDTV.

VC-3—Video coding 3. A shorthand descriptor for the SMPTE 2019 family of standards. Avid’s DNxHD video production compression format is the basis of VC-3 supporting intra frame, 4:2:2, 10-bit sampling, and bitstream rates to 220 Mbps.

VDCP—Video Disc Control Protocol. This is commonly used to control video server operations.

Vertical sync—The portion of the video signal that triggers the receiver to start the vertical retrace, thereby bringing the raster in position to start the top line.

Video key—A video signal used to “cut a hole” in a second video signal to allow for insertion of a third video signal (the fill) into that hole.

Video reference—Typically, an analog composite or SDI video signal with a black image active area. It is also called “black burst.” It is distributed throughout a facility to any element that needs a common horizontal and vertical timing reference.

Virtualization—A technique for hiding the physical characteristics of computing resources from the way in which other systems, applications, or end users interact with those resources.

VITC—Vertical internal time code. A time code data structure described bySMPTE 12M and encoded in one or more lines of the VBI. See also LTC.

VLAN—Virtual LAN. A logical, not physical, group of networked devices. VLANs enable administrators to segment their networks (department or region) without physically rearranging the devices or network connections. VLANs are segmented at layer 2 in the protocol stack.

VPN—Virtual Private Network. A secure, end-to-end, private data tunnel across the public Internet.

W3C—World Wide Web Consortium. A vendor-neutral industry body that develops standards for the Web. Popular W3C standards include HTML, HTTP, XML, SOAP, Web services, and others.

WAFS—Wide area file services. WAFS products accelerate data transfers across WANs using caching and protocol emulation techniques.

WAN—Wide area network. A network that connects computers or systems over a large geographic area. See Chapter 6.

WBEM—Web-Based Enterprise Management Initiative. See .

Web service—A self-describing, self-contained unit of programming logic that provides functionality (the service) through a network connection. Applications access Web services using, for example, SOAP/XML without concern for how the Web service is implemented. Do not confuse Web services with the classic Web server; they rely on completely different software models.

WMI—Windows Management Instrumentation. See .

WSDL—Web Services Description Language. WSDL defines a Web service’s functionality and data types. It is expressed using XML.

XML—eXtensible Markup Language. A data language for structured information exchange. Values are associated with tags, enabling the definition, validation, and interpretation of data elements between applications and systems.

Y’—See Luma.

Y’CrCb—Digital component signal set for uncompressed SD and HD video. See Chapter 11.

Y’PrPb—Analog component signal set for uncompressed SD and HD video. See Chapter 11.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.137.146.71