Magazine article American Cinematographer

# A Layman's Introduction to Digital Video

Magazine article American Cinematographer

# A Layman's Introduction to Digital Video

## Article excerpt

There is a lot of excitement in some quarters these days about The Digital Revolution in television. For the filmmaker who has only recently decided that electronic technology may have something to offer him, discussions of digital video may seem simply to be heaping coals on the fires of his confusion. He probably couldn't care less whether a video signal is analog or digital so long as he gets what he wants on the screen. Nonetheless it may be useful to have some rudimentary understanding of digital video technology. The more one understands a tool, the better one is able to use it; and, if nothing else, a filmmaker may feel more comfortable in a video facility if he can understand a little bit of the language being spoken there.

The term "digital" refers to a method of converting information into an electrical signal alternating between two fixed levels which can be thought of as "on" and "off" or"1" and "0." Computers as a rule process information digitally. The key to digitization is the translation of information into "words" composed of 1's and 0's, i.e. the conversion of everything into binary numbers. Binary numbers are composed of only two digits (0 & 1) so that counting from zero to ten would go 0,1,10,11,100, 101, 110, 111, 1000, 1001, 1010. While a number system based on ten digits is much handier in everyday life, a binary number system connected to an electrical signal can accomplish wonders. For example by assigning a specific binary number to each letter of the alphabet and to each normal decimal number, it is possible to convert all manner of information into electrical signals which can be recorded and manipulated by on/off switches in complex circuits.

The individual signal (i.e. a "1" or "0") is called a "bit" (condensed from of nary digit), and most computers work with eight bit "words" called "bytes." When someone talks about a "megabyte," he is talking about a million bytes of 8 million bits of "information." With eight bits there are 256 different possible combinations (2^sup 8^), and most computers are designed so that eight bits can be transmitted simultaneously. This is called parallel transmission as opposed to serial transmission in which the bits are transmitted one after another over a single line with an indication of the beginning and end of each byte.

Digital vs Analog

To understand how all this can be applied to a television signal, we must first understand the difference between an "analog" signal and a digital signal. Generally speaking an analog signal is a continuously variable signal rather than a series of discreet pulses at fixed levels. An audio signal created by a microphone is a continuously varying electrical signal the "shape" or characteristics of which are analogous to the "shape" of the sound wave. Similarly the video signal generated by a pick-up tube is a continuously varying signal in which the amplitude corresponds to the brightness level of the image at a given point on a particular scan line. All video signals (like all audio signals) originate as analog signals, but it is possible to convert them into digital signals. This is done by analyzing the signal as a series of discreet voltage levels occurring one after the other. Just as a curve on a graph can be analyzed as a series of points having definite values, the changes in the amplitude of a video signal can be analyzed as a series of distinct voltage levels. The ability to reproduce smoothly and accurately the shape of a curve on a graph depends on the number of points selected. Similarly the quality of a digital video signal depends on the "sampling rate" or the frequency with which the voltage level is measured. Generally the sampling rate for a digital video signal is greater than 12 million times a second.

Each time the video signal is sampled, the voltage level of the signal must be translated into a digital "word" which can then be transmitted. If the system is designed to use an eight-bit word, it can distinguish between 256 levels of amplitude (or levels of brightness in the image). …

Search by...
Show...

### Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.