V-Chip

television

television, transmission and reception of still or moving images by means of electrical signals, originally primarily by means of electromagnetic radiation using the techniques of radio, now also by fiber-optic and coaxial cables and other means. Television has become a major industry, especially in the industrialized nations, and a major medium of communication and source of home entertainment. Television is put to varied use in industry, e.g., for surveillance in places inaccessible to or dangerous for human beings; in science, e.g., in tissue microscopy (see microscope); in medicine, e.g., in endoscopic surgery (see endoscope); and in education.

Evolution of the Scanning Process

The idea of "seeing by telegraph" engrossed many inventors after the discovery in 1873 of variation in the electrical conductivity of selenium when exposed to light. Selenium cells were used in early television devices; the results were unsatisfactory, however, chiefly because the response of selenium to light-intensity variations was not rapid enough. Moreover, until the development of the electron tube there was no way of sufficiently amplifying the weak output signals. These limitations precluded the success of a television method for which Paul Nipkow in Germany received (1884) a patent.

His system employed a selenium photocell and a scanning disk; it embodied the essential features of later successful devices. A scanning disk has a single row of holes arranged so that they spiral inward toward the center from a point near the edge. The disk revolves in front of a light-sensitive plate on which a lens forms an image; each hole passes across, or "scans," a narrow, ring-shaped sector of the image. Thus the holes trace contiguous concentric sectors, so that in one revolution of the disk the entire image is scanned. When the light-sensitive cell is connected in an electric circuit, the variations in light cause corresponding fluctuations in the electric current. The image can be reproduced by a receiver whose luminous area is scanned by a similar disk synchronized with the disk of the transmitter.

Although selenium cells proved inadequate, the development of the phototube (see photoelectric cell) made the mechanical disk-scanning method practicable. In 1926, J. L. Baird in Great Britain and C. F. Jenkins in the United States successfully demonstrated television systems using mechanical scanning disks. While research remained at producing pictures made up of 60 to 100 scanned lines, mechanical systems were competitive. These were soon superseded, however, by electronic scanning methods; a television system employing electronic scanning was patented by V. K. Zworykin in 1928. The 1930s saw the laboratory perfection of television equipment, and some programming became available in the United States beginning in 1939, but World War II almost entirely halted television programming and broadcasting. The television industry began to grow again only after 1945.

The television scanning process, used both to record and reproduce an image, operates as do the eyes in reading a page of printed matter, i.e., line by line. Prior to the introduction of television cameras using charge-coupled devices (see below), a complex circuit of horizontal and vertical deflection coils caused an electronic beam to scan the back of a mosaic of photoelectric cells in a 483-line zigzag 30 times each second, though the actual viewing area when the image was reproduced was typically 440 lines and 480 lines were used later by DVDs (digital versatile discs). (The standard was in fact a 525-line one, but not all the lines were used for the picture. The 525-line, 30-frame-per-second system was used in the United States, Japan, and elsewhere; many other countries used similar but incompatible systems.) Because of persistence of vision only about 16 pictures need be viewed each second to give the effect of motion. The development of interlaced scanning resulted in alternate lines being scanned each 1/60 sec. Half the lines were scanned in the first 1/60 sec, and the remaining lines, each one between two lines scanned during the first pass, covered in the next 1/60 sec.

Development of the Television Camera and Receiver

V. K. Zworykin's Iconoscope (1923) was the first successful camera tube in wide use. Its functioning involved many fundamental principles common to all television image pickup devices. The face of the iconoscope consisted of a thin sheet of mica upon which thousands of microscopic globules of a photosensitive silver-cesium compound had been deposited. Backed with a metallic conductor, this expanse of mica became a mosaic of tiny photoelectric cells and capacitors. The differing light intensities of various points of a scene caused the cells of the mosaic to emit varying quantities of electrons, leaving the cells with positive charges proportionate to the number of electrons lost. An electron gun, or scanner, passed its beam across the cells. As it did so, the charge was released, causing an electrical signal to appear on the back of the mosaic, which was connected externally to an amplifier. The strength of the signal was proportional to the amount of charge released. The iconoscope provided good resolution, but required very high light levels and needed constant manual correction.

The Orthicon and Image-Orthicon camera tubes improved on the Iconoscope. They used light-sensitive granules deposited on an insulator and low-velocity scanning. These could be used with lower light levels than required by the Iconoscope, and did not require the constant manual manipulation. The Vidicon was the first successful television camera tube to use a photoconductive surface to derive a video signal.

Solid-state imaging devices were first demonstrated in the 1960s. Video cameras using semiconductor charge-coupled devices (CCDs) began development in the 1970s, and began replacing tube-based cameras in the mid-1980s. Each picture element (pixel) in a CCD stores a charge that is determined by the illumination incident on it. At the end of the exposure interval, the charge is transferred to a storage register and the CCD is freed up for the next exposure. The charges in the storage register are transferred to the output stage serially during that time. By the mid-1990s, CCD-based television cameras had replaced tube-based cameras, but at the same time development was proceeding on a different solid-state technology, the complementary metal-oxide semiconductor (CMOS) image sensor. CMOS technology also is used for computer integrated circuits and random-access memory (RAM), and CMOS image sensors are less expensive to manufacture than CCDs. In general a CMOS image sensor operates similarly to a CCD, but additional processing occurs at each pixel and the pixels transfer their output more quickly and in a digital format. Although CMOS-based cameras initially were inferior for high-quality uses compared to CCD-based ones, steady improvements in CMOS techonology led by the 2010s to its replacing CCDs in many television and video cameras. High-end 3CCD and 3CMOS video cameras use three sensors, one each for red, green, and blue, for improved color image quality.

In the television receiver, the original image is reconstructed. In television receivers using cathode-ray tubes, this was done essentially by reversing the operation of the video camera. The final 483- or 480-line interlaced image was displayed on the face of the tube, where an electron beam scanned the fluorescent face, or screen, line for line with the pickup scanning. The fluorescent deposit on the tube's inside face glowed when hit by the electrons, and the visual image was reproduced. In a television set with a liquid crystal display (LCD, also called LED LCD or LED if light-emitting diode backlighting is used), which also recreates the image line by line, control signals are sent to the lines and columns formed by the hundreds of thousands of pixels in the display, and each pixel, or picture element, is connected to a switching device. In high-definition televisions, 720 or 1080 display lines are used, with 1080 typically now standard, and the scanning may be progressive (noninterlaced, 720 and 1080) or interlaced (1080 only), typically at 30 frames per second. Progressive scanning in general produces a picture with less flicker and better reproduction of motion, particularly on larger screens. An ultra high-definition, or 4K, display uses 2160 lines for the image. Other devices in the receiver extract the crucial synchronization information from the signal, demodulate (separate the information signal from the carrier wave) it, and, in the case of a digital signal, demultiplex, decrypt, and decode it.

Development of Color Television

Several systems of color television have been developed. In the first color system approved by the Federal Communications Commission (FCC), a motor-driven disk with segments in three primary colors—red, blue, and green—rotated behind the camera lens, filtering the light from the subject so that the colors could pass through in succession. The receiving unit of this system formed monochrome (black-and-white) images through the usual cathode-ray tube, but a color wheel, identical with that affixed to the camera and synchronized with it, transformed the images back to their original appearance. This method is said to be "field-sequential" because the monochrome image is "painted" first in one color, then another, and finally in the third, in rapid enough succession so that the individual colors are blended by the retentive capacities of the eye, giving the viewer the impression of a full colored image. This system, developed by the Columbia Broadcasting System (CBS), was established in 1950 as standard for the United States by the FCC. However, it was not "compatible," i.e., a good picture could not be obtained on standard black-and-white sets from the same signal, so it found scant public acceptance.

It was the development of a simultaneous compatible system by the Radio Corporation of America (RCA), first demonstrated in 1951, that led to the widespread acceptance of color television. In this electronic, "element-sequential" system, light from the subject is broken up into its three color components, which are simultaneously scanned by three pickups. The signals corresponding to the red, green, and blue portions of the scanned elements are combined electronically so that the required 4.1-MHz bandwidth can be used. In the receiver the three color signals are separated for display. The elements, or dots, on the picture tube screen are each subdivided into areas of red, green, and blue phosphor. Beams from three electron guns, modulated by the three color signals, scan the elements together in such a way that the beam from the gun using a given color signal strikes the phosphor of the same color. Provision is made electronically for forming proper gray tones in black-and-white receivers. In 1953 the FCC reversed its 1950 ruling and revised the standards for acceptable color television systems. The RCA system met the new standards (the CBS system did not) and was well received by the public.

Broadcast, Cable, and Satellite Television Transmission

Television programs may be transmitted either "live" or from a recording. The principle means of recording television programs for future use for many years was videotape recording, although programs were first recorded (when recorded) by kinescope, a method that uses motion-picture film. Appropriate changes in the signal-carrying circuitry allow kinescopes to be played back from a developed negative as well as from a positive. Videotape recording is similar to conventional tape recording (see tape recorder; videocassette recorder) except that, because of the wide frequency range—4.2 megahertz (MHz)—occupied by a video signal, the effective speed at which the tape passes the head is kept very high. The sound is recorded along with the video signal on the same tape. Television programs may also be recorded on a computer drive that uses hard disks or solid-state flash memory and on optical disks such as DVDs in a variety of formats.

When a television program is broadcast, the varying electrical signals are then amplified and used to modulate a carrier wave (see modulation); the modulated carrier is fed to an antenna, where it is converted to electromagnetic waves and broadcast over a large region. The waves are sensed by antennas connected to television receivers. The range of waves suitable for radio and television transmission is divided into channels, which are assigned to broadcast companies or services. In the United States the Federal Communications Commission (FCC) currently has assigned 12 television channels between 54 and 216 MHz in the very-high-frequency (VHF) range and 47 channels between 470 and 698 MHz in the ultra-high-frequency (UHF) range; 32 additional channels at the upper end of the UHF range (698–890 MHz) originally assigned to television broadcasting were reassigned to other uses between 1983 and 2009 (see radio frequency). Since the transition to digital broadcasting was completed in 2009, the UHF range has increased in importance for television broadcasting even as the number of viewers receiving broadcast television programs has declined. Originally a television station's channel number was identical to the channel number of the radio frequency channel it used for its broadcasts, but as a result of the digital transition that often is no longer true.

Most television viewers in the United States no longer receive signals by using antennas; instead, they receive programming via cable television. Cable delivery of television started as a way to improve reception. A single, well-placed community antenna received the broadcast signals and distributed them over coaxial cables to areas that otherwise would not be able to receive them. Today, cable television is popular because of the wide variety of programming it can deliver. Many systems now provide hundreds of channels of programming. A cable television company now typically receives signals relayed from a communications satellite and sends those signals to its subscribers over coaxial or fiber-optic cable. Some television viewers use small satellite dishes to receive signals directly from satellites. Most satellite-delivered signals are scrambled and require a special decoder to receive them clearly.

See also broadcasting.

Television Technology Innovations

The FCC established a stereo audio standard for television in 1984, and by the mid-1990s all major network programming was broadcast in stereo. In 1996 the FCC adopted a U.S. standard for an all-digital HDTV system, to be used by all commercial broadcast stations by mid-2002. Although it was hoped that the transition to digital broadcasting would be largely completed by 2006, less than a third of all stations had begun transmitting digital signals by the mid-2002 deadline. In 2005 the U.S. government mandated an end to digital broadcasting in Feb., 2009 (changed to June, 2009, shortly before the deadline in 2009). After the transition to digital broadcasting was completed, older analog sets required an external digital converter in order to be able to use broadcast programs.

The next great advance was the adoption of a high-definition television (HDTV) system. Non-experimental analog HDTV broadcasting began in Japan in 1991. The most noticeable difference between the previously existing system and the HDTV system is the aspect ratio of the picture. While the ratio of the width of the old standard TV picture to its height is 4:3, the HDTV system has a ratio of 16:9, about the same as the screen used in a typical motion-picture theater. HDTV also provides higher picture resolution and high quality audio. A total of 750 or 1,125 scanlines are embedded in the HDTV signal. Each frame of video consists of 720 or 1080 visible horizontally scanned lines, and the rest of the scanlines may carry the time code, vertical synchronization information, closed-captioning, and other information.

Television networks experimented with so-called three-dimensional (3D) or stereoscopic television during the late 20th and early 21st cent., using a variety of technologies to create an illusion of depth in the picture. In the early 2010s, however, networks, filmmakers, and television manufacturers developed more regular 3D programing, an increased number of 3D motion pictures, and a variety of 3D-capable television sets (most relying on special glasses that needed to be worn while viewing 3D programs and movies). Consumers, however, did not widely embrace 3D television, leading major television manufacturers to end the production of 3D sets by mid-decade.

Because the wide availability of television has raised concerns about the amount of time children spend watching television, as well as the increasingly violent and graphic sexual content of television programming, the FCC required television set manufacturers to install, starting in 1999, "V-Chip" technology that allows parents to block the viewing of specific programs. That same year the television industry adopted a voluntary ratings system to indicate the content of each program.

Various interactive television systems now exist. Cable television systems use an interactive system for instant ordering of "pay-per-view" programming or other on-demand viewing of programs. Cable systems also may poll their subscribers' equipment to compile information on program preferences, and interactive systems can be used for instant public-opinion polls or for home shopping. So-called smart televisions include an operating system and storage, and allow users to run computer applications (apps) that resemble those designed for smartphones. Standards have also been developed for the distribution of television programming via the Internet.

Bibliography

See D. G. Fink and D. M. Lutyens, The Physics of Television (1960); M. S. Kiver and M. Kaufman, Television Simplified (7th ed. 1973); R. Armes, On Video (1988); K. B. Benson and J. C. Whitaker, Television and Audio Handbook (1990); K. B. Benson, Television Engineering Handbook (1992); D. E. Fisher and M. J. Fisher, Tube (1996).

The Columbia Encyclopedia, 6th ed. Copyright© 2018, The Columbia University Press.

V-Chip: Selected full-text books and articles

The V-Chip in Canada and the United States: Themes and Variations in Design and Deployment By McDowell, Stephen D.; Maitland, Carleen Journal of Broadcasting & Electronic Media, Vol. 42, No. 4, Fall 1998
PEER-REVIEWED PERIODICAL
Peer-reviewed publications on Questia are publications containing articles which were subject to evaluation for accuracy and substance by professional peers of the article's author(s).
A Cognitive Psychology of Mass Communication By Richard Jackson Harris Lawrence Erlbaum Associates, 2004 (4th edition)
Librarian's tip: The V-Chip is discussed in "Helping Children Deal with Violent Media," which begins on p. 282
Free Expression and Censorship in America: An Encyclopedia By Herbert N. Foerstel Greenwood Press, 1997
Librarian's tip: "V-Chip" begins on p. 222
Society on the Line: Information Politics in the Digital Age By Malcolm Peltu; William H. Dutton Oxford University Press, 1999
Librarian's tip: "The Violence Chip (V-Chip)" begins on p. 67
Regulating the Changing Media: A Comparative Study By Stefaan Verhulst; David Goldberg; Tony Prosser Clarendon Press, 1998
Librarian's tip: "V-Chip Legislation" begins on p. 238
The Telecommunications Act of 1996: Its Impact on the Electronic Media of the 21st Century By Hendricks, John Allen Communications and the Law, Vol. 21, No. 2, June 1999
Ratings and the V-Chip By Dority, Barbara; Barlow, John Perry The Humanist, Vol. 56, No. 3, May-June 1996
Looking for a topic idea? Use Questia's Topic Generator
Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.