In the early 1950s, cable
systems began experimenting with ways to use microwave
transmitting and receiving towers to capture the signals from
distant stations. In some cases, this made television
available to people who lived outside the range of standard TV
broadcasts. In other cases, especially in the northeastern
United States, it meant that cable customers might have access
to several broadcast stations of the same network. For the
first time, cable was used to enrich television viewing, not
just make ordinary viewing possible. This started a trend that
would begin to blossum in the 1970s.
The addition of CATV (community antenna television)
stations and the spread of cable systems ultimately led
manufacturers to add a switch to most new television
sets. People could set their televisions to tune to channels
based on the Federal
Communications Commission (FCC) frequency allocation plan,
or they could set them for the plan used by most cable
systems. The two plans differed in important ways.
In both tuning systems, each television station was given a
6-megahertz (MHz) slice of the radio
spectrum. The FCC had originally devoted parts of the
very high frequency (VHF) spectrum to 12 television
channels (2-13). These were the original TV channels broadcast
over the air and were typically allocated to the major TV
networks such as CBS, NBC, ABC. For example, channel 5 was the
local CBS TV station while channel 4 was NBC and channel 12
was ABC. The channels weren't put into a single block of
frequencies, but were instead broken into two groups to
avoid interfering with existing radio
services.
Later, when the growing popularity of television
necessitated additional channels, the FCC allocated
frequencies in the ultra-high frequency (UHF) portion
of the spectrum. They established channels 14 to 69 using a
block of frequencies between 470 MHz and 812 MHz.
Because CATV used cable instead of antennas, cable
television systems didn't have to worry about existing
broadcasting services. Engineers could use the frequencies between 88 MHz
and 174 MHz for 13 channels of programming and begin channel
14 at 216 MHz. The "CATV/Antenna" switch tells the
television's tuner whether to tune around the restricted
blocks in the FCC broadcast plan or tune "straight through"
for cable reception. In the CATV position, the switch tells
the tuner to start at 88 MHz and go straight up in 6-MHz
slices, with no break.
While we're on the subject of tuning, it's worth
considering why CATV systems don't use the same frequencies
for stations broadcasting on channels 1 to 6 that those
stations use to broadcast over the airwaves. Cable equipment
is designed to shield the signals carried on the cable
from outside interference, and televisions are designed to
accept signals only from the point of connection to the cable
or antenna; but interference can still enter the
system, especially at connectors. When the interference comes
from the same channel that's carried on the cable, there is a
problem because of the difference in broadcast speed between
the two signals.
Radio signals travel through the air at a speed very close
to the speed
of light. In a coaxial cable like the one that
brings CATV signals to your house, radio signals travel at
about two-thirds the speed of light. When the broadcast and
cable signals get to the television tuner a fraction of a
second apart, you see a double image called "ghosting."
Digital Cable
In 1989, General Instruments demonstrated that it was possible to convert an analog cable signal to digital and transmit it in a standard 6-MHz television channel. Using MPEG compression, CATV systems installed today can transmit up to 10 channels of video in the 6-MHz bandwidth of a single analog channel. When combined with a 550-MHz overall bandwidth, this allows the possibility of nearly 1,000 channels of video on a system. In addition, digital technology allows for error correction to ensure the quality of the received signal. The move to digital technology also changed the quality of one of cable television's most visible features: the scrambled channel.
The first system to "scramble" a channel on a cable system was demonstrated in 1971. In the first scrambling system, one of the signals used to synchronize the television picture was removed when the signal was transmitted, then reinserted by a small device at the customer's home. Later scrambling systems inserted a signal slightly offset from the channel's frequency to interfere with the picture, then filtered the interfering signal out of the mix at the customer's television. In both cases, the scrambled channel could generally be seen as a jagged, jumbled set of video images.
In a digital system, the signal isn't scrambled, but encrypted. The encrypted signal must be decoded with the proper key. Without the key, the digital-to-analog converter can't turn the stream of bits into anything usable by the television's tuner.