A modem (a portmanteau constructed from modulate and demodulate) is a device that modulates an analog carrier signal to encode digital information, and also demodulates such a carrier signal to decode the transmitted information. The goal is to produce a signal that can be transmitted easily and decoded to reproduce the original digital data. Modems can be used over any means of transmitting analog signals, from driven diodes to radio. Experiments have even been performed in the use of modems over the medium of two cans connected by a string.
The most familiar example of a modem turns the digital '1s and 0s' of a personal computer into sounds that can be transmitted over the telephone lines of Plain Old Telephone Systems (POTS), and once received on the other side, converts those sounds back into 1s and 0s. Modems are generally classified by the amount of data they can send in a given time, normally measured in bits per second, or "bps".
Far more exotic modems are used by Internet users every day, notably cable modems and ADSL modems. In telecommunications, "radio modems" transmit repeating frames of data at very high data rates over microwave radio links. Some microwave modems transmit more than a hundred million bits per second. Optical modems transmit data over optic fibers. Most intercontinental data links now use optic modems transmitting over undersea optical fibers. Optical modems routinely have data rates in excess of a billion (1x109) bits per second.
History
1957 AT&T DataphoneModems in the United States were first introduced as a part of the SAGE air-defense system in the 1950s, connecting terminals at various airbases, radar sites, and command-and-control centers to the SAGE director centers scattered around the U.S. and Canada. SAGE ran on dedicated communications lines, but the devices at each end were otherwise similar in concept to today's modems. IBM was the primary contractor for both the computers and the modems used in the SAGE system.
A few years later, a chance meeting between the CEO of American Airlines and a regional manager of IBM led to development of a "mini-SAGE" as an automated airline ticketing system. The terminals were at ticketing offices, tied to a central computer that managed availability and scheduling. The system, known as SABRE, is the ancestor of today's Sabre system.
[edit]
AT&T monopoly in the United States
For many years, AT&T maintained a monopoly in the United States on the use of its phone lines, allowing only AT&T-supplied devices to be attached to its network. For the growing group of computer users, AT&T introduced two digital sub-sets in 1958. One is the wideband device shown in the picture to the left. The other was a low-speed modem, which ran at 200 baud.
In the summer of 1960, the name Data-Phone was introduced to replace the earlier term digital subset. The 202 Data-Phone was a half-duplex asynchronous service that was marketed extensively in late 1960. In 1962, the 201A and 201B Data-Phones were introduced. They were synchronous modems using two-bit-per-baud phase-shift keying (PSK). The 201A operated half-duplex at 2000 bit/s over normal phone lines, while the 201B provided full duplex 2400 bit/s service on four-wire leased lines, the send and receive channels running on their own set of two wires each.
The famous 103A was also introduced in 1962. It provided full-duplex service at up to 300 baud over normal phone lines. Frequency-shift keying (FSK) was used with the call originator transmitting at 1070 or 1270 Hz and the answering modem transmitting at 2025 or 2225 Hz. The readily available 103A2 gave an important boost to the use of remote low-speed terminals such as the KSR33, the ASR33, and the IBM 2741. AT&T reduced modem costs by introducing the originate-only 113D and the answer-only 113B/C modems.
[edit]
The Carterfone decision
The Novation CAT acoustically coupled modemBefore 1968, AT&T maintained a monopoly on what devices could be electrically connected to its phone lines. This led to a market for 103A-compatible modems that were mechanically connected to the phone, through the handset, known as acoustically coupled modems. Particularly common models from the 1970s were the Novation CAT (shown in the image) and the Anderson-Jacobson, spun off from an in-house project at the LLNL.
In 1968, the U.S. Supreme Court broke AT&T's monopoly on the lines in the landmark Carterfone decision. Now, the lines were open to anyone, as long as they passed a stringent set of AT&T-designed tests. AT&T made these tests complex and expensive, so acoustically coupled modems remained common into the early 1980s.
In December 1972, Vadic introduced the VA3400. This device was remarkable because it provided full duplex operation at 1200 bit/s over the dial network, using methods similar to those of the 103A in that it used different frequency bands for transmit and receive. In November 1976, AT&T introduced the 212A modem to compete with Vadic. It was similar in design to Vadic's model, but used the lower frequency set for transmit from originating modem. It was also possible to use the 212A with a 103A modem at 300 bit/s. According to Vadic, the change in frequency assignments made the 212 intentionally incompatible with acoustic coupling, thereby locking out many potential modem manufacturers.
In 1977, Vadic responded with the VA3467 triple modem, an answer-only modem sold to computer center operators that supported Vadic's 1200-bit/s mode, AT&T's 212A mode, and 103A operation.
[edit]
The Smartmodem
The next major advance in modems was the Smartmodem, introduced in 1981 by Hayes Communications. The Smartmodem was an otherwise standard 103A 300-bit/s modem, but was attached to a small controller that let the computer send commands to it to operate the phone line. The command set included instructions for picking up and hanging up the phone, dialing numbers, and answering calls. The basic Hayes command set remains the basis for computer control of most modern modems.
Before the Smartmodem, modems almost universally required a two-step process to activate a connection: first, manually dial the remote number on a standard phone handset, and then plug the handset into an acoustic coupler. Because the modem could not dial the phone, the acoustic coupler remained common to allow users needed to dial the phone. Hardware add-ons, known simply as dialers, were used in special circumstances, and generally operated by emulating someone dialing a handset.
With the Smartmodem, the computer could dial the phone directly by sending the modem a command. This eliminated the need for an associated phone to dial and the need for an acoustic coupler. The Smartmodem instead plugged directly into the phone line. This greatly simplified setup. Terminal programs that maintained lists of phone numbers and sent the dialing commands became common.
The Smartmodem and its clones also aided the spread of bulletin-board systems (BBSs). Modems had previously been typically either the call-only, acoustically coupled models used on the client side, or the much more expensive, answer-only models used on the server side. The Smartmodem could operate in either mode depending on the commands sent from the computer. There was now a low-cost server-side modem on the market, and the BBSs flourished.
[edit]
Increasing speeds
Modems generally remained at 300 and 1200 bit/s into the mid 1980s, although, over this period, the acoustic coupler disappeared, seemingly overnight, as Smartmodem-compatible modems flooded the market.
An external 2400 bit/s modem for a laptop.A 2400-bit/s system similar in concept to the 1200-bit/s Bell 212 signalling was introduced in the U.S., and a slightly different, and incompatible, one in Europe. By the late 1980s, most modems could support all of these standards, and 2400-bit/s operation was becoming common.
Many other standards were also introduced for special purposes, commonly using a high-speed channel for receiving, and a lower-speed channel for sending. One typical example was used in the French Minitel system, in which the user's terminals spent the majority of their time receiving information. The modem in the Minitel terminal thus operated at 1200 bit/s for reception, and 75 bit/s for sending commands back to the servers.
Such solutions were useful in many circumstances in which one side would be sending more data than the other. In addition to a number of "medium-speed" standards, like Minitel, four U.S. companies became famous for high-speed versions of the same concept.
Telebit introduced its Trailblazer modem in 1984, which used a large number of low-speed channels to send data one-way at rates up to 19,200 bit/s. A single additional channel in the reverse direction allowed the two modems to communicate how much data was waiting at either end of the link, and the modems could switch which side had the high-speed channels on the fly. The Trailblazer modems also supported a feature that allowed them to "spoof" the UUCP "g" protocol, commonly used on Unix systems to send e-mail, and thereby speed UUCP up by a tremendous amount. Trailblazers thus became extremely common on Unix systems, and maintained their dominance in this market well into the 1990s.
U.S. Robotics (USR) introduced a similar system, known as HST, although this supplied only 9600 bit/s (in early versions at least) and provided for a larger backchannel. Rather than offer spoofing, USR instead created a large market among Fidonet users by offering its modems to BBS sysops at a much lower price, resulting in sales to end users who wanted faster file transfers.
Hayes was forced to compete, and introduced its own 9600-bit/s standard, Express 96 (also known as "Ping-Pong"), which was generally similar to Telebit's PEP. Hayes, however, offered neither protocol spoofing nor sysop discounts, and its high-speed modems remained rare.
Operations at these speeds pushed the limits of the phone lines, and would have been generally very error-prone. This led to the introduction of error-correction systems built into the modems, made most famous with Microcom's MNP systems. A string of MNP standards came out in the 1980s, each slowing the effective data rate by a smaller amount each time, from about 25% in MNP 1, to 5% in MNP 4. MNP 5 took this a step further, adding data compression to the system, thereby actually increasing the data rate: generally, the user could expect an MNP modem to transfer at about 1.3 times the normal data rate of the modem. MNP was later "opened" and became popular on a series of 2400-bit/s modems, although it was never widespread.
Another common feature of these high-speed modems was the concept of fallback, allowing them to talk to less-capable modems. During the call initiation the modem would play a series of signals into the line and wait for the remote modem to "answer" them. They would start at high speeds and progressively get slower and slower until they heard an answer. Thus, two USR modems would be able to connect at 9600 bit/s, but, when a user with a 2400-bit/s modem called in, the USR would "fall back" to the common 2400-bit/s speed. Without such a system, the operator would be forced to have multiple phone lines for high- and low-speed use.
[edit]
v.32
Echo cancellation was the next major advance in modem design. Normally the phone system sends a small amount of the outgoing signal, called sidetone, back to the earphone, in order to give the user some feedback that their voice is indeed being sent. However this same signal can confuse the modem, is the signal it is "hearing" from the remote modem, or its own signal being sent back to itself? This was the reason for splitting the signal frequencies into answer and originate; if you received a signal on your own frequency set, you simply ignored it. Even with improvements to the phone system allowing for higher speeds, this splitting of the available phone signal bandwidth still imposed a half-speed limit on modems.
Echo cancellation was a way around this problem. By using the sidetone's well-known timing, a slight delay, it was possible for the modem to tell if the received signal was from itself or the remote modem. As soon as this happened the modems were able to send at "full speed" in both directions at the same time, leading to the development of the 9600 bit/s v.32 standard.
Starting in the late 1980s a number of companies started introducing v.32 modems, most of them also using the now-opened MNP standards for error correction and compression. These earlier systems were not very popular due to their price, but by the early 1990s the prices started falling.
The "tipping point" occurred with the introduction of the SupraFax 14400 in 1991. Rockwell had introduced a new chip-set supporting not only v.32 and MNP, but the newer 14,400 bit/s v.32bis and the higher-compression v.42bis as well, and even included 9600 bit/s fax capability. Supra, then known primarily for their hard drive systems for the Atari ST, used this chip set to build a low-priced 14,400 bit/s modem which cost the same as a 2400 bit/s modem from a year or two earlier (about 300 USD). The product was a runaway best-seller, and it was months before the company could keep up with demand.
The SupraFax was so successful that a huge number of companies joined the fray, and by the next year 14.4 modems from a wide variety of companies were available. The Rockwell chip set, while not terribly reliable, became extremely common, but Texas Instruments and AT&T Paragon quickly responded with similar chipsets of their own.
v.32bis was so successful that the older high-speed standards had little to recommend them. USR fought back with a 16,800 bit/s version of HST, but this small increase in performance did little to keep HST interesting. AT&T introduced a one-off 19,200 bit/s "standard" they referred to as v.32ter (also known as v.32 terbo), but this also did little to increase demand, and typically this mode came into use only when two users with AT&T-based modems just happened to call each other. Motorola also introduced another, incompatible, 19.2 standard, but charged very high prices for their modems, which they had previously sold into commercial settings only.
[edit]
v.34
An ISA modem manufactured to conform to the v.34 protocol.Any interest in these systems was destroyed during the lengthy introduction of the 28,800 bit/s v.34 standard. While waiting, several companies decided to "jump the gun" and introduced modems they referred to as "V.FAST". In order to guarantee compatibility with v.34 modems once the standard was ratified (which happened in 1994), the manufacturers were forced to use more "flexible" parts, generally a DSP and microcontroller, as opposed to purpose-designed "modem chips".
A good example of this was USR, which changed their modems to use a DSP from Texas Instruments, and introduced a top-of-the-line Courier product, the V.everything. As the name implied, the new model supported practically every standard on the market, including all of the HST modes, v.32bis, V.FAST and, later, v.34. Rockwell also introduced a V.FAST chipset in late 1993, which they referred to as V.FC (for "Fast Class").
Rapid commoditization in 1994 forced almost all vendors out of the market; Motorola gave up and disappeared without a trace, AT&T throwing in the towel soon after. Their attempts to introduce their own standards were failures in both a technical and business sense.
[edit]
v.90
With the rapid introduction of all-digital phone systems in the 1990s, it became possible to use much greater bandwidth, on the assumption that users would generally be based on digital lines – if not immediately, then in the near future. Digital lines are based on a standard using 8-bits of data for every voice sample, sampling 8000 times a second, for a total data rate of 64 kbit/s. However, many systems use in-band signaling for command data, inserting one bit of command data per byte of signal, and thereby reducing the real throughput to 56k. In 1996, modems began to appear on the market that took advantage of the widespread use of digital phone systems at ISPs, in order to provide download speeds up to 56kbps. Originally, there were two available protocols for achieving such speeds, K56flex, designed and promoted by Rockwell and X2 (protocol), designed and promoted by U.S. Robotics. The already widespread use of the Rockwell chip set made K56flex more popular. A standardization effort started around 1996 with the intent to create a single standard for 56k modems that would replace K56flex and X2. Originally known as V.pcm (PCM referring to the pulse code modulation used in digital telephony), it became the v.90 protocol when finalized in 1998.
There are certain special requirements and restrictions associated with v.90 modems. In order for a users to obtain up to 56k upload speeds from their ISP, the telephone line had to be completely digital between the ISP and the telephone company central office (CO) of the user. From there the signal could be converted from digital to analog but only at this point. If there was a second conversion anywhere along the line 56k speeds were impossible. Also, the line quality of the user's telephone line could affect the speed of the 56k connection with line noise causing slow downs, sometimes to the point of only being marginally faster the 33.6Kbps connection. An important restriction with v.90 is that while v.90 modems can obtain up to 56Kbps download speeds, they are limited to 33.6Kbps upload speeds.
Prior to the adoption of the v.90 protocol, users were slow to adopt K56flex and X2 based 56K modems, many simply waiting for v.90 to arrive. Some modem manufacturers promised and later offered firmware or driver upgrades for their modems so that users could add v.90 functionality when it was released. V.90 modems can be backwards compatible with K56flex or X2. Thus users of non-upgradeable K56flex or X2 modems can often find ISP dial-up numbers that will support at least one of the older 56K protocols along with v.90.
Following the adoption of v.90, there was an attempt to adopt a protocol that would define a standard to allow all-digital communications (i.e where both the ISP and the user had digital connections to the telephone network). It was to be known as v.91 but the process appears to be "dead", as the rapid introduction of short-haul high-speed solutions like ADSL and cable modems offer much higher speeds from the user's local machine onto the Internet. With the exception of rural areas, the need for point-to-point calls has generally disappeared as a result, as the bandwidth and responsiveness of the Internet has improved so much. It appears that v.90 will be the last analog modem standard to see widespread use.
[edit]
V.92
V.92 is the standard that followed v.90. While it provides no speed increase when downloading from the Internet (56kbps appears to be the maximum speed for analog based modems), it does allow upload speeds to match the download speed provided both the ISP and the caller both have fully v.92 compatible modems. It also adds two features. The first is the ability for users who have call waiting to put their dial-up Internet connection on hold for extended periods of time while they answer a call. The second feature is the ability to "quick connect" to one's ISP. This is achieved by remembering key information about the telephone line one is using to connect with and then using this saved information to help speed up future calls made from the line to the ISP.
ISPs have been slow to adopt V.92 due to the high cost of upgrading their equipment and the lack of demand by their customers to do so. With the rise in broadband take-up that has led to declining numbers of dial-up users, some ISPs have decided not to bother ever upgrading to v.92
[edit]
Long haul modems
In the 1960s, Bell began to digitize the telephone system, and developed early high-speed radio modems for this purpose. Once digital long-haul networks were in place, they were leased for every other purpose.
Optic fiber manufacturing was mastered in the 1980s, and optic modems were first invented for these early systems. The first systems simply used light-emitting diodes and PIN diodes. Faster modulation was quickly adopted for long-haul networks. In the 1990s, multispectral optical modems were adopted as well.
[edit]
Narrowband
28.8kbit/s serial-port modem from MotorolaA standard modem of today is what would have been called a "smart modem" in the 1980s. They contain two functional parts: an analog section for generating the signals and operating the phone, and a digital section for setup and control. This functionality is actually incorporated into a single chip, but the division remains in theory.
In operation the modem can be in one of two "modes", data mode in which data is sent to and from the computer over the phone lines, and command mode in which the modem listens to the data from the computer for commands, and carries them out. A typical session consists of powering up the modem (often inside the computer itself) which automatically assumes command mode, then sending it the command for dialing a number. After the connection is established to the remote modem, the modem automatically goes into data mode, and the user can send and receive data. When the user is finished, the escape sequence, "+++" followed by a pause of about a second, is sent to the modem to return it to command mode, and the command to hang up the phone is sent. One problem with this method of operation is that it is not really possible for the modem to know if a string is a command or data. When the modem misinterprets a string, it generally causes odd things to happen.
The commands themselves are typically from the Hayes command set, although that term is somewhat misleading. The original Hayes commands were useful for 300 bit/s operation only, and then extended for their 1200 bit/s modems. Hayes was much slower upgrading to faster speeds however, leading to a proliferation of command sets in the early 1990s as each of the high-speed vendors introduced their own command styles. Things became considerably more standardized in the second half of the 1990s, when most modems were built from one of a very small number of "chip sets", invariably supporting a rapidly converging command set. We call this the Hayes command set even today, although it has three or four times the numbers of commands as the actual standard.
The 300 bit/s modems used frequency-shift keying to send data. In this system the stream of 1's and 0's in computer data it translated into sounds which can be easily sent on the phone lines. In the Bell 103 system the originating modem sends 0's by playing a 1070 Hz tone, and 1's at 1270 Hz, with the answering modem putting its 0's on 2025 Hz and 1's on 2225 Hz. These frequencies were chosen carefully, they are in the range that suffer minimum distortion on the phone system, and also are not harmonics of each other. For the 103F leased line version, internal strapping selected originate or answer operation. For dial models, the selection was determined by which modem originated the call. Modulation was so slow and simple that some people were able to learn how to whistle short bits of data into the phone with some accuracy.
In the 1200 bit/s and faster systems, phase-shift keying was used. In this system the two tones for any one side of the connection are sent at the similar frequencies as in the 300 bit/s systems, but slightly out of phase. By comparing the phase of the two signals, 1's and 0's could be pulled back out, for instance if the signals were 90 degrees out of phase, this represented two digits, "1, 0", at 180 degrees it was "1, 1". In this way each cycle of the signal represents two digits instead of one, 1200 bit/s modems were, in effect, 600 bit/s modems with "tricky" signalling.
It was at this point that the difference between baud and bit per second became real. Baud refers to the signaling rate of a system, in a 300 bit/s modem the signals sent one bit per signal, so the data rate and signalling rate was the same. In the 1200 bit/s systems this was no longer true since the modems were actually 600 baud. This led to a series of flame wars on the BBSes of the 80s.
Increases in speed have since used increasingly complicated communications theory. The Milgo 4500 introduced the 8 phase shift key concept. This could transmit three bits per signaling instance (baud.) The next major advance was introduced by the Codex Corporation in the late 1960's. Here the bits were encoded into a combination of amplitude and phase. Best visualized as a two dimensional "eye pattern", the bits are mapped onto points on a graph with the x (real) and y (quadrature) coordinates transmitted over a single carrier. This technique became very effective and was incorporated into an international standard named V.29, by the CCITT (now ITU) arm of the United Nations. The standard was able to transmit 4 bits per signalling interval of 2400 Hz. giving an effective bit rate of 9600 bits per second. For many years, most considered this rate to be the limit of data communications over telephone networks.
In 1980 Godfried Ungerboek from IBM applied powerful channel coding techniques to search for new ways to increase the speed of modems. His results were astonishing but only conveyed to a few colleagues. Finally in 1982, he agreed to publish what is now a landmark paper in the theory of information coding. By applying powerful parity check coding to the bits in each symbol, and mapping the encoded bits into a two dimensional "eye pattern", Ungerboek showed that it was possible to increase the speed by a factor of two with the same error rate. The new technique was called mapping by set partitions (now known as trellis modulation). This new view was an extension of the "penny packing" problem and the related and more general problem of how to pack points into an N-dimension sphere such that they are far away from their neighbors (so that noise can not confuse the receiver.)
The industry was galvanized into new research and development. More powerful coding techniques were developed, commercial firms rolled out new product lines, and the standards organizations rapidly adopted to new technology. Today the ITU standard V.34 represents the culmination of the joint efforts. It employs the most powerful coding techniques including channel encoding and shape encoding. From the mere 16 points per symbol, V.34 uses over 1000 points and very sophisticated algorithms to achieve 33.6 kbit/s.
In the late 1990's Rockwell and U.S. Robotics introduced new technology based upon the digital transmission used in modern telephony networks. The standard digital transmission in modern networks is 64 kbit/s but some networks use a part of the bandwidth for remote office signalling (eg to hang up the phone), limiting the effective rate to 56 kbit/s DS0. This new technology was adopted into ITU standards V.90 and is common in modern computers. The 56 kbit/s rate is only possible from the central office to the user site (downlink). The uplink (from the user to the central office) still uses V.34 technology. Later, in V.92, upload speed increased to a maximum of 48 kbit/s.
It is guessed that this rate is near the theoretical Shannon limit. Higher speeds are possible but may be due more to improvements in the underlying phone system than anything in the technology of the modems themselves.
Software is as important to the operation of the modem today as the hardware. Even with the improvements in the performance of the phone system, modems still lose a considerable amount of data due to noise on the line. The MNP standards were originally created to automatically fix these errors, and later expanded to compress the data at the same time. Today's v.42 and v.42bis fill these roles in the vast majority of modems, and although later MNP standards were released, they are not common.
With such systems it is possible for the modem to transmit data faster than its basic rate would imply. For instance, a 2400 bit/s modem with v.42bis can transmit up to 9600 bit/s, at least in theory. One problem is that the compression tends to get better and worse over time, at some points the modem will be sending the data at 4000 bit/s, and others at 9000 bit/s. In such situations it becomes necessary to use hardware flow control, extra pins on the modem–computer connection to allow the computers to signal data flow. The computer is then set to supply the modem at some higher rate, in this example at 9600 bit/s, and the modem will tell the computer to stop sending if it cannot keep up. A small amount of memory in the modem, a buffer, is used to hold the data while it is being sent.
Almost all modern modems also do double-duty as a fax machine as well. Digital faxes, introduced in the 1980s, are simply a particular image format sent over a high-speed (9600/1200 bit/s) modem. Software running on the host computer can convert any image into fax-format, which can then be sent using the modem. Such software was at one time an add-on, but since has become largely universal.
[edit]
Winmodem
A PCI Winmodem/Softmodem (on the left) next to a traditional ISA modem (on the right). Notice the less complex circuitry of the modem on the left.A Winmodem or Softmodem is a stripped-down modem for Windows that replaces tasks traditionally handled in hardware with software. In this case the modem is a simple digital signal processor designed to create sounds, or voltage variations, on the telephone line. Modern computers often include a very simple card slot, the Communications and Networking Riser slot (CNR), to lower the cost of connecting it up. The CNR slot includes pins for sound, power and basic signaling, instead of the more expensive PCI slot normally used. Winmodems are often cheaper than traditional modems, since they have fewer hardware components. One downside of a Winmodem is that the software generating the modem tones is not that simple, and the performance of the computer as a whole often suffers when it is being used. For online gaming this can be a real concern. Another problem with WinModems is lack of flexibility, due to their strong tie to the underlying operating system. A given Winmodem might not be supported by other operating systems (such as Linux), because their manufacturers may neither support the other operating system nor provide enough technical data to create an equivalent driver. A Winmodem might not even work (or work well) with a later version of Microsoft Windows, if its driver turns out to be incompatible with that later version of the operating system.
Apple's GeoPort modems from the second half of the 1990s were similar, and are generally regarded as having been a bad move. Although a clever idea in theory, enabling the creation of more-powerful telephony applications, in practice the only programs created were simple answering-machine and fax software, hardly more advanced than their physical-world counterparts, and certainly more error-prone and cumbersome. The software was finicky and ate up significant processor time, and no longer functions in current operating system versions.
Today's modern audio modems (ITU-T V.92 standard) closely approach the Shannon capacity of the PSTN telephone channel. They are plug-and-play fax/data/voice modems (broadcast voice messages and records touch tone responses).
[edit]
Radio modems
Direct broadcast satellite, WiFi, and mobile phones all use modems to communicate, as do most other wireless services today. Modern telecommunications and data networks also make extensive use of radio modems where long distance data links are required. Such systems are an important part of the PSTN, and are also in common use for high-speed computer network links to outlying areas where fibre is not economical.
Even where a cable is installed, it is often possible to get better performance or make other parts of the system simpler by using radio frequencies and modulation techniques through a cable. Coaxial cable has a very large bandwidth, however signal attenuation becomes a major problem at high data rates if a digital signal is used. By using a modem, a much larger amount of digital data can be transmitted through a single piece of wire. Digital cable television and cable Internet services use radio frequency modems to provide the increasing bandwidth needs of modern households. Using a modem also allows for frequency-division multiple access to be used, making full-duplex digital communication with many users possible using a single wire.
Wireless modems come in a variety of types, bandwidths, and speeds. Wireless modems are often referred to as transparent or smart. They transmit information that is modulated onto a carrier frequency to allow many simultaneous wireless communication links to work simultaneously on different frequencies.
Transparent modems operate in a manner similar to their phone line modem cousins. Typically, they were half duplex, meaning that they could not send and receive data at the same time. Typically transparent modems are polled in a round robin manner to collect small amounts of data from scattered locations that do not have easy access to wired infrastructure. Transparent modems are most commonly used by utility companies for data collection.
Smart modems come with a media access controller inside which prevents random data from colliding and resends data that is not correctly received. Smart modems typically require more bandwidth than transparent modems, and typically achieve higher data rates. The IEEE 802.11 standard defines a short range modulation scheme that is used on a large scale throughout the world.
WiFi and WiMax
Wireless data modems are used in the WiFi and WiMax standards, operating at microwave frequencies.
WiFi could be used in laptops for Internet connections (wireless access point and wireless application protocol (WAP).
Mobile modems
Modems for mobile phone lines (GPRS and UMTS) comes generally in a PC card, where a phone card is included. Nowadays are appearing USB modems too.