RE: modems

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi, guys. Here is my contribution to this surprising debate about modems. :)

To the point of view of an application running above the Application layer 7, the Data Link layer 2 receives BITS from the Physical layer 1 and organizes them into FRAMES, before transmitting its contents (the payload) to the Network layer 3, and so on (actually, at each layer, the payload is extracted and transmitted to the upper layer, where it becomes that upper layer's Protocol Data Unit).

To the point of view of a transmission medium (copper cable, fiber optics, radio waves, etc.), BITS come from the Data Link layer 2 into the Physical layer 1, where they are converted into a specific signal that can be transmitted on the given medium. In case of copper cabling, it will be an electric signal. In case of fiber optics, it will be light pulses, etc.

Now, talking about modems. Modems were invented to transmit digital data over an analog line. It means that the data coming from a computer is digital (i.e. a non-continuous signal, carrying a limited number of values/levels) and needs to be converted into an analog signal (i.e. a continuous signal, carrying an infinite number of values/levels), using frequencies limited to the range accepted by the telephone network.

The range of frequencies accepted by the telephone network (i.e. the BANDWIDTH) is theoretically between 0Hz and 4000Hz - more practically between 300Hz and 3500Hz. Why not higher frequencies? Because high frequencies are more sensitive, more fragile, and are corrupted first during the transmission, corrupting then the whole signal - then the whole telephone conversation. So, this filtering guarantees a minimum quality of a telephone conversation.

Why that range of frequencies instead of another one? Because it's the range used by human voice - what we actually want to transmit over a telephone network, plus some harmonics necessary to ensure a minimum quality of speech and to allow the speaker to be recognized and then identified.

As the telephone network was the only omnipresent, global, ubiquitous network available at that time (the 50's), it was obvious that it should be used to interconnect computers through long distances. But computers didn't use analog signals (a fortiori since Von Neumann strongly recommended a digital architecture for computers during the 40's).

So BELL LABs developed the MODEM to convert digital signals into analog signals to be able to transport data through the telephone network, and then convert back analog signals into digital signals on the other end (for the destination computer to understand the data transmitted through the telephone network).

Remember : the analog lines were low-pass-filtered and then limited to a maximum of 3500Hz (then a range of frequencies that can be heard by human ear - compare to the range of frequencies supported by an ordinary Hi-Fi system, usually covering 20Hz to 20000Hz). So, when MODULATING the incoming digital signal, the modem created an analog signal in the range of frequencies between 300Hz and 3500Hz - then a signal that can be HEARD, hence the noise generated by a modem.

Now, why do we hear noise only at the beginning of the transmission? Because modems are configured to let users hear noise only at the beginning of the transmission to have the audible confirmation that it's working. After some seconds, the internal modem's speaker is turned off to prevent annoying the users.

What about the baudrate, the bitrate and "modulation"? Well, modems will communicate through the telephone network by exchanging an analog, audible signal. How to transport bits and bytes with such a noisy signal? By modulating one or several of the characteristics of this noisy, analog signal, which are : the frequency, the amplitude, the phase. Note that, by combining several modulation techniques, you increase the number of bits that can be represented - then transported. That's why, today, with a baudrate of 2400 bauds per second, we can transmit 33600 bits per second (because we transport 14 bits per baud), while some years ago we transported 1 bit per baud at a baudrate of 300 bauds per second, achieving a bitrate of 300 bits per second.

Why can't we hear Gigabit Ethernet? 1.In case of copper cabling (IEEE 802.3ab), it's a pure baseband technology, using pure digital signalling, using frequencies of 80MHz (80 millions of Hertz!!!) on Category 5 UTP copper cabling, and that's far, far higher than the highest audible frequency. 2.In case of fiber optics (IEEE 802.3z), it's a pure light transmission, using pure light signalling on fiber optics, and that's not at all audible - only visible. :)

Last word : computers don't communicate by screeching or talking or whatever : they communicate by exchanging a specific signal through a physical medium. Depending on this medium, this signal will be either electrical (copper cabling) or optical (fiber optics) or radio or infra-red or micro-wave (etc.).

I hope I could combine the engineer and academic points of view in a comprehensible, yet proper way. ;)

P.S.: if one of you detects a mistake or an error, please let me know - I'm always learning. Every single day of my life.

-----Original Message-----
From: Bill Cunningham [mailto:billcu@citynet.net]

I know modems communicate on the physical layer by electrical pulses or
binaries sent on copper wires. Is that screeching you hear electrical
communication? Computers don't communicate by screeching...or do they?


[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]