Jump to content

Talk:Direct-sequence spread spectrum

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Europrobe (talk | contribs) at 08:50, 1 August 2004 (explained...). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

I am wondering why raising the bitrate "spreads the energy" on a wider frequency band. Does anyone have an idea?

Shannon's law gives us the maximum possible bit rate given a frequency band and a Signal to Noise ratio: MaxBitRate = Bandwidth * log2 (1 + SignalPower/NoisePower)

But I don't see that as an explanation since it allow us to calculate the MAX bitrate, not the bitrate itself.

Then there's Nyquist's law : the sampling rate must be at least twice the signal's max frequency. I don't see this as an explanation either.

One explanation I can imagine is that switching abruptly a signal from one frequency to another generates harmonics. If you switch very often, a greater part of the energy is used up by those harmonics. Does this even make sense?  ;-)

Anyway, does anyone have any good explanation for this rule?


If you assume that you always use close to maximum bitrate at a given bandwidth (which makes sense, since otherwise you'd be wasting spectrum), you see that it's only by increasing bandwidth that you can increase the bitrate (or baud rate, really). Have a look at modem, Quadrature amplitude modulation and some of their linked pages for more info.

europrobe 08:50, 2004 Aug 1 (UTC)