Matthew S. Hallacy wrote:
> 
> On Wed, Aug 07, 2002 at 12:03:31AM -0500, Bryan Halvorson wrote:
> 
> > With 30 mw transmitters and 24 db antennas a 10 mile link only has 
> > about 10 db of fade margin. Plus this isn't taking into consideration 
> > interference and the higher noise floor that we're going to be seeing 
> > as the band gets more crowded. I've heard enough people say that 10
> > mile links are doable that I think it should work. I think we'll just
> > have to get something up and running and see what the error rate ends up
> > being. If it ends up being a problem we can start looking for more
> > antenna gain or add amplifiers.
> 
> I wouldn't be confortable using 30mw units for this, most likely we'd be
> using 100mw or 200mw radios.

Going from 30 mw to 100 mw isn't really that big of a change. It's only
a 5 db increase on power. Going from 30 to 200 mw is 8 db. If possible 
it's usually better to increase the antenna gain or decrease the 
feedline loss. Both of these add to both the transmit and receive 
signal level and narrowing the antenna pattern with a higher gain 
antenna can also decrease interference.

I'm not trying to say that increasing the transmitter power is the wrong
answer, just that you need to look at the whole transmitter and antenna
system to see where changes will be the most effective. With the small
signal levels we're forced to work with here every db of signal is
important.

-- 
Bryan Halvorson
bryan at edgar.sector14.net