Rain fade
Rain fade refers primarily to the absorption of a microwave radio frequency (RF) signal by atmospheric rain, snow, or ice, and losses which are especially prevalent at frequencies above 11 GHz. It also refers to the degradation of a signal caused by the electromagnetic interference of the leading edge of a storm front. Rain fade can be caused by precipitation at the uplink or downlink location. It does not need to be raining at a location for it to be affected by rain fade, as the signal may pass through precipitation many miles away, especially if the satellite dish has a low look angle. From 5% to 20% of rain fade or satellite signal attenuation may also be caused by rain, snow, or ice on the uplink or downlink antenna reflector, radome, or feed horn. Rain fade is not limited to satellite uplinks or downlinks, as it can also affect terrestrial point-to-point microwave links (those on the Earth's surface).
The rain attenuation on satellite communication can be predicted using rain attenuation prediction models which lead to a suitable selection of the Fade Mitigation Technique (FMT). The rain attenuation prediction models require rainfall rate data which, in turn, can be obtained from in either the prediction rainfall maps, which may reflect inaccurate rain performance prediction, or by actual measured rainfall data that gives more accurate prediction and hence the appropriate selection of FMT. Substantially, the earth altitude above the sea level is an essential factor affecting the rain attenuation performance. The satellite system designers and channel providers should account for the rain impairments at their channel setup.
Possible ways to overcome the effects of rain fade are site diversity, uplink power control, variable rate encoding, and receiving antennas larger than the requested size for normal weather conditions.