Smoothing
In statistics and image processing, to smooth a data set is to create an approximating function that attempts to capture important patterns in the data, while leaving out noise or other fine-scale structures/rapid phenomena. In smoothing, the data points of a signal are modified so individual points higher than the adjacent points (presumably because of noise) are reduced, and points that are lower than the adjacent points are increased, leading to a smoother signal.
Reducing noise by smoothing may aid in data analysis in two notable ways:
- Help uncover more meaningful information from the underlying data, such as trends.
- Provide analyses that are both flexible and robust.
Many different algorithms are used in smoothing, most commonly binning, kernels, and local weighted regression.