Concept and function of filtering (filter, mask, core, template and window are one meaning)

Posted by Devsense on Tue, 08 Mar 2022 21:27:10 +0100

Reprinted from:
In essence, image filtering and enhancement processing is to use filtering technology to enhance some spatial frequency features of the image, so as to improve the gray contrast between the ground object and the field or background. The possible "blurring" effect in the imaging process of remote sensing system often makes the linear trace, texture and ground object boundary information that some users are interested in on the remote sensing image not clear enough and difficult to identify. It is necessary to analyze, compare and adjust the contrast relationship between pixels and their surrounding adjacent pixels by using domain processing methods, so that the image can be increased, that is, it needs to be processed by filter addition technology.

 1, Spatial filtering: image processing using spatial convolution template. The template itself is called spatial filter.

        (1) Linear filter: it is a natural extension of the concepts of linear system and frequency domain filtering in spatial domain.

          It includes: (1) low-pass filter (low-frequency pass): smooth the image and remove noise

                          (2)High pass filter: edge enhancement, edge extraction

                          (3)Bandpass filter: delete specific frequency

        (2) Nonlinear filter: when the template is used to calculate the result pixel value, the result value directly depends on the value of the pixel field, rather than the calculation method of weighted sum.

          It includes: (1) median filtering: smoothing the image and removing noise

                          (2)Maximum filtering: find the most bright spot

                           (3)Minimum filtering: find the darkest point

  (3) The main purpose of smoothing filter: reduce noise, delete useless small details before large image processing, smooth processing, restore excessively sharp images, and create images.

        Several simple low-pass filters:

        (1)Mean filter: the value of the pixel to be processed is equal to the average value of all pixels in the field of a certain size.

        (2)Weighted average filter: the output value of the pixel to be processed is equal to the weighted average value of all pixels of adjacent pixels around it.

        (3)Median filter: use the median of pixels in the template area as the result value. Eliminating isolated bright spots (dark spots) and suppressing noise can better preserve the edge contour information and image details.  

  (4) Sharpening filter: enhance the edge and contour of the scene in the image, emphasize the subtle level in printing, military target recognition and positioning, etc.

        It includes:

        (1)Basic high pass filter: while enhancing the edge, it loses the hierarchy and background brightness of the image. It can enhance the small-scale feature in the image.

        (2)High gain filter: while enhancing the edges and details of the image, it maintains the low-frequency components of the original image. It not only enhances the edge but also retains the hierarchy, but also enhances the noise while enhancing the edge.

        (3)Differential filter: direct use, similar to high pass. There are two special applications: if the gradient is greater than 25, assign 255, otherwise assign the original value, the edge information in the image is highlighted and the background is retained. If it is greater than 25, the maximum value is 255, otherwise it is zero, the edge information in the image is highlighted, and the image is also binarized.       

2, Frequency domain filtering: including:

        Low pass filtering: 

            It is also divided into: ideal low-pass filter( ILPF)

                    Butterworth Low pass filter( BLPF)

                    Exponential offset filter( ELPF)

                    Trapezoidal low pass filter( TLPF)

        High pass filtering:

  Including: ideal high pass filter( IHPF)

                    Butterworth High pass filter( BHPF)

                    Exponential high pass filter( EHPF)

                    Trapezoidal high pass filter( THPF)

        Bandpass/Band stop filter: bandpass allows signals in a certain frequency range to pass through and prevents signals in other frequency ranges from passing through.

Basic concepts of image processing -- convolution, filtering, smoothing

1. Image convolution (template)

(1). Concepts related to image processing using templates:

 Template: matrix block, whose mathematical meaning is a convolution operation.      

 Convolution operation: it can be regarded as the process of weighted summation. Each pixel in the image area used is in the convolution kernel respectively ( Weight matrix ) Each element of is multiplied correspondingly, and the sum of all products is taken as the new value of the central pixel of the region.
 Convolution kernel: the weight used in convolution is represented by a matrix, which is a weight matrix.
  Convolution example:
          3 * 3  Pixel area of R Convolution kernel G Convolution operation of:
           R5(Center pixel)=R1G1 + R2G2 + R3G3 + R4G4 + R5G5 + R6G6 + R7G7 + R8G8 + R9G9

(2). Problems with using templates to process images (cross-border problems):
Boundary problem: when processing the image boundary pixels, the convolution kernel cannot match the image use area, and the center of the convolution kernel corresponds to the boundary pixels, so the convolution operation will have problems.
Treatment method:
A. Ignore boundary pixels, that is, the processed image will lose these pixels.
B. Retain the original boundary pixels, that is, copy boundary pixels to the processed image.
(3). Common templates:

Let's look at the concept of one-dimensional convolution
Definition of convolution in continuous space: the convolution of f(x) and g(x) is the integral value of f(t-x)g(x) from negative infinity to positive infinity at t T-x must be in the f(x) domain, so the seemingly large integral is actually in a certain range The actual process is that f(x) first reverses the Y axis, then translates along the X axis, t is f(t-x), and then takes g(x), and then integrates the value of the product of the two Imagine if g(x) or f(x) is a unit order function Then it is the area of the intersection of f(t-x) and g(x) This is convolution
Changing the integral sign into summation is the definition of convolution in discrete space So what does convolution mean in the image? That is, the image is the image f(x), and the template is g(x). Then move the template g(x) in the template. At each position, the element convolution that intersects the definition domain of f(x) and g(x) is often used in linear system analysis A linear system is a system whose input and output are linear In other words, the whole system can be decomposed into N independent changes, and the whole system is the accumulation of these changes For example, X1 - > Y1, X2 - > Y2; Then ax1 + bx2 - > AY1 + BY2 is a linear system A linear system can be expressed in the form of integral

That is, f(t,x) represents linear coefficients such as a and B It looks like convolution. Yes, if f(t,x) = F(t-x), isn't it Changing from f(t,x) to F(t-x) actually means that f(t,x) is a linear shift invariant, that is, when the difference of variables does not change, the value of the function does not change In fact, it shows that the output of a linear shift invariant system can be obtained by convolution of the input and the function representing the linear characteristics of the system

2. Image filtering

(1) Image filtering, that is to suppress the noise of the target image under the condition of preserving the detailed characteristics of the image as much as possible, is an indispensable operation in image preprocessing. Its processing effect will directly affect the effectiveness and reliability of subsequent image processing and analysis. (filtering is to remove useless information and retain useful information, which may be low frequency or high frequency)

(2) There are two purposes of filtering: one is to extract the features of the object as the feature mode of image recognition; The other is to meet the requirements of image processing and eliminate the noise mixed in image digitization.

There are two requirements for filtering processing: one is not to damage the important information such as the contour and edge of the image; Second, make the image clear and have good visual effect.

(3) Image filtering methods: there are many image filtering methods, which can be divided into frequency domain method and space domain method. The processing of frequency domain method is to calculate the transformation coefficient value of the image in a certain transformation domain of the image, and then obtain the enhanced image through inverse transformation. This is an indirect image filtering method. Spatial filtering method is a kind of direct filtering method, which directly calculates the gray level of the image when processing the image.

 <1>Frequency domain filtering: it is a method to transform the image from space or time domain to frequency domain, and then use the transformation coefficient to reflect the properties of some image features for image filtering. Fourier transform is a commonly used transform. In the Fourier transform domain, the DC component of the spectrum is directly proportional to the average brightness of the image, the noise corresponds to the region with higher frequency, and the image entity is located in the region with lower frequency. These inherent characteristics of image transformation can be used for image filtering. A low-pass filter can be constructed to make the low-frequency component pass smoothly and effectively block the high-frequency component, which can filter out the noise of the image, and then obtain a smooth image through inverse transformation.

The mathematical expression of low-pass is as follows::
                    wave filtering                
Where F (u, v)Fourier transform of an original image with noise;
    H (u, v)One is transfer function, also known as transfer function (i.e. low-pass filter);
    G (u, v)One is the Fourier transform of the output image after low-pass filtering.
    H The high-frequency components are filtered out, and the low-frequency information passes through basically lossless. After filtering, the smooth image can be obtained by inverse Fourier transform, i.e
Select the appropriate transfer function H (u, v),It is important for low-pass filtering in frequency domain. The commonly used transfer functions include trapezoidal function, exponential function, Butterworth function and so on.
Several common low-pass filters in frequency domain are ideal low-pass filters(Ideal  circular Iow-passfilter),Butterworth(Butterworth)Low pass filter, exponential low-pass filter and trapezoidal low-pass filter. These low-pass filters can improve when there are noise interference components in the image.
<2>There are two kinds of commonly used plane space domain filtering methods:  
      One is the method of fitting image, including n Order polynomial fitting, discrete orthogonal polynomial fitting, quadric surface fitting and other methods; 

      The other is image smoothing methods, including domain average method, median filter method, gradient reciprocal weighting method, selective mask method and so on.

 <3>High pass filtering: edge extraction and enhancement. The gray transformation of the edge region is increased, that is, the frequency is higher. Therefore, for high pass filtering, the edge part will be retained and the non edge part will be filtered;

       Low pass filtering: the edge is smooth, and the edge area will be smoothly transitioned.

       Gaussian filter: Gaussian filter is a kind of linear smoothing filter, which is suitable for eliminating Gaussian noise and is widely used in the noise reduction process of image processing. Generally speaking, Gaussian filtering is the process of weighted average of the whole image. The value of each pixel is obtained by weighted average of itself and other pixel values in the neighborhood. Gaussian smoothing filter is very effective for suppressing noise that obeys normal distribution. three×3 The mask is as follows:


It can be seen from the structured mask that the weight of the pixel in the center of the mask is greater than that of any other pixel, so the pixel given in the mean calculation is more important. The pixels far from the center of the mask are less important in order to reduce the blur in smoothing.

3. Image smoothing

Image smoothing: image smoothing is to suppress, weaken or eliminate details, mutations, edges and noise in the image. Image smoothing is low-pass filtering of images, which can be realized in spatial domain or frequency domain. Image smoothing methods in spatial domain mainly use low-pass convolution filter, median filter and so on; The commonly used low-pass filters for image smoothing in frequency domain include low-pass trapezoidal filter, low-pass Gaussian filter, low-pass exponential filter, Butterworth low-pass filter and so on.


Image convolution: an implementation means, whether filtering or anything else, can be said to be an extension of mathematics in image processing.

Image filtering: an image processing method to achieve different purposes.

Image smoothing: in fact, it is low-pass filtering.

Topics: NLP