Traditional Culture Encyclopedia - Traditional culture - Digital Image Processing - Knowledge Points

Digital Image Processing - Knowledge Points

What not to say first, Lena town a building.

* Digital image : An image that can be displayed and processed on a computer line.

*? Digital image processing: the use of computers to analyze the image, processing and other processing, so that it can meet a variety of purposes.

*? Characteristics of digital images:

1, the amount of information in the image

2, the amount of data for image processing

3, the amount of repetitive calculations in the processing process

4, the processing technology is highly integrated

*? Human visual structure:

*? Cone Cells: Sense light, color. Sensitive to color.

*? Rod cells: only light, not color. (Lack of night blindness)

*? Brightness : How bright or dark the light is

*? Hue : Color mode, the degree of light and darkness of the primary colors, such as RGB under the red, green and blue primary colors of light and darkness.

*? Saturation: The concentration of the color.

*? Brightness contrast effect:

1. Simultaneous contrast effect: the contrast between the measurements of objects according to the contrast feeling.

2. Mach band effect: The visual director feels the need for bright or dark stripes where there is a change in brightness.

*? Image Digitization : The conversion of a continuous analog signal into a discrete digital signal.

*?Nyquist Sampling Theorem :

Conditions for a discrete signal to replace a continuous signal:

1. The original signal is a finite bandwidth signal.

2. The sampling frequency is not less than two times the highest frequency of the signal.

*? Spatial resolution:

Units: pixel/inch, pixel/cm, pixel*pixel

Quantization of a digital image: converting a gray scale to an integer representation.

eg 8 bits can represent 2^8 gray levels (0 - 256)

Amplitude resolution: the more gray levels there are, the higher that resolution is

(False contours: color distinctions are increased in the discretization because there are too few gray levels, leading to contour-like appearances)

* Calculation of the amount of data in a digital image

The pixel resolution is M*N, Q bits/pixel

The amount of data is: M*N*Q/8 Bytes

(number of levels of this quantization: 2^8)

*? Digital image classification:

1, grayscale image: quantized between pure black to pure white.

2. Binary image: black and white only

3. Color image: such as RGB image, each color channel is represented by the corresponding bits.

* Basic relationships between pixels:

* Positional relationships:

* Adjacency:

Adjacency conditions:

1. 4-adjacency or 8-adjacency

2. Gray values are similar

* Connectedness : a property generated by adjacency

Connected sets: generated by connectedness

4 -connectivity : 6

8-connectivity : 2

Region : R is a subset of pixels of an image, and if R is a connected set, then R is a region.

Boundary : A region R is bounded by one or more field pixels if they are not in the region. (All the above images are boundaries)

Pixel Distance :

1, Euclidean Distance

2, Block Distance = |x1-x2| + |y1 - y2|

3, Tessellated Distance = max(|x1-x2| , |y1 - y2|)

Digital Image Algebraic Operations :

Application :

Addition : Removal of additive noise, image superimposition.

Subtraction : Detecting changes in the image

Multiplication : Keying, changing grayscale

*? Point operations : Transforms individual pixels

*? Spatial filtering : Processing based on domains

*? Gray scale transformation :

Original pixel - > mapping function - > transformed pixel

Application :

1, image inversion (negative effect)

8bits as an example: the transformed pixel gray scale = 255 - the original pixel gray scale

2, linear transformation (1)

Expansion : the gray scale set of the image (underexposed or overexposed) gray-scale dynamic range, increase the contrast, making the image clearer.

Compression: The opposite, to soften the image.

*? Segmented linear transformation (2):

3. Non-linear transformation:

Purpose of the different gray-scale range of pixels to do different degrees of processing, such as dark and highlights there is no need to pull the dynamic range of gray-scale values.

* Logarithmic expansion:

Exponential expansion:

Gray scale histogram: reflecting the distribution of gray scale

Horizontal axis gray scale, vertical axis pixel number or percentage

* Calculation:

Histogram equalization

eg Practice problems

Gray scale 0 - 7

Distribution probability is: 0.19, 0.25, 0.21, 0.16, 0.08, 0.06, 0.03, 0.02

Find the distribution of pixels in the histogram after homogenization:

Answer:

The probabilities of having only 5 gray levels after homogenization, 1, 3, 5, 6, 7, are as follows:

1:0.19, 3:0.25, 5:0.21 , 6:0.24, 7:0.11

Histogram Prescriptive

In short, given a template, make the gray scale distribution of pixels in the transformed image similar to the template.

For example, in this problem, 0 grayscale accounts for 0.19 close to the target template 0.2, so it becomes the target template's grayscale 3. The intermediate grayscales 1, 2, and 3 add up to 0.62 close to the target template's 0.6 so it becomes 5.

*? Airspace filter/stencil : a matrix

*? Filtering process:

1. Align the filter to the pixels of the image in sequence in the image

2. Do the convolution (multiply the corresponding pixel by k and finally sum)

3. Assign the result to the pixel of the image that corresponds to the middle position of the filter

*? Edge problem : Since the filter cannot go beyond the image, the edges cannot be filtered.

*? Processing methods:

1. Ignore

2. Imagine that there are pixels outside the edge that have the same gray value as the edge

Classification of null filters:

1. Smoothing filter: Smoothes the image, removes the high-frequency component, makes the image gray value change less, and at the same time reduces the noise.

2. Sharpening filter: removes the low-frequency component, making the image contrast increase, the edge is obvious.

1, field averaging

Can reduce noise, but the image is also fuzzy

2, weighted average

The importance of different positions of the grayscale (weight) is not the same, the most important in the middle, next to the importance of the decrease.

3, non-linear smoothing filter

1, the use of difference to reflect the size of the gray-scale changes in neighboring pixels (continuous degree of change is called differential, discrete is called differential, in fact, is the difference. Is a concept)

2, through the difference of the out gradient. (Gradient can be used to detect the edge, because the edge of the pixel gray scale varies greatly)

3, after sharpening the pixel gray scale value = the original pixel gray scale value + sharpening degree coefficient * gradient

Practical application :

1,

2, the second-order difference template - the Laplace operator

Calculate the gradient:

Direct sharpening:

> The matrix filter we used earlier was processing the image in the null domain, and now we are going to move to the frequency region.

> For those who don't understand the frequency domain, you can go to Zhihu and search.

> Brief introduction:

> The genius mathematician Fourier discovered that any periodic signal can be represented by a sinusoidal function series, and any non-periodic signal can be represented by a weighted integral of a sinusoidal signal.

> So the distribution of these sinusoidal functions gave rise to the concept of the frequency domain.

After discrete Fourier transforming the image in two dimensions:

The four corners, are the low frequency part. The center is the highest frequency.

The brightest indicates the highest low-frequency energy (look at the image, the black coat, the background, and so on, these pixels with small changes in gray scale make up most of the pixels, they are the low-frequency components).

We can center the frequency spectrum due to the periodicity and **** yoke symmetry of the 2D DFT.

Longitudinal and transverse nature of the frequency spectrum map:

* Frequency filtering basics

Steps:

1, image null domain to frequency domain

2, multiply the frequency spectrum by the frequency filter

3, Fourier inverse transform to get the image

*? Frequency domain filtering classification:

1, low-pass filtering

2, high-pass filtering

3, band-pass and band-stop filtering

4, homomorphic filtering

* Notch filter

Idea: noise and edges belong to the high-frequency components, low-pass, as the name implies, the low-frequency passes through, filtering out the high frequencies.

Classification:

1, the ideal low-pass filter

where D0 is the cut-off frequency determined by man

Disadvantages: may produce ringing phenomenon

The reason for the ringing phenomenon:

2, Butterworth low-pass filters

Disadvantages: smoothing is not as good as the ideal low-pass

3, Gaussian low-pass filter (GLPF)

Disadvantages: the smoothing effect is not as good as the first two

Smoothing effect and the relationship between the cutoff frequency:

High-frequency passes through the low-frequency filter. Sharpening is achieved.

High-pass filter template = 1 - Low-pass filter template

Effect:

Again IHPF has ringing.

High-pass filtering yields only edge information, non-edge information is all blackened. In order to get enhanced sharpened image, use high frequency enhancement filtering method.

Method:

k * high pass filter + c

k is the coefficient of ? > 1 coefficient and c is a constant

For images with a large dynamic range (blacks are very black, whites are very white) and where the detail is in the black or white parts.

Using gray level expansion improves the contrast and the dynamic range of the image is further increased.

Compressing the gray levels makes the dynamic range smaller, but the details are even more indistinguishable.

At this point it is necessary to combine frequency filtering with gray-scale transformation - homomorphic filtering.

* Rationale:

An image is composed according to the illuminance/reflectance model.

Illuminance: sunlight or other light source, generally less variable, low frequency.

Reflectance: Determined by the material on the surface of the object, it varies greatly and is high frequency.

(As an example, let's say you look out the window and the sunlight hits all the objects with almost the same light. But the different details and so on are determined by the reflectivity of flowers, plants, houses, and so on)

So,

attenuating the incident light i(x, y) reduces the range of grayscales.

How to strengthen the reflected light r(x, y) improves the image contrast.

Process:

In this way the homomorphic filter automatically attenuates the low-frequency incident light, reducing the dynamic range. The high frequencies are enhanced to improve contrast.

Image degradation : The image quality is corrupted due to imperfections in the equipment during generation, storage, and transmission.

Image recovery: Based on the image degradation model, establish a degradation model based on a priori knowledge, and then perform inverse operations to recover the original image.

* The connection and difference between image enhancement and image restoration

Connection: Both are to improve the visual quality of the image

Difference: Enhancement is subjective and does not take into account the causes of image degradation. Recovery is objective, the purpose is to maximize the return to the original image.

Image degradation model:

Described by probability density function.

Classification:

1. Gaussian noise

2. Rayleigh noise

3. Gamma noise

4. Uniformly distributed noise

5. Impulsive noise (pretzel noise)

6. Cyclic noise

Gray scale histograms of some noises:

Case :

Analyze:

Analysis:

1. > Analysis:

Take a piece with very little variation and plot the histogram. Found to be a Gaussian noise model.

Processing Additive Noise (Gaussian, Uniformly Distributed Noise) - Airspace Filtering

1. Arithmetic Mean Filtering , do arithmetic mean

2. Geometric Mean Filtering , do geometric mean

Advantage: Geometric Mean Filtering image detail is retained more, and smoothing is about the same as the Arithmetic The advantages.

3. Harmonic mean filtering

It is better to deal with "salt" noise, but not for "pepper" noise.

4. Inverse Harmonic Mean Filtering

Q-Filter Order :

Q > 0 for "Pepper" noise

Q == 0 for Arithmetic Mean Filtering

Q < 0 for "Salt" noise (Q == 0). "Salt" noise (Q == -1 for harmonic mean filtering)

5. Statistical ordering filters:

Median filter : Same size, less blurring than mean filter. It is very effective in dealing with impulse noise. However, repeated use will blur the image.

Maximum filter: Good for "pepper" noise, but removes some black pigment from the edges of black objects.

Minimum filter: works well with "salt" noise, but removes some white color from the edges of white objects.

Midpoint filter: Calculates the arithmetic average of the maximum and minimum values in the filter template, which is the midpoint value. Works best with Gaussian and uniform noise.

6, adaptive filter (according to the current processing of pixel information, to determine the intensity of repair)

Effect:

7, adaptive median filter

Find the median within the template, the median is not a pulse, then look at the center of the value of Zxy is not a pulse. The center value Zxy is