Digital Image Processing MCQ (Multiple Choice Questions)

Here are 1000 MCQs on Digital Image Processing (Chapterwise).

1. What is Digital Image Processing?
a) It’s an application that alters digital videos
b) It’s a software that allows altering digital pictures
c) It’s a system that manipulates digital medias
d) It’s a machine that allows altering digital images
View Answer

Answer: b
Explanation: Digital Image Processing (DIP) is a software that allows you to alter digital images using a computer. It’s also used to improve images and extract useful information from them.

2. Which of the following process helps in Image enhancement?
a) Digital Image Processing
b) Analog Image Processing
c) Both a and b
d) None of the above
View Answer

Answer: c
Explanation: The process of digitally modifying a stored image with software is known as image enhancement.

3. Among the following, functions that can be performed by digital image processing is?
a) Fast image storage and retrieval
b) Controlled viewing
c) Image reformatting
d) All of the above
View Answer

Answer: d
Explanation: Functions that can be performed by digital image processing are:

  1. Image reconstruction
  2. Image reformatting
  3. Dynamic range image data acquisition
  4. Image processing
  5. Fast image storage and retrieval
  6. Fast and high-quality image distribution
  7. Controlled viewing
  8. Image analysis

4. Which of the following is an example of Digital Image Processing?
a) Computer Graphics
b) Pixels
c) Camera Mechanism
d) All of the mentioned
View Answer

Answer: d
Explanation: Digital Image Processing is a type of image processing software. Computer graphics, signals, photography, camera mechanisms, pixels, etc are examples.

5. What are the categories of digital image processing?
a) Image Enhancement
b) Image Classification and Analysis
c) Image Transformation
d) All of the mentioned
View Answer

Answer: d
Explanation: Digital image processing is categorized into:
i. Preprocessing
ii. Image Enhancement
iii. Image Transformation
iv. Image Classification and Analysis
advertisement
advertisement

6. How does picture formation in the eye vary from image formation in a camera?
a) Fixed focal length
b) Varying distance between lens and imaging plane
c) No difference
d) Variable focal length
View Answer

Answer: d
Explanation: The ciliary body’s fibers change the curvature of the lens, changing its focal length.

7. What are the names of the various colour image processing categories?
a) Pseudo-color and Multi-color processing
b) Half-color and pseudo-color processing
c) Full-color and pseudo-color processing
d) Half-color and full-color processing
View Answer

Answer: c
Explanation: Full-color and pseudo-color processing are the two main types of colour picture processing. The photographs in the first category were captured with a full-color sensor, such as a colour TV or a colour scanner. In the second category, attributing a colour to a certain monochromatic intensity or range of intensities is a challenge.

8. Which characteristics are taken together in chromaticity?
a) Hue and Saturation
b) Hue and Brightness
c) Saturation, Hue, and Brightness
d) Saturation and Brightness
View Answer

Answer: a
Explanation: The combination of hue and saturation is known as chromaticity, and a color’s brightness and chromaticity can be used to describe it.

9. Which of the following statement describe the term pixel depth?
a) It is the number of units used to represent each pixel in RGB space
b) It is the number of mm used to represent each pixel in RGB space
c) It is the number of bytes used to represent each pixel in RGB space
d) It is the number of bits used to represent each pixel in RGB space
View Answer

Answer: d
Explanation: The RGB color model represents images as three-component images, one for each primary color. These three images mix on the phosphor screen to generate a composite color image when input into an RGB display. The pixel depth refers to the number of bits required to represent each pixel in RGB space.

10. The aliasing effect on an image can be reduced using which of the following methods?
a) By reducing the high-frequency components of image by clarifying the image
b) By increasing the high-frequency components of image by clarifying the image
c) By increasing the high-frequency components of image by blurring the image
d) By reducing the high-frequency components of image by blurring the image
View Answer

Answer: d
Explanation: By adding additional frequency components to the sampled function, aliasing corrupts the sampled image. As a result, the most common method for decreasing aliasing effects on an image is to blur the image prior to sampling to lower its high-frequency components.

11. Which of the following is the first and foremost step in Image Processing?
a) Image acquisition
b) Segmentation
c) Image enhancement
d) Image restoration
View Answer

Answer: a
Explanation: The initial step in image processing is image acquisition. It’s worth noting that acquisition might be as simple as being provided a digital image. Preprocessing, such as scaling, is usually done during the image acquisition stage.

12. Which of the following image processing approaches is the fastest, most accurate, and flexible?
a) Photographic
b) Electronic
c) Digital
d) Optical
View Answer

Answer: c
Explanation: Because it is fast, accurate, and dependable, digital image processing is a more versatile and agile technology.

13. Which of the following is the next step in image processing after compression?
a) Representation and description
b) Morphological processing
c) Segmentation
d) Wavelets
View Answer

Answer: b
Explanation: Steps in image processing:
Step 1: Image acquisition
Step 2: Image enhancement
Step 3: Image restoration
Step 4: Color image processing
Step 5: Wavelets and multi-resolution processing
Step 6: Compression
Step 7: Morphological processing
Step 8: Segmentation
Step 9: Representation & description
Step 10: Object recognition

14. ___________ determines the quality of a digital image.
a) The discrete gray levels
b) The number of samples
c) discrete gray levels & number of samples
d) None of the mentioned
View Answer

Answer: c
Explanation: The number of samples and discrete grey levels employed in sampling and quantization determine the quality of a digital image.
advertisement

15. Image processing involves how many steps?
a) 7
b) 8
c) 13
d) 10
View Answer

Answer: d
Explanation: Steps in image processing:
Step 1: Image acquisition
Step 2: Image enhancement
Step 3: Image restoration
Step 4: Color image processing
Step 5: Wavelets and multi-resolution processing
Step 6: Compression
Step 7: Morphological processing
Step 8: Segmentation
Step 9: Representation & description
Step 10: Object recognition

16. Which of the following is the abbreviation of JPEG?
a) Joint Photographic Experts Group
b) Joint Photographs Expansion Group
c) Joint Photographic Expanded Group
d) Joint Photographic Expansion Group
View Answer

Answer: a
Explanation: Most computer users are aware of picture compression in the form of image file extensions, such as the jpg file extension used in the JPEG (Joint Photographic Experts Group) image compression standard.

17. Which of the following is the role played by segmentation in image processing?
a) Deals with property in which images are subdivided successively into smaller regions
b) Deals with partitioning an image into its constituent parts or objects
c) Deals with extracting attributes that result in some quantitative information of interest
d) Deals with techniques for reducing the storage required saving an image, or the bandwidth required transmitting it
View Answer

Answer: b
Explanation: Segmentation is a technique for dividing a picture into its component components or objects. In general, one of the most difficult tasks in digital image processing is autonomous segmentation. A robust segmentation approach takes the process a long way toward solving image challenges that need individual object identification.

18. The digitization process, in which the digital image comprises M rows and N columns, necessitates choices for M, N, and the number of grey levels per pixel, L. M and N must have which of the following values?
a) M have to be positive and N have to be negative integer
b) M have to be negative and N have to be positive integer
c) M and N have to be negative integer
d) M and N have to be positive integer
View Answer

Answer: d
Explanation: The digitization process, in which the digital image contains M rows and N columns, necessitates choices for M, N, and the maximum grey level number, L. Further than the fact that M and N must be positive integers, there are no other constraints for M and N.

19. Which of the following tool is used in tasks such as zooming, shrinking, rotating, etc.?
a) Filters
b) Sampling
c) Interpolation
d) None of the Mentioned
View Answer

Answer: c
Explanation: The basic tool for zooming, shrinking, rotating, and other operations is interpolation.
advertisement

20. The effect caused by the use of an insufficient number of intensity levels in smooth areas of a digital image _____________
a) False Contouring
b) Interpolation
c) Gaussian smooth
d) Contouring
View Answer

Answer: a
Explanation: The ridges resemble the contours of a map, hence the name.

21. What is the procedure done on a digital image to alter the values of its individual pixels known as?
a) Geometric Spacial Transformation
b) Single Pixel Operation
c) Image Registration
d) Neighbourhood Operations
View Answer

Answer: b
Explanation: It’s written as s=T(z), where z is the intensity, and T is the transformation function.

22. Points whose locations are known exactly in the input and reference images are used in Geometric Spacial Transformation.
a) Known points
b) Key-points
c) Réseau points
d) Tie points
View Answer

Answer: d
Explanation: Tie points, also known as Control points, are spots in input and reference images whose locations are known precisely.

23. ___________ is a commercial use of Image Subtraction.
a) MRI scan
b) CT scan
c) Mask mode radiography
d) None of the Mentioned
View Answer

Answer: c
Explanation: Mask mode radiography, which is based on Image Subtraction, is an important medical imaging field.

24. Approaches to image processing that work directly on the pixels of incoming image work in ____________
a) Spatial domain
b) Inverse transformation
c) Transform domain
d) None of the Mentioned
View Answer

Answer: a
Explanation: In the Spatial Domain, operations on pixels of an input image work directly.

25. Which of the following in an image can be removed by using a smoothing filter?
a) Sharp transitions of brightness levels
b) Sharp transitions of gray levels
c) Smooth transitions of gray levels
d) Smooth transitions of brightness levels
View Answer

Answer: b
Explanation: The value of each pixel in an image is replaced by the average value of the grey levels in a smoothing filter. As a result, the sharp transitions in grey levels between pixels are reduced. This is done because random noise generally has strong gray-level transitions.

26. Region of Interest (ROI) operations is generally known as _______
a) Masking
b) Dilation
c) Shading correction
d) None of the Mentioned
View Answer

Answer: a
Explanation: Masking, commonly known as the ROI operation, is a typical use of image multiplication.

27. Which of the following comes under the application of image blurring?
a) Image segmentation
b) Object motion
c) Object detection
d) Gross representation
View Answer

Answer: d
Explanation: The blurring of an image with the aim of obtaining a gross representation of interesting items, so that the intensity of small objects mixes with the background and large objects become easier to distinguish, is an essential use of spatial averaging.

28. Which of the following filter’s responses is based on the pixels ranking?
a) Sharpening filters
b) Nonlinear smoothing filters
c) Geometric mean filter
d) Linear smoothing filters
View Answer

Answer: b
Explanation: Order static filters are nonlinear smoothing spatial filters that respond by ordering or ranking the pixels in the image area covered by the filter, and then replacing the value of the central pixel with the result of the ranking.

29. Which of the following illustrates three main types of image enhancing functions?
a) Linear, logarithmic and power law
b) Linear, logarithmic and inverse law
c) Linear, exponential and inverse law
d) Power law, logarithmic and inverse law
View Answer

Answer: d
Explanation: The three fundamental types of functions used often for picture improvement are shown in an introduction to gray-level transformations: linear (negative and identity transformations), logarithmic (log and inverse-log transformations), and power-law transformations (nth power and nth root transformations). The identity function is the simplest situation, in which the output and input intensities are the same. It’s just included in the graph for completeness’ sake.

30. Which of the following is the primary objective of sharpening of an image?
a) Decrease the brightness of the image
b) Increase the brightness of the image
c) Highlight fine details in the image
d) Blurring the image
View Answer

Answer: c
Explanation: Sharpening an image aids in highlighting small features in the image or enhancing details that have become blurred owing to factors such as noise addition.

31. Which of the following operation is done on the pixels in sharpening the image, in the spatial domain?
a) Differentiation
b) Median
c) Integration
d) Average
View Answer

Answer: a
Explanation: We know that when we blur an image, we produce a pixel average, which might be termed integration. Because sharpening is the inverse of blurring, we may deduce that we sharpen the image by doing differentiation on the pixels.

32. ________ is the principle objective of Sharpening, to highlight transitions.
a) Brightness
b) Pixel density
c) Composure
d) Intensity
View Answer

Answer: d
Explanation: Intensity is the main goal of Sharpening, which is to highlight transitions.

33. _________ enhance Image Differentiation?
a) Pixel Density
b) Contours
c) Edges
d) None of the mentioned
View Answer

Answer: c
Explanation: Edges and other discontinuities are enhanced via image differentiation.

34. Which of the following fact is correct for an image?
a) An image is the multiplication of illumination and reflectance component
b) An image is the subtraction of reflectance component from illumination component
c) An image is the subtraction of illumination component from reflectance component
d) An image is the addition of illumination and reflectance component
View Answer

Answer: a
Explanation: The multiplication of the illumination and reflectance components yields a picture.

35. Which of the following occurs in Unsharp Masking?
a) Subtracting blurred image from original
b) Blurring the original image
c) Adding a mask to the original image
d) All of the mentioned
View Answer

Answer: d
Explanation: All of the above happens in this order in Unsharp Masking: blurring, subtracting the blurred picture, and finally adding the mask.

36. Which of the following makes an image difficult to enhance?
a) Dynamic range of intensity levels
b) High noise
c) Narrow range of intensity levels
d) All of the mentioned
View Answer

Answer: d
Explanation: Dynamic range of intensity levels, High noise and Narrow range of intensity levels make it difficult to enhance an image.

37. _________ is the process of moving a filter mask over the image and computing the sum of products at each location.
a) Nonlinear spatial filtering
b) Convolution
c) Correlation
d) Linear spatial filtering
View Answer

Answer: c
Explanation: Correlation is the process of moving a filter mask over the image and computing the sum of products at each location.

38. Which side of the greyscale is the components of the histogram concentrated in a dark image?
a) Medium
b) Low
c) Evenly distributed
d) High
View Answer

Answer: b
Explanation: We know that in a dark image, the histogram components are largely concentrated on the low, or dark, side of the grey scale. Similarly, the bright image’s histogram components are biassed toward the high end of the grey scale.

39. Which of the following is the application of Histogram Equalisation?
a) Blurring
b) Contrast adjustment
c) Image enhancement
d) None of the Mentioned
View Answer

Answer: c
Explanation: Dark images are usually Enhancement using Image enhancement.

40. Which of the following is the expansion of PDF, in uniform PDF?
a) Probability Density Function
b) Previously Derived Function
c) Post Derivation Function
d) Portable Document Format
View Answer

Answer: a
Explanation: PDF is abbreviated as Probability Density Function.

41. ____________ filter is known as averaging filters.
a) Bandpass
b) Low pass
c) High pass
d) None of the Mentioned
View Answer

Answer: b
Explanation: Averaging filters are also known as Low pass filters.

42. What is/are the resultant image of a smoothing filter?
a) Image with reduced sharp transitions in gray levels
b) Image with high sharp transitions in gray levels
c) None of the mentioned
d) All of the mentioned
View Answer

Answer: a
Explanation: Smoothing filters reduce noise in random noise, which features sharp grey level transitions.

43. The response for linear spatial filtering is given by the relationship __________
a) Difference of filter coefficient’s product and corresponding image pixel under filter mask
b) Product of filter coefficient’s product and corresponding image pixel under filter mask
c) Sum of filter coefficient’s product and corresponding image pixel under filter mask
d) None of the mentioned
View Answer

Answer: c
Explanation: The mask is moved from point to point in spatial filtering, and the response is determined using a predefined relationship at each place. In linear spatial filtering, the connection is defined as the product of the sum of the filter coefficients and the corresponding picture pixel in the area beneath the filter mask.

44. ___________ is/are the feature(s) of a highpass filtered image.
a) An overall sharper image
b) Have less gray-level variation in smooth areas
c) Emphasized transitional gray-level details
d) All of the mentioned
View Answer

Answer: d
Explanation: A highpass filter reduces the low frequency to reduce grey-level variance in smooth sections while allowing high frequencies to emphasize transitional gray-level details for a clearer image.

45. The filter order of a Butterworth lowpass filter determines whether it is a very sharp or extremely smooth filter function, or an intermediate filter function. Which of the following filters does the filter approach if the parameter value is very high?
a) Gaussian lowpass filter
b) Ideal lowpass filter
c) Gaussian & Ideal lowpass filters
d) None of the mentioned
View Answer

Answer: b
Explanation: Butterworth lowpass filter functions like an Ideal lowpass filter at high order values, but it has a smoother form at lower order values, behaving like a Gaussian lowpass filter.

46. Which of the following image component is characterized by a slow spatial variation?
a) Reflectance and Illumination components
b) Reflectance component
c) Illumination component
d) None of the mentioned
View Answer

Answer: c
Explanation: The illumination component of an image is characterized by a slow spatial variation.

47. Gamma Correction is defined as __________
a) Light brightness variation
b) A Power-law response phenomenon
c) Inverted Intensity curve
d) None of the Mentioned
View Answer

Answer: b
Explanation: Gamma Correction is a technique for employing the exponent gamma to correct the response of a Power-law transformation.

48. ____________________ is known as the highlighting the contribution made to total image by specific bits instead of highlighting intensity-level changes.
a) Bit-plane slicing
b) Intensity Highlighting
c) Byte-Slicing
d) None of the Mentioned
View Answer

Answer: a
Explanation: It is called Bit-plane slicing.

49. Which gray-level transformation increases the dynamic range of gray-level in the image?
a) Negative transformations
b) Contrast stretching
c) Power-law transformations
d) None of the mentioned
View Answer

Answer: b
Explanation: The primary principle behind contrast stretching is to increase the dynamic range of gray-levels in an image.

50. What is/are the gray-level slicing approach(es)?
a) To brighten the pixels gray-value of interest and preserve the background
b) To give all gray level of a specific range high value and a low value to all other gray levels
c) All of the mentioned
d) None of the mentioned
View Answer

Answer: c
Explanation: Gray-level slicing can be done in one of two ways:
One method is to assign a high value to all grey levels in a certain range and a low value to all other grey levels.
The second method is to brighten the pixels with the gray-value of interest while leaving the background alone.


Chapterwise Multiple Choice Questions on Digital Image Processing

Digital Image Processing MCQ - Multiple Choice Questions and Answers

Our 1000+ MCQs focus on all topics of the Digital Image Processing subject, covering 100+ topics. This will help you to prepare for exams, contests, online tests, quizzes, viva-voce, interviews, and certifications. You can practice these MCQs chapter by chapter starting from the 1st chapter or you can jump to any chapter of your choice.
  1. Basic of Digital Image Processing
  2. Digital Image Fundamentals
  3. Intensity Transformations and Spatial Filtering
  4. Filtering in Frequency Domain
  5. Image Restoration and Reconstruction
  6. Color Image Processing
  7. Image Compression
  8. Morphological Image Processing
  9. Image Segmentation
  10. Representation and Description
  11. Wavelet based Image Processing
  12. Image Enhancement
  13. Object Recognition

1. MCQ on Basic of Digital Image Processing

The section contains multiple choice questions and answers on digital image processing introduction, steps and components.

  • Introduction to Digital Image Processing
  • Steps in Image Processing
  • Components of Image Processing System
  • 2. Multiple Choice Question on Digital Image Fundamentals

    The section contains questions and answers on image sampling and quantization, image sensing and acquisition, electromagnetic spectrum, relationship between pixels, mathematical tools of digital image processing.

  • Basics Of Image Sampling & Quantization
  • Representing Digital Images
  • Image Sampling and Quantization
  • Image Sensing and Acquisition
  • Light and the Electromagnetic Spectrum
  • Mathematical Tools in Digital Image Processing
  • Basic Relationships between Pixels
  • 3. Digital Image Processing MCQ on Intensity Transformations and Spatial Filtering

    The section contains MCQs on smoothing and sharpening spatial filters, intensity transformation functions, spatial filtering and its fundamentals, spatial enhancement methods, histogram processing, smoothing linear and non-linear spatial filters, fuzzy techniques for intensity, transformation and filtering, unsharp masking, intensity transformation techniques, piecewise-linear transformation functions, noise reduction by spatial and domain filtering.

  • Smoothing Spatial Filters
  • Basic Intensity Transformation Functions
  • Sharpening Spatial Filters-1
  • Sharpening Spatial Filters-2
  • Sharpening Spatial Filters-3
  • Combining Spatial Enhancements Methods
  • Fundamentals of Spatial Filtering
  • Histogram Processing-1
  • Histogram Processing-2
  • Smoothing Spacial Filters
  • Smoothing Linear Spatial Filters
  • Smoonthing Nonlinear Spatial Filter
  • Spatial Filtering
  • Filtering in Frequency Domain
  • Smoothing Frequency-Domain Filters
  • Unsharp Masking, High-boost filtering and Emphasis Filtering
  • Homomorphic filtering
  • Intensity Transformation Functions
  • Fuzzy Techniques – Transformations and Filtering
  • Piecewise-Linear Transformation Functions
  • Noise Reduction by Spatial Filtering
  • Fuzzy Techniques for Intensity
  • Discrete Fourier Transform Implementation
  • Noise Reduction by Frequency Domain Filtering
  • Smoothing and Sharpening
  • Noise Models
  • 4. Digital Image Processing MCQ on Filtering in Frequency Domain

    The section contains multiple choice questions and answers on frequency domain filtering basics, dft of one and two variables, fourier transform of sampled functions, image sharpening, smoothing and implementation, 2-d discrete fourier transform, sampling and selective filtering.

  • Gaussain Lowpass and Sharpening Frequency-Domain Filters
  • Basics of Frequency Domain Filtering
  • DFT of One Variable
  • DFT of Two Variables
  • Fourier Transform of Sampled Functions
  • Image Sharpening
  • Image Smoothing
  • Implementation
  • Preliminary Concepts of Frequency Domain Filtering
  • Properties of 2-D Discrete Fourier Transform
  • Sampling
  • Selective Filtering
  • Filtering in the Frequency Domain
  • 5. Digital Image Processing Multiple Choice Question on Image Restoration and Reconstruction

    The section contains questions and answers on relationship between pixels, visual perception, adaptive filters, bandpass and band reject filters, geometric mean filters, inverse filters, notch and static filters, wiener filtering, fourier transform of functions and variables, noise restoration and reduction, least squares filtering and degradation function estimation.

  • Elements of Visual Perception
  • Relationships between Pixels
  • Adaptive Filters
  • Bandpass and Bandreject Filters
  • Constrained Least Squares Filtering
  • Estimating the Degradation Function
  • Geometric Mean Filter
  • Image Reconstructions from Projections
  • Inverse Filtering
  • Mean Filters
  • Model of Image Restoration and Degradation Process
  • Notch Filters
  • Order Statistic Filters
  • Wiener Filtering
  • Sampling – Fourier Transform of Sampled Functions
  • Discrete Fourier Transform of One Variable
  • Extension to Functions of Two Variables
  • Restoration in the Presence of Noise
  • Periodic Noise Reduction
  • Linear, Position-Invariant Degradations
  • Degradation Function Estimation
  • Constrained Least Squares Filtering
  • 6. MCQ on Color Image Processing

    The section contains MCQs on color transformation and segmentation, full color and pseudo color image processing, image construction and formulation, color slicing and correction.

  • Color Fundamentals
  • Color Models
  • Color Transformations
  • Color Segmentation
  • Full Color Image Processing
  • Noise in Color Images
  • Pseudo color Image Processing
  • Image Segmentation based on Color
  • Color Image Compression
  • Geometric Transformations
  • Color Formulationon
  • Color Slicing
  • Color Correction
  • 7. Digital Image Processing Multiple Choice Question on Image Compression

    The section contains multiple choice questions and answers on compression methods basics and fundamentals, bit plane and block transform coding, digital image watermarking, run length and symbol based coding, lossy and error free compression, image compression standards and models, multiresolution expansions and compression methods.

  • Basic Compression Methods
  • Bit Plane Coding
  • Block Transform Coding
  • Digital Image Watermarking
  • Fundamentals of Image Compression
  • Lossless Predictive Coding
  • Lossy Predictive Coding
  • Run Length Coding
  • Symbol Based Coding
  • Multiresolution Expansions
  • Redundancy in Images
  • Image Compression Models
  • Error Free Compression
  • Lossy Compression
  • Image Compression Standards
  • Compression Methods
  • 8. Multiple Choice Question on Morphological Image Processing

    The section contains questions and answers on boundry extraction, complex hull, erosion and dilation, gray scale morphology, hit or miss transform, morphological reconstruction, skeletons and pruning, thinning and thickening, morphological algorithms, grey scale morphology applications.

  • Boundary Extraction and Hole Filling
  • Complex Hull
  • Erosion and Dilation
  • Extraction of Connected Components
  • Gray Scale Morphology
  • Hit or Miss Transform
  • Morphological Reconstruction
  • Opening and Closing
  • Skeletons and Pruning
  • Thinning and Thickening
  • Morphology- Erosion and Dilation
  • Morphological Algorithms
  • Dilation and Erosion
  • The Hit or Miss Transformation
  • Some Basic Morphological Algorithms
  • Applications of Grey-Scale Morphology
  • 9. Digital Image Processing MCQ on Image Segmentation

    The section contains MCQs on edge detection, edge linking and boundary detection, line and point detection, thresholding and variable thresholding, image segmentation, segmentation using morphological watersheds and boundary segments.

  • Advanced Techniques for Edge Detection
  • Edge Detection
  • Edge Linking and Boundary Detection
  • Fundamentals of Image Segmentation
  • Line Detection
  • Multple Thresholds
  • Point Detection
  • Region Based Segmentation
  • Segmentation Using Morphological Watersheds
  • Thresholding
  • Use of Motion in Segmentation
  • Variable Thresholding
  • Point,Line and Edge Detection
  • Segmentation by Morphological Watershed
  • Boundary Segments
  • 10. Digital Image Processing Multiple Choice Question on Representation and Description

    The section contains multiple choice questions and answers on regional and boundary descriptors, boundary following, chain codes, perimeter polygons, description principal and components, signatures, decision recognition, structural methods, detection of discontinuities and relation descriptors.

  • Regional Descriptors
  • Boundary Descriptors
  • Boundary Following
  • Chain Codes
  • Polygonal Approximations Using Minimum Perimeter Polygons
  • Principal Components for Description
  • Signatures
  • Representation
  • Components for Description
  • Recognition Based on Decision
  • Structural Methods
  • Detection Of Discontinuities
  • Relational Descriptors
  • 11. Multiple Choice Question on Wavelet based Image Processing

    The section contains questions and answers on wavelet transform in one and two dimensions, fast wavelet transform and wavelet packets.

  • Wavelet Transforms in One Dimension
  • Fast Wavelet Transform
  • Wavelet Transforms in Two Dimensions
  • Wavelet Packets
  • 12. Digital Image Processing MCQ on Image Enhancement

    The section contains MCQs on spatial and grey level resolutions, zooming and shrinking, image enhancement basics, histogram equalization, histogram specification, logic and arithmetic operations enhancement, first and second order derivatives for enhancement and laplacian in frequency domain.

  • Spatial and Grey-Level Resolutions and Aliasing
  • Zooming and Shrinking Digital Images
  • Relationship between Pixels and Image Enhancement Basics
  • Basic Grey Level Transformation
  • Histogram Equalization and Processing
  • Histogram Specification and Use of Histogram Statistics for Image Enhancement
  • Enhancement using Logic Operations
  • Enhancement using Arithmetic Operations
  • Use of Second Order Derivative for Enhancement
  • Use of First Order Derivative for Enhancement
  • Laplacian in Frequency Domain
  • Elements of Information Theory
  • 13. Digital Image Processing Multiple Choice Question on Object Recognition

    The section contains multiple choice questions and answers on patterns and pattern classes, template and shape matching, optimum statistical classifiers, syntactic recognition of string and trees.

  • Patterns and Pattern Classes
  • Template matching
  • Optimum Statistical Classifiers
  • Neural Networks
  • Matching Shape Numbers
  • Syntactic Recognition of String
  • Syntactic Recognition of Trees
  • If you would like to learn "Digital Image Processing" thoroughly, you should attempt to work on the complete set of 1000+ MCQs - multiple choice questions and answers mentioned above. It will immensely help anyone trying to crack an exam or an interview.

    Wish you the best in your endeavor to learn and master Digital Image Processing!

    advertisement
    Manish Bhojasia - Founder & CTO at Sanfoundry
    Manish Bhojasia, a technology veteran with 20+ years @ Cisco & Wipro, is Founder and CTO at Sanfoundry. He lives in Bangalore, and focuses on development of Linux Kernel, SAN Technologies, Advanced C, Data Structures & Alogrithms. Stay connected with him at LinkedIn.

    Subscribe to his free Masterclasses at Youtube & discussions at Telegram SanfoundryClasses.