Application of Convolutional Neural Network-Based Feature Extraction and Data Fusion for Geographical Origin Identification of Radix Astragali by Visible/Short-Wave Near-Infrared and Near Infrared Hyperspectral Imaging
Abstract
:1. Introduction
2. Materials and Methods
2.1. Sample Preparation
2.2. Hyperspectral Image Acquisition
2.3. Image Preprocessing and Spectral Extraction
2.4. Data Analysis Methods
2.4.1. Principal Component Analysis
2.4.2. Convolutional Neural Network
2.4.3. Data Fusion Strategy
2.4.4. Traditional Discriminant Model
2.4.5. Model Evaluation
2.4.6. Significance Test
2.4.7. Software
3. Results and Discussion
3.1. Overview of Spectral Profiles
3.2. PCA Score Images
3.3. Discrimination Results of Models Using Full Wavelengths and Extracted Features
3.4. Prediction Maps
3.5. Discrimination Results of Models Using Fusion Strategy
4. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Cheng, J.H.; Guo, Q.S.; Sun, D.W.; Han, Z. Kinetic modeling of microwave extraction of polysaccharides from Astragalus membranaceus. J. Food Process. Preserv. 2019, 43, e14001. [Google Scholar] [CrossRef]
- Tian, Y.; Ding, Y.P.; Shao, B.P.; Yang, J.; Wu, J.G. Interaction between homologous functional food Astragali Radix and intestinal flora. China J. Chin. Mater. Med. 2020, 45, 2486–2492. [Google Scholar] [CrossRef]
- Zhang, Q.; Gao, W.Y.; Man, S.L. Chemical composition and pharmacological activities of astragali radix. China J. Chin. Mater. Med. 2012, 37, 3203–3207. [Google Scholar]
- Guo, Z.Z.; Lou, Y.M.; Kong, M.Y.; Luo, Q.; Liu, Z.Q.; Wu, J.J. A systematic review of phytochemistry, pharmacology and pharmacokinetics on Astragali Radix: Implications for Astragali Radix as a personalized medicine. Int. J. Mol. Sci. 2019, 20, 1463. [Google Scholar] [CrossRef] [Green Version]
- Sinclair, S. Chinese herbs: A clinical review of Astragalus, Ligusticum, and Schizandrae. Altern. Med. Rev. A J. Clin. Ther. 1998, 3, 338–344. [Google Scholar]
- Wu, F.; Chen, X. A review of pharmacological study on Astragalus membranaceus (Fisch.) Bge. J. Chin. Med. Mater. 2004, 27, 232–234. [Google Scholar]
- Wei, W.; Li, J.; Huang, L. Discrimination of producing areas of Astragalus membranaceus using electronic nose and UHPLC-PDA combined with chemometrics. Czech J. Food Sci. 2017, 35, 40–47. [Google Scholar] [CrossRef] [Green Version]
- Wang, C.J.; He, F.; Huang, Y.F.; Ma, H.L.; Wang, Y.P.; Cheng, C.S.; Cheng, J.L.; Lao, C.C.; Chen, D.A.; Zhang, Z.F.; et al. Discovery of chemical markers for identifying species, growth mode and production area of Astragali Radix by using ultra-high-performance liquid chromatography coupled to triple quadrupole mass spectrometry. Phytomedicine 2020, 67, 153155. [Google Scholar] [CrossRef]
- Woo, Y.A.; Kim, H.J.; Cho, J.; Chung, H. Discrimination of herbal medicines according to geographical origin with near infrared reflectance spectroscopy and pattern recognition techniques. J. Pharmaceut. Biomed. 1999, 21, 407–413. [Google Scholar] [CrossRef]
- Liu, Y.; Peng, Q.W.; Yu, J.C.; Tang, Y.L. Identification of tea based on CARS-SWR variable optimization of visible/near-infrared spectrum. J. Sci. Food Agric. 2020, 100, 371–375. [Google Scholar] [CrossRef]
- Liu, Y.; Xia, Z.; Yao, L.; Wu, Y.; Li, Y.; Zeng, S.; Li, H. Discriminating geographic origin of sesame oils and determining lignans by near-infrared spectroscopy combined with chemometric methods. J. Food Compos. Anal. 2019, 84, 103327. [Google Scholar] [CrossRef]
- Oerke, E.C.; Leucker, M.; Steiner, U. Sensory assessment of Cercospora beticola sporulation for phenotyping the partial disease resistance of sugar beet genotypes. Plant Methods 2019, 15, 133. [Google Scholar] [CrossRef] [Green Version]
- Nagasubramanian, K.; Jones, S.; Singh, A.K.; Sarkar, S.; Singh, A.; Ganapathysubramanian, B. Plant disease identification using explainable 3D deep learning on hyperspectral images. Plant Methods 2019, 15, 98. [Google Scholar] [CrossRef]
- Wu, N.; Zhang, Y.; Na, R.; Mi, C.; Zhu, S.; He, Y.; Zhang, C. Variety identification of oat seeds using hyperspectral imaging: Investigating the representation ability of deep convolutional neural network. RSC Adv. 2019, 9, 12635–12644. [Google Scholar] [CrossRef] [Green Version]
- Xiao, Q.L.; Bai, X.L.; He, Y. Rapid screen of the color and water content of fresh-cut potato tuber slices using hyperspectral imaging coupled with multivariate analysis. Foods 2020, 9, 94. [Google Scholar] [CrossRef] [Green Version]
- He, J.; He, Y.; Zhang, C. Determination and visualization of peimine and peiminine content in fritillaria thunbergii bulbi treated by sulfur fumigation using hyperspectral imaging with chemometrics. Molecules 2017, 22, 1402. [Google Scholar] [CrossRef]
- He, J.; Zhu, S.; Chu, B.; Bai, X.; Xiao, Q.; Zhang, C.; Gong, J. Nondestructive determination and visualization of quality attributes in fresh and dry chrysanthemum morifolium using near-infrared hyperspectral imaging. Appl. Sci. 2019, 9, 1959. [Google Scholar] [CrossRef] [Green Version]
- Ru, C.L.; Li, Z.H.; Tang, R.Z. A hyperspectral imaging approach for classifying geographical origins of rhizoma atractylodis macrocephalae using the fusion of spectrum-image in VNIR and SWIR ranges (VNIR-SWIR-FuSI). Sensors 2019, 19, 2045. [Google Scholar] [CrossRef] [Green Version]
- Xia, Z.; Zhang, C.; Weng, H.; Nie, P.; He, Y. Sensitive wavelengths selection in identification of ophiopogon japonicus based on near-infrared hyperspectral imaging Technology. Int. J. Anal. Chem. 2017, 2017, 6018769. [Google Scholar] [CrossRef] [Green Version]
- Sellami, A.; Farah, M.; Farah, I.R.; Solaiman, B. Hyperspectral imagery classification based on semi-supervised 3-D deep neural network and adaptive band selection. Expert Syst. Appl. 2019, 129, 246–259. [Google Scholar] [CrossRef]
- Mou, L.C.; Zhu, X.X. Learning to pay attention on spectral domain: A spectral attention module-based convolutional network for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2020, 58, 110–122. [Google Scholar] [CrossRef]
- Dyson, J.; Mancini, A.; Frontoni, E.; Zingaretti, P. Deep learning for soil and crop segmentation from remotely sensed data. Remote Sens. 2019, 11, 1859. [Google Scholar] [CrossRef] [Green Version]
- Zhou, L.; Zhang, C.; Qiu, Z.J.; He, Y. Information fusion of emerging non-destructive analytical techniques for food quality authentication: A survey. TrAC Trends Anal. Chem. 2020, 127, 115901. [Google Scholar] [CrossRef]
- Feng, L.; Zhu, S.; Zhou, L.; Zhao, Y.; Bao, Y.; Zhang, C.; He, Y. Detection of subtle bruises on winter jujube using hyperspectral imaging with pixel-wise deep learning method. IEEE Access 2019, 7, 64494–64505. [Google Scholar] [CrossRef]
- Feng, L.; Zhu, S.; Zhang, C.; Bao, Y.; Feng, X.; He, Y. Identification of maize kernel vigor under different accelerated aging times using hyperspectral imaging. Molecules 2018, 23, 3078. [Google Scholar] [CrossRef] [Green Version]
- Zhu, S.S.; Zhou, L.; Gao, P.; Bao, Y.D.; He, Y.; Feng, L. Near-infrared hyperspectral imaging combined with deep learning to identify cotton seed varieties. Molecules 2019, 24, 3268. [Google Scholar] [CrossRef] [Green Version]
- Qiu, Z.; Chen, J.; Zhao, Y.; Zhu, S.; He, Y.; Zhang, C. Variety identification of single rice seed using hyperspectral imaging combined with convolutional neural network. Appl. Sci. 2018, 8, 212. [Google Scholar] [CrossRef] [Green Version]
- Vaddi, R.; Manoharan, P. Hyperspectral image classification using CNN with spectral and spatial features integration. Infrared Phys. Technol. 2020, 107, 103296. [Google Scholar] [CrossRef]
- Burges, C.J.C. A tutorial on Support Vector Machines for pattern recognition. Data Min. Knowl. Disc. 1998, 2, 121–167. [Google Scholar] [CrossRef]
- Kuo, B.C.; Ho, H.H.; Li, C.H.; Hung, C.C.; Taur, J.-S. A kernel-based feature selection method for SVM with RBF kernel for hyperspectral image classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 317–326. [Google Scholar] [CrossRef]
- Cen, H.; He, Y. Theory and application of near infrared reflectance spectroscopy in determination of food quality. Trends Food Sci. Technol. 2007, 18, 72–83. [Google Scholar] [CrossRef]
- Feng, X.; Peng, C.; Chen, Y.; Liu, X.; Feng, X.; He, Y. Discrimination of CRISPR/Cas9-induced mutants of rice seeds using near-infrared hyperspectral imaging. Sci. Rep. 2017, 7, 15934. [Google Scholar] [CrossRef] [PubMed]
- Turker-Kaya, S.; Huck, C.W. A review of mid-Infrared and near-infrared imaging: Principles; concepts and applications in plant tissue analysis. Molecules 2017, 22, 168. [Google Scholar] [CrossRef] [Green Version]
- Salzer, R. Practical Guide to Interpretive Near-Infrared Spectroscopy; Jerry, W., Jr., Lois, W., Eds.; CRC Press: Boca Raton, FL, USA, 2008. [Google Scholar] [CrossRef]
- Liang, H.M.; Li, Q. Hyperspectral imagery classification using sparse representations of convolutional neural network features. Remote Sens. 2016, 8, 99. [Google Scholar] [CrossRef] [Green Version]
- Mei, S.H.; Ji, J.Y.; Hou, J.H.; Li, X.; Du, Q. Learning sensor-specific spatial-spectral features of hyperspectral images via convolutional neural networks. IEEE Trans. Geosci. Remote Sens. 2017, 55, 4520–4533. [Google Scholar] [CrossRef]
- Zhao, S.P.; Zhang, B.; Chen, C.L.P. Joint deep convolutional feature representation for hyperspectral palmprint recognition. Inf. Sci. 2019, 489, 167–181. [Google Scholar] [CrossRef]
- Zhi, L.; Yu, X.C.; Liu, B.; Wei, X.P. A dense convolutional neural network for hyperspectral image classification. Remote Sens. Lett. 2019, 10, 59–66. [Google Scholar] [CrossRef]
- Wang, H.Y.; Song, C.; Sha, M.; Liu, J.; Li, L.-P.; Zhang, Z.Y. Discrimination of medicine Radix Astragali from different geographic origins using multiple spectroscopies combined with data fusion methods. J. Appl. Spectrosc. 2018, 85, 313–319. [Google Scholar] [CrossRef]
Spectra | Models | Cal | Val | Pre |
---|---|---|---|---|
Vis-NIR | SVM | 99.477 ± 0.243 a | 98.295 ± 0.572 b | 98.953 ± 0.446 b |
LR | 98.418 ± 0.171 b | 98.101 ± 0.704 b | 98.798 ± 0.677 b | |
CNN | 99.606 ± 0.368 a | 99.961 ± 0.087 a | 99.961 ± 0.087 a | |
NIR | SVM | 99.785 ± 0.105 b | 99.109 ± 0.524 b | 98.837 ± 0.930 b |
LR | 99.535 ± 0.067 c | 99.264 ± 0.442 ab | 99.263 ± 0.913 ab | |
CNN | 99.994 ± 0.014 a | 99.806 ± 0.237 a | 99.923 ± 0.105 a |
Spectra | Models | Cal | Val | Pre |
---|---|---|---|---|
Vis-NIR | SVM | 99.238 ± 0.286 b | 98.295 ± 0.634 b | 98.835 ± 0.566 a |
LR | 99.548 ± 0.082 b | 99.380 ± 0.442 a | 99.380 ± 0.347 a | |
CNN | 99.780 ± 0.053 a | 98.527 ± 0.652 b | 98.682 ± 0.860 a | |
NIR | SVM | 99.703 ± 0.090 b | 99.264 ± 0.420 b | 98.760 ± 0.884 a |
LR | 99.968 ± 0.000 a | 99.806 ± 0.237 a | 99.453 ± 0.912 a | |
CNN | 99.890 ± 0.054 a | 98.953 ± 0.221 b | 98.760 ± 0.446 a |
Spectra | Models | Cal | Val | Pre |
---|---|---|---|---|
Vis-NIR | SVM | 99.612 ± 0.208 a | 99.186 ± 0.373 a | 99.496 ± 0.294 a |
LR | 99.683 ± 0.136 a | 99.109 ± 0.591 a | 99.496 ± 0.402 a | |
CNN | 99.729 ± 0.132 a | 99.070 ± 0.373 a | 99.535 ± 0.221 a | |
NIR | SVM | 100.000 ± 0.000 a | 99.845 ± 0.087 a | 99.884 ± 0.106 a |
LR | 99.994 ± 0.014 a | 99.806 ± 0.137 a | 99.767 ± 0.213 a | |
CNN | 99.974 ± 0.027 a | 99.767 ± 0.212 a | 99.806 ± 0.237 a |
Fusion Strategy | Models | Cal | Val | Pre |
---|---|---|---|---|
measurement fusion | SVM | 99.903 ± 0.069 b | 99.147 ± 0.378 b | 99.535 ± 0.324 a |
LR | 99.852 ± 0.029 b | 99.612 ± 0.137 a | 99.690 ± 0.402 a | |
CNN | 99.981 ± 0.018 a | 99.845 ± 0.087 a | 99.922 ± 0.174 a | |
PCA feature fusion | SVM | 99.897 ± 0.058 b | 99.186 ± 0.373 b | 99.535 ± 0.324 a |
LR | 99.987 ± 0.029 a | 99.961 ± 0.087 a | 99.767 ± 0.318 a | |
CNN | 99.936 ± 0.076 ab | 99.031 ± 0.565 b | 99.457 ± 0.162 a | |
deep feature fusion | SVM | 100.000 ± 0.000 a | 99.922 ± 0.174 a | 99.922 ± 0.106 a |
LR | 99.987 ± 0.018 a | 99.845 ± 0.162 a | 99.922 ± 0.106 a | |
CNN | 99.981 ± 0.029 a | 99.884 ± 0.260 a | 99.922 ± 0.106 a |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xiao, Q.; Bai, X.; Gao, P.; He, Y. Application of Convolutional Neural Network-Based Feature Extraction and Data Fusion for Geographical Origin Identification of Radix Astragali by Visible/Short-Wave Near-Infrared and Near Infrared Hyperspectral Imaging. Sensors 2020, 20, 4940. https://doi.org/10.3390/s20174940
Xiao Q, Bai X, Gao P, He Y. Application of Convolutional Neural Network-Based Feature Extraction and Data Fusion for Geographical Origin Identification of Radix Astragali by Visible/Short-Wave Near-Infrared and Near Infrared Hyperspectral Imaging. Sensors. 2020; 20(17):4940. https://doi.org/10.3390/s20174940
Chicago/Turabian StyleXiao, Qinlin, Xiulin Bai, Pan Gao, and Yong He. 2020. "Application of Convolutional Neural Network-Based Feature Extraction and Data Fusion for Geographical Origin Identification of Radix Astragali by Visible/Short-Wave Near-Infrared and Near Infrared Hyperspectral Imaging" Sensors 20, no. 17: 4940. https://doi.org/10.3390/s20174940
APA StyleXiao, Q., Bai, X., Gao, P., & He, Y. (2020). Application of Convolutional Neural Network-Based Feature Extraction and Data Fusion for Geographical Origin Identification of Radix Astragali by Visible/Short-Wave Near-Infrared and Near Infrared Hyperspectral Imaging. Sensors, 20(17), 4940. https://doi.org/10.3390/s20174940