Sensor-Based Indoor Fire Forecasting Using Transformer Encoder
Abstract
:1. Introduction
2. Related Work
3. Method
3.1. Problem
3.2. Solution
4. Experiment
4.1. Dataset
4.2. Result
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Correction Statement
References
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef] [PubMed]
- Chung, J.; Gulcehre, C.; Cho, K.; Bengio, Y. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. In Proceedings of the NIPS 2014 Deep Learning and Representation Learning Workshop, Montreal, QC, Canada, 12–13 December 2014; pp. 1–9. [Google Scholar]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.; Polosukhin, I. Attention is All you Need. In Proceedings of the Advances in Neural Information Processing Systems 30, Long Beach, CA, USA, 4–9 December 2017; pp. 5998–6008. [Google Scholar]
- Devlin, J.; Chang, M.W.; Lee, K.; Toutanova, K. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv 2018, arXiv:1810.04805. [Google Scholar]
- Li, Y.; Miao, N.; Ma, L.; Shuang, F.; Huang, X. Transformer for object detection: Review and benchmark. Eng. Appl. Artif. Intell. 2023, 126, 107021. [Google Scholar] [CrossRef]
- Luptáková, I.D.; Kubovčík, M.; Pospíchal, J. Wearable Sensor-Based Human Activity Recognition with Transformer Model. Sensors 2022, 22, 1911. [Google Scholar] [CrossRef] [PubMed]
- Ting, Y.Y.; Hsiao, C.W.; Wang, H.S. A Data Fusion-Based Fire Detection System. IEICE Trans. Inf. Syst. 2018, E101-D, 977–984. [Google Scholar]
- Chen, S.; Ren, J.; Yan, Y.; Sun, M.; Hu, F.; Zhao, H. Multi-sourced sensing and support vector machine classification for effective detection of fire hazard in early stage. Comput. Electr. Eng. 2022, 101, 108046. [Google Scholar] [CrossRef]
- Burges, C.J.C. A Tutorial on Support Vector Machines for Pattern Recognition. Data Min. Knowl. Discov. 1998, 2, 121–167. [Google Scholar] [CrossRef]
- Jana, S.; Shome, S.K. Hybrid Ensemble Based Machine Learning for Smart Building Fire Detection Using Multi Modal Sensor Data. Fire Technol. 2023, 59, 473–496. [Google Scholar] [CrossRef]
- Dampage, U.; Bandaranayake, L.; Wanasinghe, R.; Kottahachchi, K.; Jayasanka, B. Forest fire detection system using wireless sensor networks and machine learning. Sci. Rep. 2022, 12, 46. [Google Scholar] [CrossRef] [PubMed]
- Wu, L.; Chen, L.; Hao, X. Multi-Sensor Data Fusion Algorithm for Indoor Fire Early Warning Based on BP Neural Network. Information 2021, 12, 59. [Google Scholar] [CrossRef]
- Nakip, M.; Güzelíş, C.; Yildiz, O. Recurrent Trend Predictive Neural Network for Multi-Sensor Fire Detection. IEEE Access 2021, 9, 84204–84216. [Google Scholar] [CrossRef]
- Li, Y.; Su, Y.; Zeng, X.; Wang, J. Research on Multi-Sensor Fusion Indoor Fire Perception Algorithm Based on Improved TCN. Sensors 2022, 22, 4550. [Google Scholar] [CrossRef]
- Jesubalan, A.; Nallasamy, S.; Anbukaruppusamy, S.; Upreti, K.; Dubey, A.K. Forest fire prediction using IoT and deep learning. Int. J. Adv. Technol. Eng. Explor. 2022, 9, 246–256. [Google Scholar]
- Liu, P.; Xiang, P.; Lu, D. A new multi-sensor fire detection method based on LSTM networks with environmental information fusion. Neural Comput. Appl. 2023, 35, 25275–25289. [Google Scholar] [CrossRef]
- Qiao, Y.; Jiang, W.; Wang, F.; Su, G.; Li, X.; Jiang, J. FireFormer: An efficient Transformer to identify forest fire from surveillance cameras. Int. J. Wildland Fire 2023, 32, 1364–1380. [Google Scholar] [CrossRef]
- Mardani, K.; Vretos, N.; Daras, P. Transformer-Based Fire Detection in Videos. Sensors 2023, 23, 3035. [Google Scholar] [CrossRef] [PubMed]
- Radford, A.; Narasimhan, K.; Salimans, T.; Sutskever, I. Improving Language Understanding by Generative Pre-Training. Preprint. 2018. Available online: https://paperswithcode.com/paper/improving-language-understanding-by (accessed on 11 February 2024).
- Radford, A.; Wu, J.; Child, R.; Luan, D.; Amodei, D.; Sutskever, I. Language Models are Unsupervised Multitask Learners. OpenAI Blog 2019, 1, 9. [Google Scholar]
- Brown, T.B.; Mann, B.; Ryder, N.; Subbiah, M.; Kaplan, J.; Dhariwal, P.; Neelakantan, A.; Shyam, P.; Sastry, G.; Askell, A.; et al. Language Models are Unsupervised Multitask Learners. arXiv 2020, arXiv:2005.14165. [Google Scholar]
- Pettersson, J.; Falkman, P. Comparison of LSTM, Transformers, and MLP-mixer neural networks for gaze based human intention prediction. Front. Neurorobot. 2023, 17, 1157957. [Google Scholar] [CrossRef] [PubMed]
- Dandwate, P.; Shahane, C.; Jagtap, V.; Karande, S.C. Comparative study of Transformer and LSTM Network with attention mechanism on Image Captioning. In Proceedings of the International Conference on Information and Communication Technology for Intelligent Systems, Ahmedabad, India, 27–28 April 2023; pp. 527–539. [Google Scholar]
- Kusumawardani, S.S.; Alfarozi, S.A.I. Transformer Encoder Model for Sequential Prediction of Student Performance Based on Their Log Activities. IEEE Access 2023, 11, 18960–18971. [Google Scholar] [CrossRef]
- Wass, D. Transformer Learning for Traffic Prediction in Mobile Networks. Master’s Thesis, KTH Royal Institute of Technology, Stockholm, Sweden, 2021. [Google Scholar]
- Bilokon, P.; Qiu, Y. Transformers versus LSTMs for electronic trading. arXiv 2023, arXiv:2309.11400. [Google Scholar] [CrossRef]
- Pascal, V. Indoor Fire Dataset with Distributed Multi-Sensor Nodes. Mendeley Data 2023. V1. Available online: https://data.mendeley.com/datasets/npk2zcm85h/1 (accessed on 11 February 2024).
- Jang, K.; Cho, S.B.; Cho, Y.S.; Son, S. Development of Fire Engine Travel Time Estimation Model for Securing Golden Time. J. Korea Inst. Intell. Transp. Syst. 2020, 19, 1–13. [Google Scholar] [CrossRef]
- Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. In Proceedings of the 3rd International Conference on Learning Representations, San Diego, CA, USA, 7–9 May 2015; pp. 1–15. [Google Scholar]
File Names of the Original Dataset | Number of Samples | ||
---|---|---|---|
NIST | Train | sdc02, sdc08, sdc11, sdc12, sdc13, sdc30, sdc31, sdc34, sdc35, | normal:fire = 873:586 |
sdc36, sdc37, sdc38, sdc39, sdc40, sdc41 | |||
Test | sdc07, sdc09, sdc10, sdc14, sdc15, sdc32, sdc33 | normal:fire = 112:113 | |
Pascal | Train | Indoor Fire Dataset with Distributed Multi-Sensor Nodes | normal:nuisance:fire = 3813: 199:578 |
Test | normal:nuisance:fire = 416:21:66 |
Model | Accuracy | FPR | TPR | TPR-FPR |
---|---|---|---|---|
SVM | 0.6281 | 0.6964 | 0.9499 | 0.2535 |
RF (200 estimators) | 0.6400 | 0.4256 | 0.7050 | 0.2794 |
GRU (4 layers) | 0.6089 | 0.7679 | 0.9823 | 0.2144 |
Transformer encoder stack (1 layer) | 0.6948 | 0.4911 | 0.8791 | 0.3880 |
Transformer encoder stack (2 layer) | 0.6518 | 0.6101 | 0.9115 | 0.3014 |
Transformer encoder stack (4 layer) | 0.6963 | 0.4405 | 0.8319 | 0.3914 |
Transformer encoder stack (6 layer) | 0.6444 | 0.5119 | 0.7994 | 0.2875 |
Transformer encoder stack (8 layer) | 0.6652 | 0.2798 | 0.6106 | 0.3308 |
Model | Accuracy | FPR | TPR | TPR-FPR |
---|---|---|---|---|
SVM | 0.9616 | 0.0091 | 0.8125 | 0.8034 |
RF (200 estimators) | 0.9907 | 0.0023 | 0.9688 | 0.9665 |
GRU (2 layers) | 0.9607 | 0.0334 | 0.9167 | 0.8833 |
Transformer encoder stack (1 layer) | 0.9734 | 0.0091 | 0.8594 | 0.8503 |
Transformer encoder stack (2 layer) | 0.9766 | 0.0091 | 0.8958 | 0.8867 |
Transformer encoder stack (4 layer) | 0.9726 | 0.0091 | 0.8698 | 0.8607 |
Transformer encoder stack (6 layer) | 0.9766 | 0.0114 | 0.8906 | 0.8792 |
Transformer encoder stack (8 layer) | 0.9744 | 0.0099 | 0.8802 | 0.8703 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jeong, Y.-S.; Hwang, J.; Lee, S.; Ndomba, G.E.; Kim, Y.; Kim, J.-I. Sensor-Based Indoor Fire Forecasting Using Transformer Encoder. Sensors 2024, 24, 2379. https://doi.org/10.3390/s24072379
Jeong Y-S, Hwang J, Lee S, Ndomba GE, Kim Y, Kim J-I. Sensor-Based Indoor Fire Forecasting Using Transformer Encoder. Sensors. 2024; 24(7):2379. https://doi.org/10.3390/s24072379
Chicago/Turabian StyleJeong, Young-Seob, JunHa Hwang, SeungDong Lee, Goodwill Erasmo Ndomba, Youngjin Kim, and Jeung-Im Kim. 2024. "Sensor-Based Indoor Fire Forecasting Using Transformer Encoder" Sensors 24, no. 7: 2379. https://doi.org/10.3390/s24072379
APA StyleJeong, Y. -S., Hwang, J., Lee, S., Ndomba, G. E., Kim, Y., & Kim, J. -I. (2024). Sensor-Based Indoor Fire Forecasting Using Transformer Encoder. Sensors, 24(7), 2379. https://doi.org/10.3390/s24072379