An Image-Based User Interface Testing Method for Flutter Programming Learning Assistant System
Abstract
:1. Introduction
2. Literature Review
2.1. Programming Education
2.2. UI Testing
2.3. Image Detection Algorithms
3. Adopted Software Tools
3.1. Flask
3.2. OpenCV
3.2.1. ORB
3.2.2. SIFT
3.3. GitHub
3.4. Moodle
4. Review of FPLAS Platform
4.1. Overview of FPLAS
4.2. Usage Procedure and Access Exercises by Student
- Install Docker and VSCode based on the student’s PC operating system.
- Import the three extensions for Flutter, Remote Development, and Docker in VSCode.
- Obtain the Docker container image for FPLAS.
- Download the GitHub project containing the essential files, or clone the project if students have already installed Git on their PCs.
- Open the downloaded project in VSCode, initiate the containerized development to access the remote development, and activate the FPLAS development environment in the Docker container.
- Transfer each exercise to the designated workspace in the container.
- Access to the exercise directory. This allows students to navigate to the specific exercise folder, where they can modify the source codes according to the provided modification guidance.
- Initiate the Flutter web server by executing “flutter run -d web-server”.
- Preview the output generated by the source code by navigating to the local web server address using “http://localhost:port” in the web browser.
5. Proposal of Image-Based UI Testing Method
5.1. Software Architecture
5.2. Data Pre-Processing for UI Testing
5.2.1. UI Image Capture Step
5.2.2. Black Border Removal Step
5.3. Image-Based UI Testing
- Directory Path: The paths to specific directories are defined for hosting the images and results. They include the “exercise” directory for the correct images of the exercises, the “answer” directory for the corresponding answer images received from the preprocessing steps, the “result” directory for the processed results, and the “difference” directory for the images that highlight differences between correct images and answer images. These paths can ensure efficient file management and operations.
- Image Size and Similarity Check: The check_image_size_similarity function ensures that both correct and answer images have the exact matching dimensions. It calculates similarity percentages using either the SIFT or ORB algorithm, depending on whether the size is the same. Both methods detect vital key points and compute descriptors to match features between images, ultimately providing a similarity score that quantifies how closely the images match.
- Similarity Calculation by SIFT:SIFT is used when images have the same size, providing robust feature matching. It detects key points and computes descriptors in both images, and then matches them using a FLANN-based matcher [37]. Good matches are filtered to calculate the similarity percentage, offering an accurate measurement based on key point matching.
- Similarity Calculation by ORB:ORB is used when images need resizing, offering a faster alternative with lower computational complexity. The resize_image function standardizes image dimensions, ensuring consistency and enhancing similarity calculations. ORB detects vital key points and computes descriptors, matches them using BFMatcher [38], and calculates similarity based on good matches, providing an efficient method for image comparison.
- Image Difference Highlighting: The results are sorted and saved in a CSV file, providing comprehensive analysis of image similarities. Then, to identify and highlight the differences in the images, the highlight_image_difference function computes the absolute difference between the images, applies a threshold to create a binary mask, and dilates this mask to enhance visibility. The differences are highlighted in red on the original image, and the result is saved for the user or teacher review.
Web Interface
- Image Retrieval: Upon receiving the request, the application fetches the selected correct and answer images from the specified folders and prepares the paths for storing and accessing the result files.
- Comparison Process: The application checks whether the CSV file containing the similarity results exists for the selected exercise, and if not, it computes the image similarities using the defined functions and saves the results in a CSV file for future reference.
- Data Presentation: Once the comparison process is completed, the web interface displays the reference and student answer images for the selected exercise and provides an option to view detailed similarity results through a downloadable CSV file.
6. Evaluation
6.1. Participant Overview and Methodology
6.2. Five Flutter Projects for Exercises
6.3. Results of Five Exercises
6.3.1. Results of Exercise-1 and Exercise-2
6.3.2. Result of Exercise-3
6.3.3. Result of Exercise-4
6.3.4. Result of Exercise-5
6.4. Discussion
6.4.1. Findings
6.4.2. Automation of Multiple UI Screenshots
6.4.3. Limitations and Future Work
6.4.4. Advantages for University Teachers
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Criollo-C, S.; Guerrero-Arias, A.; Jaramillo-Alcázar, Á.; Luján-Mora, S. Mobile Learning Technologies for Education: Benefits and Pending Issues. Appl. Sci. 2021, 11, 4111. [Google Scholar] [CrossRef]
- McQuiggan, S.; Kosturko, L.; McQuiggan, J.; Sabourin, J. Mobile Learning: A Handbook for Developers, Educators, and Learners; Wiley: Hoboken, NJ, USA, 2015. [Google Scholar]
- Flutter. Available online: https://docs.flutter.dev/ (accessed on 1 June 2024).
- Dart. Available online: https://dart.dev/overview/ (accessed on 1 June 2024).
- Aung, S.T.; Funabiki, N.; Aung, L.H.; Kinari, S.A.; Mentari, M.; Wai, K.H. A Study of Learning Environment for Initiating Flutter App Development Using Docker. Information 2024, 15, 191. [Google Scholar] [CrossRef]
- Jackson, S.; Wurst, K.R. Teaching with VS code DevContainers: Conference workshop. J. Comput. Sci. Coll. 2022, 37, 81–82. [Google Scholar]
- Khan, S.; Usman, R.; Haider, W.; Haider, S.M.; Lal, A.; Kohari, A.Q. E-Education Application using Flutter: Concepts and Methods. In Proceedings of the 2023 Global Conference on Wireless and Optical Technologies (GCWOT), Malaga, Spain, 24–27 January 2023; pp. 1–10. [Google Scholar] [CrossRef]
- Boada, I.; Soler, J.; Prados, F.; Poch, J. A teaching/learning support tool for introductory programming courses. In Proceedings of the Information Technology Based Proceedings of the Fifth International Conference on Higher Education and Training (ITHET), Istanbul, Turkey, 31 May–2 June 2004; pp. 604–609. [Google Scholar] [CrossRef]
- Crow, T.; Luxton-Reilly, A.; Wuensche, B. Intelligent Tutoring Systems for Programming Education: A Systematic Review. In Proceedings of the 20th Australasian Computing Education Conference, Brisbane, Australia, 30 January–2 February 2018; pp. 53–62. [Google Scholar] [CrossRef]
- Keuning, H.; Jeuring, J.; Heeren, B. A Systematic Literature Review of Automated Feedback Generation for Programming Exercises. ACM Trans. Comput. Educ. 2018, 19, 1–43. [Google Scholar] [CrossRef]
- Sun, X.; Li, T.; Xu, J. UI Components Recognition System Based On Image Understanding. In Proceedings of the 2020 IEEE 20th International Conference on Software Quality, Reliability and Security Companion (QRS-C), Macau, China, 11–14 December 2020; pp. 65–71. [Google Scholar] [CrossRef]
- CNN. Available online: https://en.wikipedia.org/wiki/Convolutional_neural_network (accessed on 1 June 2024).
- Khaliq, Z.; Farooq, S.U.; Khan, D.A. A Deep Learning-based Automated Framework for Functional User Interface Testing. Inf. Softw. Technol. 2022, 150, 13. [Google Scholar] [CrossRef]
- Wang, W.; Lam, W.; Xie, T. An infrastructure approach to improving effectiveness of Android UI testing tools. In Proceedings of the 30th ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA), Virtual, Denmark, 11–17 July 2021; pp. 165–176. [Google Scholar] [CrossRef]
- UIAutomator. Available online: https://developer.android.com/training/testing/other-components/ui-automator (accessed on 1 June 2024).
- Tareen, S.A.K.; Saleem, Z. A comparative analysis of SIFT, SURF, KAZE, AKAZE, ORB, and BRISK. In Proceedings of the 2018 International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), Sukkur, Pakistan, 3–4 March 2018; pp. 1–10. [Google Scholar] [CrossRef]
- Zhong, B.; Li, Y. Image Feature Point Matching Based on Improved SIFT Algorithm. In Proceedings of the 2019 IEEE 4th International Conference on Image, Vision and Computing (ICIVC), Xiamen, China, 5–7 July 2019; pp. 489–493. [Google Scholar] [CrossRef]
- Gupta, S.; Kumar, M.; Garg, A. Improved object recognition results using SIFT and ORB feature detector. Multimed. Tool. Appl. 2019, 78, 34157–34171. [Google Scholar] [CrossRef]
- Andrianova, E.G.; Demidova, L.A. An Approach to Image Matching Based on SIFT and ORB Algorithms. In Proceedings of the 2021 3rd International Conference on Control Systems, Mathematical Modeling, Automation and Energy Efficiency (SUMMA), Lipetsk, Russian Federation, 10–12 November 2021; pp. 534–539. [Google Scholar] [CrossRef]
- Chhabra, P.; Garg, N.K.; Kumar, M. Content-based image retrieval system using ORB and SIFT features. Neur. Comput. Applic. 2020, 32, 2725–2733. [Google Scholar] [CrossRef]
- Flask. Available online: https://flask.palletsprojects.com/en/3.0.x/ (accessed on 1 June 2024).
- OpenCV. Available online: https://docs.opencv.org/4.x/ (accessed on 1 June 2024).
- ORB. Available online: https://docs.opencv.org/3.4/d1/d89/tutorial_py_orb.html (accessed on 1 June 2024).
- SIFT. Available online: https://docs.opencv.org/4.x/da/df5/tutorial_py_sift_intro.html (accessed on 1 June 2024).
- GitHub. Available online: https://docs.github.com/en (accessed on 1 June 2024).
- Moodle. Available online: https://moodle.org/ (accessed on 1 June 2024).
- Docker. Available online: https://docs.docker.com/get-started/overview/ (accessed on 1 June 2024).
- Visual Studio Code. Available online: https://code.visualstudio.com/docs (accessed on 1 June 2024).
- OS. Available online: https://docs.python.org/3/library/os.html (accessed on 1 June 2024).
- Shutil. Available online: https://docs.python.org/3/library/shutil.html (accessed on 1 June 2024).
- Subprocess. Available online: https://docs.python.org/3/library/subprocess.html (accessed on 1 June 2024).
- Xdotool. Available online: https://pypi.org/project/xdotool/ (accessed on 1 June 2024).
- PyAutoGUI. Available online: https://pypi.org/project/PyAutoGUI/ (accessed on 1 June 2024).
- PIL. Available online: https://pillow.readthedocs.io/en/stable/ (accessed on 1 June 2024).
- Numpy. Available online: https://numpy.org/ (accessed on 1 June 2024).
- Pytesseract. Available online: https://pypi.org/project/pytesseract/ (accessed on 1 June 2024).
- FLANN. Available online: https://docs.opencv.org/3.4/d5/d6f/tutorial_feature_flann_matcher.html (accessed on 1 June 2024).
- BFMatcher. Available online: https://docs.opencv.org/3.4/dc/dc3/tutorial_py_matcher.html (accessed on 1 June 2024).
- Muuli, E.; Tonisson, E.; Lepp, M.; Luik, P.; Palts, T.; Suviste, R.; Papli, K.; Sade, M. Using Image Recognition to Automatically Assess Programming Tasks with Graphical Output. Educ. Inf. Technol. 2020, 25, 5185–5203. [Google Scholar] [CrossRef]
- Combefis, S. Automated Code Assessment for Education: Review, Classification and Perspectives on Techniques and Tools. Software 2022, 1, 3–30. [Google Scholar] [CrossRef]
- Mozgovoy, M.; Pyshkin, E. Unity Application Testing Automation with Appium and Image Recognition. Commun. Comput. Inf. Sci. 2018, 779, 139–150. [Google Scholar] [CrossRef]
- Rainforest, Q.A. Available online: https://www.rainforestqa.com/blog/ui-testing-tools (accessed on 20 July 2024).
- Applitools Eyes. Available online: https://applitools.com/platform/eyes/ (accessed on 20 July 2024).
- Screenster. Available online: https://www.screenster.io/ui-testing-automation-tools-and-frameworks/ (accessed on 20 July 2024).
- Flutter Drive. Available online: https://fig.io/manual/flutter/drive (accessed on 1 June 2024).
- Appium. Available online: https://appium.io/docs/en/latest/ (accessed on 1 June 2024).
- Selenium WebDriver. Available online: https://www.selenium.dev/documentation/webdriver/ (accessed on 1 June 2024).
Exercise | Objective | Modification Guidance |
---|---|---|
Exercise-1 | Container widget as a fundamental UI element used to encapsulate other widgets. |
|
Exercise-2 | ListView displays scrollable lists of widgets and manipulates their functionality. |
|
Exercise-3 | AlertDialog widget for displaying critical information and interacting with users. |
|
Exercise-4 | BottomNavigationBar, layout widgets, text input, conditional UI updates, and asset management. |
|
Exercise-5 | Create a simple to-do list app with custom widget and state management, input dialog, list display and management, item addition, basic layout, and styling. |
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Aung, S.T.; Funabiki, N.; Aung, L.H.; Kinari, S.A.; Wai, K.H.; Mentari, M. An Image-Based User Interface Testing Method for Flutter Programming Learning Assistant System. Information 2024, 15, 464. https://doi.org/10.3390/info15080464
Aung ST, Funabiki N, Aung LH, Kinari SA, Wai KH, Mentari M. An Image-Based User Interface Testing Method for Flutter Programming Learning Assistant System. Information. 2024; 15(8):464. https://doi.org/10.3390/info15080464
Chicago/Turabian StyleAung, Soe Thandar, Nobuo Funabiki, Lynn Htet Aung, Safira Adine Kinari, Khaing Hsu Wai, and Mustika Mentari. 2024. "An Image-Based User Interface Testing Method for Flutter Programming Learning Assistant System" Information 15, no. 8: 464. https://doi.org/10.3390/info15080464