Paper
19 July 2024 An augmentation of random forest through using the principle of justifiable granularity
Linying Liu, Jiangyue Liu, Xiubin Zhu
Author Affiliations +
Proceedings Volume 13181, Third International Conference on Electronic Information Engineering, Big Data, and Computer Technology (EIBDCT 2024); 131815B (2024) https://doi.org/10.1117/12.3031228
Event: Third International Conference on Electronic Information Engineering, Big Data, and Computer Technology (EIBDCT 2024), 2024, Beijing, China
Abstract
In recent years, ensemble learning has garnered increasing attention within the field of machine learning. This is primarily because ensemble learning techniques have demonstrated their capability to enhance model performance. Ensemble learning enables the amalgamation of multiple weak learners into a single robust learner, leading to improvements in overall model performance. One of the key benefits of ensemble learning is its ability to reduce model variance, enhance model robustness, and mitigate the risk of overfitting. Additionally, it contributes to reducing data bias and elevating model stability. One well-established ensemble learning algorithm is the Random Forest, which comprises numerous decision trees, typically hundreds of them. Each tree is trained on different subsets of data and a subset of features. This combination enhances the model's robustness and generalization performance. In the context of a regression task with a Random Forest model, results are typically obtained through a simple averaging process. However, this basic averaging approach can introduce significant bias into the results. To address this issue, one can introduce the concept of employing justifiable granularity when handling the prediction results of a Random Forest regression model. This approach offers the advantage of generating interval-valued results instead of numerical ones. The interval values derived through the inclusion of justifiable granularity are calculated based on two fundamental characteristics. Consequently, the interval values possess a certain degree of confidence, and the results in this format offer a level of interpretability and credibility compared to numerical results.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Linying Liu, Jiangyue Liu, and Xiubin Zhu "An augmentation of random forest through using the principle of justifiable granularity", Proc. SPIE 13181, Third International Conference on Electronic Information Engineering, Big Data, and Computer Technology (EIBDCT 2024), 131815B (19 July 2024); https://doi.org/10.1117/12.3031228
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Random forests

Decision trees

Machine learning

Data modeling

Fuzzy logic

Design

Statistical modeling

Back to Top