A Fuzzy Regression Method Based on Monotone Nonparametric Least Squares Technique
DOI:
https://doi.org/10.37256/cm.6520256887Keywords:
fuzzy regression, nonparametric, least squaresAbstract
Nonparametric fuzzy regression techniques have proven to be useful in addressing challenges related to modeling vague or imprecise variables, particularly in situations where data availability is constrained. These methods offer greater adaptability by eliminating the need for predefined functional forms, making them more practical compared to both machine-learning-based and parametric regression approaches. Machine learning methods, while effective, often require substantial data to produce reliable outputs, whereas parametric regression may encounter issues with small sample sizes, leading to reduced goodness of fit. The Fuzzy-Monotone Nonparametric Least Squares (MNLS) method represents an innovative advancement within this context. Specifically developed to manage triangular fuzzy outputs with crisp inputs, this method builds upon the fuzzy least squares framework developed by Diamond. It divides the regression task into three sub-components: Center, Left, and Right endpoint. Each component is subsequently processed using a Monotone Nonparametric Least Squares (MNLS) framework. This framework permits Fuzzy-MNLS to seamlessly combine convex and concave aspects within the regression model, leading to greater accuracy and versatility. Unlike machine learning-based fuzzy regression methods, Fuzzy-MNLS avoids the need for regularization while maintaining effectiveness even when data is sparse. Illustrative examples demonstrate that Fuzzy-MNLS consistently yields higher similarity scores and more accurate forecasts compared to other least squares methods. This robustness makes it an optimal choice for scenarios where data availability is limited.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 William Chung

This work is licensed under a Creative Commons Attribution 4.0 International License.
