SR-DARTS: Robust Differentiable Neural Architecture Search with Stability Regularization
DOI:
https://doi.org/10.37256/cm.7120268384Keywords:
neural architecture search, stability regularizationAbstract
Neural Architecture Search (NAS) has emerged as a powerful tool to find the best neural architectures with minimal manual intervention. However, existing NAS approaches suffer from poor robustness, because the search process is sensitive to the noise and hyperparameter settings. To this end, we propose a robust Differentiable Architecture Search with Stability Regularization (SR-DARTS for short). SR-DARTS introduces a new stability regularization term into the validation loss function to explicitly constrain the architecture parameter. Specifically, the new regularizer drives the mean of the architecture parameters towards zero during the search process, thereby accelerating the convergence of the search algorithm. Consequently, SR-DARTS can reduce the variance error during search. Experimental results demonstrate that SR-DARTS can discover accurate neural network architectures and obtain better performance on benchmark datasets.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 Peng Zhang, et al.

This work is licensed under a Creative Commons Attribution 4.0 International License.
