SR-DARTS: Robust Differentiable Neural Architecture Search with Stability Regularization

Authors

  • Huakun Wu Cyberspace Institute of Advanced Technology, Guangzhou University, Guangzhou, 510006, China
  • Gusheng Tian Cyberspace Institute of Advanced Technology, Guangzhou University, Guangzhou, 510006, China
  • Peng Zhang Cyberspace Institute of Advanced Technology, Guangzhou University, Guangzhou, 510006, China https://orcid.org/0000-0001-7973-2746

DOI:

https://doi.org/10.37256/cm.7120268384

Keywords:

neural architecture search, stability regularization

Abstract

Neural Architecture Search (NAS) has emerged as a powerful tool to find the best neural architectures with minimal manual intervention. However, existing NAS approaches suffer from poor robustness, because the search process is sensitive to the noise and hyperparameter settings. To this end, we propose a robust Differentiable Architecture Search with Stability Regularization (SR-DARTS for short). SR-DARTS introduces a new stability regularization term into the validation loss function to explicitly constrain the architecture parameter. Specifically, the new regularizer drives the mean of the architecture parameters towards zero during the search process, thereby accelerating the convergence of the search algorithm. Consequently, SR-DARTS can reduce the variance error during search. Experimental results demonstrate that SR-DARTS can discover accurate neural network architectures and obtain better performance on benchmark datasets. 

Downloads

Published

2026-01-27

How to Cite

1.
Wu H, Tian G, Zhang P. SR-DARTS: Robust Differentiable Neural Architecture Search with Stability Regularization. Contemp. Math. [Internet]. 2026 Jan. 27 [cited 2026 Feb. 8];7(1):1217-30. Available from: https://ojs.wiserpub.com/index.php/CM/article/view/8384