Average rating
Cast your vote
You can rate an item by clicking the amount of stars they wish to award to this item.
When enough users have cast their vote on this item, the average rating will also be shown.
Star rating
Your vote was cast
Thank you for your feedback
Thank you for your feedback
Journal title
PloS oneDate Published
2018-03-16Publication Volume
13Publication Issue
3Publication Begin page
e0192944
Metadata
Show full item recordAbstract
When standard optimization methods fail to find a satisfactory solution for a parameter fitting problem, a tempting recourse is to adjust parameters manually. While tedious, this approach can be surprisingly powerful in terms of achieving optimal or near-optimal solutions. This paper outlines an optimization algorithm, Adaptive Stochastic Descent (ASD), that has been designed to replicate the essential aspects of manual parameter fitting in an automated way. Specifically, ASD uses simple principles to form probabilistic assumptions about (a) which parameters have the greatest effect on the objective function, and (b) optimal step sizes for each parameter. We show that for a certain class of optimization problems (namely, those with a moderate to large number of scalar parameter dimensions, especially if some dimensions are more important than others), ASD is capable of minimizing the objective function with far fewer function evaluations than classic optimization methods, such as the Nelder-Mead nonlinear simplex, Levenberg-Marquardt gradient descent, simulated annealing, and genetic algorithms. As a case study, we show that ASD outperforms standard algorithms when used to determine how resources should be allocated in order to minimize new HIV infections in Swaziland.Citation
Kerr CC, Dura-Bernal S, Smolinski TG, Chadderdon GL, Wilson DP. Optimization by Adaptive Stochastic Descent. PLoS One. 2018 Mar 16;13(3):e0192944. doi: 10.1371/journal.pone.0192944. PMID: 29547665; PMCID: PMC5856269.DOI
10.1371/journal.pone.0192944ae974a485f413a2113503eed53cd6c53
10.1371/journal.pone.0192944
Scopus Count
Collections
The following license files are associated with this item:
- Creative Commons
Except where otherwise noted, this item's license is described as Attribution-NonCommercial-NoDerivatives 4.0 International
Related articles
- ASD+M: Automatic parameter tuning in stochastic optimization and on-line learning.
- Authors: Wawrzyński P
- Issue date: 2017 Dec
- A multiresolution stochastic level set method for Mumford-Shah image segmentation.
- Authors: Law YN, Lee HK, Yip AM
- Issue date: 2008 Dec
- Generalized separable parameter space techniques for fitting 1K-5K serial compartment models.
- Authors: Kadrmas DJ, Oktay MB
- Issue date: 2013 Jul
- Parameter estimation in biochemical pathways: a comparison of global optimization methods.
- Authors: Moles CG, Mendes P, Banga JR
- Issue date: 2003 Nov
- Use of a simulated annealing algorithm to fit compartmental models with an application to fractal pharmacokinetics.
- Authors: Marsh RE, Riauka TA, McQuarrie SA
- Issue date: 2007