• español
    • English
  • Login
  • English 
    • español
    • English

UniversidaddeCádiz

Área de Biblioteca, Archivo y Publicaciones
Communities and Collections
View Item 
  •   RODIN Home
  • Producción Científica
  • Artículos Científicos
  • View Item
  •   RODIN Home
  • Producción Científica
  • Artículos Científicos
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Gradient-enhanced stochastic optimization of high-fidelity simulations

Thumbnail
Identificadores

URI: http://hdl.handle.net/10498/32433

DOI: 10.1016/J.CPC.2024.109122

ISSN: 0010-4655

Files
OA_2024_0281.pdf (1.646Mb)
Statistics
View statistics
Metrics and citations
 
Share
Export
Export reference to MendeleyRefworksEndNoteBibTexRIS
Metadata
Show full item record
Author/s
Quirós Rodríguez, Alejandro; Fosas de Pando, Miguel ÁngelAuthority UCA; Sayadi, Taraneh
Date
2024
Department
Ingeniería Mecánica y Diseño Industrial
Source
Computer Physics Communications - 2024, Vol. 298 pp. 1-16
Abstract
Optimization and control of complex unsteady flows remains an important challenge due to the large cost of performing a function evaluation, i.e. a full computational fluid dynamics (CFD) simulation. Reducing the number of required function evaluations would help to decrease the computational cost of the overall optimization procedure. In this article, we consider the stochastic derivative-free surrogate-model based Dynamic COordinate search using Response Surfaces (DYCORS) algorithm and propose several enhancements: First, the gradient information is added to the surrogate model to improve its accuracy and enhance the convergence rate of the algorithm. Second, the internal parameters of the radial basis function employed to generate the surrogate model are optimized by minimizing the leave-one-out error in the case of the original algorithm and by using the gradient information in the case of the gradient-enhanced version. We apply the resulting optimization algorithm to the minimization of the total pressure loss through a linear cascade of blades, and we compare the results obtained with the stochastic algorithms at different Reynolds numbers with a gradient-based optimization algorithm. The results show that stochastic optimization outperforms gradient-based optimization even at very low Re numbers, and that the proposed gradient-enhanced version improves the convergence rate of the original algorithm. An open-source implementation of the gradient-enhanced version of the algorithm is available.
Subjects
Stochastic optimization; Surrogate model; Radial basis function; Gradient-enhanced radial basis function; High-fidelity simulation
Collections
  • Artículos Científicos [11595]
  • Articulos Científicos Ing. Mec. [310]
Attribution-NonCommercial-NoDerivatives 4.0 Internacional
This work is under a Creative Commons License Attribution-NonCommercial-NoDerivatives 4.0 Internacional

Browse

All of RODINCommunities and CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

LoginRegister

Statistics

View Usage Statistics

Información adicional

AboutDeposit in RODINPoliciesGuidelinesRightsLinksStatisticsNewsFrequently Asked Questions

RODIN is available through

OpenAIREOAIsterRecolectaHispanaEuropeanaBaseDARTOATDGoogle Academic

Related links

Sherpa/RomeoDulcineaROAROpenDOARCreative CommonsORCID

RODIN está gestionado por el Área de Biblioteca, Archivo y Publicaciones de la Universidad de Cádiz

Contact informationSuggestionsUser Support