DCAMM Seminar - A new stochastic descent method for the efficient solution of structural optimization problems with infinitely many load cases

A DCAMM seminar be presented by

Professor Michael Stingl
Friedrich-Alexander-University Erlangen-Nürnberg
Lehrstuhl für Angewandte Mathematik 2

Erlangen, Germany



Abstract:

Recently, stochastic gradient methods (SG) have been successfully applied to structural optimization problems with a large number of  load cases. The advantage of the SG method is that in each iteration of the algorithm only a single state problem has to be solved while a deterministic method requires the solution of state problems for all individual load cases in every iteration. In this presentation, this idea    is generalized in two ways: first, we assume that the objective depends on an arbitrary set of parameters, while the only assumption we use is that this dependency is locally Lipschitz continuous. Second, instead of a discrete parameter set (as in the multiple load case example) we want to allow for continuous parameter sets. Then the objective functional which is minimized over an admissible set is     given as an integral over these parameters. To name some particular examples, parameters can encode the location of a load in an elastic setting, a wave length in a time harmonic setting, or, when properties of particulate systems are optimized, a size distribution in a poly-disperse setting. The presented scheme (SIG) is related to the idea of the stochastic average gradient method (SAG), which is mainly    known from the field of machine learning. However, In contrast to the SAG method, for the new SIG scheme no discretization of the integral type objective by a quadrature rule is required. Thus, apart from its efficiency, a main advantage of the SIG method is its convergence to an optimal solution for the (non-discretized) objective functional. In general, this cannot be expected, if a deterministic optimization method or an SG-type algorithm, relying on a quadrature rule, is applied. The reason is that the discretization of the integral can lead to an arbitrary number of artificial local optima. Apart from the algorithmic idea, convergence properties as well as numerical results for a linear elastic setting are presented.

Danish pastry, coffee and tea will be served 15 minutes before the seminar starts.

All interested persons are invited.

Time

Fri 05 Apr 19
10:00 - 10:45

Organizer

Where

Building 414, room 061E
Technical University of Denmark
https://www.dcamm.dk/kalender/arrangement?id=8d417026-18c4-4347-a933-793df8e3a2f6
16 APRIL 2024