Discrete-Continuous Variational Optimization with Local Gradients

Publication Date: 12/15/2024

Event: OPT2024: 16th Annual Workshop on Optimization for Machine Learning (part of NeurIPS 2024)

Reference: pp. 1-8, 2024

Authors: Jonathan Warrell, NEC Laboratories America, Inc.; Francesco Alesiani, NEC Laboratories Europe; Cameron Smith , Broad Institute; Anja Moesche, NEC Laboratories Europe; Martin Renqiang Min, NEC Laboratories America, Inc.

Abstract: Variational optimization (VO) offers a general approach for handling objectives which may involve discontinuities, or whose gradients are difficult to calculate. By introducing a variational distribution over the parameter space, such objectives are smoothed, and rendered amenable to VO methods. Local gradient information, though, may be available in certain problems, which is neglected by such an approach. We therefore consider a general method for incorporating local information via an augmented VO objective function to accelerate convergence and improve accuracy. We show how our augmented objective can be viewed as an instance of multilevel optimization. Finally, we show our method can train a genetic algorithm simulator, using a recursive Wasserstein distance objective.

Publication Link: https://opt-ml.org/papers/2024/paper69.pdf