Tuesday, January 31, 2023

Consensus based optimization with memory effects: random selection and applications

 Giacomo Borghi, Sara Grassi, Lorenzo Pareschi (Chaos, Solitons and Fractals 174, 113859, 2023. Preprint arXiv:2301.13242)

In this work we extend the class of Consensus-Based Optimization (CBO) metaheuristic methods by considering memory effects and a random selection strategy. The proposed algorithm iteratively updates a population of particles according to a consensus dynamics inspired by social interactions among individuals. The consensus point is computed taking into account the past positions of all particles. While sharing features with the popular Particle Swarm Optimization (PSO) method, the exploratory behavior is fundamentally different and allows better control over the convergence of the particle system.
We discuss some implementation aspects which lead to an increased efficiency while preserving the success rate in the optimization process. In particular, we show how employing a random selection strategy to discard particles during the computation improves the overall performance. Several benchmark problems and applications to image segmentation and Neural Networks training are used to validate and test the proposed method. A theoretical analysis allows to recover convergence guarantees under mild assumptions of the objective function. This is done by first approximating the particles evolution with a continuous-in-time dynamics, and then by taking the mean-field limit of such dynamics. Convergence to a global minimizer is finally proved at the mean-field level.