# [SciPy-User] (Possible) new optimization routines - scipy.optimize

9 messages
Open this post in threaded view
|

## [SciPy-User] (Possible) new optimization routines - scipy.optimize

 Hi All,     as my team and I are constantly facing very hard/complex numerical optimization problems, I have taken a look at the various *global* optimization routines available in Python and I thought I could throw in a couple of algorithms I implemented, mostly drawing from my previous thesis work. I have implemented two routines (based on numpy and scipy), namely: - AMPGO: Adaptive Memory Programming for Global Optimization: this is my Python   implementation of the algorithm described here:   http://leeds-faculty.colorado.edu/glover/fred%20pubs/416%20-%20AMP%20(TS)%20for%20Constrained%20Global%20Opt%20w%20Lasdon%20et%20al%20.pdf  I have added a few improvements here and there based on my Master Thesis work   on the standard Tunnelling Algorithm of Levy, Montalvo and Gomez. - Firefly:  the Firefly algorithm, this is my Python implementation of the procedure   described here:   http://www.mathworks.com/matlabcentral/fileexchange/29693-firefly-algorithmAs it appears that most numerical optimization "experts" still report their benchmark results based on "CPU time" or "elapsed time" or similar meaningless performance indicators, I have built a fairly sizeable benchmark test suite to test various optimization routines. The test suite currently contains: - 18 one-dimensional test functions with multiple local/global minima; - 85 multivariate problems (where the number of independent variables ranges from 2 to 10), again with multiple local/global minima. The main page describing the rules, algorithms and motivation is here: http://infinity77.net/global_optimization/index.htmlAlgorithms comparisons: http://infinity77.net/global_optimization/multidimensional.htmlhttp://infinity77.net/global_optimization/univariate.htmlTest functions: http://infinity77.net/global_optimization/test_functions.htmlThe overall conclusion is that, while the Firefly method performances are average at best, AMPGO is superior to all the other algorithms I have tried. However, to be fair, it is an algorithm designed for low-dimensional optimization problems (i.e., 1-10 variables). I haven't published the code for the two algorithms (yet), as they are not quite up to the documentation standards of scipy. However, if there is interest from the community to add them (or one of them) to scipy.optimize I will be happy to polish them up and contribute to this great library. The same applies to the benchmark test suite. Andrea. "Imagination Is The Only Weapon In The War Against Reality." http://www.infinity77.net# ------------------------------------------------------------- # def ask_mailing_list_support(email):     if mention_platform_and_version() and include_sample_app():         send_message(email)     else:         install_malware()         erase_hard_drives() # ------------------------------------------------------------- # _______________________________________________ SciPy-User mailing list [hidden email] http://mail.scipy.org/mailman/listinfo/scipy-user
Open this post in threaded view
|

## Re: (Possible) new optimization routines - scipy.optimize

 On Thu, Feb 14, 2013 at 4:01 PM, Andrea Gavana <[hidden email]> wrote: > Hi All, > >     as my team and I are constantly facing very hard/complex numerical > optimization problems, I have taken a look at the various *global* > optimization routines available in Python and I thought I could throw > in a couple of algorithms I implemented, mostly drawing from my > previous thesis work. > > I have implemented two routines (based on numpy and scipy), namely: > > - AMPGO: Adaptive Memory Programming for Global Optimization: this is my Python >   implementation of the algorithm described here: > >   http://leeds-faculty.colorado.edu/glover/fred%20pubs/416%20-%20AMP%20(TS)%20for%20Constrained%20Global%20Opt%20w%20Lasdon%20et%20al%20.pdf> >   I have added a few improvements here and there based on my Master Thesis work >   on the standard Tunnelling Algorithm of Levy, Montalvo and Gomez. This could also be a good addition. similar to basinhopping. >From my perspective, this kind of global optimizers are the most promising, (compared to the evolutionary, ...) >From a quick browse: Is the local optimizer fixed to a specific one, or can it be any available solver as in basinhopping? The only thing I might worry about that it only has 6 citations in Google Scholar (which might not mean much if the optimizer is not widely available). Given that there seem to be many variations of this kind of optimizers, it might be good to have some background on comparison with similar optimizers. If you have other comparisons of similar optimizers, it would be useful to see them. Also given that you have a large benchmark suite, you could compare it with the new basinhopping in scipy.optimize. > > - Firefly:  the Firefly algorithm, this is my Python implementation of > the procedure >   described here: > >   http://www.mathworks.com/matlabcentral/fileexchange/29693-firefly-algorithmthe fileexchange has a large number of "animal" optimizers, and I doubt they are all good. I think there needs to be a convincing case before adding any of them should be added to scipy. On the other hand, having them available outside of scipy would make it easier to try them out. (and see if fireflies, or bees or ants are doing better :) thank you for proposing this, from a potential user. I expect to try out basinhopping soon on estimation parameters of mixture distributions. Josef > > > As it appears that most numerical optimization "experts" still report > their benchmark results based on "CPU time" or "elapsed time" or > similar meaningless performance indicators, I have built a fairly > sizeable benchmark test suite to test various optimization routines. > The test suite currently contains: > > - 18 one-dimensional test functions with multiple local/global minima; > - 85 multivariate problems (where the number of independent variables > ranges from 2 to 10), again with multiple local/global minima. > > The main page describing the rules, algorithms and motivation is here: > > http://infinity77.net/global_optimization/index.html> > Algorithms comparisons: > > http://infinity77.net/global_optimization/multidimensional.html> http://infinity77.net/global_optimization/univariate.html> > Test functions: > > http://infinity77.net/global_optimization/test_functions.html> > The overall conclusion is that, while the Firefly method performances > are average at best, AMPGO is superior to all the other algorithms I > have tried. However, to be fair, it is an algorithm designed for > low-dimensional optimization problems (i.e., 1-10 variables). > > I haven't published the code for the two algorithms (yet), as they are > not quite up to the documentation standards of scipy. However, if > there is interest from the community to add them (or one of them) to > scipy.optimize I will be happy to polish them up and contribute to > this great library. The same applies to the benchmark test suite. > > > Andrea. > > "Imagination Is The Only Weapon In The War Against Reality." > http://www.infinity77.net> > # ------------------------------------------------------------- # > def ask_mailing_list_support(email): > >     if mention_platform_and_version() and include_sample_app(): >         send_message(email) >     else: >         install_malware() >         erase_hard_drives() > # ------------------------------------------------------------- # > _______________________________________________ > SciPy-User mailing list > [hidden email] > http://mail.scipy.org/mailman/listinfo/scipy-user_______________________________________________ SciPy-User mailing list [hidden email] http://mail.scipy.org/mailman/listinfo/scipy-user
Open this post in threaded view
|

## Re: (Possible) new optimization routines - scipy.optimize

Open this post in threaded view
|

## Re: (Possible) new optimization routines - scipy.optimize

Open this post in threaded view
|

## Re: (Possible) new optimization routines - scipy.optimize

 In reply to this post by Andrea Gavana Andrea Gavana gmail.com> writes: > > Hi All, > >     as my team and I are constantly facing very hard/complex numerical > optimization problems, I have taken a look at the various *global* > optimization routines available in Python and I thought I could throw > in a couple of algorithms I implemented, mostly drawing from my > previous thesis work. > > I haven't published the code for the two algorithms (yet), as they are > not quite up to the documentation standards of scipy. However, if > there is interest from the community to add them (or one of them) to > scipy.optimize I will be happy to polish them up and contribute to > this great library. The same applies to the benchmark test suite. > > Andrea. > I can't speak for the maintainers however as a user I'm definitely interested in scipy having more optimization capabilities, especially in the tricky global optimization area. Having to install a different package for each optimiser you want to try is a pain, especially on Windows where they often require extensive knowledge of both distutils and general compilation of C/C++ software. Thanks, Dave _______________________________________________ SciPy-User mailing list [hidden email] http://mail.scipy.org/mailman/listinfo/scipy-user
Open this post in threaded view
|

## Re: (Possible) new optimization routines - scipy.optimize

 In reply to this post by josef.pktd On 14 February 2013 22:53,  <[hidden email]> wrote: > On Thu, Feb 14, 2013 at 4:01 PM, Andrea Gavana <[hidden email]> wrote: >> Hi All, >> >>     as my team and I are constantly facing very hard/complex numerical >> optimization problems, I have taken a look at the various *global* >> optimization routines available in Python and I thought I could throw >> in a couple of algorithms I implemented, mostly drawing from my >> previous thesis work. >> >> I have implemented two routines (based on numpy and scipy), namely: >> >> - AMPGO: Adaptive Memory Programming for Global Optimization: this is my Python >>   implementation of the algorithm described here: >> >>   http://leeds-faculty.colorado.edu/glover/fred%20pubs/416%20-%20AMP%20(TS)%20for%20Constrained%20Global%20Opt%20w%20Lasdon%20et%20al%20.pdf>> >>   I have added a few improvements here and there based on my Master Thesis work >>   on the standard Tunnelling Algorithm of Levy, Montalvo and Gomez. > > This could also be a good addition. similar to basinhopping. > >From my perspective, this kind of global optimizers are the most > promising, (compared to the evolutionary, ...) > > >From a quick browse: Is the local optimizer fixed to a specific one, > or can it be any available solver as in basinhopping? > > The only thing I might worry about that it only has 6 citations in > Google Scholar (which might not mean much if the optimizer is not > widely available). > Given that there seem to be many variations of this kind of > optimizers, it might be good to have some background on comparison > with similar optimizers. > > If you have other comparisons of similar optimizers, it would be > useful to see them. Also given that you have a large benchmark suite, > you could compare it with the new basinhopping in scipy.optimize. OK, I have run the test suite with basinhopping as well, even though I had to modify it a bit to support the maximum number of functions evaluations and the tolerance on achieving the global optimum. Results are here: - N-D : http://infinity77.net/global_optimization/multidimensional.html- 1-D : http://infinity77.net/global_optimization/univariate.htmlOverall it appears to have average performances, even though I must stress again that, while my test suite is relatively large, it only encompasses low-dimensional problem (1-10 variables) and my stopping/convergence criteria may not be applicable to everyone else's needs. Andrea. "Imagination Is The Only Weapon In The War Against Reality." http://www.infinity77.net# ------------------------------------------------------------- # def ask_mailing_list_support(email):     if mention_platform_and_version() and include_sample_app():         send_message(email)     else:         install_malware()         erase_hard_drives() # ------------------------------------------------------------- # _______________________________________________ SciPy-User mailing list [hidden email] http://mail.scipy.org/mailman/listinfo/scipy-user
Open this post in threaded view
|

## Re: (Possible) new optimization routines - scipy.optimize

 On Mon, Feb 18, 2013 at 11:41 AM, Andrea Gavana <[hidden email]> wrote: > On 14 February 2013 22:53,  <[hidden email]> wrote: >> On Thu, Feb 14, 2013 at 4:01 PM, Andrea Gavana <[hidden email]> wrote: >>> Hi All, >>> >>>     as my team and I are constantly facing very hard/complex numerical >>> optimization problems, I have taken a look at the various *global* >>> optimization routines available in Python and I thought I could throw >>> in a couple of algorithms I implemented, mostly drawing from my >>> previous thesis work. >>> >>> I have implemented two routines (based on numpy and scipy), namely: >>> >>> - AMPGO: Adaptive Memory Programming for Global Optimization: this is my Python >>>   implementation of the algorithm described here: >>> >>>   http://leeds-faculty.colorado.edu/glover/fred%20pubs/416%20-%20AMP%20(TS)%20for%20Constrained%20Global%20Opt%20w%20Lasdon%20et%20al%20.pdf>>> >>>   I have added a few improvements here and there based on my Master Thesis work >>>   on the standard Tunnelling Algorithm of Levy, Montalvo and Gomez. >> >> This could also be a good addition. similar to basinhopping. >> >From my perspective, this kind of global optimizers are the most >> promising, (compared to the evolutionary, ...) >> >> >From a quick browse: Is the local optimizer fixed to a specific one, >> or can it be any available solver as in basinhopping? >> >> The only thing I might worry about that it only has 6 citations in >> Google Scholar (which might not mean much if the optimizer is not >> widely available). >> Given that there seem to be many variations of this kind of >> optimizers, it might be good to have some background on comparison >> with similar optimizers. >> >> If you have other comparisons of similar optimizers, it would be >> useful to see them. Also given that you have a large benchmark suite, >> you could compare it with the new basinhopping in scipy.optimize. > > OK, I have run the test suite with basinhopping as well, even though I > had to modify it a bit to support the maximum number of functions > evaluations and the tolerance on achieving the global optimum. Anything that might be useful to incorporate in the scipy version? > > Results are here: > > - N-D : http://infinity77.net/global_optimization/multidimensional.html> - 1-D : http://infinity77.net/global_optimization/univariate.html> > Overall it appears to have average performances, even though I must > stress again that, while my test suite is relatively large, it only > encompasses low-dimensional problem (1-10 variables) and my > stopping/convergence criteria may not be applicable to everyone else's > needs. Interesting results, AMPGO looks good. I'm a bit surprised that the number of function evaluations of basinhopping is relatively large. (One possible difference to your benchmark is that with smooth models and numerical derivatives, a more efficient local optimizer could be chosen.) two things I saw when I looked at your benchmark before: DE has several cases that are complementary to AMPGO, it might be interesting in applications to cross-check results across optimizers. Why does DE and some other have success rate either 0 or 100 and nothing in between? I don't see a reason what could cause this result. Thanks Josef > > > Andrea. > > "Imagination Is The Only Weapon In The War Against Reality." > http://www.infinity77.net> > # ------------------------------------------------------------- # > def ask_mailing_list_support(email): > >     if mention_platform_and_version() and include_sample_app(): >         send_message(email) >     else: >         install_malware() >         erase_hard_drives() > # ------------------------------------------------------------- # > _______________________________________________ > SciPy-User mailing list > [hidden email] > http://mail.scipy.org/mailman/listinfo/scipy-user_______________________________________________ SciPy-User mailing list [hidden email] http://mail.scipy.org/mailman/listinfo/scipy-user