[SciPy-User] (Possible) new optimization routines - scipy.optimize

classic Classic list List threaded Threaded
9 messages Options
Reply | Threaded
Open this post in threaded view
|

[SciPy-User] (Possible) new optimization routines - scipy.optimize

Andrea Gavana
Hi All,

    as my team and I are constantly facing very hard/complex numerical
optimization problems, I have taken a look at the various *global*
optimization routines available in Python and I thought I could throw
in a couple of algorithms I implemented, mostly drawing from my
previous thesis work.

I have implemented two routines (based on numpy and scipy), namely:

- AMPGO: Adaptive Memory Programming for Global Optimization: this is my Python
  implementation of the algorithm described here:

  http://leeds-faculty.colorado.edu/glover/fred%20pubs/416%20-%20AMP%20(TS)%20for%20Constrained%20Global%20Opt%20w%20Lasdon%20et%20al%20.pdf

  I have added a few improvements here and there based on my Master Thesis work
  on the standard Tunnelling Algorithm of Levy, Montalvo and Gomez.

- Firefly:  the Firefly algorithm, this is my Python implementation of
the procedure
  described here:

  http://www.mathworks.com/matlabcentral/fileexchange/29693-firefly-algorithm


As it appears that most numerical optimization "experts" still report
their benchmark results based on "CPU time" or "elapsed time" or
similar meaningless performance indicators, I have built a fairly
sizeable benchmark test suite to test various optimization routines.
The test suite currently contains:

- 18 one-dimensional test functions with multiple local/global minima;
- 85 multivariate problems (where the number of independent variables
ranges from 2 to 10), again with multiple local/global minima.

The main page describing the rules, algorithms and motivation is here:

http://infinity77.net/global_optimization/index.html

Algorithms comparisons:

http://infinity77.net/global_optimization/multidimensional.html
http://infinity77.net/global_optimization/univariate.html

Test functions:

http://infinity77.net/global_optimization/test_functions.html

The overall conclusion is that, while the Firefly method performances
are average at best, AMPGO is superior to all the other algorithms I
have tried. However, to be fair, it is an algorithm designed for
low-dimensional optimization problems (i.e., 1-10 variables).

I haven't published the code for the two algorithms (yet), as they are
not quite up to the documentation standards of scipy. However, if
there is interest from the community to add them (or one of them) to
scipy.optimize I will be happy to polish them up and contribute to
this great library. The same applies to the benchmark test suite.


Andrea.

"Imagination Is The Only Weapon In The War Against Reality."
http://www.infinity77.net

# ------------------------------------------------------------- #
def ask_mailing_list_support(email):

    if mention_platform_and_version() and include_sample_app():
        send_message(email)
    else:
        install_malware()
        erase_hard_drives()
# ------------------------------------------------------------- #
_______________________________________________
SciPy-User mailing list
[hidden email]
http://mail.scipy.org/mailman/listinfo/scipy-user
Reply | Threaded
Open this post in threaded view
|

Re: (Possible) new optimization routines - scipy.optimize

josef.pktd
On Thu, Feb 14, 2013 at 4:01 PM, Andrea Gavana <[hidden email]> wrote:

> Hi All,
>
>     as my team and I are constantly facing very hard/complex numerical
> optimization problems, I have taken a look at the various *global*
> optimization routines available in Python and I thought I could throw
> in a couple of algorithms I implemented, mostly drawing from my
> previous thesis work.
>
> I have implemented two routines (based on numpy and scipy), namely:
>
> - AMPGO: Adaptive Memory Programming for Global Optimization: this is my Python
>   implementation of the algorithm described here:
>
>   http://leeds-faculty.colorado.edu/glover/fred%20pubs/416%20-%20AMP%20(TS)%20for%20Constrained%20Global%20Opt%20w%20Lasdon%20et%20al%20.pdf
>
>   I have added a few improvements here and there based on my Master Thesis work
>   on the standard Tunnelling Algorithm of Levy, Montalvo and Gomez.

This could also be a good addition. similar to basinhopping.
>From my perspective, this kind of global optimizers are the most
promising, (compared to the evolutionary, ...)

>From a quick browse: Is the local optimizer fixed to a specific one,
or can it be any available solver as in basinhopping?

The only thing I might worry about that it only has 6 citations in
Google Scholar (which might not mean much if the optimizer is not
widely available).
Given that there seem to be many variations of this kind of
optimizers, it might be good to have some background on comparison
with similar optimizers.

If you have other comparisons of similar optimizers, it would be
useful to see them. Also given that you have a large benchmark suite,
you could compare it with the new basinhopping in scipy.optimize.


>
> - Firefly:  the Firefly algorithm, this is my Python implementation of
> the procedure
>   described here:
>
>   http://www.mathworks.com/matlabcentral/fileexchange/29693-firefly-algorithm

the fileexchange has a large number of "animal" optimizers, and I
doubt they are all good.

I think there needs to be a convincing case before adding any of them
should be added to scipy.
On the other hand, having them available outside of scipy would make
it easier to try them out.
(and see if fireflies, or bees or ants are doing better :)

thank you for proposing this, from a potential user.
I expect to try out basinhopping soon on estimation parameters of
mixture distributions.

Josef

>
>
> As it appears that most numerical optimization "experts" still report
> their benchmark results based on "CPU time" or "elapsed time" or
> similar meaningless performance indicators, I have built a fairly
> sizeable benchmark test suite to test various optimization routines.
> The test suite currently contains:
>
> - 18 one-dimensional test functions with multiple local/global minima;
> - 85 multivariate problems (where the number of independent variables
> ranges from 2 to 10), again with multiple local/global minima.
>
> The main page describing the rules, algorithms and motivation is here:
>
> http://infinity77.net/global_optimization/index.html
>
> Algorithms comparisons:
>
> http://infinity77.net/global_optimization/multidimensional.html
> http://infinity77.net/global_optimization/univariate.html
>
> Test functions:
>
> http://infinity77.net/global_optimization/test_functions.html
>
> The overall conclusion is that, while the Firefly method performances
> are average at best, AMPGO is superior to all the other algorithms I
> have tried. However, to be fair, it is an algorithm designed for
> low-dimensional optimization problems (i.e., 1-10 variables).
>
> I haven't published the code for the two algorithms (yet), as they are
> not quite up to the documentation standards of scipy. However, if
> there is interest from the community to add them (or one of them) to
> scipy.optimize I will be happy to polish them up and contribute to
> this great library. The same applies to the benchmark test suite.
>
>
> Andrea.
>
> "Imagination Is The Only Weapon In The War Against Reality."
> http://www.infinity77.net
>
> # ------------------------------------------------------------- #
> def ask_mailing_list_support(email):
>
>     if mention_platform_and_version() and include_sample_app():
>         send_message(email)
>     else:
>         install_malware()
>         erase_hard_drives()
> # ------------------------------------------------------------- #
> _______________________________________________
> SciPy-User mailing list
> [hidden email]
> http://mail.scipy.org/mailman/listinfo/scipy-user
_______________________________________________
SciPy-User mailing list
[hidden email]
http://mail.scipy.org/mailman/listinfo/scipy-user
Reply | Threaded
Open this post in threaded view
|

Re: (Possible) new optimization routines - scipy.optimize

Andrea Gavana
On 14 February 2013 22:53,  <[hidden email]> wrote:

> On Thu, Feb 14, 2013 at 4:01 PM, Andrea Gavana <[hidden email]> wrote:
>> Hi All,
>>
>>     as my team and I are constantly facing very hard/complex numerical
>> optimization problems, I have taken a look at the various *global*
>> optimization routines available in Python and I thought I could throw
>> in a couple of algorithms I implemented, mostly drawing from my
>> previous thesis work.
>>
>> I have implemented two routines (based on numpy and scipy), namely:
>>
>> - AMPGO: Adaptive Memory Programming for Global Optimization: this is my Python
>>   implementation of the algorithm described here:
>>
>>   http://leeds-faculty.colorado.edu/glover/fred%20pubs/416%20-%20AMP%20(TS)%20for%20Constrained%20Global%20Opt%20w%20Lasdon%20et%20al%20.pdf
>>
>>   I have added a few improvements here and there based on my Master Thesis work
>>   on the standard Tunnelling Algorithm of Levy, Montalvo and Gomez.
>
> This could also be a good addition. similar to basinhopping.
> >From my perspective, this kind of global optimizers are the most
> promising, (compared to the evolutionary, ...)
>
> >From a quick browse: Is the local optimizer fixed to a specific one,
> or can it be any available solver as in basinhopping?

It can be changed to be any available solver. As most of the
optimization problems I deal with are horrendously complex (i.e., no
gradient information available), I decided to use BOBYQA as default
local solver inside AMPGO. So the benchmark comparison is fair only
against derivative-free algorithms.

> The only thing I might worry about that it only has 6 citations in
> Google Scholar (which might not mean much if the optimizer is not
> widely available).

I am not sure how relevant the Google Scholar is regarding the
effectiveness of an algorithm: AMPGO was the only method able to crack
two of our latest (real-life) optimization problems (42 and 9
variables respectively), while all the others failed.

> Given that there seem to be many variations of this kind of
> optimizers, it might be good to have some background on comparison
> with similar optimizers.

As far as I know, there are no Open Source implementation of the
Tunnelling Algorithm. I used to have access to a Fortran
implementation (gradient-based only) in the past, but the code belongs
to the National Autonomous University of Mexico. There are no Python
implementation of it. AMPGO is a short-memory tunnelling algorithm
loosely connected with the standard tunnelling method.

> If you have other comparisons of similar optimizers, it would be
> useful to see them. Also given that you have a large benchmark suite,
> you could compare it with the new basinhopping in scipy.optimize.

I'll give it a try tomorrow and report back. I still don't have scipy
0.11 at work (they won't let me upgrade), but I guess I can just get
the Python source code for it and run it, can't I?

>> - Firefly:  the Firefly algorithm, this is my Python implementation of
>> the procedure
>>   described here:
>>
>>   http://www.mathworks.com/matlabcentral/fileexchange/29693-firefly-algorithm
>
> the fileexchange has a large number of "animal" optimizers, and I
> doubt they are all good.
>
> I think there needs to be a convincing case before adding any of them
> should be added to scipy.
> On the other hand, having them available outside of scipy would make
> it easier to try them out.
> (and see if fireflies, or bees or ants are doing better :)

I agree. I did it as an exercise to see if I remembered my Matlab
after 10 years of using Numpy/Scipy only :-)


Andrea.

"Imagination Is The Only Weapon In The War Against Reality."
http://www.infinity77.net
_______________________________________________
SciPy-User mailing list
[hidden email]
http://mail.scipy.org/mailman/listinfo/scipy-user
Reply | Threaded
Open this post in threaded view
|

Re: (Possible) new optimization routines - scipy.optimize

josef.pktd
On Thu, Feb 14, 2013 at 5:06 PM, Andrea Gavana <[hidden email]> wrote:

> On 14 February 2013 22:53,  <[hidden email]> wrote:
>> On Thu, Feb 14, 2013 at 4:01 PM, Andrea Gavana <[hidden email]> wrote:
>>> Hi All,
>>>
>>>     as my team and I are constantly facing very hard/complex numerical
>>> optimization problems, I have taken a look at the various *global*
>>> optimization routines available in Python and I thought I could throw
>>> in a couple of algorithms I implemented, mostly drawing from my
>>> previous thesis work.
>>>
>>> I have implemented two routines (based on numpy and scipy), namely:
>>>
>>> - AMPGO: Adaptive Memory Programming for Global Optimization: this is my Python
>>>   implementation of the algorithm described here:
>>>
>>>   http://leeds-faculty.colorado.edu/glover/fred%20pubs/416%20-%20AMP%20(TS)%20for%20Constrained%20Global%20Opt%20w%20Lasdon%20et%20al%20.pdf
>>>
>>>   I have added a few improvements here and there based on my Master Thesis work
>>>   on the standard Tunnelling Algorithm of Levy, Montalvo and Gomez.
>>
>> This could also be a good addition. similar to basinhopping.
>> >From my perspective, this kind of global optimizers are the most
>> promising, (compared to the evolutionary, ...)
>>
>> >From a quick browse: Is the local optimizer fixed to a specific one,
>> or can it be any available solver as in basinhopping?
>
> It can be changed to be any available solver. As most of the
> optimization problems I deal with are horrendously complex (i.e., no
> gradient information available), I decided to use BOBYQA as default
> local solver inside AMPGO. So the benchmark comparison is fair only
> against derivative-free algorithms.
>
>> The only thing I might worry about that it only has 6 citations in
>> Google Scholar (which might not mean much if the optimizer is not
>> widely available).
>
> I am not sure how relevant the Google Scholar is regarding the
> effectiveness of an algorithm: AMPGO was the only method able to crack
> two of our latest (real-life) optimization problems (42 and 9
> variables respectively), while all the others failed.

That sounds good.

I'm all in favor,
But I'm just a user and cheerleader for scipy.optimize
( http://scipy-user.10969.n7.nabble.com/OT-global-optimization-hybrid-global-local-search-td5769.html
)

>
>> Given that there seem to be many variations of this kind of
>> optimizers, it might be good to have some background on comparison
>> with similar optimizers.
>
> As far as I know, there are no Open Source implementation of the
> Tunnelling Algorithm. I used to have access to a Fortran
> implementation (gradient-based only) in the past, but the code belongs
> to the National Autonomous University of Mexico. There are no Python
> implementation of it. AMPGO is a short-memory tunnelling algorithm
> loosely connected with the standard tunnelling method.
>
>> If you have other comparisons of similar optimizers, it would be
>> useful to see them. Also given that you have a large benchmark suite,
>> you could compare it with the new basinhopping in scipy.optimize.
>
> I'll give it a try tomorrow and report back. I still don't have scipy
> 0.11 at work (they won't let me upgrade), but I guess I can just get
> the Python source code for it and run it, can't I?

It's essentially just one module to grab and to adjust the imports.
That's what I did.

In the pull request there is also a link to benchmarks
https://github.com/js850/scipy/blob/benchmark_basinhopping/scipy/optimize/benchmarks/_basinhopping_benchmarks.py
which were not added to the scipy source AFAICS.

Josef

>
>>> - Firefly:  the Firefly algorithm, this is my Python implementation of
>>> the procedure
>>>   described here:
>>>
>>>   http://www.mathworks.com/matlabcentral/fileexchange/29693-firefly-algorithm
>>
>> the fileexchange has a large number of "animal" optimizers, and I
>> doubt they are all good.
>>
>> I think there needs to be a convincing case before adding any of them
>> should be added to scipy.
>> On the other hand, having them available outside of scipy would make
>> it easier to try them out.
>> (and see if fireflies, or bees or ants are doing better :)
>
> I agree. I did it as an exercise to see if I remembered my Matlab
> after 10 years of using Numpy/Scipy only :-)
>
>
> Andrea.
>
> "Imagination Is The Only Weapon In The War Against Reality."
> http://www.infinity77.net
> _______________________________________________
> SciPy-User mailing list
> [hidden email]
> http://mail.scipy.org/mailman/listinfo/scipy-user
_______________________________________________
SciPy-User mailing list
[hidden email]
http://mail.scipy.org/mailman/listinfo/scipy-user
Reply | Threaded
Open this post in threaded view
|

Re: (Possible) new optimization routines - scipy.optimize

dhirschfeld
In reply to this post by Andrea Gavana
Andrea Gavana <andrea.gavana <at> gmail.com> writes:

>
> Hi All,
>
>     as my team and I are constantly facing very hard/complex numerical
> optimization problems, I have taken a look at the various *global*
> optimization routines available in Python and I thought I could throw
> in a couple of algorithms I implemented, mostly drawing from my
> previous thesis work.
>
> I haven't published the code for the two algorithms (yet), as they are
> not quite up to the documentation standards of scipy. However, if
> there is interest from the community to add them (or one of them) to
> scipy.optimize I will be happy to polish them up and contribute to
> this great library. The same applies to the benchmark test suite.
>
> Andrea.
>

I can't speak for the maintainers however as a user I'm definitely
interested in scipy having more optimization capabilities, especially
in the tricky global optimization area.

Having to install a different package for each optimiser you want
to try is a pain, especially on Windows where they often require
extensive knowledge of both distutils and general compilation of
C/C++ software.

Thanks,
Dave



_______________________________________________
SciPy-User mailing list
[hidden email]
http://mail.scipy.org/mailman/listinfo/scipy-user
Reply | Threaded
Open this post in threaded view
|

Re: (Possible) new optimization routines - scipy.optimize

Andrea Gavana
In reply to this post by josef.pktd
On 14 February 2013 22:53,  <[hidden email]> wrote:

> On Thu, Feb 14, 2013 at 4:01 PM, Andrea Gavana <[hidden email]> wrote:
>> Hi All,
>>
>>     as my team and I are constantly facing very hard/complex numerical
>> optimization problems, I have taken a look at the various *global*
>> optimization routines available in Python and I thought I could throw
>> in a couple of algorithms I implemented, mostly drawing from my
>> previous thesis work.
>>
>> I have implemented two routines (based on numpy and scipy), namely:
>>
>> - AMPGO: Adaptive Memory Programming for Global Optimization: this is my Python
>>   implementation of the algorithm described here:
>>
>>   http://leeds-faculty.colorado.edu/glover/fred%20pubs/416%20-%20AMP%20(TS)%20for%20Constrained%20Global%20Opt%20w%20Lasdon%20et%20al%20.pdf
>>
>>   I have added a few improvements here and there based on my Master Thesis work
>>   on the standard Tunnelling Algorithm of Levy, Montalvo and Gomez.
>
> This could also be a good addition. similar to basinhopping.
> >From my perspective, this kind of global optimizers are the most
> promising, (compared to the evolutionary, ...)
>
> >From a quick browse: Is the local optimizer fixed to a specific one,
> or can it be any available solver as in basinhopping?
>
> The only thing I might worry about that it only has 6 citations in
> Google Scholar (which might not mean much if the optimizer is not
> widely available).
> Given that there seem to be many variations of this kind of
> optimizers, it might be good to have some background on comparison
> with similar optimizers.
>
> If you have other comparisons of similar optimizers, it would be
> useful to see them. Also given that you have a large benchmark suite,
> you could compare it with the new basinhopping in scipy.optimize.

OK, I have run the test suite with basinhopping as well, even though I
had to modify it a bit to support the maximum number of functions
evaluations and the tolerance on achieving the global optimum.

Results are here:

- N-D : http://infinity77.net/global_optimization/multidimensional.html
- 1-D : http://infinity77.net/global_optimization/univariate.html

Overall it appears to have average performances, even though I must
stress again that, while my test suite is relatively large, it only
encompasses low-dimensional problem (1-10 variables) and my
stopping/convergence criteria may not be applicable to everyone else's
needs.


Andrea.

"Imagination Is The Only Weapon In The War Against Reality."
http://www.infinity77.net

# ------------------------------------------------------------- #
def ask_mailing_list_support(email):

    if mention_platform_and_version() and include_sample_app():
        send_message(email)
    else:
        install_malware()
        erase_hard_drives()
# ------------------------------------------------------------- #
_______________________________________________
SciPy-User mailing list
[hidden email]
http://mail.scipy.org/mailman/listinfo/scipy-user
Reply | Threaded
Open this post in threaded view
|

Re: (Possible) new optimization routines - scipy.optimize

josef.pktd
On Mon, Feb 18, 2013 at 11:41 AM, Andrea Gavana <[hidden email]> wrote:

> On 14 February 2013 22:53,  <[hidden email]> wrote:
>> On Thu, Feb 14, 2013 at 4:01 PM, Andrea Gavana <[hidden email]> wrote:
>>> Hi All,
>>>
>>>     as my team and I are constantly facing very hard/complex numerical
>>> optimization problems, I have taken a look at the various *global*
>>> optimization routines available in Python and I thought I could throw
>>> in a couple of algorithms I implemented, mostly drawing from my
>>> previous thesis work.
>>>
>>> I have implemented two routines (based on numpy and scipy), namely:
>>>
>>> - AMPGO: Adaptive Memory Programming for Global Optimization: this is my Python
>>>   implementation of the algorithm described here:
>>>
>>>   http://leeds-faculty.colorado.edu/glover/fred%20pubs/416%20-%20AMP%20(TS)%20for%20Constrained%20Global%20Opt%20w%20Lasdon%20et%20al%20.pdf
>>>
>>>   I have added a few improvements here and there based on my Master Thesis work
>>>   on the standard Tunnelling Algorithm of Levy, Montalvo and Gomez.
>>
>> This could also be a good addition. similar to basinhopping.
>> >From my perspective, this kind of global optimizers are the most
>> promising, (compared to the evolutionary, ...)
>>
>> >From a quick browse: Is the local optimizer fixed to a specific one,
>> or can it be any available solver as in basinhopping?
>>
>> The only thing I might worry about that it only has 6 citations in
>> Google Scholar (which might not mean much if the optimizer is not
>> widely available).
>> Given that there seem to be many variations of this kind of
>> optimizers, it might be good to have some background on comparison
>> with similar optimizers.
>>
>> If you have other comparisons of similar optimizers, it would be
>> useful to see them. Also given that you have a large benchmark suite,
>> you could compare it with the new basinhopping in scipy.optimize.
>
> OK, I have run the test suite with basinhopping as well, even though I
> had to modify it a bit to support the maximum number of functions
> evaluations and the tolerance on achieving the global optimum.

Anything that might be useful to incorporate in the scipy version?

>
> Results are here:
>
> - N-D : http://infinity77.net/global_optimization/multidimensional.html
> - 1-D : http://infinity77.net/global_optimization/univariate.html
>
> Overall it appears to have average performances, even though I must
> stress again that, while my test suite is relatively large, it only
> encompasses low-dimensional problem (1-10 variables) and my
> stopping/convergence criteria may not be applicable to everyone else's
> needs.

Interesting results, AMPGO looks good.

I'm a bit surprised that the number of function evaluations of
basinhopping is relatively large.
(One possible difference to your benchmark is that with smooth models
and numerical derivatives, a more efficient local optimizer could be
chosen.)

two things I saw when I looked at your benchmark before:

DE has several cases that are complementary to AMPGO, it might be
interesting in applications to cross-check results across optimizers.

Why does DE and some other have success rate either 0 or 100 and
nothing in between?
I don't see a reason what could cause this result.

Thanks

Josef


>
>
> Andrea.
>
> "Imagination Is The Only Weapon In The War Against Reality."
> http://www.infinity77.net
>
> # ------------------------------------------------------------- #
> def ask_mailing_list_support(email):
>
>     if mention_platform_and_version() and include_sample_app():
>         send_message(email)
>     else:
>         install_malware()
>         erase_hard_drives()
> # ------------------------------------------------------------- #
> _______________________________________________
> SciPy-User mailing list
> [hidden email]
> http://mail.scipy.org/mailman/listinfo/scipy-user
_______________________________________________
SciPy-User mailing list
[hidden email]
http://mail.scipy.org/mailman/listinfo/scipy-user
Reply | Threaded
Open this post in threaded view
|

Re: (Possible) new optimization routines - scipy.optimize

Andrea Gavana
On 18 February 2013 18:25,  <[hidden email]> wrote:

> On Mon, Feb 18, 2013 at 11:41 AM, Andrea Gavana <[hidden email]> wrote:
>> On 14 February 2013 22:53,  <[hidden email]> wrote:
>>> On Thu, Feb 14, 2013 at 4:01 PM, Andrea Gavana <[hidden email]> wrote:
>>>> Hi All,
>>>>
>>>>     as my team and I are constantly facing very hard/complex numerical
>>>> optimization problems, I have taken a look at the various *global*
>>>> optimization routines available in Python and I thought I could throw
>>>> in a couple of algorithms I implemented, mostly drawing from my
>>>> previous thesis work.
>>>>
>>>> I have implemented two routines (based on numpy and scipy), namely:
>>>>
>>>> - AMPGO: Adaptive Memory Programming for Global Optimization: this is my Python
>>>>   implementation of the algorithm described here:
>>>>
>>>>   http://leeds-faculty.colorado.edu/glover/fred%20pubs/416%20-%20AMP%20(TS)%20for%20Constrained%20Global%20Opt%20w%20Lasdon%20et%20al%20.pdf
>>>>
>>>>   I have added a few improvements here and there based on my Master Thesis work
>>>>   on the standard Tunnelling Algorithm of Levy, Montalvo and Gomez.
>>>
>>> This could also be a good addition. similar to basinhopping.
>>> >From my perspective, this kind of global optimizers are the most
>>> promising, (compared to the evolutionary, ...)
>>>
>>> >From a quick browse: Is the local optimizer fixed to a specific one,
>>> or can it be any available solver as in basinhopping?
>>>
>>> The only thing I might worry about that it only has 6 citations in
>>> Google Scholar (which might not mean much if the optimizer is not
>>> widely available).
>>> Given that there seem to be many variations of this kind of
>>> optimizers, it might be good to have some background on comparison
>>> with similar optimizers.
>>>
>>> If you have other comparisons of similar optimizers, it would be
>>> useful to see them. Also given that you have a large benchmark suite,
>>> you could compare it with the new basinhopping in scipy.optimize.
>>
>> OK, I have run the test suite with basinhopping as well, even though I
>> had to modify it a bit to support the maximum number of functions
>> evaluations and the tolerance on achieving the global optimum.
>
> Anything that might be useful to incorporate in the scipy version?

Might be. The tolerance on the global optimum is close to meaningless
for real-life problems (as we normally don't know where the global
optimum is), but the number of functions evaluations stopping
condition might be useful. I am not very familiar (i.e., close to
zero) with the PR process, but I'll see what I can do. I can of course
provide a patch, which is so much easier than the entire PR chain
IMNSHO.


>> Results are here:
>>
>> - N-D : http://infinity77.net/global_optimization/multidimensional.html
>> - 1-D : http://infinity77.net/global_optimization/univariate.html
>>
>> Overall it appears to have average performances, even though I must
>> stress again that, while my test suite is relatively large, it only
>> encompasses low-dimensional problem (1-10 variables) and my
>> stopping/convergence criteria may not be applicable to everyone else's
>> needs.
>
> Interesting results, AMPGO looks good.
>
> I'm a bit surprised that the number of function evaluations of
> basinhopping is relatively large.
> (One possible difference to your benchmark is that with smooth models
> and numerical derivatives, a more efficient local optimizer could be
> chosen.)

The results on my website have been obtained using L-BFGS-B as a local
optimizer. Most of the test functions, however, are designed to fool
gradient-based descent algorithms like BFGS, so I am not particularly
surprised by the performances. I had also tried with SLSQP and TNC,
with similar results. I couldn't use COBYLA as the Fortran interface
to it changed between Scipy 0.9 and 0.11 and I have trouble
compiling/installing anything at work. I'll give it another go
tomorrow by bringing Scipy from my home PC.

> two things I saw when I looked at your benchmark before:
>
> DE has several cases that are complementary to AMPGO, it might be
> interesting in applications to cross-check results across optimizers.

I am not sure I understand what you mean here... You mean that when
AMPGO fails DE succeeds and vice-versa?

> Why does DE and some other have success rate either 0 or 100 and
> nothing in between?
> I don't see a reason what could cause this result.

0 means that, whatever the random starting point was, the algorithm
could never locate the global optimum over 100 trials (100 different
random starting points). 100 means that the global optimum could
always be located, irrespective of the starting point chosen.

I see two reasons for this behaviour:

1. For the vast majority of the test functions, the lower/upper bounds
are symmetric around the origin and the actual global optimum is *at*
the origin: most derivative-free algorithms (but not AMPGO), when they
don't know where to turn because they are desperate and can't find a
better minimum, they choose the middle point of the search space
(which happens to be the global optimum for most test functions -
cheating).

This is very difficult for me to correct as most of the algorithm's
black magic is hidden behind obsolescent, outdated user-unfriendly
Fortran/C code (as if we would need Fortran/C speed of their algorithm
to calibrate the Large Hadron Collider detectors... please, just use
an intelligent language like Python or, if worst comes, Matlab). This
issue actually got some of the algorithms close to be excluded from
the benchmark - it's not fair. The DIRECT and MLSL algorithms are just
two examples (there are others).

I'll try another approach tomorrow by shifting the lower/upper bounds
to be asymmetric, and we'll see how it goes.

2. There is a (widespread) tendency amongst numerical optimization
"experts" to pick and choose a set of benchmark functions that is
better suited to demonstrate the superiority of their own algorithm
against other methods (or they simply write new functions for which
they know their algorithm will perform better). To write my benchmark
suite I have looked at around 30/40 papers and most of them showed
that behaviour: a couple of examples for that is the "Deflected
Corrugated Spring" and the "Xin-She Yang" test functions, which are
known to be favourable to DE and PSWARM.

But anyway, I'll run a modified benchmark tomorrow (or the next few
days) and I'll report back if there is any interest.

Andrea.

"Imagination Is The Only Weapon In The War Against Reality."
http://www.infinity77.net

# ------------------------------------------------------------- #
def ask_mailing_list_support(email):

    if mention_platform_and_version() and include_sample_app():
        send_message(email)
    else:
        install_malware()
        erase_hard_drives()
# ------------------------------------------------------------- #
_______________________________________________
SciPy-User mailing list
[hidden email]
http://mail.scipy.org/mailman/listinfo/scipy-user
Reply | Threaded
Open this post in threaded view
|

Re: (Possible) new optimization routines - scipy.optimize

Pauli Virtanen-3
18.02.2013 22:30, Andrea Gavana kirjoitti:
[clip]
> I am not very familiar (i.e., close to
> zero) with the PR process, but I'll see what I can do. I can of course
> provide a patch, which is so much easier than the entire PR chain
> IMNSHO.

It's not that much more difficult than patches, and way easier than
patch queues:

- create account on github.com, log in
- click fork at https://github.com/scipy/scipy
- git clone -o github [hidden email]:YOURUSERNAME/scipy.git

- git branch my-feature
- edit
- git commit -m "commit message"
- git push github my-feature
- go to https://github.com/YOURUSERNAME/scipy and click
  "Pull request" next to "my-feature"

--
Pauli Virtanen

_______________________________________________
SciPy-User mailing list
[hidden email]
http://mail.scipy.org/mailman/listinfo/scipy-user