Re: optimization with ill conditioned Hessian (josef.pktd@gmail.com)

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Re: optimization with ill conditioned Hessian (josef.pktd@gmail.com)

federico vaggi-2
Hey Josef,

If the problem you are dealing with is some kind of least square problem, you might find this paper helpful:


Federico


Message: 1
Date: Fri, 18 Oct 2013 22:16:28 -0400
From: [hidden email]
Subject: [SciPy-User] optimization with ill conditioned Hessian
To: SciPy Users List <[hidden email]>
Message-ID:
        <[hidden email]>
Content-Type: text/plain; charset=ISO-8859-1

Does scipy have another optimizer besides fmin (Nelder-Mead) that is
robust to near-singular, high condition number Hessian?

fmin_bfgs goes into neverland, values become huge until I get some
nans in my calculations.

What would be nice is an optimizer that uses derivatives, but
regularizes, forces Hessian or equivalent to be positive definite.


Background
I'm trying to replicate a textbook example that has data and matrix
inverses that are "not nice". fmin (Nelder-Mead) is getting pretty
close to the Stata numbers. However fmin_bfgs has been my preferred
default optimizer for some time.

Aside:
It looks like it's a good test case to make my linear algebra more robust.
np.linalg.pinv(x.T.dot(x)) doesn't seem to be robust enough for this case.
And no idea why a textbook would use an example like that.
And no idea if Stata doesn't just make up the numbers.

Thanks,

Josef


_______________________________________________
SciPy-User mailing list
[hidden email]
http://mail.scipy.org/mailman/listinfo/scipy-user
Reply | Threaded
Open this post in threaded view
|

Re: optimization with ill conditioned Hessian (josef.pktd@gmail.com)

josef.pktd
On Sat, Oct 19, 2013 at 5:09 PM, federico vaggi
<[hidden email]> wrote:
> Hey Josef,
>
> If the problem you are dealing with is some kind of least square problem,
> you might find this paper helpful:
>
> http://arxiv.org/abs/1201.5885

Thanks for the link.

My problem has a quadratic form but it cannot be rewritten as a least
squares problem, at least not in it's general form.
That's the reason I'm using the general optimizers, mainly fmin and fmin_bfgs.

Josef

>
> Federico
>
>>
>> Message: 1
>> Date: Fri, 18 Oct 2013 22:16:28 -0400
>> From: [hidden email]
>> Subject: [SciPy-User] optimization with ill conditioned Hessian
>> To: SciPy Users List <[hidden email]>
>> Message-ID:
>>
>> <[hidden email]>
>> Content-Type: text/plain; charset=ISO-8859-1
>>
>> Does scipy have another optimizer besides fmin (Nelder-Mead) that is
>> robust to near-singular, high condition number Hessian?
>>
>> fmin_bfgs goes into neverland, values become huge until I get some
>> nans in my calculations.
>>
>> What would be nice is an optimizer that uses derivatives, but
>> regularizes, forces Hessian or equivalent to be positive definite.
>>
>>
>> Background
>> I'm trying to replicate a textbook example that has data and matrix
>> inverses that are "not nice". fmin (Nelder-Mead) is getting pretty
>> close to the Stata numbers. However fmin_bfgs has been my preferred
>> default optimizer for some time.
>>
>> Aside:
>> It looks like it's a good test case to make my linear algebra more robust.
>> np.linalg.pinv(x.T.dot(x)) doesn't seem to be robust enough for this case.
>> And no idea why a textbook would use an example like that.
>> And no idea if Stata doesn't just make up the numbers.
>>
>> Thanks,
>>
>> Josef
>>
>
> _______________________________________________
> SciPy-User mailing list
> [hidden email]
> http://mail.scipy.org/mailman/listinfo/scipy-user
>
_______________________________________________
SciPy-User mailing list
[hidden email]
http://mail.scipy.org/mailman/listinfo/scipy-user