Hi ALl,
I am trying to a three-term gaussian with a specific form to fit a data set. The fit isn't working out as I nicely. Could you help me out? Here attached is my code and the data, along with the fit I obtain: import numpy as np #from scipy.optimize import curve_fit import scipy.optimize as optimize import matplotlib.pyplot as plt #r=np.linspace(0.0e-15,4e-15, 100) data = np.loadtxt('V_lambda_n.dat') r = data[:, 0] V = data[:, 1] std_dev=np.std(data) def func(x, ps1, ps2, ps3, ps4): return ps1*np.exp(-(x/ps2)**2) + ps2*np.exp(-(x/ps3)**2) + ps3*np.exp(-(x/ps4)**2) popt, pcov = optimize.curve_fit(func, r, V, p0=[50, std_dev, 50, std_dev], maxfev=10000) #params = optimize.curve_fit(func, ps1, ps2, ps3, ps4) #[ps1, ps2, ps2, ps4] = params[0] p1=plt.plot(r, V, 'bo', label='data') p2=plt.plot(r, func(r, *popt), 'r-', label='fit') plt.xticks(np.linspace(0, 4, 9, endpoint=True)) plt.yticks(np.linspace(-50, 150, 9, endpoint=True)) plt.show() _______________________________________________ SciPy-User mailing list [hidden email] https://mail.python.org/mailman/listinfo/scipy-user |
Your func should get parameters as a vector/list, not as separate parameters. I’m surprised the code doesn’t just fail. So, try:
def func(x, params):
ps1, ps2, ps3, ps4 = params
…
I think that should work, but don’t have time to check right now… Good luck!
Juan.
On 5 Apr 2017, 9:16 AM -0400, Otto Ngeke <[hidden email]>, wrote:
_______________________________________________ SciPy-User mailing list [hidden email] https://mail.python.org/mailman/listinfo/scipy-user |
Hi Juan, Here is my new function as you suggested: def func(x, *params): for i in range(0, len(params), 2): ps1=params[i] ps2=params[i+1] return ps1*np.exp(-(x/ps2)**2) I still get the same fit as before. However, I am worried because my Gaussians are such that the amplitude of the second Gaussian term is the standard deviation of the first: some kind of coupling between the terms. I don't know if this new function definition actually represents that property. On Wed, Apr 5, 2017 at 4:45 PM, Juan Nunez-Iglesias <[hidden email]> wrote:
_______________________________________________ SciPy-User mailing list [hidden email] https://mail.python.org/mailman/listinfo/scipy-user |
In reply to this post by Otto Ngeke
Otto, I think the simplest conclusion is that your model function (three Gaussians, each centered at x=0 and with the amplitude of one constrained to be the width of the next) does not easily match your data. If it is, you must need at least one of the Gaussians to have a negative amplitude or else you'll never get values below zero. Are you sure this is a good model for your data?On Wed, Apr 5, 2017 at 8:16 AM, Otto Ngeke <[hidden email]> wrote:
_______________________________________________ SciPy-User mailing list [hidden email] https://mail.python.org/mailman/listinfo/scipy-user |
In reply to this post by Otto Ngeke
On 5 April 2017 at 17:24, Otto Ngeke <[hidden email]> wrote: I still get the same fit as before. However, I am worried because my Gaussians are such that the amplitude of the second Gaussian term is the standard deviation of the first: some kind of coupling between the terms. I don't know if this new function definition actually represents that property. It should be: def func(x, params): ps1, ps2, ps3, ps4 = params return ps1*np.exp(-(x/ps2)**2) + ps2*np.exp(-(x/ps3)**2) + ps3*np.exp(-(x/ps4)**2) Note that in your initialisation you have a symmetry between the first and the third Gaussian: popt, pcov = optimize.curve_fit(func, r, V, p0=[50, std_dev, 50, std_dev], maxfev=10000) So unless curve_fit adds some random noise, the two are going to be always the same because you cannot break the symmetry. Instead, give them different values, for example: [50, 1.2 * std_dev, 30, std_dev / 1.2] (I have no idea if this particular values make sense, but they do will allow you to get two independent Gaussians, not too far from your initial guess.) For hard fits (and arbitrary fits are surprisingly hard!), iminuit is a quite powerful option. /David. _______________________________________________ SciPy-User mailing list [hidden email] https://mail.python.org/mailman/listinfo/scipy-user |
On Wed, Apr 5, 2017 at 10:41 AM, Daπid <[hidden email]> wrote:
That is not correct. Setting multiple parameters to the same initial values does not make them always have the same value. Each parameter here is separate, and there is no symmetry here at all.
Certainly giving initial values that are close as possible is a good idea, but they do not need to have values that are unique.
Lmfit might also be useful here. But the basic problem is that the model of multiple Gaussians with the same center will have a very hard time describing the data shown. --Matt _______________________________________________ SciPy-User mailing list [hidden email] https://mail.python.org/mailman/listinfo/scipy-user |
I tried to pay more attention to the initial guess as suggested, and found a very nearly close fit! The new initial guess I used is p0=[V.max(), std_dev, V.max(), 2] See attached for new fit. I then used the resulting values for my parameters ps1, ps2,ps3, ps4 as a new initial guess, but surprisingly, there is no improvement in the fit. It just remains the same, even after repeatedly using new parameters as initial guess for like five successive times. Any way I can improve my guess? On Wed, Apr 5, 2017 at 6:39 PM, Matt Newville <[hidden email]> wrote:
_______________________________________________ SciPy-User mailing list [hidden email] https://mail.python.org/mailman/listinfo/scipy-user gaussian_fit2.png (33K) Download Attachment |
If you are using the output parameters from a curve fit as the input to a new curve fit then you won't see any improvement because you will already be in a local/ global minimum in chi2 space. On 6 Apr 2017 4:58 pm, "Otto Ngeke" <[hidden email]> wrote:
_______________________________________________ SciPy-User mailing list [hidden email] https://mail.python.org/mailman/listinfo/scipy-user |
Free forum by Nabble | Edit this page |