# Re: Non-linear regression help in Python

Discussion in 'Python' started by sturlamolden, Feb 14, 2011.

1. ### sturlamoldenGuest

On 14 Feb, 22:02, Akand Islam <> wrote:
> Hello all,
> I want to do non-linear regression by fitting the model (let say, y =
> a1*x+b1*x**2+c1*x**3/exp(d1*x**4)) where the parameter (say "c1") must
> be in between (0 to 1), and others can be of any values. How can I
> perform my job in python?

First, install NumPy and SciPy!

scipy.optimize.leastsq implements Levenberg-Marquardt, which is close
to the 'gold standard' for non-linear least squares. The algorithm
will be a bit faster and more accurate if you provide the Jacobian of
the residuals, i.e. the partial derivative of

r = y - a1*x+b1*x**2+c1*x**3/exp(d1*x**4)

with respect to a1, b1, c1, and d1 (stored in a 4 by n matrix).

If you have prior information about c1, you have a Bayesian regression
problem. You can constrain c1 between 1 and 0 by assuming a beta prior
distribution on c1, e.g.

c1 ~ Be(z,z) with 1 < z < 2

Then proceed as you would with Bayesian regession -- i.e. EM, Gibbs'
sampler, or Metropolis-Hastings. Use scipy.stats.beta to evaluate and
numpy.random.beta to sample the beta distribution. The problem is not
programming it in Python, but getting the correct equations on paper.
Also beware that running the Markov chain Monte Carlo might take a
while.

Sturla

sturlamolden, Feb 14, 2011

2. ### sturlamoldenGuest

On 15 Feb, 05:24, Akand Islam <> wrote:

> Dear Sturlamolden,
> Thanks for reply. I will follow-up if I need further assistance.
>
> -- Akand

You should rather use the SciPy user mailing list than
comp.lang.python for this.

Sturla

sturlamolden, Feb 15, 2011