Linear regression in NumPy

Discussion in 'Python' started by nikie, Mar 17, 2006.

  1. nikie

    nikie Guest

    I'm a little bit stuck with NumPy here, and neither the docs nor
    trial&error seems to lead me anywhere:
    I've got a set of data points (x/y-coordinates) and want to fit a
    straight line through them, using LMSE linear regression. Simple
    enough. I thought instead of looking up the formulas I'd just see if
    there isn't a NumPy function that does exactly this. What I found was
    "linear_least_squares", but I can't figure out what kind of parameters
    it expects: I tried passing it my array of X-coordinates and the array
    of Y-coordinates, but it complains that the first parameter should be
    two-dimensional. But well, my data is 1d. I guess I could pack the X/Y
    coordinates into one 2d-array, but then, what do I do with the second
    parameter?

    Mor generally: Is there any kind of documentation that tells me what
    the functions in NumPy do, and what parameters they expect, how to call
    them, etc. All I found was:
    "This function returns the least-squares solution of an overdetermined
    system of linear equations. An optional third argument indicates the
    cutoff for the range of singular values (defaults to 10-10). There are
    four return values: the least-squares solution itself, the sum of the
    squared residuals (i.e. the quantity minimized by the solution), the
    rank of the matrix a, and the singular values of a in descending
    order."
    It doesn't even mention what the parameters "a" and "b" are for...
    nikie, Mar 17, 2006
    #1
    1. Advertising

  2. nikie

    Robert Kern Guest

    nikie wrote:
    > I'm a little bit stuck with NumPy here, and neither the docs nor
    > trial&error seems to lead me anywhere:
    > I've got a set of data points (x/y-coordinates) and want to fit a
    > straight line through them, using LMSE linear regression. Simple
    > enough. I thought instead of looking up the formulas I'd just see if
    > there isn't a NumPy function that does exactly this. What I found was
    > "linear_least_squares", but I can't figure out what kind of parameters
    > it expects: I tried passing it my array of X-coordinates and the array
    > of Y-coordinates, but it complains that the first parameter should be
    > two-dimensional. But well, my data is 1d. I guess I could pack the X/Y
    > coordinates into one 2d-array, but then, what do I do with the second
    > parameter?
    >
    > Mor generally: Is there any kind of documentation that tells me what
    > the functions in NumPy do, and what parameters they expect, how to call
    > them, etc. All I found was:
    > "This function returns the least-squares solution of an overdetermined
    > system of linear equations. An optional third argument indicates the
    > cutoff for the range of singular values (defaults to 10-10). There are
    > four return values: the least-squares solution itself, the sum of the
    > squared residuals (i.e. the quantity minimized by the solution), the
    > rank of the matrix a, and the singular values of a in descending
    > order."
    > It doesn't even mention what the parameters "a" and "b" are for...


    Look at the docstring. (Note: I am using the current version of numpy from SVN,
    you may be using an older version of Numeric. http://numeric.scipy.org/)

    In [171]: numpy.linalg.lstsq?
    Type: function
    Base Class: <type 'function'>
    String Form: <function linear_least_squares at 0x1677630>
    Namespace: Interactive
    File:
    /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy-0.9.6.2148-py2.4-macosx-10.4-ppc.egg/numpy/linalg/linalg.py
    Definition: numpy.linalg.lstsq(a, b, rcond=1e-10)
    Docstring:
    returns x,resids,rank,s
    where x minimizes 2-norm(|b - Ax|)
    resids is the sum square residuals
    rank is the rank of A
    s is the rank of the singular values of A in descending order

    If b is a matrix then x is also a matrix with corresponding columns.
    If the rank of A is less than the number of columns of A or greater than
    the number of rows, then residuals will be returned as an empty array
    otherwise resids = sum((b-dot(A,x)**2).
    Singular values less than s[0]*rcond are treated as zero.

    --
    Robert Kern


    "I have come to believe that the whole world is an enigma, a harmless enigma
    that is made terrible by our own mad attempt to interpret it as though it had
    an underlying truth."
    -- Umberto Eco
    Robert Kern, Mar 17, 2006
    #2
    1. Advertising

  3. nikie

    Guest

    nikie napisal(a):
    > I'm a little bit stuck with NumPy here, and neither the docs nor
    > trial&error seems to lead me anywhere:
    > I've got a set of data points (x/y-coordinates) and want to fit a
    > straight line through them, using LMSE linear regression. Simple
    > enough. I thought instead of looking up the formulas I'd just see if
    > there isn't a NumPy function that does exactly this. What I found was
    > "linear_least_squares", but I can't figure out what kind of parameters
    > it expects: I tried passing it my array of X-coordinates and the array
    > of Y-coordinates, but it complains that the first parameter should be
    > two-dimensional. But well, my data is 1d. I guess I could pack the X/Y
    > coordinates into one 2d-array, but then, what do I do with the second
    > parameter?


    Well, it works for me:

    x = Matrix([[1, 1], [1, 2], [1, 3]])
    y = Matrix([[1], [2], [4]])
    print linear_least_squares(x, y)

    Make sure the dimensions are right. X should be n*k, Y should (unless
    you know what you are doing) be n*1. So the first dimension must be
    equal.

    If you wanted to:
    y = Matrix([1, 2, 4])
    it won't work because it'll have dimensions 1*3. You would have to
    transpose it:
    y = transpose(Matrix([1, 2, 4]))

    Hope this helps.
    , Mar 17, 2006
    #3
  4. nikie

    nikie Guest

    I still don't get it...
    My data looks like this:
    x = [0,1,2,3]
    y = [1,3,5,7]
    The expected output would be something like (2, 1), as y = x*2+1

    (An image sometimes says more than 1000 words, so to make myself clear:
    this is what I want to do:
    http://www.statistics4u.info/fundstat_eng/cc_regression.html)

    So, how am I to fill these matrices?

    (As a matter of fact, I already wrote the whole thing in Python in
    about 9 lines of code, but I'm pretty sure this should have been
    possible using NumPy)
    nikie, Mar 17, 2006
    #4
  5. nikie

    Robert Kern Guest

    nikie wrote:
    > I still don't get it...
    > My data looks like this:
    > x = [0,1,2,3]
    > y = [1,3,5,7]
    > The expected output would be something like (2, 1), as y = x*2+1
    >
    > (An image sometimes says more than 1000 words, so to make myself clear:
    > this is what I want to do:
    > http://www.statistics4u.info/fundstat_eng/cc_regression.html)
    >
    > So, how am I to fill these matrices?


    As the docstring says, the problem it solves is min ||A*x - b||_2. In order to
    get it to solve your problem, you need to cast it into this matrix form. This is
    out of scope for the docstring, but most introductory statistics or linear
    algebra texts will cover this.

    In [201]: x = array([0., 1, 2, 3])

    In [202]: y = array([1., 3, 5, 7])

    In [203]: A = ones((len(y), 2), dtype=float)

    In [204]: A[:,0] = x

    In [205]: from numpy import linalg

    In [206]: linalg.lstsq(A, y)
    Out[206]:
    (array([ 2., 1.]),
    array([ 1.64987674e-30]),
    2,
    array([ 4.10003045, 1.09075677]))

    --
    Robert Kern


    "I have come to believe that the whole world is an enigma, a harmless enigma
    that is made terrible by our own mad attempt to interpret it as though it had
    an underlying truth."
    -- Umberto Eco
    Robert Kern, Mar 17, 2006
    #5
  6. nikie

    Matt Crema Guest

    Robert Kern wrote:
    > nikie wrote:
    >
    >>I still don't get it...
    >>My data looks like this:
    >> x = [0,1,2,3]
    >> y = [1,3,5,7]
    >>The expected output would be something like (2, 1), as y = x*2+1
    >>
    >>(An image sometimes says more than 1000 words, so to make myself clear:
    >>this is what I want to do:
    >>http://www.statistics4u.info/fundstat_eng/cc_regression.html)
    >>
    >>So, how am I to fill these matrices?

    >
    >
    > As the docstring says, the problem it solves is min ||A*x - b||_2. In order to
    > get it to solve your problem, you need to cast it into this matrix form. This is
    > out of scope for the docstring, but most introductory statistics or linear
    > algebra texts will cover this.
    >
    > In [201]: x = array([0., 1, 2, 3])
    >
    > In [202]: y = array([1., 3, 5, 7])
    >
    > In [203]: A = ones((len(y), 2), dtype=float)
    >
    > In [204]: A[:,0] = x
    >
    > In [205]: from numpy import linalg
    >
    > In [206]: linalg.lstsq(A, y)
    > Out[206]:
    > (array([ 2., 1.]),
    > array([ 1.64987674e-30]),
    > 2,
    > array([ 4.10003045, 1.09075677]))
    >


    I'm new to numpy myself.

    The above posters are correct to say that the problem must be cast into
    matrix form. However, as this is such a common technique, don't most
    math/stats packages do it behind the scenes?

    For example, in Matlab or Octave I could type:
    polyfit(x,y,1)

    and I'd get the answer with shorter, more readable code. A one-liner!
    Is there a 'canned' routine to do it in numpy?

    btw, I am not advocating that one should not understand the concepts
    behind a 'canned' routine. If you do not understand this concept you
    should take <Robert Kern>'s advice and dive into a linear algebra book.
    It's not very difficult, and it is essential that a scientific
    programmer understand it.

    -Matt
    Matt Crema, Mar 18, 2006
    #6
  7. nikie

    Matt Crema Guest

    Matt Crema wrote:
    > Robert Kern wrote:
    >
    >> nikie wrote:
    >>
    >>> I still don't get it...
    >>> My data looks like this:
    >>> x = [0,1,2,3]
    >>> y = [1,3,5,7]
    >>> The expected output would be something like (2, 1), as y = x*2+1
    >>>
    >>> (An image sometimes says more than 1000 words, so to make myself clear:
    >>> this is what I want to do:
    >>> http://www.statistics4u.info/fundstat_eng/cc_regression.html)
    >>>
    >>> So, how am I to fill these matrices?

    >>
    >>
    >>
    >> As the docstring says, the problem it solves is min ||A*x - b||_2. In
    >> order to
    >> get it to solve your problem, you need to cast it into this matrix
    >> form. This is
    >> out of scope for the docstring, but most introductory statistics or
    >> linear
    >> algebra texts will cover this.
    >>
    >> In [201]: x = array([0., 1, 2, 3])
    >>
    >> In [202]: y = array([1., 3, 5, 7])
    >>
    >> In [203]: A = ones((len(y), 2), dtype=float)
    >>
    >> In [204]: A[:,0] = x
    >>
    >> In [205]: from numpy import linalg
    >>
    >> In [206]: linalg.lstsq(A, y)
    >> Out[206]:
    >> (array([ 2., 1.]),
    >> array([ 1.64987674e-30]),
    >> 2,
    >> array([ 4.10003045, 1.09075677]))
    >>

    >
    > I'm new to numpy myself.
    >
    > The above posters are correct to say that the problem must be cast into
    > matrix form. However, as this is such a common technique, don't most
    > math/stats packages do it behind the scenes?
    >
    > For example, in Matlab or Octave I could type:
    > polyfit(x,y,1)
    >
    > and I'd get the answer with shorter, more readable code. A one-liner!
    > Is there a 'canned' routine to do it in numpy?
    >
    > btw, I am not advocating that one should not understand the concepts
    > behind a 'canned' routine. If you do not understand this concept you
    > should take <Robert Kern>'s advice and dive into a linear algebra book.
    > It's not very difficult, and it is essential that a scientific
    > programmer understand it.
    >
    > -Matt


    Hi again,

    I guess I should have looked first ;)

    m,b = numpy.polyfit(x,y,1)

    -Matt
    Matt Crema, Mar 18, 2006
    #7
  8. nikie

    nikie Guest

    Thank you!

    THAT's what I've been looking for from the start!
    nikie, Mar 18, 2006
    #8
  9. nikie

    Matt Crema Guest

    nikie wrote:
    >
    > <SNIP Found that polyfit is a useful built-in tool for linear regression>


    Hello,

    I'm glad that helped, but let's not terminate this discussion just yet.
    I am also interested in answers to your second question:

    nikie wrote:

    > "More generally: Is there any kind of documentation that tells me what
    > the functions in NumPy do, and what parameters they expect, how to
    > call them, etc.


    As I said, I'm also new to numpy (only been using it for a week), but my
    first impression is that the built-in documentation is seriously
    lacking. For example, the Mathworks docs absolutely crush numpy's. I
    mean this constructively, and not as a shot at numpy.

    <Robert Kern> gave an excellent answer, but I differ with his one point
    that the docstring for "numpy.linalg.lstsq?" contains an obvious answer
    to the question. Good documentation should be written in much simpler
    terms, and examples of the function's use should be included.

    I wonder if anyone can impart some strategies for quickly solving
    problems like "How do I do a linear fit in numpy?" if, for example, I
    don't know which command to use.

    In Matlab, I would have typed:
    "lookfor fit"
    It would have returned 'polyval'. Then:
    "help polyval"

    and this problem would have been solved in under 5 minutes.

    To sum up a wordy post, "What do experienced users find is the most
    efficient way to navigate the numpy docs? (assuming one has already
    read the FAQs and tutorials)"

    Thanks.
    -Matt
    Matt Crema, Mar 18, 2006
    #9
  10. nikie

    Robert Kern Guest

    Matt Crema wrote:

    > To sum up a wordy post, "What do experienced users find is the most
    > efficient way to navigate the numpy docs? (assuming one has already
    > read the FAQs and tutorials)"


    You're not likely to get much of an answer here, but if you ask on
    , you'll get plenty of discussion.

    --
    Robert Kern


    "I have come to believe that the whole world is an enigma, a harmless enigma
    that is made terrible by our own mad attempt to interpret it as though it had
    an underlying truth."
    -- Umberto Eco
    Robert Kern, Mar 18, 2006
    #10
  11. Matt Crema wrote:
    > > "More generally: Is there any kind of documentation that tells me what
    > > the functions in NumPy do, and what parameters they expect, how to
    > > call them, etc.


    This is a good start too:

    http://www.tramy.us/guidetoscipy.html

    Yes, you have to pay for it, but the money goes to the guy who has done
    a MASSIVE amount of work to get the new numpy out.

    I would like to see a "Mastering Numpy" book much like the excellent
    "Mastering Matlab", but some one needs to write it!

    -Chris
    Christopher Barker, Mar 20, 2006
    #11
  12. nikie

    nikie Guest

    Although I think it's worth reading, it only covers the fundamental
    structure (what arrays are, what ufuncs are..) of NumPy. Neither of the
    functions dicussed in this thread (polyfit/linear_least_squares) is
    mentioned in the file.
    nikie, Mar 23, 2006
    #12
  13. nikie

    Robert Kern Guest

    nikie wrote:
    > Although I think it's worth reading, it only covers the fundamental
    > structure (what arrays are, what ufuncs are..) of NumPy. Neither of the
    > functions dicussed in this thread (polyfit/linear_least_squares) is
    > mentioned in the file.


    Both functions are described in the full book. Were you just looking at the
    sample chapter?

    --
    Robert Kern


    "I have come to believe that the whole world is an enigma, a harmless enigma
    that is made terrible by our own mad attempt to interpret it as though it had
    an underlying truth."
    -- Umberto Eco
    Robert Kern, Mar 23, 2006
    #13
  14. nikie

    nikie Guest

    Robert Kern wrote:
    > Both functions are described in the full book. Were you just looking at the
    > sample chapter?


    No, I've got the full PDF by mail a few days ago, "numpybook.pdf", 261
    pages (I hope we're talking about the same thing). I entered
    "linear_least_squares" and "polyfit" in acrobat's "find text" box, but
    neither one could be found.
    nikie, Mar 24, 2006
    #14
  15. nikie

    Robert Kern Guest

    nikie wrote:
    > Robert Kern wrote:
    >
    >>Both functions are described in the full book. Were you just looking at the
    >>sample chapter?

    >
    > No, I've got the full PDF by mail a few days ago, "numpybook.pdf", 261
    > pages (I hope we're talking about the same thing). I entered
    > "linear_least_squares" and "polyfit" in acrobat's "find text" box, but
    > neither one could be found.


    The version I have in front of me is a bit shorter, 252 pages, but describes
    polyfit in section 5.3 on page 91 along with the other polynomial functions.
    lstsq (linear_least_squares is a backwards-compatibility alias that was recently
    moved to numpy.linalg.old) is described in section 10.1 on page 149.

    --
    Robert Kern


    "I have come to believe that the whole world is an enigma, a harmless enigma
    that is made terrible by our own mad attempt to interpret it as though it had
    an underlying truth."
    -- Umberto Eco
    Robert Kern, Mar 24, 2006
    #15
  16. nikie

    nikie Guest

    > The version I have in front of me is a bit shorter, 252 pages, but describes
    > polyfit in section 5.3 on page 91 along with the other polynomial functions.
    > lstsq (linear_least_squares is a backwards-compatibility alias that was recently
    > moved to numpy.linalg.old) is described in section 10.1 on page 149.


    Oops, sorry, shouldn't have posted before reading the whole document...
    You are right, of course, both functions are explained.
    I wonder why the acrobat's search function doesn't work, though.
    nikie, Mar 24, 2006
    #16
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Replies:
    1
    Views:
    406
    John Machin
    Sep 2, 2006
  2. Robert Kern

    Re: Linear regression in 3 dimensions

    Robert Kern, Sep 2, 2006, in forum: Python
    Replies:
    5
    Views:
    2,310
    David J. Braden
    Sep 15, 2006
  3. Nod Lee

    linear regression in webform

    Nod Lee, Jan 9, 2007, in forum: ASP .Net
    Replies:
    1
    Views:
    349
    Cowboy \(Gregory A. Beamer\)
    Jan 9, 2007
  4. Jianzhong Liu

    Linear regression in NumPy

    Jianzhong Liu, Dec 5, 2006, in forum: Python
    Replies:
    1
    Views:
    692
  5. Krzysztof Bieniasz

    Re: Non-linear regression help in Python

    Krzysztof Bieniasz, Feb 14, 2011, in forum: Python
    Replies:
    0
    Views:
    596
    Krzysztof Bieniasz
    Feb 14, 2011
Loading...

Share This Page