how to put constaints on coefficients obtained from regression.pm

P

pj

Hi,

I use regression package to do a multivariate linear regression fit to
my data. Regression.pm generates coefficient values (thetas), but I
want to make them always possitive. How do I put constraints (all
coefficients are positive) when using regression package? any ideas?
thanks
 
C

ctcgag

Hi,

I use regression package to do a multivariate linear regression fit to
my data. Regression.pm generates coefficient values (thetas), but I
want to make them always possitive. How do I put constraints (all
coefficients are positive) when using regression package? any ideas?

I can't find a top-level module named Regression, so I don't know if it has
any specific capabilities to implement constraints. If not, you could just
iteratively remove any variables that come back with negative coefficients
and rerun the regression.

I'm not sure that this is gauranteed to lead to a globally optimal
solution, but if you are that particular about your statistical rigour, you
probably should probably be using a commercial regression application
anyway.

Xho
 
D

David K. Wall

pj said:
I use regression package to do a multivariate linear regression
fit to my data. Regression.pm generates coefficient values
(thetas), but I want to make them always possitive. How do I put
constraints (all coefficients are positive) when using regression
package? any ideas? thanks

Huh? That's really strange. Regression coefficients sometimes just ARE
negative because that's the way the data is. You can transform the data
so that the coefficients are more meaningful, e.g.; taking the log or
something, but when you fit a model to your data you have to accept
what the data says, not what you want it to be. That is, if you're
honest.

Ask in a statistics group; I think they'll tell you the same thing. For
what it's worth, my job title is statistician (although I tend to do
more programming these days) and I've never heard of trying to force
all regression coefficients to be positive. The PhD statistician across
the hall (I only have a Master's in stat) hadn't either. He said it
sounded like something a psychologist would try to do. :)
 
A

A. Sinan Unur

Huh? That's really strange. Regression coefficients sometimes just ARE
negative because that's the way the data is. You can transform the data
so that the coefficients are more meaningful, e.g.; taking the log or
something, but when you fit a model to your data you have to accept
what the data says, not what you want it to be. That is, if you're
honest.

Ask in a statistics group; I think they'll tell you the same thing. For
what it's worth, my job title is statistician (although I tend to do
more programming these days) and I've never heard of trying to force
all regression coefficients to be positive. The PhD statistician across
the hall (I only have a Master's in stat) hadn't either. He said it
sounded like something a psychologist would try to do. :)

Well, there are sometimes perfectly valid reasons for calculating
restricted regressions, mostly in the realm of Likelihood Ratio tests.

I know, I know, getting off-topic here.

Sinan.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,482
Members
44,900
Latest member
Nell636132

Latest Threads

Top