Python in Linux - barrier to Python 3.x

A

Ant

Hi all,

I've just seen this: http://sheddingbikes.com/posts/1285063820.html

Whatever you think of Zed Shaw (author of the Mongrel Ruby server and
relatively recent Python convert), he has a very good point in this. I
run Fedora 12 on my home computers, and find it far too much hassle to
try to get Python 3 installed. Even the 2.x's are behind - IIRC think
it currently uses 2.5.

So I really think this is a barrier to entry to Python 3 that we could
do without - it's the only reason I do all of my Python work in 2.x, I
would jump at migrating to Python 3 if it was easily available on
Fedora.

Is there a solution to this that anyone knows of? Has Zed jumped to
conclusions? Have I?
 
P

Peter Otten

Ant said:
I've just seen this: http://sheddingbikes.com/posts/1285063820.html

Whatever you think of Zed Shaw (author of the Mongrel Ruby server and
relatively recent Python convert), he has a very good point in this. I
run Fedora 12 on my home computers, and find it far too much hassle to
try to get Python 3 installed. Even the 2.x's are behind - IIRC think
it currently uses 2.5.

Once you remove the Zedshawisms the article's claim boils down to

- If you want to install and run a python script on a wide range of Linux
distributions you have to stay compatible with Python 2.4.

- Users of languages competing with Python tend to avoid applications that
use Python, even if that usage is mostly under the hood -- but they don't
mind using a program written in C.
So I really think this is a barrier to entry to Python 3 that we could
do without - it's the only reason I do all of my Python work in 2.x, I
would jump at migrating to Python 3 if it was easily available on
Fedora.

Is there a solution to this that anyone knows of?

More practical people than Zed (or his online persona) don't see it as a
problem.
Has Zed jumped to conclusions?

He jumped indeed, to another language, and he will do it again, but not
without telling the world.

I think migration from 2.x to 3.x will be hard for large infrastructures but
that doesn't seem to be your concern.

Peter

PS: Is the Paul Graham quote real?
 
P

Philip Semanchuk

Hi all,

I've just seen this: http://sheddingbikes.com/posts/1285063820.html

Whatever you think of Zed Shaw (author of the Mongrel Ruby server and
relatively recent Python convert), he has a very good point in this. I
run Fedora 12 on my home computers, and find it far too much hassle to
try to get Python 3 installed. Even the 2.x's are behind - IIRC think
it currently uses 2.5.

Don't know about Python 3 on Fedora (I use a Mac), but distrowatch.org reports that Fedora has been using Python >= 2.6 since Fedora 11 which was released in June of 2009.

http://distrowatch.com/table.php?distribution=fedora


bye
Philip
 
P

Philip Semanchuk

Hi all,

I've just seen this: http://sheddingbikes.com/posts/1285063820.html

Whatever you think of Zed Shaw (author of the Mongrel Ruby server and
relatively recent Python convert), he has a very good point in this. I
run Fedora 12 on my home computers, and find it far too much hassle to
try to get Python 3 installed. Even the 2.x's are behind - IIRC think
it currently uses 2.5.

Don't know about Python 3 on Fedora (I use a Mac), but distrowatch.org reports that Fedora has been using Python >= 2.6 since Fedora 11 which was released in June of 2009.

http://distrowatch.com/table.php?distribution=fedora


bye
Philip
 
D

Diez B. Roggisch

Ant said:
Hi all,

I've just seen this: http://sheddingbikes.com/posts/1285063820.html

Whatever you think of Zed Shaw (author of the Mongrel Ruby server and
relatively recent Python convert), he has a very good point in this. I
run Fedora 12 on my home computers, and find it far too much hassle to
try to get Python 3 installed. Even the 2.x's are behind - IIRC think
it currently uses 2.5.

So I really think this is a barrier to entry to Python 3 that we could
do without - it's the only reason I do all of my Python work in 2.x, I
would jump at migrating to Python 3 if it was easily available on
Fedora.

Is there a solution to this that anyone knows of? Has Zed jumped to
conclusions? Have I?

I think he has a very valid point. I've been arguing quite a few times
here that e.g. the stupid splitting up of python and python-dev packages
that a great deal of people trip over should go away.

But usually people here seem to think that other package management
systems are the way to go, and python itself must integrate with
them. E.g. providing dependency information compatible to them and their policies.

I think that's bonkers. You can't support every new kid on the block
claiming to be the shizzle in package management. Or the next distro
with it's own packaging policies. And of course the overall release
planning that says "we use that ancient stable version not supported for
years anymore, because it's true & tested for us".

IMHO the solution to this is the way Apple does it: they have a System
Python. Don't mess with it. Seriously. Don't.

But you can install as many other Python versions as you want, or even bundle one
with your own app that depends on it.

People object to this usually for two reasons:

- additional waste of disk-space. Seriously? A thorough visit of
youporn.com most probably fills your browser cache with more data
than all possibly python installations ever can.

- security issues through aged libraries. Certainly a valid point, but
then this problem is not limited to Python and needs a more universal
solution: Meta-information gathering about binary versions of
libraries, and (safe) upgrades for these. Maybe. I haven't given much
thought to this, but I think it's an OS thing more than a package
distro thing.

So, in summary, I think if anything, Python should liberate itself from
the reigns of distro package management, and fix whatever issues there
are with setuptools (or distutils or pip or distribute or whatever the
cool kids use these days). And then make people use that to work with
Python-packages, potentially even in individual, isolated VirtualEnvs
because of package version conflicts.

Diez
 
M

Michele Simionato

Hi all,

I've just seen this:http://sheddingbikes.com/posts/1285063820.html

Whatever you think of Zed Shaw (author of the Mongrel Ruby server and
relatively recent Python convert), he has a very good point in this. I
run Fedora 12 on my home computers, and find it far too much hassle to
try to get Python 3 installed. Even the 2.x's are behind - IIRC think
it currently uses 2.5.

So I really think this is a barrier to entry to Python 3 that we could
do without - it's the only reason I do all of my Python work in 2.x, I
would jump at migrating to Python 3 if it was easily available on
Fedora.

Is there a solution to this that anyone knows of? Has Zed jumped to
conclusions? Have I?

Zed's approach (removing Python when it could just have downgraded to
Python 2.4) does not look very smart to me. The post itself is pretty
much bullshit. Yes, there are
Linux distributions with old Python versions out there. Yes, if you
don't want to install a newer Python on such distributions you need to
take in account this fact
and not to use modern features of Python. But the situation is not
different for
other languages such as Perl or Ruby. C is free from this problem
because it is a very old and stable language. There is no more content
in that post and everybody should already know such basic facts.


Michele Simionato
 
D

D'Arcy J.M. Cain

On Tue, 21 Sep 2010 15:23:42 +0200
So, in summary, I think if anything, Python should liberate itself from
the reigns of distro package management, and fix whatever issues there
are with setuptools (or distutils or pip or distribute or whatever the
cool kids use these days). And then make people use that to work with
Python-packages, potentially even in individual, isolated VirtualEnvs
because of package version conflicts.

Install NetBSD package tools (http://pkgsrc.org/) and install as many
versions as you like. A symlink takes care of your default and you can
use .pth files to have separate local libraries. Pkgsrc is designed to
run on all OSs including Linux distros.
 
D

David Cournapeau

I think he has a very valid point. I've been arguing quite a few times
here that e.g. the stupid splitting up of python and python-dev packages
that a great deal of people trip over should go away.

It is not stupid, it makes a lot of sense when you know the
distributions in question. It means you have a consistent behavior
independently of the language. So of course if you don't care about
the rest of the ecosystem, you will think it is useless overhead.

Also, I doubt that the issue is python vs python-dev - of course,
given that the exact issues are not explained, we can only play guess
games.
But usually people here seem to think that other package management
systems are the way to go, and python itself must integrate with
them. E.g. providing dependency information compatible to them and their policies.

I think that's bonkers. You can't support every new kid on the block
claiming to be the shizzle in package management. Or the next distro
with it's own packaging policies. And of course the overall release
planning that says "we use that ancient stable version not supported for
years anymore, because it's true & tested for us".

IMHO the solution to this is the way Apple does it: they have a System
Python. Don't mess with it. Seriously. Don't.

Apple's python have caused more issues than all distributions
altogether for Numpy and scipy, at least as far as python itself is
concerned. It is very confusing for many end-users.
So, in summary, I think if anything, Python should liberate itself from
the reigns of distro package management, and fix whatever issues there
are with setuptools (or distutils or pip or distribute or whatever the
cool kids use these days). And then make people use that to work with
Python-packages, potentially even in individual, isolated VirtualEnvs
because of package version conflicts.

This kind of thinking mostly shows a poor understanding of complex
deployment scenario. If everybody worked like that, you would quickly
be unable to build anything stable. True, conflicts are sometimes
unavoidable, but if every library keeps changing your only solution is
isolated environments, you quickly have a mess of a system where many
combinations of libraries are not possible.

The fact is that if you need to assemble softwares from many sources,
you need proper engineering. Isolated environments do not help you
much with that. Maybe you should consider that distros have an
experience that you don't,

cheers,

David
 
N

Neal Becker

Ant said:
Hi all,

I've just seen this: http://sheddingbikes.com/posts/1285063820.html

Whatever you think of Zed Shaw (author of the Mongrel Ruby server and
relatively recent Python convert), he has a very good point in this. I
run Fedora 12 on my home computers, and find it far too much hassle to
try to get Python 3 installed. Even the 2.x's are behind - IIRC think
it currently uses 2.5.

So I really think this is a barrier to entry to Python 3 that we could
do without - it's the only reason I do all of my Python work in 2.x, I
would jump at migrating to Python 3 if it was easily available on
Fedora.

Is there a solution to this that anyone knows of? Has Zed jumped to
conclusions? Have I?

Current fedora release (13) has python3 available.
 
D

Diez B. Roggisch

David Cournapeau said:
It is not stupid, it makes a lot of sense when you know the
distributions in question. It means you have a consistent behavior
independently of the language. So of course if you don't care about
the rest of the ecosystem, you will think it is useless overhead.

The point is that the distro doesn't care about the python eco
system. Which is what I care about, and a lot of people who want to ship
software.

Don't get me wrong: I'm a Linux user for way over a decade, I enjoy the
package management in providing a consistent distribution.

What I'm talking about here are 3rd-party
developers/companies/whatever, and people who want to install software
that requires recent versions of packages *not* provided by their
distros. I should have made that point clearer I guess.
Also, I doubt that the issue is python vs python-dev - of course,
given that the exact issues are not explained, we can only play guess
games.

The problems explained are simply outdated and crippled python
versions.

And to me, a python version installed that has not the
distutils module is *crippled*. You can rationalize that as much as you
want through some package philosophy saying "we don't ship development
related files", but to me a simple installation instruction that says

"run 'python setup.py install'"

which fails because of such a (debatable) decision sucks. Yes, there are
corner-cases when you need GCC to compile an extension. But that's still
catering to the 80% or even more (I'm guessing here) of pure-python packages.

Of course, in a ideal world, distutils would hook into the distros
dependency system + simply say "please install python-dev first".

But I'm not convinced that putting the weight here on the shoulders of
the python-communtiy to deal with arbirtray decisions of the dozen or
how many distros + packaging schemes out there is possible - and helpful.
Apple's python have caused more issues than all distributions
altogether for Numpy and scipy, at least as far as python itself is
concerned. It is very confusing for many end-users.

Exactly. My point is that I can safely install a second version besides
it, and don't use the system's python that is there and kept stable for
the system's own belongings.
This kind of thinking mostly shows a poor understanding of complex
deployment scenario. If everybody worked like that, you would quickly
be unable to build anything stable. True, conflicts are sometimes
unavoidable, but if every library keeps changing your only solution is
isolated environments, you quickly have a mess of a system where many
combinations of libraries are not possible.

And now you are already in a mess of a system where many combinations of
libraries are not possible. And the thinking of "one version set to rule
them all" shows a poor understanding of the need for legacy code's
specific version requirements, as well as the need for fluctuating,
(sometimes cutting edge) libraries.

Does that suck? Sure. But the answer can't be a system that ties you in
with *outdated* software for *years*. And it can't be Python's or it's
communities burden to provide the solutions for the mass of distro providers.

I *am* in the situation to need to deploy
a TurboGears2 application every day on a debian lenny machine. Guess
what that means:

root@web01 / 15:55:43 # aptitude install python-turbogears -Vs
Reading package lists... Done
Building dependency tree
Reading state information... Done
Reading extended state information
Initializing package states... Done
Reading task descriptions... Done
The following NEW packages will be installed:
python-cheetah{a} [2.0.1-2] python-cherrypy{a} [2.3.0-1] python-configobj{a} [4.5.2-1] python-crypto{a} [2.0.1+dfsg1-2.3+lenny0] python-decoratortools{a} [1.7-1]
python-dispatch{a} [0.5a.svn20080510-1] python-dns{a} [2.3.3-2] python-elementtree{a} [1.2.6-12] python-elixir{a} [0.6.0-1] python-flup{a} [1.0-1] python-formencode{a} [1.0.1-1]
python-kid{a} [0.9.6-1] python-mysqldb{a} [1.2.2-7] python-nose{a} [0.10.3-1] python-openid{a} [2.2.1-2] python-openssl{a} [0.7-2] python-paste{a} [1.7.1-1] python-pastedeploy{a} [1.3.2-1]
python-pastescript{a} [1.6.3-1] python-pkg-resources{a} [0.6c8-4] python-protocols{a} [1.0a.svn20070625-2] python-pysqlite2{a} [2.4.1-1] python-scgi{a} [1.12-0.2]
python-setuptools{a} [0.6c8-4] python-simplejson{a} [1.9.2-1] python-sqlalchemy{a} [0.4.7p1-2] python-sqlobject{a} [0.10.2-3] python-turbogears [1.0.4.4-1] python-turbojson{a} [1.1.2-1]
python-turbokid{a} [1.0.4-2] python-webpy{a} [0.230-1]
0 packages upgraded, 31 newly installed, 0 to remove and 0 not upgraded.
Need to get 4936kB/5188kB of archives. After unpacking 29.3MB will be used.

Turbogears 1.0.4? Thank you - can't use that.
The fact is that if you need to assemble softwares from many sources,
you need proper engineering. Isolated environments do not help you
much with that. Maybe you should consider that distros have an
experience that you don't,

I don't deny them their experience. Do you deny the experience of other
people with *other* needs? As I already said: I don't propose to ditch
the package management. I'm all fine with a distro that carefully
selects it's packages and dependencies.

I'm proposing that it should be able to have "system_python" and
"current_python" or whatever you call it on one machine, and then let
them live as happy co-existing eco-systems.

This is not a well planned out thing. And it would certainly help to
have a central meta-information registry to know exactly *where* and
*which* version of all dependend libraries are installed. And thus make
it possible to say "ok, we upgrade all system copies of openssl, and
btw. you have these additional locations where you should do that as
well." Or something like this.

But the current state of affairs forces me to use customized versions of
python on my machines, with virtualenvs including specific versions of
libraries, bypassing the distro anyway.

Why not try and make this more standardized & thus manageable?

Diez
 
A

Antoine Pitrou

On Tue, 21 Sep 2010 17:59:27 +0200
The problems explained are simply outdated and crippled python
versions.

And to me, a python version installed that has not the
distutils module is *crippled*. You can rationalize that as much as you
want through some package philosophy saying "we don't ship development
related files", but to me a simple installation instruction that says

comp.lang.python doesn't handle Linux packaging, so why don't you
complain to your distro instead? Ranting on this group has zero chance
of fixing the problem.

Thank you

Antoine.
 
D

David Cournapeau

David Cournapeau said:
It is not stupid, it makes a lot of sense when you know the
distributions in question. It means you have a consistent behavior
independently of the language. So of course if you don't care about
the rest of the ecosystem, you will think it is useless overhead.

The point is that the distro doesn't care about the python eco
system. Which is what I care about, and a lot of people who want to ship
software.

Don't get me wrong: I'm a Linux user for way over a decade, I enjoy the
package management in providing a consistent distribution.

What I'm talking about here are 3rd-party
developers/companies/whatever, and people who want to install software
that requires recent versions of packages *not* provided by their
distros. I should have made that point clearer I guess.
Also, I doubt that the issue is python vs python-dev - of course,
given that the exact issues are not explained, we can only play guess
games.

The problems explained are simply outdated and crippled python
versions.

And to me, a python version installed that has not the
distutils module is *crippled*. You can rationalize that as much as you
want through some package philosophy saying "we don't ship development
related files", but to me a simple installation instruction that says

"run 'python setup.py install'"

which fails because of such a (debatable) decision sucks. Yes, there are
corner-cases when you need GCC to compile an extension. But that's still
catering to the 80% or even more (I'm guessing here) of pure-python packages.

Of course, in a ideal world, distutils would hook into the distros
dependency system + simply say "please install python-dev first".

But I'm not convinced that putting the weight here on the shoulders of
the python-communtiy to deal with arbirtray decisions of the dozen or
how many distros + packaging schemes out there is possible - and helpful.
Apple's python have caused more issues than all distributions
altogether for Numpy and scipy, at least as far as python itself is
concerned. It is very confusing for many end-users.

Exactly. My point is that I can safely install a second version besides
it, and don't use the system's python that is there and kept stable for
the system's own belongings.
This kind of thinking mostly shows a poor understanding of complex
deployment scenario. If everybody worked like that, you would quickly
be unable to build anything stable. True, conflicts are sometimes
unavoidable, but if every library keeps changing your only solution is
isolated environments, you quickly have a mess of a system where many
combinations of libraries are not possible.

And now you are already in a mess of a system where many combinations of
libraries are not possible. And the thinking of "one version set to rule
them all" shows a poor understanding of the need for legacy code's
specific version requirements, as well as the need for fluctuating,
(sometimes cutting edge) libraries.

Does that suck? Sure. But the answer can't be a system that ties you in
with *outdated* software for *years*. And it can't be Python's or it's
communities burden to provide the solutions for the mass of distro providers.

I *am* in the situation to need to deploy
a TurboGears2 application every day on a debian lenny machine. Guess
what that means:

root@web01 / 15:55:43 # aptitude install python-turbogears -Vs
Reading package lists... Done
Building dependency tree
Reading state information... Done
Reading extended state information
Initializing package states... Done
Reading task descriptions... Done
The following NEW packages will be installed:
 python-cheetah{a} [2.0.1-2]  python-cherrypy{a} [2.3.0-1]  python-configobj{a} [4.5.2-1]  python-crypto{a} [2.0.1+dfsg1-2.3+lenny0]  python-decoratortools{a} [1.7-1]
 python-dispatch{a} [0.5a.svn20080510-1]  python-dns{a} [2.3.3-2]  python-elementtree{a} [1.2.6-12]  python-elixir{a} [0.6.0-1]  python-flup{a} [1.0-1]  python-formencode{a} [1.0.1-1]
 python-kid{a} [0.9.6-1]  python-mysqldb{a} [1.2.2-7]  python-nose{a} [0.10.3-1]  python-openid{a} [2.2.1-2]  python-openssl{a} [0.7-2]  python-paste{a} [1.7.1-1]  python-pastedeploy{a} [1..3.2-1]
 python-pastescript{a} [1.6.3-1]  python-pkg-resources{a} [0.6c8-4]  python-protocols{a} [1.0a.svn20070625-2]  python-pysqlite2{a} [2.4.1-1]  python-scgi{a} [1.12-0.2]
 python-setuptools{a} [0.6c8-4]  python-simplejson{a} [1.9.2-1]  python-sqlalchemy{a} [0.4.7p1-2]  python-sqlobject{a} [0.10.2-3]  python-turbogears [1.0.4.4-1]  python-turbojson{a} [1.1.2-1]
 python-turbokid{a} [1.0.4-2]  python-webpy{a} [0.230-1]
0 packages upgraded, 31 newly installed, 0 to remove and 0 not upgraded.
Need to get 4936kB/5188kB of archives. After unpacking 29.3MB will be used.

Turbogears 1.0.4? Thank you - can't use that.
The fact is that if you need to assemble softwares from many sources,
you need proper engineering. Isolated environments do not help you
much with that. Maybe you should consider that distros have an
experience that  you don't,

I don't deny them their experience. Do you deny the experience of other
people with *other* needs? As I already said: I don't propose to ditch
the package management. I'm all fine with a distro that carefully
selects it's packages and dependencies.

In your previous email, you were "suggesting" that we should make
people use a specific set of python-specific tools. That does not
sound very consistent with the idea of letting people choose what they
want to use.

FWIW, I think those tools are already pushed too aggressively,
confusing many people who use pip, virtualenv, etc... for dubious
reasons ("I read somewhere that I should use this"), and causing
numerous bug reports on the numpy/scipy mailing lists.

David
 
D

Diez B. Roggisch

Antoine Pitrou said:
On Tue, 21 Sep 2010 17:59:27 +0200


comp.lang.python doesn't handle Linux packaging, so why don't you
complain to your distro instead? Ranting on this group has zero chance
of fixing the problem.

comp.lang.python frequently deals with problems caused by this and other
distro-related issues. Your are welcome to not participate on these
discussions.

The state of affairs isn't ideal, and there is improvement options on
all sides. I'm just astonished that people seem to think that distros in
general are better and should be left alone, than what a more python
centric solution could be.

Diez
 
D

Diez B. Roggisch

David Cournapeau said:
In your previous email, you were "suggesting" that we should make
people use a specific set of python-specific tools. That does not
sound very consistent with the idea of letting people choose what they
want to use.

FWIW, I think those tools are already pushed too aggressively,
confusing many people who use pip, virtualenv, etc... for dubious
reasons ("I read somewhere that I should use this"), and causing
numerous bug reports on the numpy/scipy mailing lists.

What I suggested was that there is a python-centric solution for
managing dependencies for users of Linux, Windows and OSX alike. And
which offers recent versions of python to anybody. A lot of wishful
thinking, admittedly. But less than trying to deal with *all* the
diffences in code, style and politics of various distributions.

I was not suggesting that this solution itself be manifold. The sad
truth is that there currently seem to be various attempts to improve or
even fix perceived or real shortcomings of of distutils or probably even
more setuptools, and this is from an outside perspective a waste. But
then, the survival of the fittest, so to speak, requires the death of
some that are unfit. It's hard to say which approach will "win". So we
seem to be stuck with that at least for a while.

zc.buildout, btw, seems to be going into the general direction of doing
a lot (if not everything) itself. Including complete 3rd-party-packages
and their builds.

http://pypi.python.org/pypi/zc.buildout#buildout-examples

For historic reasons I personally haven't used it yet. But it seems to
scratch an itch, don't you think?

Regarding the "dubiousness" of these reasons - I'm happy if you don't
feel the pain. Good for you. I do, and frankly virtualenv is a
life-saver for me in many situations. I wish it was part of core python,
to create isolated environments. It sure is better than the Java-way of
relying on environment-variables or giant sized commandline argument
lists to specify specific version sets.

However, *both* solutions cater to the obvious need of something other
than pre-packaged versions in the distro. Is that such an abnorm wish?
Amazing.

Diez
 
E

Emile van Sebille

On 9/21/2010 5:29 AM Ant said...
Is there a solution to this that anyone knows of? Has Zed jumped to
conclusions? Have I?

I'd say the error was in selecting something other that the lowest
common subset of python functions when designing and writing a python
version dependent Mongrel2. There are certainly ways (and perhaps
compatibility libraries as well) to write python at a 2.4 (or 2.2) level
to sidestep installation issues related to python versions. Targeting a
wide range of installation platforms without taking that into account is
the problem. When you issue a 2.5 dependent version and find it easier
to convert it all to C rather than write the (relatively minor) python
fixes, it speaks to the level of python competency as well.

What if I wrote a 64bit app and complained that in 32bit environments it
didn't do the right thing?

Emile
 
N

Ned Deily

[email protected] (Diez B. Roggisch) said:
The point is that the distro doesn't care about the python eco
system. Which is what I care about, and a lot of people who want to ship
software.

I don't think that is totally accurate or fair. There is regular
participation in the python-dev group by packagers from various distros.
For example, Matthias Klose is not only the primary Debian Python
maintainer, he is also has commit privileges for Python itself and he
regularly contributes patches. Currently, I see current Python 2.6.6
and 3.1.2 packages in Debian testing with current Python 2.7 and Python
3.2 alpha coming along in Debian experimental.
 
A

Ant

On Sep 21, 2010, at 8:29 AM, Ant wrote:
Don't know about Python 3 on Fedora (I use a Mac), but distrowatch.org reports that Fedora has been using Python >= 2.6 since Fedora 11 which was released in June of 2009.

Yes you are right - I've checked on my home machine, and it is indeed
2.6. Still, no Python 3 unless I upgrade to Fedora 13, and upgrading
an OS in order to get the latest version of one package is a bit much!

I know that this is a distribution issue, and not strictly a Python
one, but the issue exists primarily because python is such a
successful language it has become deeply embedded in linux
distributions, and so there is now a lot of work involved in checking
that a python upgrade doesn't break things.

Some solution involving virtualenv is perhaps a possibility.
 
D

Diez B. Roggisch

Ned Deily said:
I don't think that is totally accurate or fair. There is regular
participation in the python-dev group by packagers from various distros.
For example, Matthias Klose is not only the primary Debian Python
maintainer, he is also has commit privileges for Python itself and he
regularly contributes patches. Currently, I see current Python 2.6.6
and 3.1.2 packages in Debian testing with current Python 2.7 and Python
3.2 alpha coming along in Debian experimental.

I'm sorry, this was worded stronger than appropriate. Let me rephrase:
The distros have their own (perfectly reasonable) agenda. Yet this may
still conflict with the needs of users regarding e.g. contemporary
package availability. I already mentioned in another post that the
current debian stable features TurboGears 1.0.4. Which is by itself a
problem, but also ties a lot of dependencies to "ancient" versions. So
frankly, if I want to run (which in fact I do) a perfecly fine
TurboGears2 system on lenny, I'm *forced* to use virtualenv and
consorts.

In other words: I think that the goals of a linux distribution don't
necessarily are the same than those of a python package maintainer. In
an ideal world, they would be congruent. But they aren't. My wish would
be that unless that this congruency is achieved (which isn't feasible I fear), a
python-only package management solution can be implemented and be
adopted even by the distros without neglecting their own issues.


Diez
 
N

Ned Deily

[email protected] (Diez B. Roggisch) said:
I'm sorry, this was worded stronger than appropriate. Let me rephrase:
The distros have their own (perfectly reasonable) agenda. Yet this may
still conflict with the needs of users regarding e.g. contemporary
package availability. I already mentioned in another post that the
current debian stable features TurboGears 1.0.4. Which is by itself a
problem, but also ties a lot of dependencies to "ancient" versions. So
frankly, if I want to run (which in fact I do) a perfecly fine
TurboGears2 system on lenny, I'm *forced* to use virtualenv and
consorts.

In other words: I think that the goals of a linux distribution don't
necessarily are the same than those of a python package maintainer. In
an ideal world, they would be congruent. But they aren't. My wish would
be that unless that this congruency is achieved (which isn't feasible I
fear), a
python-only package management solution can be implemented and be
adopted even by the distros without neglecting their own issues.

Thanks for the clarification.

While I too wish such a general python-only package management solution
existed, the big blocker is and will remain the effort required to
manage the package quirks (local patches), package dependencies,
platform differences, unit testing, unit testing across multiple
platforms, system testing (packages with their dependencies - think
Django or Zope), packaging and distribution. In short think of all the
steps and infrastructure that go into, say, the Debian development
process, which - not to slight the others out there - I consider to be
the most mature and rigorous of the large open source integration
projects. While much of it can be (and, to some extent, already is)
automated, there is still a strong human involvement required at nearly
all stages. Repeatable and predictable does not necessarily mean
totally automated or scalable. Getting to where Debian is today has
required an almost superhuman and ongoing effort and dedication over
many years by many people. Having been intimately involved over the
years in various software release processes most involving hundreds of
people, I remain somewhat awestruck in what they have accomplished. I'm
not optimistic that the resources or leadership are available in the
community to make something similar happen for Python packages. It's an
enormous task.
 
J

John Nagle

I don't think that is totally accurate or fair. There is regular
participation in the python-dev group by packagers from various distros.
For example, Matthias Klose is not only the primary Debian Python
maintainer, he is also has commit privileges for Python itself and he
regularly contributes patches. Currently, I see current Python 2.6.6
and 3.1.2 packages in Debian testing with current Python 2.7 and Python
3.2 alpha coming along in Debian experimental.

Debian seems to have a well worked out Python policy:

http://www.debian.org/doc/packaging-manuals/python-policy/

They address the need to have multiple versions of Python on
the same machine in a reasonably clean way. So do the ActiveState
people, although their way is different than the Debian way.

Trying to make Python play well with distros is probably
more useful than trying to make distros play well with Python.
Rather than fancier "distutils" or "eggs", I'd suggest developing
tools that take in "setup.py" files and make Windows installers,
RPMs, or whatever the platform likes.

John Nagle
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,012
Latest member
RoxanneDzm

Latest Threads

Top