David Cournapeau said:
It is not stupid, it makes a lot of sense when you know the
distributions in question. It means you have a consistent behavior
independently of the language. So of course if you don't care about
the rest of the ecosystem, you will think it is useless overhead.
The point is that the distro doesn't care about the python eco
system. Which is what I care about, and a lot of people who want to ship
software.
Don't get me wrong: I'm a Linux user for way over a decade, I enjoy the
package management in providing a consistent distribution.
What I'm talking about here are 3rd-party
developers/companies/whatever, and people who want to install software
that requires recent versions of packages *not* provided by their
distros. I should have made that point clearer I guess.
Also, I doubt that the issue is python vs python-dev - of course,
given that the exact issues are not explained, we can only play guess
games.
The problems explained are simply outdated and crippled python
versions.
And to me, a python version installed that has not the
distutils module is *crippled*. You can rationalize that as much as you
want through some package philosophy saying "we don't ship development
related files", but to me a simple installation instruction that says
"run 'python setup.py install'"
which fails because of such a (debatable) decision sucks. Yes, there are
corner-cases when you need GCC to compile an extension. But that's still
catering to the 80% or even more (I'm guessing here) of pure-python packages.
Of course, in a ideal world, distutils would hook into the distros
dependency system + simply say "please install python-dev first".
But I'm not convinced that putting the weight here on the shoulders of
the python-communtiy to deal with arbirtray decisions of the dozen or
how many distros + packaging schemes out there is possible - and helpful.
Apple's python have caused more issues than all distributions
altogether for Numpy and scipy, at least as far as python itself is
concerned. It is very confusing for many end-users.
Exactly. My point is that I can safely install a second version besides
it, and don't use the system's python that is there and kept stable for
the system's own belongings.
This kind of thinking mostly shows a poor understanding of complex
deployment scenario. If everybody worked like that, you would quickly
be unable to build anything stable. True, conflicts are sometimes
unavoidable, but if every library keeps changing your only solution is
isolated environments, you quickly have a mess of a system where many
combinations of libraries are not possible.
And now you are already in a mess of a system where many combinations of
libraries are not possible. And the thinking of "one version set to rule
them all" shows a poor understanding of the need for legacy code's
specific version requirements, as well as the need for fluctuating,
(sometimes cutting edge) libraries.
Does that suck? Sure. But the answer can't be a system that ties you in
with *outdated* software for *years*. And it can't be Python's or it's
communities burden to provide the solutions for the mass of distro providers.
I *am* in the situation to need to deploy
a TurboGears2 application every day on a debian lenny machine. Guess
what that means:
root@web01 / 15:55:43 # aptitude install python-turbogears -Vs
Reading package lists... Done
Building dependency tree
Reading state information... Done
Reading extended state information
Initializing package states... Done
Reading task descriptions... Done
The following NEW packages will be installed:
python-cheetah{a} [2.0.1-2] python-cherrypy{a} [2.3.0-1] python-configobj{a} [4.5.2-1] python-crypto{a} [2.0.1+dfsg1-2.3+lenny0] python-decoratortools{a} [1.7-1]
python-dispatch{a} [0.5a.svn20080510-1] python-dns{a} [2.3.3-2] python-elementtree{a} [1.2.6-12] python-elixir{a} [0.6.0-1] python-flup{a} [1.0-1] python-formencode{a} [1.0.1-1]
python-kid{a} [0.9.6-1] python-mysqldb{a} [1.2.2-7] python-nose{a} [0.10.3-1] python-openid{a} [2.2.1-2] python-openssl{a} [0.7-2] python-paste{a} [1.7.1-1] python-pastedeploy{a} [1.3.2-1]
python-pastescript{a} [1.6.3-1] python-pkg-resources{a} [0.6c8-4] python-protocols{a} [1.0a.svn20070625-2] python-pysqlite2{a} [2.4.1-1] python-scgi{a} [1.12-0.2]
python-setuptools{a} [0.6c8-4] python-simplejson{a} [1.9.2-1] python-sqlalchemy{a} [0.4.7p1-2] python-sqlobject{a} [0.10.2-3] python-turbogears [1.0.4.4-1] python-turbojson{a} [1.1.2-1]
python-turbokid{a} [1.0.4-2] python-webpy{a} [0.230-1]
0 packages upgraded, 31 newly installed, 0 to remove and 0 not upgraded.
Need to get 4936kB/5188kB of archives. After unpacking 29.3MB will be used.
Turbogears 1.0.4? Thank you - can't use that.
The fact is that if you need to assemble softwares from many sources,
you need proper engineering. Isolated environments do not help you
much with that. Maybe you should consider that distros have an
experience that you don't,
I don't deny them their experience. Do you deny the experience of other
people with *other* needs? As I already said: I don't propose to ditch
the package management. I'm all fine with a distro that carefully
selects it's packages and dependencies.
I'm proposing that it should be able to have "system_python" and
"current_python" or whatever you call it on one machine, and then let
them live as happy co-existing eco-systems.
This is not a well planned out thing. And it would certainly help to
have a central meta-information registry to know exactly *where* and
*which* version of all dependend libraries are installed. And thus make
it possible to say "ok, we upgrade all system copies of openssl, and
btw. you have these additional locations where you should do that as
well." Or something like this.
But the current state of affairs forces me to use customized versions of
python on my machines, with virtualenvs including specific versions of
libraries, bypassing the distro anyway.
Why not try and make this more standardized & thus manageable?
Diez