Eggs, VirtualEnv, and Apt - best practices?

S

Scott Sharkey

Hello all,

Our development group at work seems to be heading towards adopting
python as one of our standard "systems languages" for internal
application development (yeah!). One of the issues that's come up is
the problem with apt (deb packages) vs eggs, vs virtual environments.
We're probably gonna end up using Pylons or TurboGears for web-based
apps, and I've recommended virtualenv, but one of the other developers
has had some "inconsistencies" when mixing systems with python installed
from apt (all our servers are debian or ubuntu based) vs when installed
under virtualenv.

I have basically recommended that we only install the python base (core
language) from apt, and that everything else should be installed into
virtual environments. But I wanted to check to see how other enterprises
are handling this issue? Are you building python from scratch, or using
specific sets of .deb packages, or some other process.

Any insight into the best way to have a consistent, repeatable,
controllable development and production environment would be much
appreciated.

Suggestions on build/rollout tools (like zc.buildout, Paver, etc) would
also be appreciated.

Thanks!!!

-Scott
 
D

Dmitry S. Makovey

Scott said:
Any insight into the best way to have a consistent, repeatable,
controllable development and production environment would be much
appreciated.

you have just described OS package building ;)

I can't speak for everybody, but supporting multiple platforms (PHP, Perl,
Python, Java) we found that the only way to stay consistent is to use OS
native packaging tools (in your case apt and .deb ) and if you're missing
something - roll your own package. After a while you accumulate plenty of
templates to chose from when you need yet-another-library not available
upstream in your preferred package format. Remember that some python tools
might depend on non-python packages, so the only way to make sure all that
is consistent across environment - use unified package management.

Sorry, not specific pointers though as we're redhat shop and debs are not
our everyday business.
 
F

Fredrik Lundh

Dmitry said:
you have just described OS package building ;)

I can't speak for everybody, but supporting multiple platforms (PHP, Perl,
Python, Java) we found that the only way to stay consistent is to use OS
native packaging tools (in your case apt and .deb ) and if you're missing
something - roll your own package. After a while you accumulate plenty of
templates to chose from when you need yet-another-library not available
upstream in your preferred package format. Remember that some python tools
might depend on non-python packages, so the only way to make sure all that
is consistent across environment - use unified package management.

you're speaking for lots of organizations, at least.

rpm/debs from supplier's repository
subversion (or equivalent) -> locally built rpm/debs
+ organization's favourite deployment tools
---------------------------------
deployed application

</F>
 
D

Diez B. Roggisch

Scott said:
Hello all,

Our development group at work seems to be heading towards adopting
python as one of our standard "systems languages" for internal
application development (yeah!). One of the issues that's come up is
the problem with apt (deb packages) vs eggs, vs virtual environments.
We're probably gonna end up using Pylons or TurboGears for web-based
apps, and I've recommended virtualenv, but one of the other developers
has had some "inconsistencies" when mixing systems with python installed
from apt (all our servers are debian or ubuntu based) vs when installed
under virtualenv.
I have basically recommended that we only install the python base (core
language) from apt, and that everything else should be installed into
virtual environments. But I wanted to check to see how other enterprises
are handling this issue? Are you building python from scratch, or using
specific sets of .deb packages, or some other process.

Any insight into the best way to have a consistent, repeatable,
controllable development and production environment would be much
appreciated.

This is the exact way we are deploying our software. You can even use
the virtualenv --no-site-packages option to completely isolate the VE
from the underlying system site-packages.

I would recommend that all you install into the system python is
virtualenv, and maybe some uncritical C-modules such as psycopg2.

Currently there is much going on regarding setuptools. A fork,
"Distribute" has been announced, and "pyinstall" by Ian Bicking, an
easy_install replacement that deals with some of it's ancestors
shortcomings.

Then people (shameless plug warning: including me) are working on
"eggbasket", a PYPI-clone that allows to have a local repository of eggs
so that you don't fall prey to old versions not longer available on PYPI.

Eggbasket will feature "easterbunny", a tool to publish a virtualenv as
whole to the eggbasket and also keep track of the precise version set
uploaded. Through a specific url on eggbasket you can then limit the
contents of eggbasket to that exact version set - which helps dealing
with subtle (or not so subtle) version conflicts.

I personally can say that I'm really thrilled by the prospects of all
these developments. And as much bad rap as setuptools had here and
elsewhere, sometimes rightfully so - it certainly does a lot of stuff
right, and pushing the whole stack of tools to manage software
dependencies in Python to the next level is of great value.

Diez
 
D

Diez B. Roggisch

Dmitry said:
you have just described OS package building ;)

I can't speak for everybody, but supporting multiple platforms (PHP, Perl,
Python, Java) we found that the only way to stay consistent is to use OS
native packaging tools (in your case apt and .deb ) and if you're missing
something - roll your own package. After a while you accumulate plenty of
templates to chose from when you need yet-another-library not available
upstream in your preferred package format. Remember that some python tools
might depend on non-python packages, so the only way to make sure all that
is consistent across environment - use unified package management.

That this is a desirable goal can't be argued against. Yet two big
hurdles make it often impractical to be dogmatic about that:

- different OS. I for one don't know about a package management tool
for windows. And while our servers use Linux (and I as developer as
well), all the rest of our people use windows. No use telling them to
apt-get instal python-imaging.

- keeping track of recent developments. In the Python webframework
world for example (which the OP seems to be working with), things move
fast. Or extremly slow, regarding releases. Take Django - until 2 month
ago, there hasn't been a stable release for *years*. Virtually everybody
was working with trunk. And given the rather strict packaging policies
of debian and consorts, you'd be cut off of recent developments as well
as of bugfixes.



Diez
 
D

Diez B. Roggisch

Nick said:
I'll admit to not knowing what you mean by virtual environment...

virtualenv is a simple tool to create isolated python environments where
you can install whatever packages you like without interfering with
other installations, even allowing to install conflicting package
versions (per VE of course they must be without conflicts).

Diez
 
S

Scott Sharkey

Except that we do need multiple different environments on one server,
and also have issues where our servers may be Windows.
>
> That this is a desirable goal can't be argued against. Yet two big
hurdles make it often impractical to be dogmatic about that:
>
> - different OS. I for one don't know about a package management tool
for windows. And while our servers use Linux (and I as developer as
well), all the rest of our people use windows. No use telling them to
apt-get instal python-imaging.

Exactly!
> - keeping track of recent developments. In the Python webframework
world for example (which the OP seems to be working with), things move
fast. Or extremly slow, regarding releases. Take Django - until 2 month
ago, there hasn't been a stable release for *years*. Virtually everybody
was working with trunk. And given the rather strict packaging policies
of debian and consorts, you'd be cut off of recent developments as well
as of bugfixes.

Very much the case. Most of debian's packages for python are woefully
out of date, it seems. And then we're at the whim of the os provider as
to when updates happen, rather than being controlled by our staff.

I am very interested in the eggbasket project - that's something that's
been needed for a while. And I'm aware of the setuptools fork, and the
discussion on the distutils sig mailing list.

Thanks.
-Scott
 
D

Dmitry S. Makovey

Diez said:
- different OS. I for one don't know about a package management tool
for windows. And while our servers use Linux (and I as developer as
well), all the rest of our people use windows. No use telling them to
apt-get instal python-imaging.

that is a very valid point, but it seemed that Scott has homogeneous
environment: Debian/Ubuntu so my post was relative to the original request.
I agree that when you throw Windows/MacOS into the mix things
become "interesting". But then it's better when your developers develop on
server/platform they are going to be using, using same stack they going to
face in production etc. It all depends on requirements and current
practices in company.
- keeping track of recent developments. In the Python webframework
world for example (which the OP seems to be working with), things move
fast. Or extremly slow, regarding releases. Take Django - until 2 month
ago, there hasn't been a stable release for *years*. Virtually everybody
was working with trunk. And given the rather strict packaging policies
of debian and consorts, you'd be cut off of recent developments as well
as of bugfixes.

that definitely becomes tricky however not impossible to track. You do need
a common snapshot for all developers to use anyway - so why not just
package it up?

Note: I do agree that depending on environment/development
practices/policies/etc my statement might become invalid or useless.
However when you're dealing with homogeneous environment or you require
development and testing to be done on your servers running targeted
application stack - things become much easier to manage :)
 
D

Diez B. Roggisch

that is a very valid point, but it seemed that Scott has homogeneous
environment: Debian/Ubuntu so my post was relative to the original request.
I agree that when you throw Windows/MacOS into the mix things
become "interesting". But then it's better when your developers develop on
server/platform they are going to be using, using same stack they going to
face in production etc. It all depends on requirements and current
practices in company.

Well, you certainly want a desktop-orientied Linux for users, so you
chose ubuntu - but then on the server you go with a more stable debian
system. Even though the both have the same technical and even package
management-base, they are still incompatible wrt to package versions for
python.

And other constraints such as Photoshop not being available for Linux
can complicate things further.
that definitely becomes tricky however not impossible to track. You do need
a common snapshot for all developers to use anyway - so why not just
package it up?

I do, but based on Python eggs. They are platform independent (at
ultimo, you can use the source distribution, albeit that sux for windows
most of the time), and as I explained in my other post - things are
moving in the right direction.

Don't get me wrong - I love .deb-based systems. But if using them for my
development means that I have to essentially create a full zoo of
various packages *nobody else* uses - I rather stick with what's working
for me.

Diez
 
D

Dmitry S. Makovey

Diez said:
Well, you certainly want a desktop-orientied Linux for users, so you
chose ubuntu - but then on the server you go with a more stable debian
system. Even though the both have the same technical and even package
management-base, they are still incompatible wrt to package versions for
python.

And other constraints such as Photoshop not being available for Linux
can complicate things further.

actually I had in mind X11 sessions forwarded from server to desktop - all
development tools and libraries are on server, and all unrelated packages
(like Photoshop etc.) are on desktop.
I do, but based on Python eggs. They are platform independent (at
ultimo, you can use the source distribution, albeit that sux for windows
most of the time), and as I explained in my other post - things are
moving in the right direction.


/I'll play devil's advocate here even though I see your point/

how do you deal with non-pythonic dependencies then? surely you don't
package ImageMagic into an egg ;)
Don't get me wrong - I love .deb-based systems. But if using them for my
development means that I have to essentially create a full zoo of
various packages *nobody else* uses - I rather stick with what's working
for me.

Looks like if you package and make those available you'll have quite a few
people using them. I've seen people looking for pre-packaged python libs
just to stick to OS package management tools. :)

Eggs and debs are not silver-bullet for *any* scenario, so you'd have to
weight what can you get out of either one against what are you going to
sacrifice. In my case I know all our systems (servers) run same OS, however
developers don't. So I provide them with environment on devel/testing
servers that they can use as a primary development environment or develop
on their own boxes (which means they are on their own hunting
dependencies/packages/etc.) but testing before moving forward they still
have to test it on "certified" server. And I don't suggest that everybody
should run *this* type of environment - it just works better in our case.
 
R

r0g

Diez said:
Well, you certainly want a desktop-orientied Linux for users, so you
chose ubuntu - but then on the server you go with a more stable debian
system. Even though the both have the same technical and even package
management-base, they are still incompatible wrt to package versions for
python.

And other constraints such as Photoshop not being available for Linux
can complicate things further.

Photoshop for Windows runs fine under Wine, or at least it does on my
Ubuntu box, just make sure you install the windows core fonts. I don't
think many people realise how good Wine is these days, maybe because it
was so useless for so long, it's pretty good right now. Also there's
'crossover' if you need even better out of the box Win32 binary
compatibility.

Roger.
 
J

jhermann

Our solution consists of:
* our own base python distribution, decoupled from the OS one (for
various reasons, one being version independency)
* distutils / setuptools / virtualenv is included in that python
installation, no other eggs installed in site-packages
* virtualenv + Paver to manage build environments
* Paver plugins containing standard (continuous) build targets
* a hand-crafted tool that builds an RPM (.deb would be easy, too)
from an egg URL / filename, packaging a ready-to-run virtualenv
environment into the RPM; it's a rather shallow shell above virtualenv
and rpmbuild, automating the process and enforcing company standards.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,734
Messages
2,569,441
Members
44,832
Latest member
GlennSmall

Latest Threads

Top